The present invention relates to an information processing system.
There has been known a group messaging system that can perform group file management using a messenger by reporting an occurrence of an activity via a group chat room of the messenger mapped to a shared group in a case where, for example, an activity occurs such as file registration relative to a file managed in the shared group by a Cloud server by simultaneously operating a messenger server and the Cloud server (see, for example, Patent Document 1).
A user may perform file sharing among a plurality of users by using an information processing apparatus such as a file server that can perform file sharing among the users. Further, a user may perform sharing by exchanging comments on a file using an information processing apparatus such as a chat server among the users who perform the file sharing.
However, there has been no scheme available to coordinate (cooperate, work) a function of the file sharing with a function of exchanging comments on the file in a terminal device that performs the file sharing and exchanges comments on the file among the users.
An embodiment of the present invention is made in light of this point (problem), and may provide an information processing system capable of coordinating the function of file sharing and the function of exchanging comments on the file to work together.
According to an aspect of the present invention, an information processing system includes one or more information processing apparatuses; and two or more terminal devices, including first and second terminal devices, which are connected to the one or more information processing apparatuses. Further, each of the information processing apparatuses includes a storage unit storing a file, and a first transmission unit transmitting, in response to a request from one of the terminal devices, the file stored in the storage unit to the one of the terminal devices. Further, the first terminal device includes an acquisition unit sending the request to the one or more information processing apparatuses to acquire the file stored in the storage unit, and acquiring the file, a first display unit including first and second display areas, the first display area displaying the file acquired by the acquisition unit, the second display area displaying messages transmitted to and received from the second terminal device, a reception unit receiving a selection of a certain area of the file displayed in the first display area by the first display unit and an operation to transmit the certain area as one of the messages transmitted to and received from the second terminal device, and a second transmission unit transmitting a message, which includes information indicating the certain area received by the reception unit, to the second terminal device. Further, the second terminal device includes a second display unit including first and second display areas, the first display area displaying the file, the second display area displaying messages transmitted to and received from the first terminal device. Further, the second display unit displays the message, which includes the information indicating the certain area and is transmitted from the first terminal device, in the first display area, and, upon receiving a selection of the displayed message, displays the file based on the information indicating the certain area included in the displayed message.
According to an aspect of the present invention, it becomes possible to coordinate a function of file sharing and a function of exchanging comments on the file to work together.
Next, embodiments of the present invention are described in detail.
The relay server 11, the chat server 12, and at least a part of the smart devices 13 are connected with a network N1 such as the Internet. Further, the file server 14 and at least a part of the smart devices 13 are connected with a network N2 such as a
Local Area Network (LAN). The network N1 is connected with the network N2 via the FW 15.
The relay server 11 first receives a “request” which is from the chat server 12 and the smart device 13, which are connected to the network N1, to the file server 14 which is connected to the network N2, and relays (outputs) the request to the file server 14.
The chat server 12 receives conversation content, etc., from the smart devices 13 to perform chatting among the smart devices 13, and distributes the conversation content, etc. The smart device 13 refers to a terminal device which is used by a user.
In the file server 14, for example, a file shared by the users and the logs of the conversation content of the conversations performed by the users are stored. The file server 14 is connected to the network N2. Therefore, it is not possible for the relay server 11, the chat server 12, and the smart devices 13 which are connected with the network N1 to directly access the file server 14. It is possible for the file server 14 to indirectly access the relay server 11, the chat server 12, and the smart devices 13 which are connected with the network N1.
The file server 14 constantly (repeatedly) makes an inquiry of the relay server 11 to determine whether to receive the “request”. When determining that the relay server 11 receives the request, the file server 14 acquires the request from the relay server 11 and performs processing on the request. Further, the file server 14 reports a processing result of the request to the relay server 11. The smart device 13, which sends the request, can receive the processing result of the request from the relay server 11. As described, the request from the smart device 13 connected with the network N1 to the file server 14 connected with the network N2 can be transmitted indirectly via the relay server 11.
The relay server 11, the chat server 12, and the smart devices 13, which are connected to the network N1, can communicate with each other. Similarly, the smart devices 13 and the file server 14 which are connected to the network N2 can communicate with each other. In
Note that the configuration of the information processing system 1 of
The relay server 11, the chat server 12, and the file server 14 can be realized by a computer that has a hardware configuration as illustrated in
A computer 100 of
The input device 101 includes a keyboard, a mouse, a touch panel, etc., and is used to input various operation signals to the computer 100. The display device 102 includes a display, etc., and displays a processing result by the computer 100. The communication I/F 107 is an interface to connect the computer 100 to the networks N1 and N2. Via the communication I/F 107, the computer 100 can perform data communications with another computer 100.
The HDD 108 is a non-volatile storage device storing programs and data. The programs and data stored in the HDD 108 include, for example, an Operating System (OS), which is fundamental software to control the entire computer 100, and application software which provides various functions running on the OS. Further, the HDD 108 manages the programs and the data stored therein based on a predetermined file system and/or database (DB).
The external I/F 103 is an interface with an external device. The external device includes a recording medium 103a, etc. The computer 100 can read and write data from and to the recording medium 103a via the external I/F 103. The recording medium 103a includes a flexible disk, a Compact Disk (CD), a Digital Versatile Disk (DVD), an SD memory card, a Universal Serial Bus (USB) memory, etc.
The ROM 105 is a non-volatile semiconductor memory (storage device) which can hold programs and data stored therein even when power thereto is turned off. In the ROM 105, programs and data such as BIOS, which is executed when the computer 100 starts up, OS settings, network settings, etc., are stored. The RAM 104 is a volatile semiconductor memory (storage device) which temporarily stores programs and data.
The CPU 106 reads (loads) the programs and data from the storage device such as the ROM 105 and the HDD 108.
By having the hardware configuration described above, the computer according to an embodiment can execute various processes described below.
The smart device 13 according to an embodiment can be realized based on, for example, the processing blocks as illustrated in
The display section 21 displays the content of the file, the conversation content of chat, a file selection screen, etc., to a user. The operation receiving section 22 receives an operation from a user. The two-dimensional code read section 23 reads a two-dimensional code.
The image information generation section 24 generates image positional information such as the position and the file name of a partial image selected by a user from an image of the file displayed on the display section 21. The image generation section 25 generates an image based on the image positional information. The setting storage section 26 stores settings such as a user name, a password, a group, etc.
The data transmission section 27 transmits the conversation content of chat, the image positional information, etc. The data receiving section 28 receives the conversation content of chat, the image positional information, the file, etc. The file management section 29 stores and deletes a cache of the received file. The text information generation section 30 generates character string information such as the position of the character string and the file which are selected by a user from among the files displayed on the display section 21.
The chat server 12 according to an embodiment can be realized by, for example, processing blocks as illustrated in
The data transmission section 41 transmits data such as conversation content of chat (content of chat conversation). The data receiving section 42 receives data such as conversation content of chat. The user group management section 43 manages users who are participating in chat and a group to which conversation content of chat is to be transmitted. The data transmission destination determination section 44 determines the group to which conversation content of chat is to be transmitted. The chat server 12 provides chat functions.
The relay server 11 according to an embodiment can be realized by, for example, processing blocks as illustrated in
The data receiving section 51 receives, for example, data from the smart device 13 connected to the network N1, a smart device ID of the transmission source of the data, a file server ID of the transmission destination of the data, etc. The data storage section 52 stores various data, which are received by the data receiving section 51, in an associated manner. The request receiving section 53 receives the inquiry from the file server 14 to determine whether the “request” is received.
The data determination section 54 determines whether there are stored data which are associated with the file server ID of the file server 14 from which the request receiving section 53 receives the inquiry. The data transmission section 55 transmits the stored data to the file server 14 from which the inquiry is received when the data determination section 54 determines that there are stored data.
The file server 14 according to an embodiment can be realized by, for example, processing blocks as illustrated in
The data transmission section 61 transmits a file and data such as a processing result of the request. The data receiving section 62 receives data such as a file, a log of conversation content of chat, the request from other smart devices 13, etc. The user group management section 63 manages users who are participating in chat and a group to which conversation content of chat is to be transmitted.
The file management section 64 stores the received file, reads the stored file, etc. The log management section 65 stores a log of conversation content of chat. The request inquiry section 66 queries the relay server 11 to determine whether there exists the request. The request processing section 67 performs processing on the request based on the content of the request.
In the following, details of the processing performed by the information processing system 1 according to an embodiment are described.
In the information processing system 1 according to an embodiment, it is necessary to register the smart devices 13 which are accessible to the file server 14. For example, in the information processing system 1, the smart devices 13 which are accessible to the file server 14 are registered (pairing) by using a two-dimensional code as described below.
Note that the Web UI of
In step S2, the smart device 13 accesses the link to be used for activation (i.e., the address for the activation) while transmitting the smart device ID of the smart device 13.
In step S3, after accessing the file server 14 using the link to be used for the activation, the smart device 13 determines whether the smart device 13 is registered in the file server 14. In step S4, when accessing the file server 14 using the link to be used for the activation and determining that the smart device 13 is registered in the file server 14, the smart device 13 displays a successful screen as illustrated in
The flowchart of
The file serve 14 does not permit access from the smart device 13 that has not performed the smart device registration process of
In the information processing system 1 according to an embodiment, it is necessary to generate a group to which conversation content of chat is to be transmitted. For example, the information processing system 1 generates a group to which conversation content of chat is to be transmitted as described below.
In step S13, the smart device 13 displays a group generation screen as illustrated in
In step S14, a user operates the smart device 13 to input a group name in the group generation screen. Further, in step S15, the user operates the smart device 13 to select users who will participate in the group in the group generation screen. In step S16, the user operates the smart device 13 to finish the operation by pressing, for example, a “finish” button of the group generation screen.
When the user performs the finish operation, the process goes to step S17, where the smart device 13 sends a request to the file server 14 to generate the group by using the group name, which is input in step S14, and the users who are selected in step S15. Then, the file server 14, which receives the request to generate the group, generates the group by using the group name, which is input in step S14, and the users who are selected in step S15, and manages the group in association with the users.
In the information processing system 1 according to an embodiment, chat is performed among the smart devices who are participating in the (same) group.
The smart device 13, which is operated by a user of the group to perform chatting, displays a chat screen as illustrated, for example, in
On the left side of the chat screen of
When the “switch” button on the upper side of the chat screen of
On the left side of the file selection screen of
For example, on the upper side of the chat screen of
In the chat screen of
By performing the range selection operation in a part where the content (image) of the file is displayed in the chat screen of
When a user performs the range selection operation, the smart device 13 performs difference processes depending on whether a start point of the range selection operation by the user is on a character string or on an image in step S21 of
When it is determined that the start point of the range selection operation by the user is on an image, the process goes to step S22, where the selection range of the image is displayed. On the other hand, when it is determined that the start point of the range selection operation by the user is on a character string, the process goes to step S23, where the selection range of the character string is displayed.
Further, it is assumed that the file selected in this embodiment refers to a file described in an electronic document format such as PDF, etc., where an image and a character string can be distinguished from each other or a file described in a format of an application. In the following, the process to proceed to step S22 and the process to proceed to step S23 in
In step S33, the display section 21 receives an operation by the user to add (append) the selection range of the image to the part where the conversation content of chat is displayed (e.g., a drag-and-drop operation). By the operation by the user of adding the selection range of the image to the part where the conversation content of chat is displayed, the display section 21 displays the selection range of the image in the part where the conversation content of chat is displayed. As illustrated in
When the start point of the range selection operation performed by a user is on an image, the information processing system 1 according to this embodiment performs a process, for example, as illustrated in
In step S31, a user operates the smart device 13A to perform the range selection operation on the image. In step S32, for example, the display section 21 of the smart device 13A displays a frame of the selection range of the image as illustrated in
In step S34, the information generation section 24 of the smart device 13A generates image positional information of the partial image based on the selection range of the image on which the adding is performed to the part where the conversation content of chat is displayed. Further, in step S35, the image generation section 25 of the smart device 13A generates an image corresponding to the image positional information (“partial image”).
In step S36, the data transmission section 27 of the smart device 13A transmits the image positional information and the partial image to the chat server 12. The chat server 12 determines the group in chat to which the received image positional information and the partial image are to be transmitted.
In step S37, the chat server 12 distributes the image positional information and the partial image, which are received from the smart device 13A, to, for example, a smart device 13B operated by a user of the group in chat. In step S38, the data receiving section 28 of the smart device 13B receives the image positional information and the partial image from the chat server 12. The file management section 29 stores the received image positional information and the partial image.
In step S39, the display section 21 of the smart device 13B displays the received partial image in the part where the conversation content of chat is displayed. Further, in step S40, the display section 21 of the smart device 13A displays the image (partial image) corresponding to the image positional information in the part where the conversation content of chat is displayed (“chat display part”).
As described above, the information processing system 1 according to this embodiment can use a part of the image of the file in chat by displaying the part of the image of the file in the area where the conversation content of chat is displayed.
Here, with reference to the sequence diagram of
The file server 14, which generates the partial image may transmit the partial image to the smart device 13A that sends the request to generate the partial image or may transmit the partial image to the chat server 12. In a case where the partial image is transmitted to the smart device 13A, the process of and after step S36 in
In
The image positional information generated in step S34 has, for example, a configuration as illustrated in
The information to identify the image of the file includes information to uniquely identify the file server 14, information to distinguish between image and character string, and a file path and a page number of the file, which are being displayed, on the file server 14. On the other hand, the information to identify the position of the partial image includes the position in the X axis direction of the partial image, the position in the Y axis direction of the partial image, the width of the partial image, and the height of the partial image.
In step S53, the display section 21 receives an operation by the user to add (append) the selection range of the character string to the part where the conversation content of chat is displayed (e.g., the drag-and-drop operation).
By the operation by the user of adding the selection range of the character string to the part where the conversation content of chat is displayed, the display section 21 displays the selection range of the character string in the part where the conversation content of chat is displayed. As illustrated in
When the start point of the range selection operation performed by a user is on a character string, the information processing system 1 according to this embodiment performs a process, for example, as illustrated in
In step S61, a user operates the smart device 13A to perform the range selection operation on the character string. In step S62, for example, the display section 21 of the smart device 13A highlights the selection range of the character string as illustrated in
In step S63, the user performs a process of adding the selection range of the character string to the area where the conversation content of chat is displayed. In step S64, the text information generation section 30 of the smart device 13A generates character string information based on the selection range of the character string on which the adding is performed to the part where the conversation content of chat is displayed.
In step S65, the data transmission section 27 of the smart device 13A transmits the character string information to the chat server 12. The chat server 12 determines the group in chat to which the received character string information is to be transmitted.
In step S66, the chat server 12 distributes the character string information, which is received from the smart device 13A, to, for example, the smart device 13B operated by a user of the group in chat. In step S67, the data receiving section 28 of the smart device 13B receives the character string information from the chat server 12. The file management section 29 stores the received character string information. Further, the display section 21 of the smart device 13B extracts the character string to be displayed based on the received character string information.
In step S68, the display section 21 of the smart device 13B displays the character string, which is extracted from the character string information, in the part where the conversation content of chat is displayed. Further, in step S69, the display section 21 of the smart device 13A displays the character string corresponding to the character string information in the part where the conversation content of chat is displayed (“chat display part”).
As described above, the information processing system 1 according to this embodiment can use a part of the character in chat by displaying the part of the character string of the file selected by the user in the part where the conversation content of chat is displayed.
In
The character string information generated in step S64 has, for example, a configuration as illustrated in
The information to identify the image of the file includes information to uniquely identify the file server 14, information to distinguish between image and character string, and a file path and a page number of the file, which is being displayed, on the file server 14. The information to identify the position of the character string includes the position in the X axis direction of the character string, the position in the Y axis direction of the character string image, the width of the character string, and the height of the character string. The information to identify the position of the character string relative to all the character strings includes the start position of the character string, and the end position of the character string. Therefore, it is possible to change the display of the file by using the character string information.
For example, the display section of the smart device 13A displays a received character string “AGCDEFG” as a hyperlink. The character string “AGCDEFG” displayed as a hyperlink includes the character string information described above as meta information.
In step S111, by selecting a character string displayed as a hyperlink in the chat area, the user who operates the smart device 13B can acquire the character string information stored as the meta information of the character string.
In step S112, the display section 21 of the smart device 13B can open the file included in the character string based on the information to identify the image of the file, and highlight-display the character string selected by the user included in the content of the opened file in accordance with the acquired character string information. If the file is already open, it is sufficient that the character string selected by the user is highlight-displayed.
Further,
First, the smart device 133 determines whether the selected message includes meta information (information of the area selected by the smart device 13A) (step S151). When determining that meta information is included (YES in step S151), the smart device 13B further determines whether a file indicated by the meta information is displayed in a file display area (step S152). Here, whether the file is displayed is determined based on a comparison between the meta information illustrated in
When determining that a file indicated by the meta information is displayed in a file display area (YES in step S152), the smart device 13B further determines whether a page indicated by the meta information is displayed in the file display area (step S153). When determining that a page indicated by the meta information is displayed in the file display area (YES in step S153), the smart device 13B highlights the area indicated by the meta information based on the positional information of the meta information (step S159).
On the other hand, when determining that a file indicated by the meta information is not displayed in the file display area (NO in step S152), the smart device 13B further determines whether the file indicated by the meta information is stored in the smart device 13B (step S155). When determining that the file indicated by the meta information is stored in the smart device 13B (YES in step S155), the smart device 13B displays a page indicated by the meta information of the stored file (step S156), and highlights the area indicated by the meta information (step S159). On the other hand, when determining that the file indicated by the meta information is not stored in the smart device 13B (NO in step S155), the smart device 13B acquires the file, which is indicated by the meta information, from the file server indicated by the meta information (step S157), displays the page indicated by the meta information of the acquired file (step S158), and highlights the area indicated by the meta information (step S159).
Further, when determining that a page indicated by the meta information is not displayed in the file display area (NO in step S153), the smart device 13B displays the page indicated by the meta information (step S154), and highlights the area indicated by the meta information (step S159).
Further,
By doing this, it becomes possible for a user B to easily know the part of the file indicated by a user A.
Note that the process illustrated in
The configuration of the information processing system 1 of
An information processing system 1A includes the chat server 12, a plurality of smart devices 13, and the file server 14, which are connected to the network N2 such as a LAN. There are no communications over the FW 15 in the information processing system 1A of
According to an embodiment of the present invention, it becomes possible to visibly share the partial images and character strings among the users who are participating in chat by displaying the content of chat and the content of the file and adding the partial image and the character string of the file to a part where the content of chat is displayed. Therefore, according to an embodiment, it becomes possible for users who are participating in chat to easily make a comment and point out by chat on the partial image and the character string of the file which are visibly shared among the users.
According to an embodiment, it becomes possible to coordinate the functions provided by the file server 14 and the functions provided by the chat server 12 to work together in the smart device 13.
Note that the present invention is not limited to the embodiments described above, and various modifications and changes may be made without departing from a scope of the present invention. Here, the file server 14 is an example of claimed “file storage unit”. The chat server 12 is an example of a “distribution unit”. The display section 21 is an example of a “display unit”. The data transmission section 27 is an example of a “transmission unit”. The information generation section 24 is an example of an “image information generation unit”.
The image generation section 25 is an example of an “image generation unit”. The text information generation section 30 is an example of a “character string information generation unit”. The operation receiving section 22 is an example of an “operation receiving unit”. The file server 14 is an example of the “file storage unit”. The chat server 12 is an example of the “distribution unit”.
Note that embodiments of the present invention do not limit the scope of the present invention. Namely, the present invention is not limited to the configurations as illustrated in
Although the invention has been described with respect to specific embodiments for a complete and clear disclosure, the appended claims are not to be thus limited but are to be construed as embodying all modifications and alternative constructions that may occur to one skilled in the art that fairly fall within the basic teachings herein set forth.
The present application is based on and claims the benefit of priority of Japanese Patent Application Nos. 2014-007277 filed Jan. 17, 2014, and 2015-000719 filed Jan. 6, 2015, the entire contents of which are hereby incorporated herein by reference.
Number | Date | Country | Kind |
---|---|---|---|
2014-007277 | Jan 2014 | JP | national |
2015-000719 | Jan 2015 | JP | national |
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/JP2015/051431 | 1/14/2015 | WO | 00 |