The present disclosure relates to an information processing device, an information processing method, a computer program, and a content display system.
Mechanisms (content display systems) that allow a plurality of users in physically distant locations to view the same content over a network include a videoconference system, for example (cf. e.g. Japanese Unexamined Patent Publication No. 2008-289094). In this system, a plurality of users in physically distant locations can have a conversation, looking at the same screen.
Basically, in such an existing content display system, content is displayed on a screen, and users can communicate with one another by voice or the like while looking at the screen. For example, when a certain user displays a document file on the screen, other users can view the document file, and the users can communicate with one another about the details of the document file.
However, a mechanism that, when a plurality of users in physically distant locations view the same content (document, video, Web site on the Internet etc.) over a network, supports viewing of the content by allowing the plurality of users to directly modify the content at the same time has not been developed. In the existing content display system, only a specific user can modify the content displayed on the screen, and it has been difficult for a plurality of users to modify the content on the screen at the same time and communicate with one another.
In light of the foregoing, it is desirable to provide a novel and improved information processing device, information processing method, computer program, and content display system that can provide a mechanism that supports viewing of content when a plurality of users in physically distant locations view the same content (document, video, Web site on the Internet etc.) over a network.
Accordingly, there is provided a method for initiating display of information relating to content having a plurality of portions. The method comprises acquiring a capability of a first user device in a first location and a capability of a second user device in a second location. The method further comprises respectively acquiring, from the first and second user devices, information identifying first and second ones of the content portions. The method still further comprises generating signals for respectively displaying representations of the first and second user devices as indications of the first and second content portions.
In a second aspect, there is provided a method for displaying information relating to content having a plurality of portions. The method comprises sending to a server, from a first user device in a first location, information identifying a first portion of the content, the first user device having a first capability. The method further comprises receiving, from the server, signals for displaying: a representation of the first user device as an indication of the first content portion; and a representation of a second user device in a second location as an indication of a second content portion associated with the second user device, the second user device having a second capability. The method still further comprises displaying the representations of the first and second user devices.
In a third aspect, there is provided an apparatus for displaying information relating to content having a plurality of portions, comprising a memory and a processor executing instructions stored in the memory. The processor executes instructions to send to a server, from a first user device in a first location, information identifying a first portion of the content, the first user device having a first capability. The processor further executes instructions to receive, from the server, signals for displaying: a representation of the first user device as an indication of the first content portion; and a representation of a second user device in a second location as an indication of a second content portion associated with the second user device, the second user device having a second capability. The processor still further executes instructions to display the representations of the first and second user devices.
According to the embodiments of the present disclosure described above, it is possible to provide an information processing device, an information processing method, a computer program, and a content display system that can provide a mechanism that supports viewing of content when a plurality of users in physically distant locations view the same content (document, video, Web site on the Internet etc.) over a network.
Hereinafter, preferred embodiments of the present disclosure will be described in detail with reference to the appended drawings. Note that, in this specification and the appended drawings, structural elements that have substantially the same function and structure are denoted with the same reference numerals, and repeated explanation of these structural elements is omitted.
Preferred embodiments of the disclosure will be described hereinafter in the following order.
<1. One Embodiment of Disclosure>
[1-1. Configuration of Content Display System]
[1-2. Configuration of Information Processing Device]
[1-3. Operation of Information Processing Device]
[1-4. Exemplary Hardware Configuration of Information Processing Device]
<2. Summary>
A configuration of a content display system according to one embodiment of the disclosure is described first.
Referring to
The content server 10 stores content having a plurality of portions to be displayed by the information processing devices 100A to 100D and acquires capabilities of the information processing devices 100A to 100D, such as audio output or video output. The content server 10 provides appropriate content to the information processing devices 100A to 100D in response to a request from the information processing devices 100A to 100D. The information processing devices 100A to 100D, which are associated with different users in different locations, receive the content provided from the content server 10 and display the content on a screen. The content provided from the content server 10 includes document files, presentation files, video files, and Web pages on the Internet, for example, although the content is not limited to those listed above in this disclosure.
In this embodiment, the content server 10 can provide different content to the information processing devices 100A to 100D and can further provide the same content to the information processing devices 100A to 100D. Then, the information processing devices 100A to 100D allow users to simultaneously view the same content provided from the content server 10 and modify the content.
Specifically, users of the information processing devices 100A to 100D can share the same content provided from the content server 10 and simultaneously view the same content.
The application server 11 stores an application for sharing the same content provided from the content server 10 to the information processing devices 100A to 100D among the information processing devices 100A to 100D.
Note that, although the application server 11 is separated from the content server 10 in this embodiment, the disclosure is not limited to such an example, and the function of the application server 11 may be incorporated into the content server 10.
The information processing devices 100A to 100D may be a desktop personal computer, a notebook type personal computer, a mobile phone, a television set, a stationary game machine, a portable game machine or the like, for example, and they are connected to the content server 10 through the network. Further, the information processing devices 100A to 100D are connected to one another through the network 20. Note that, in the following description of the information processing devices 100A to 100D, they are sometimes referred to simply as the information processing device 100.
The information processing devices 100A to 100D can acquire the content from the content server 10 through the network 20 by transmitting a content acquisition request to the content server 10. Then, under control of the application stored in the application server 11, the information processing devices 100A to 100D can operate to enable simultaneous viewing of the same content, user information, and modifications provided from the content server 10.
The configuration of the content display system according to one embodiment of the disclosure is described above with reference to
Referring to
The control unit 101 controls the information processing device 100 as a whole by controlling the respective functional units illustrated in
The communication unit 102 is a communication interface that receives content such as video data through the network 20 according to a given communication standard.
The operating unit 103 accepts an operation input to the information processing device 100, and it may be an input device such as a keyboard or a mouse, for example. When an operation is performed on the operating unit 103 by a user of the information processing device 100, the control unit 101 detects the meaning of the operation on the operating unit 103 and performs processing in accordance with the operation.
The storage unit 104 has nonvolatility as described above and stores computer programs to be executed by the control unit 101. Specifically, the storage unit 104 stores a program for viewing the content provided from the content server 10, a program for performing decompression of video data or audio data in the image data decompression unit 105 or the audio data decompression unit 107, a program for performing compression of video data or audio data in the image data compression unit 110 or the audio data compression unit 112 and so on.
The image data decompression unit 105 is a module (decoder) that decompresses (decodes) video data that is included in the content supplied from the control unit 101 in accordance with a given standard and supplies the data to the image output unit 106. The image data decompression unit 105 is implemented by hardware or software.
The image output unit 106 has a frame memory function that temporarily stores the video data decompressed by the image data decompression unit 105, a display controller function that outputs frame data stored in the frame memory, and an image display function that displays an image by the display controller.
The audio data decompression unit 107 is a module (decoder) that decompresses (decodes) audio data that is included in the content supplied from the control unit 101 in accordance with a given standard and supplies the data to the audio output unit 108. The audio data decompression unit 107 is implemented by hardware or software.
The audio output unit 108 has a sound driver function that performs processing such as conversion and amplification of the digital audio data decompressed by the audio data decompression unit 107 into analog audio data and then outputs the data, and a speaker function that outputs the audio data from the sound driver.
The image input unit 109 processes an image signal that is input from an imaging device such as a camera, for example, and outputs the signal to the image data compression unit 110.
The image data compression unit 110 is a module (encoder) that compresses (encodes) the image signal supplied from the image input unit 109 in accordance with a given standard and outputs the signal to the control unit 101.
The audio input unit 111 has a sound driver function that converts an analog audio signal captured by a sound capturing device such as a microphone into a digital audio signal and then outputs the signal to the audio data compression unit 112.
The audio data compression unit 112 is a module (encoder) that compresses (encodes) the digital audio signal supplied from the audio input unit 111 in accordance with a given standard and outputs the signal to the control unit 101.
With such a configuration, the information processing device 100 can display the content provided from the content server 10 on the image output unit 106. Further, the information processing device 100 allows its user to perform communication by text input or voice input with another user who is simultaneously viewing the same content provided from the content server 10.
The configuration of the information processing device 100 according to one embodiment of the disclosure is described above with reference to
Referring to
The content detection unit 121 detects details of the content that is displayed on the image output unit 106 in the information processing device 100. According to the details of the content detected by the content detection unit 121, the display control unit 123, which is described later, controls information to be displayed on the image output unit 106.
The user state detection unit 122 is connected to the network 20 and detects a state (user state) of another information processing device 100 that is displaying the same content. According to the state (user state) of another information processing device 100 detected by the user state detection unit 122, the display control unit 123, which is described later, controls information to be displayed on the image output unit 106.
The display control unit 123 performs control of information to be displayed on the image output unit 106 according to the user operation on the operating unit 103, the details of the content detected by the content detection unit 121, and the state (user state) of another information processing device 100 detected by the user state detection unit 122. The display control of information on the image output unit 106 by the display control unit 123 is described in detail later in reference to specific examples.
It should be noted that, although the configuration in which the content detection unit 121, the user state detection unit 122 and the display control unit 123 are included in the control unit 101 of the information processing device 100 is illustrated in this embodiment, the disclosure is not limited thereto. Some or all of those elements may be included in a device different from the information processing device 100, such as the application server 11, for example.
The functional configuration of the control unit 101 included in the information processing device 100 according to one embodiment of the disclosure is described above. Hereinafter, an operation of the information processing device 100 according to one embodiment of the disclosure is described.
The information processing device 100 makes a connection to the content server 10 through the network 20 based on user operation and acquires content having a plurality of portions from the content server 10. Then, the information processing device 100 displays the content acquired from the content server 10 on the image output unit 106 (step S101). The display control of the content on the image output unit 106 is mainly performed by the control unit 101. Particularly, the display control of the content on the image output unit 106 is performed by the display control unit 123 shown in
The content that is acquired from the content server 10 through the network 20 by the information processing device 100 may be a homepage on the Internet, for example. In addition to the homepage on the Internet, the content acquired from the content server 10 through the network 20 may be a still image, a moving image, a document file, a presentation file or the like.
After the information processing device 100 acquires the content from the content server 10 through the network 20 and displays the content on the image output unit 106, the information processing device 100 then displays user information, which is information specific to the users, in superposition upon the content, on the image output unit 106 according to user operation on the operating unit 103 (step S102). The display control of the user information on the image output unit 106 is mainly performed by the control unit 101, and particularly performed by the display control unit 123 shown in
Note that the icon may be prepared by a user; alternatively, when the information processing device 100 is equipped with an imaging device, an image of a user captured by the imaging device may be displayed in real time. The icon prepared by a user may be a head shot of the user, or an image (avatar) of the user in a social networking service (SNS), for example. The icon is not limited to such examples, and a user can display an arbitrary image as the icon.
A user of each information processing device 100 can arbitrarily move the user information 130a to 130d to different content portions by operating the operating unit 103. Further, a user of the information processing device 100 can view the user information indicating a content portion selected by a user of another information processing device 100 through the image output unit 106 and thereby find in what state other users are and what kind of modifications other users are performing on the content displayed on the image output unit 106 in real time. The content server 10 acquires information identifying the content portions selected by the users and causes the display control units 123 of the information processing devices 100 to display the user information 130a to 130d within the appropriate content portions on the image output unit 106.
Further, a user of each information processing device 100 can communicate with other users by text input or voice input. The text that is input by a user is displayed superposed on the content and displayed also in the information processing devices 100 that share the same content and are operated by other users. Content server 10 acquires a character string that is input by a user and causes the display control unit 123 of the information processing devices 100 to display the character string within the content portion selected by the user on the image output unit 106, similar to the user information 130a to 130d. Further, the voice input by a user is output also from the information processing devices 100 that share the same content and are operated by other users.
It should be noted that the content that is displayed on the image output unit 106 of the information processing devices 100 in the content display system 1 according to one embodiment of the disclosure may be displayed all over the screen of the image output unit 106 or displayed on a part of the screen of the image output unit 106. In the case of displaying the content on a part of the screen of the image output unit 106, the content may be displayed inside a window, which is a function provided by the operating system (OS).
In the above manner, the information processing device 100 displays the user information, which is information specific to each user, superposed upon the content on the image output unit 106 by use of the display control unit 123 according to user operation on the operating unit 103. Then, the information processing device 100 performs display of the content on the image output unit 106 according to user operation on the operating unit 103 (step S103). The display control of the content on the image output unit 106 according to user operation is mainly performed by the control unit 101, and particularly performed by the display control unit 123 shown in
By displaying the user information on the image output unit 106 as shown in
In this manner, when a user of the information processing device 100A and a user of the information processing device 100B input text using the operating unit 103, the text is displayed as the user information, thereby enabling communication among different users.
In order to share the content with other users and communicate with other users as described above, a user may log into the system having such a function. The system for communicating with other users may be provided by the application server 11, for example.
As described above, a plurality of users can simultaneously view the same content and thereby make communication with one another about the content. However, if a plurality of pieces of user information are displayed on the screen, there may be cases where the displayed user information interferes with the viewing of the content provided from the content server 10 and thus interferes with communication among users.
In light of the above, when text input or voice input is not made in the information processing device 100, the display control unit 123 may display the user information in a simplified manner so as not to interfere with the viewing of the content provided from the content server 10.
By simplifying the display of the user information as described above, it is possible to display the user information in the way that does not interfere with the viewing of the content. Then, when a user of the information processing device 100 starts inputting text or voice, the content server 10 acquires the input and causes the display control units 123 of the information processing devices 100 to switch the user information of the user from the simplified display to the display including the icon.
In the example shown in
In the example shown in
Regarding the user information that is displayed on the image output unit 106 by display control of the display control unit 123, the display position of an icon displayed as the user information or text displayed near the user information may be shifted according to the display position of the user information on the image output unit 106. By shifting the display position of an icon or text according to the display position of the user information on the image output unit 106, the user information can be controlled so as not to extend off screen.
When a user inputs text in the state where the user information is displayed as shown in
Note that, although the information indicating that text is being input is displayed as the user information by characters or symbols in the examples shown in
A user of the information processing device 100 according to the embodiment can communicate with other users by inputting voice, not only text, to the information processing device 100. In such a case, the content server 10 may acquire the voice input and the volume level detected by the display control unit 123 of the information processing devices 100, modify the user information 130a to 130d based on the volume level, and cause the display control unit 123 of the information processing devices 100 to display the modified user information 130a to 130d on the image output unit 106.
In this manner, the display control unit 123 detects the volume level of voice input by a user and modifies the display of user information based on the volume level, thereby allowing the other users who are viewing the same content to recognize which user is entering a comment by inputting voice.
As described above, the user information of a plurality of users are displayed superposed on the same content, thus enabling communication among the users of the information processing devices 100 that are connected to one another through the network.
However, the user does not always perform the operation of the information processing device 100, and the user can be away from the information processing device 100 for a while. If the user information of a user who is not operating the information processing device and currently away from the information processing device 100 remains displayed on the screen, other users vainly try to communicate with the user by entering a comment or the like without knowing that the user is temporarily away from the information processing device 100.
To avoid this, the display control unit 123 may perform display control to move the user information of a user who is not operating the information processing device and currently away from the information processing device 100 to the corner of the screen, for example, thereby notifying other users that the user is in the idle state.
In this manner, the display control unit 123 performs display control to move the user information of a user who is not operating the information processing device and is temporarily away from the information processing device 100 to the corner of the screen, thereby notifying other users that the user is in the idle state.
After a certain user enters the idle state as shown in
In this manner, when a user does not operate the information processing device 100 for a given length of time, the user is forced to leave the system and the user information of the user is removed from the screen, so that the other users who are viewing the same content can find that the user has left the system and is no longer viewing the content.
Note that the cases where each user shifts to the idle state may include a case where focus is not on an application such as a browser for displaying a content, a case where a mouse cursor is not on a window that displays an application such as a browser for displaying a content and so on, for example, in addition to the case where a user does not operate the information processing device 100 for a given length of time as described above.
Some of the content that is displayed on the image output unit 106 do not fit within one screen, and the whole content can only be seen by scrolling up and down or side to side. In this case, when a certain user scrolls the screen in order to view the content, the screen may be scrolled for the other users also, so that the same area of the content can be brought into view for all users.
However, when the screen is scrolled forcibly, a case may occur where the content which is viewed by a certain user is scrolled off-screen and not visible. Therefore, in consideration of such a case, the scrolling of the screen may not be synchronized across all users.
When the scrolling of the screen is asynchronous across all users, when a certain user (e.g. the user A) scrolls the screen, the user information of another user (e.g. the user B) is scrolled off the display range of the image output unit 106. Thus, a case may occur where, even when the user B which is off the display range of the image output unit 106 inputs text, the text input by the user B is not visible for the user A.
In the example of
When there is user information that is not displayed on the image output unit 106, the display control unit 123 may perform control to display an icon indicating the existence of user information that is not displayed on the image output unit 106 on the scroll bar, for example.
In this manner, the display control unit 123 performs control to display the icon indicating the existence of user information that is not displayed on the image output unit 106 on the scroll bar, for example, so that the users can be aware that there is user information that is not displayed on the image output unit 106.
Then, when a user of the information processing device 100 places the cursor on the icon displayed on the scroll bar as shown in
As a result that the user A places the cursor on the icon displayed on the scroll bar and operates the operating unit 103, the display area of the image output unit 106 of the user A and the display area of the image output unit 106 of the user B coincide as shown in
In this manner, the display control unit 123 changes the display area of the content according to user operation, thereby enabling the display area of the content to be synchronous across a plurality of users, so that the user information that has not been displayed on the image output unit 106 can be displayed on the image output unit 106.
When two or more users modify a certain content portion in the case where a plurality of users are simultaneously viewing the same content, processing that is different from processing when one user modifies the content portion may be performed.
In the example shown in
Note that, when a context menu is displayed by the operation of one user as shown in
On the other hand, when two or more users modify the same portion of the content, the context menu that is displayed on the screen by the display control unit 123 may be different from the one shown in
As described above, when two or more users modify a certain portion of the content in the case where a plurality of users are simultaneously viewing the same content, processing that is different from processing when one user modifies the content portion is performed. This offers a wider variety of operations for the content.
Note that, when a context menu is displayed by the operation of a plurality of users as shown in
Further, in this embodiment, when user information of two or more users gets close to each other in the case where a plurality of users are simultaneously viewing the same content, transmission and reception of a direct message between the users or entrance to another room by the users, for example, may be allowed, in addition to the display of the context menu as described above.
Further, in this disclosure, when user information of two or more users gets close to each other in the case where a plurality of users are simultaneously viewing the same content, the master-slave relationship among the plurality of users may be set. Specifically, the device may be designed so that only a specific user is allowed to perform modifications on the content displayed on the screen, and the other users are allowed only to view the modifications on the content by the specific user and not allowed to perform modifications on the content. In this case, the content detection unit 121 may detect the details of the content, and the user state detection unit 122 may detect the state of each user, and thereby the display control unit 123 may control modifications on the content according to the content and the user.
The image output unit 106 of the information processing device 100 which is used by each user does not necessarily have the same resolution. There may be cases where one information processing device 100 can display the entire content on the image output unit 106, whereas another information processing device 100 can display only a part of the content.
In such a case, the display control unit 123 may control the display area of the content so as to display user information of another user.
In such a case, when the user A operates the operating unit 103 to move the user information of the user A, the display control unit 123 of the information processing device 100 which is used by the user B may change the display range of the content according to the movement of the user information of the user A.
In this manner, according to the movement of user information of another user, the display control unit 123 controls the display area of the content to display the user information on the image output unit 106, so that the information processing device 100 according to the embodiment allows a user to view the same range of the content as that of another user.
An example of a hardware configuration of the information processing device 100 according to one embodiment of the disclosure described above is described hereinafter.
Referring to
The CPU 901 serves as a processing unit and a control unit, and it controls the whole or a part of the operation in the information processing device 100 according to programs stored in the ROM 903, the RAM 905, the storage device 919 or a removable recording medium 927. The ROM 903 stores a program to be used by the CPU 901, a processing parameter and so on. The RAM 905 primarily stores a program to be used in the execution on the CPU 901, a parameter that varies during the execution and so on. The CPU 901, the ROM 903 and the RAM 905 are connected with one another through the host bus 907, which is an internal bus such as a CPU bus.
The host bus 907 is connected to the external bus 911 such as a Peripheral Component Interconnect/Interface (PCI) bus via the bridge 909.
The input device 915 is an operating means to be operated by a user, such as a mouse, a keyboard, a touch panel, a button, a switch or a lever, for example. The input device 915 may be a remote controlling means (or a remote control) using an infrared ray or another radio wave, or an external connected equipment 929 compatible with the operation of the information processing device 100, such as a mobile phone or a PDA. Further, the input device 915 may be an input control circuit that generates an input signal based on information input by a user using the above operating means and outputs it to the CPU 901, for example. A user of the information processing device 100 operates the input device 915 to thereby input various kinds of data or direct processing operation to the information processing device 100.
The output device 917 may be a device for visually or auditorily presenting a user with acquired information, such as a display device like a CRT display device, a liquid crystal display device, a plasma display device, an EL display device or a lamp, an audio output device like a speaker or a headphone, a printer, a mobile phone or a facsimile machine, for example. The output device 917 outputs results of performing various kinds of processing by the information processing device 100, for example. Specifically, the display device displays results of performing various kinds of processing by the information processing device 100 with text or images. The audio output device converts an audio signal containing reproduced audio data, acoustic data or the like into an analog signal and outputs the signal.
The storage device 919 may be a magnetic storage device such as a hard disk drive (HDD), a semiconductor storage device, an optical storage device, or a magneto-optical storage device, for example. The storage device 919 stores programs to be executed by the CPU 901, various kinds of data, acoustic signal data or image signal data acquired from the outside and so on
The drive 921 is a reader/writer for a recording medium, which is built in the information processing device 100 or attached externally. The drive 921 reads information that is recorded in the removable recording medium 927 such as a magnetic disk, an optical disk, a magneto-optical disk or a semiconductor memory which is attached thereto and outputs the information to the RAM 905. Further, the drive 921 can write information into the removable recording medium 927 such as a magnetic disk, an optical disk, a magneto-optical disk or a semiconductor memory which is attached thereto. The removable recording medium 927 may be a DVD medium, a Blu-ray medium, a compact flash (CF) (registered trademark), a memory stick, a secure digital (SD) memory card or the like. Further, the removable recording medium 927 may be an integrated circuit (IC) card or an electronic device incorporating a contactless IC chip, for example.
The connection port 923 is a port for directly connecting equipment to the information processing device 100, such as a universal serial bus (USB) port, an IEEE 1394 port such as i.Link, a small computer system interface (SCSI) port, an RS-232C port, an optical audio terminal, or a high-definition multimedia interface (HDMI) port. By connecting the external connected equipment 929 to the connection port 923, the information processing device 100 can directly acquire acoustic signal data or image signal data from the external connected equipment 929 or supply acquired signal data or image signal data to the external connected equipment 929.
The communication device 925 is a communication interface which is a communication device or the like for establishing a connection with a communication network 931, for example. The communication device 925 may be a communication card for wired or wireless local area network (LAN), Bluetooth or wireless USB (WUSB), a router for optical communication, a router for asymmetric digital subscriber line (ADSL) or a modem for various kinds of communications, for example. The communication device 925 can transmit and receive signals or the like to and from the Internet or another communication device in conformity to a prescribed protocol such as TCP/IP, for example. Further, the communication network 931 that is connected to the communication device 925 may be a network or the like connected by wired or wireless means, and it may be the Internet, home LAN, infrared data communication, radio wave communication, satellite communication or the like, for example.
One example of the hardware configuration of the information processing device 100 according to one embodiment of the disclosure is described in the foregoing. In the information processing device 100 having the above configuration, the CPU 901 reads computer programs stored in the storage device 919 or the like and sequentially executes the programs, for example, thereby implementing the operation of the information processing device 100 according to one embodiment of the disclosure described above.
As described above, the information processing device 100 according to one embodiment of the disclosure enables viewing of the same content having a plurality of portions that is provided from the content server 10 together with the information processing device 100 to which it is connected through the network 20.
The information processing device 100 according to one embodiment of the disclosure displays user information composed of a cursor operated by a user of each information processing device 100 and a user name and an icon of each user displayed near the cursor on a screen within a portion of the content. The information processing device 100 according to one embodiment of the disclosure thereby allows the users to grasp which content portion each user is interested in.
When the information processing device 100 according to one embodiment of the disclosure displays the same content as the information processing device 100 to which it is connected through the network 20, the information processing device 100 accepts text input or voice input and outputs the input text or voice to the information processing devices 100, thereby enabling communication with another user.
Further, the information processing device 100 according to one embodiment of the disclosure brings a plurality of user information closer to each other, thereby enabling execution of processing effective only for those users.
Although preferred embodiments of the disclosure are described in detail above with reference to the appended drawings, the disclosure is not limited thereto. It should be understood by those skilled in the art that various modifications, combinations, sub-combinations and alterations may occur depending on design requirements and other factors insofar as they are within the scope of the appended claims or the equivalents thereof.
For example, the above description illustrates the case where, when a user of the information processing device 100 does not perform operations on the information processing device 100, the display is controlled to shift the state of the user to the idle state or the away state; however, the disclosure is not limited to such an example. For example, when a user of the information processing device 100 does not perform operations on the information processing device 100 for reasons such as walking, being on the train or driving a car, for example, the display control unit 123 may perform control to output the state of the user to the image output unit 106. To implement this, the information processing device 100 may include an acceleration sensor, a GPS receiver or the like.
Further, the above description illustrates the case where a plurality of information processing devices 100 share the content that is provided from the content server 10 and display the same content at the same time; however, the disclosure is not limited to such an example. For example, the disclosure may be applied in the same manner to the case where another information processing device 100 (e.g. the information processing device 100B) accesses the content (a document file, a presentation file etc.) that is stored in a certain information processing device 100 (e.g. the information processing device 100A), for example.
The present disclosure contains subject matter related to that disclosed in Japanese Priority Patent Application JP 2010-179697 filed in the Japan Patent Office on Aug. 10, 2010, the entire content of which is hereby incorporated by reference.
Number | Date | Country | Kind |
---|---|---|---|
P2010-179697 | Aug 2010 | JP | national |