This application is a National Stage patent application of PCT International Patent Application No. PCT/JP2014/073106 (filed on Sep. 2, 2014) under 35 U.S.C. § 371, which claims priority to Japanese Patent Application No. 2013-273369 (filed on Dec. 27, 2013), which are all hereby incorporated by reference in their entirety.
The present disclosure relates to a display control device, a display control method, and a program.
Devices displaying various kinds of information through manipulations on touch panels, such as smartphones or tablet terminals, have become widespread. In tablet terminals, the sizes of screens have also increased and uses of simultaneous manipulations of a plurality of users are considered. In the related art, projectors have been used as devices that display information.
Many technologies for efficiently displaying information have been proposed in the related art. For example, Patent Literature 1 below proposes a method of simultaneously displaying a plurality of windows at the time of display of information. Specifically, by displaying display information regarding a window on the rear side thinner than display information regarding a window on the front side in a portion in which first and second windows are superimposed, it is possible to view the display information regarding both of the windows.
Patent Literature 1: JP H8-123652A
When devices such as the smartphones, tablet terminals, and projectors display information, environments in which information is displayed or situations of displayed information may not normally be said to be constant. In view of the foregoing circumstances, it is necessary to execute control such that information can be displayed more appropriately and efficiently according to environments in which information is displayed or situations of displayed information.
It is desirable to propose a novel and improved display control device, a novel and improved display control method, and a novel and improved program capable of executing control such that information can be displayed more appropriately and efficiently according to an environment in which information is displayed or a situation of displayed information.
According to the present disclosure, there is provided a display control device including: a display control unit configured to decide a display region of a display object to be displayed on a display surface according to information regarding a real object on the display surface.
According to the present disclosure, there is provided a display control method including: deciding, by a processor, a display region of a display object to be displayed on a display surface according to information regarding a real object on the display surface.
According to the present disclosure, there is provided a program causing a computer to function as: a display control unit configured to decide a display region of a display object to be displayed on a display surface according to information regarding a real object on the display surface.
According to the present disclosure described above, it is possible to provide a novel and improved display control device, a novel and improved display control method, and a novel and improved program capable of executing control such that information can be displayed more appropriately and efficiently according to environments in which information is displayed or situations of displayed information.
Note that the effects described above are not necessarily limited, and along with or instead of the effects, any effect that is desired to be introduced in the present specification or other effects that can be expected from the present specification may be exhibited.
Hereinafter, preferred embodiments of the present disclosure will be described in detail with reference to the appended drawings. In this specification and the drawings, elements that have substantially the same function and structure are denoted with the same reference signs, and repeated explanation is omitted.
The description will be made in the following order.
<1. Embodiment of the present disclosure>
(1.1. System configuration example)
(1.2. Functional configuration example)
(1.3. Display control example)
<2. Specific examples of user interface>
<3. Hardware configuration example>
<4. Conclusion>
First, an example of the configuration of an information processing system according to an embodiment of the present disclosure will be described with reference to the drawings.
As illustrated in
The input unit 110a is a device that inputs manipulation content of the user using the information processing system 100a or the shape or design of an object placed on the table 140a. In the example illustrated in
When the camera that images the table 140a using one lens is used as the input unit 110a, the information processing system 100a can detect an object placed on the table 140a by analyzing an image captured by the camera. When the stereo camera is used as the input unit 110a, for example, a visible light camera or an infrared camera can be used in the stereo camera. When the stereo camera is used as the input unit 110a, the input unit 110a can acquire depth information. When the input unit 110a acquires the depth information, the information processing system 100a can detect, for example, a hand or an object placed on the table 140a. When the input unit 110a acquires the depth information, the information processing system 100a can detect touch or approach of a hand of the user to the table 140a or can detect separation of the hand from the table 140a. In the following description, a user touching or approaching an information display surface with a manipulator such as a hand is also collectively referred to simply as “touch.”
When the microphone is used as the input unit 110a, a microphone array collecting a sound in a specific direction can be used as the microphone. When the microphone array is used as the input unit 110a, the information processing system 100a may adjust a sound collection direction of the microphone array to any direction.
Hereinafter, a case in which a manipulation by the user is detected from an image captured by the input unit 110a will be mainly described. However, the present disclosure is not limited to the related example. A manipulation by the user may be detected by a touch panel that detects touch of a finger of the user. Additionally, examples of the user manipulation which can be acquired by the input unit 110a can include a stylus manipulation on an information display surface and a gesture manipulation on a camera.
The output unit 130a is a device that displays information on the table 140a according to information regarding manipulation content input through the input unit 110a by the user using the information processing system 100a, content of information output by the output unit 130a, or the shape or design of an object placed on the table 140a or that outputs a sound. For example, a projector or a speaker is used as the output unit 130a. In the example illustrated in
When the information processing system 100a is of a projection type, as illustrated in
The user using the information processing system 100a can place his or her finger or the like on the table 140a to manipulate information displayed on the table 140a by the output unit 130a. The user using the information processing system 100a can place the object on the table 140a, cause the input unit 110a to recognize an object, and execute various manipulations on the recognized object.
Although not illustrated in
In the present disclosure, the information processing system is not limited to the form illustrated in
In the following description, the configuration of the information processing system 100a, as illustrated in
Next, an example of a functional configuration of the information processing system according to an embodiment of the present disclosure will be described.
As illustrated in
The input unit 110 inputs manipulation content on the information processing system 100 from a user using the information processing system 100 or the shape or design of an object placed on a surface (for example, the table 140a illustrated in
When the information processing system 100 is of a projection type, the input unit 110 can be configured of, for example, a camera configured of one lens, a stereo camera configured of two lenses, or a microphone.
The control unit 120 executes control on each unit of the information processing system 100. For example, the control unit 120 generates information to be output from the output unit 130 using information input by the input unit 110. As illustrated in
For example, when the information processing system 100 is of a projection type illustrated in
The control unit 120 may be configured of, for example, a central processing unit (CPU). When the control unit 120 is configured of a device such as a CPU, the device can be configured of an electronic circuit.
Although not illustrated in
The output unit 130 outputs information according to information regarding manipulation content input through the input unit 110 by the user using the information processing system 100, content of information output by the output unit 130, and the shape or design of an object placed on a surface (for example, the table 140a illustrated in
The information processing system 100 illustrated in
The example of the functional configuration of the information processing system 100 according to the embodiment of the present disclosure has been described above with reference to
The information processing system 100 according to the embodiment of the present disclosure acquires manipulation content from the user on the GUI of an application output to the information display surface by the output unit 130 using the input unit 110. The information processing system 100 allows a user to touch the display surface with a manipulator such as his or her hand or move the manipulator with which he or she is touching the display surface on the display surface and receives a manipulation on the GUI of the application output to the information display surface by the output unit 130.
When the user touches any menu button in the menu button group 1110 with his or her finger or the like and the user moves his or her finger or the like along a row of the menu button group 1110 on the information display surface in the touch state, the information processing system 100 tracks the manipulation from the user and displays the menu button group 1110 so that the menu button group 1110 is rotated about the menu button 1100.
When the information processing system 100 according to the embodiment of the present disclosure outputs such a GUI and displays menus in an initial state set in advance, various problems may occur according to the position of another window or a state of the information display surface, for example, a state of an object placed on the table 140a illustrated in
Accordingly, the information processing system 100 according to the embodiment of the present disclosure detects the position of another window or the state of the information display surface and controls the position of a menu based on the detection result. Specifically, the information processing system 100 according to the embodiment of the present disclosure detects, for example, a state of an object placed on the table 140a illustrated in
When the user of the information processing system 100 executes a predetermined manipulation to display a menu, the information processing system 100 sets a menu movement destination at which the menu is displayed to a current menu position (step S1001). The process of step S1001 is executed by, for example, the output control unit 122. Subsequently, the information processing system 100 determines whether the state of a window displayed according to the manipulation executed by the user is related to the menu position (step S1002). This determination is executed by, for example, the detection unit 121. Specifically, in step S1002, the information processing system 100 determines whether the window is maximized. When the window is maximized, the information processing system 100 determines that the state of the window is related to the menu position. The fact that the window is maximized means that the window is displayed in a maximum range which can be displayed by the output unit 130.
When it is determined in step S1002 that the state of the window displayed according to a manipulation executed by the user is related to the menu position (Yes in step S1002), that is, the window is maximized, the information processing system 100 subsequently executes a process of applying an offset to the menu position set in step S1001 according to the state of the window (step S1003). That is, the information processing system 100 assigns the offset to the menu position set in step S1001 so that the menu position comes near the inside of the window by a predetermined amount. The process of step S1003 is executed by, for example, the output control unit 122.
Subsequently, the information processing system 100 determines whether the menu movement destination set in step S1001 is inside a screen, that is, inside a screen which can be displayed by the output unit 130 (step S1004). This determination is executed by, for example, the detection unit 121.
When the menu movement destination set in step S1001 is inside the screen (Yes in step S1004), the information processing system 100 subsequently determines whether the menu movement destination set in step S1001 is covered with another window displayed by the information processing system 100 (step S1005). This determination is executed by, for example, the detection unit 121.
When the menu movement destination set in step S1001 is not covered with the other window displayed by the information processing system 100 (Yes in step S1005), the information processing system 100 subsequently determines whether the menu movement destination set in step S1001 is located at a proper position according to the position of the user or a manipulation direction of the user (step S1006). Specifically, the information processing system 100 determines whether the menu movement destination set in step S1001 is located at the proper position according to the position of the user or the manipulation direction of the user by comparing the menu movement destination set in step S1001 with the position of the user or the manipulation direction of the user. This determination is executed by, for example, the detection unit 121.
When it is determined in step S1006 that the menu movement destination set in step S1001 is located at the proper position according to the position of the user or the manipulation direction of the user (Yes in step S1006), the information processing system 100 subsequently determines whether the menu movement destination set in step S1001 interferes with an object placed on the information display surface displayed by the information processing system 100 (step S1007). This determination is executed by, for example, the detection unit 121. An example of the information display surface displayed by the information processing system 100 includes the top surface of the table 140a illustrated in
When the menu movement destination set in step S1001 does not interfere with the object placed on the information display surface (Yes in step S1007), the information processing system 100 moves a menu called by the user to the menu movement destination set in step S1001 (step S1008). The process of step S1008 is executed by, for example, the output control unit 122.
Conversely, when at least one of the conditions is not satisfied in the determination of the foregoing steps S1004 to S1007 (No in steps S1004 to S1007), the information processing system 100 subsequently determines whether all of the menu movement destinations are examined (step S1009). The determination of whether all of the menu movement destinations are examined is executed by, for example, the detection unit 121.
When it is determined in the foregoing step S1009 that not all of the menu movement destinations are examined (No in step S1009), the information processing system 100 executes the determinations of the foregoing steps S1004 to S1007 on other movement destinations. First, the information processing system 100 determines whether the position of the user is confirmed (step S1010). The determination of whether the position of the user is confirmed is executed by, for example, the detection unit 121. Specifically, in step S1010, it is determined whether the position of the user is confirmed through, for example, recognition of the body, face, head, or the like of the user by a camera or recognition of the direction of a sound by a microphone.
When it is determined in the foregoing step S1010 that the position of the user is confirmed (Yes in step S1010), the information processing system 100 subsequently sets the menu movement destination to an unexamined position closest to the position of the user (step S1011). The process of step S1011 is executed by, for example, the output control unit 122. When the menu movement destination is set to the unexamined position closest to the position of the user, the information processing system 100 subsequently executes the determinations of the foregoing steps S1004 to S1007 again. When the menu has the button format illustrated in
Conversely, when it is determined in the foregoing step S1010 that the position of the user is not confirmed (No in step S1010), the information processing system 100 subsequently determines whether an object frequently used by the user is recognized on the information display surface (step S1012). The recognition of the object frequently used by the user on the information display surface is executed by, for example, the detection unit 121. The object frequently used by the user may be any object such as a mobile phone, a smartphone, a tablet terminal, a key, a book, a newspaper, a magazine, tableware, or a toy. The information processing system 100 may determine whether there is an object frequently used by the user by recognizing an object placed on the information display surface and comparing the object recognized in advance to the object placed on the information display surface at a time point at which the menu is to be displayed.
The information processing system 100 can store a history of objects placed on the information display surface by maintaining information acquired by the input unit 110. It is needless to say that the history of the objects placed on the information display surface may be stored in another device connected to the information processing system 100 via a network or the like.
In the determination of whether an object placed on the information display surface is the object frequently used by the user, the information processing system 100 may determine, for example, whether the object is placed on the information display surface with more than a predetermined frequency or may determine, for example, whether the object is an object registered as the object frequently used by the user.
When it is determined in the foregoing step S1012 that the object frequently used by the user is recognized on the information display surface (Yes in step S1012), the information processing system 100 subsequently sets the menu movement destination to the position which is the closest to the position of the object frequently used by the user and is not examined (step S1013). The process of step S1013 is executed by, for example, the output control unit 122.
Conversely, when it is determined in the foregoing step S1012 that the object frequently used by the user is not recognized on the information display surface (No in step S1012), the information processing system 100 subsequently determines whether the menu movement destination can be decided using a manipulation history of the user (step S1014). Whether the menu movement destination can be decided using the manipulation history of the user is determined by, for example, the detection unit 121. The information processing system 100 can store the manipulation history of the user by maintaining information regarding user manipulations acquired by the input unit 110. It is needless to say that the manipulation history of the user may be stored in another device connected to the information processing system 100 via a network or the like.
When it is determined in the foregoing step S1014 that the menu movement destination can be decided using the manipulation history of the user (Yes in step S1014), the information processing system 100 subsequently sets the menu movement destination to an unexamined position which is frequently manipulated by the user (step S1015). The process of step S1015 is executed by, for example, the output control unit 122.
Conversely, when it is determined in the foregoing step S1014 that the menu movement destination can be decided using the manipulation history of the user (No in step S1014), the information processing system 100 subsequently sets the menu movement destination to the unexamined position closest to the original menu position (step S1016). The process of step S1016 is executed by, for example, the output control unit 122.
When it is determined in the foregoing step S1009 that all of the menu movement destinations are examined (Yes in step S1009), the information processing system 100 subsequently determines whether there is a position to which the menu can be moved at any position inside the window displayed by the information processing system 100 (step S1017). The process of step S1017 is executed by, for example, the detection unit 121.
When it is determined in the foregoing step S1017 that there is a position to which the menu can be moved at any position inside the window (Yes in step S1017), the information processing system 100 sets the menu movement destination to any position which is not suitable for the above-described processes that is closest to the initial position and inside the window displayed on the screen (step S1018). The process of step S1018 is executed by, for example, the output control unit 122.
Conversely, when it is determined in the foregoing step S1017 that there is no position to which the menu can be moved at any position inside the window (No in step S1017), the information processing system 100 subsequently determines that there is only one window inside the screen (step S1019). The process of step S1019 is executed by, for example, the detection unit 121.
When it is determined in the foregoing step S1019 that there is only one window inside the screen (Yes in step S1019), the information processing system 100 sets the menu movement destination to any position which is not suitable for the above-described processes that is closest to the initial position and is outside the window displayed on the screen since there is no concern of confusion with a menu of another window (step S1020). The process of step S1020 is executed by, for example, the output control unit 122. Conversely, when it is determined in the foregoing step S1019 that there are a plurality of windows inside the screen, the information processing system 100 directly ends the process without changing the menu movement destination.
By executing the above-described series of processes, the information processing system 100 can execute control such that the position of the menu is moved to a proper position in addition to the form of
When the user moves a real object on the dining table to which a menu is projected to a location to which the menu is not projected, the user can manipulate the projected menu. When the user executes a manipulation of moving the projected window to a position at which the menu is not projected to a real object, the user can manipulate the projected menu. However, when the user is forced to execute such a manipulation, a burden on the user is large.
Accordingly, by executing the above-described series of processes, the information processing system 100a automatically changes the display position of the menu button 1100 so that the display position does not overlap the position of a real object (the piece of cake 1201 or the cup of coffee 1202) on the dining table, as in
By executing the above-described series of processes, the information processing system 100 according to the embodiment of the present disclosure can detect the position of another window or the state of the information display surface, for example, the state of an object placed on the table 140a illustrated in
The information processing system 100 according to the embodiment of the present disclosure executes the above-described series of processes so that the user can manipulate the menu without necessarily executing a step of moving the position of the window or moving the real object placed on the information display surface. Accordingly, the information processing system 100 according to the embodiment of the present disclosure executes the above-described series of processes, and thus the number of steps and a time until the user executes an intended manipulation are reduced.
The information processing system 100 according to the embodiment of the present disclosure executes the above-described series of processes, and thus it is possible to reduce effort in the user manipulation on a window pushed outside the screen in a GUI in which there is a possibility of the window frequently moving outside the screen and which includes the window which can be omnidirectionally manipulated. Since the effort of the user manipulation on the window pushed outside the screen is reduced, the information processing system 100 according to the embodiment of the present disclosure enables the user to use the screen broadly.
Even when a plurality of windows are displayed on a screen by the users, the information processing system 100 according to the embodiment of the present disclosure controls the display position of the menu such that the menu can be viewed normally, and thus it is possible to obtain the advantage that the user can easily specify an intended application.
In the case of a form in which the information processing system 100 according to the embodiment of the present disclosure projects a screen, as illustrated in
For example, when a form in which the information processing system 100 according to the embodiment of the present disclosure projects information to a table and causes a user to manipulate the information is adopted, as illustrated in
However, when a plurality of users own substantially portable terminals that are substantially the same, place the portable terminals on a table simultaneously and individually, and cause the information processing system 100 to recognize the portable terminals, the information processing system 100 may not be able to determine which portable terminal it is better to link to the information processing system 100.
Accordingly, in an embodiment of the present disclosure, the information processing system 100 capable of easily specifying a portable terminal to be linked even when a plurality of users own portable terminals that are substantially the same and place the portable terminals on a table simultaneously and individually will be described.
In the information processing system 100 according to the embodiment of the present disclosure, the detection unit 121 identifies a portable terminal to be linked using an image recognition technology and detects the position and posture of the identified portable terminal and a distance from the input unit 110. Accordingly, the information processing system 100 according to the embodiment of the present disclosure has feature amount data necessary to identify the portable terminals. The portable terminals to be recognized in the information processing system 100 have image data discovered in the information processing system 100.
In linking of the information processing system 100 and the portable terminals, the following techniques are considered. For example, there is a method in which the owner of each portable terminal selects a preference image and the information processing system 100 is caused to recognize this image in advance. After the image is recognized, the owner of the portable terminal causes his or her portable terminal to display the image recognized in advance and causes the information processing system 100 to recognize the image. In this way, the information processing system 100 and the portable terminal can be linked.
There is a method in which the owner of the portable terminal installs a recognition-dedicated application including image data to be recognized in the information processing system 100 in advance in the portable terminal. When the information processing system 100 has feature amount data of the image data included in the related application in advance, it is possible to suppress a detection process burden on the information processing system 100.
There is a method in which the information processing system 100 is caused to recognize a screen, such as a lock screen or a home screen, generated by a system of the portable terminal as a recognition target image. When the information processing system 100 is caused to recognize the screen generated by the system of the portable terminal, the screen may be recognized through a dedicated application or the user may capture a screen by himself or herself and may cause the information processing system 100 to recognize the captured image.
The portable terminal linked to the information processing system 100 displays a recognition screen for causing the information processing system 100 to recognize the portable terminal according to a predetermined manipulation from the user (step S1101). The information processing system 100 causes a mode to proceed to a mode of recognizing the portable terminal according to a predetermined manipulation from the user (hereinafter also referred to as a “recognition mode”) (step S1111).
The user places the portable terminal displaying the recognition screen for causing the information processing system 100 to recognize the portable terminal in the foregoing step S1101 inside a recognizable area for causing the information processing system 100 to recognize the portable terminal (step S1102). As the recognizable area, any region can be set by the information processing system 100. For example, in the case of a system projecting information to a table, the entire area to which the information is projected to the table may be the recognizable area or a predetermined partial region may be the recognizable area. When the predetermined partial region is set as the recognizable area, the information processing system 100 may output display as if the information processing system 100 understands the recognizable area from the output unit 130.
When the mode proceeds to the recognition mode in the foregoing step S111, the information processing system 100 subsequently retrieves a recognition image registered in the information processing system 100 (step S1112). The process of retrieving the recognition image is executed by, for example, the detection unit 121. The information processing system 100 may start the retrieval process of step S1112 when the portable terminal displaying an image recognition screen is placed in the recognizable area, or may start the retrieval process before the portable terminal is placed in the recognizable area.
When the retrieval process of the foregoing step S1112 starts, the information processing system 100 determines whether the registered image is discovered through the retrieval process of the foregoing step S1112 (step S1113). This determination process is executed by, for example, the detection unit 121. When it is determined in step S1113 that the registered image is not discovered (No in step S1113), the information processing system 100 subsequently determines whether a given time has passed after the retrieval process starts (step S1114). The determination process is executed by, for example, the detection unit 121. When it is determined in step S114 that the given time has passed and the registered image is not discovered (Yes in step S1114), the information processing system 100 ends the process and exits the recognition mode. Conversely, when it is determined in step S1114 that the given time has not passed (No in step S1114), the retrieval process of step S1112 is executed again.
When it is determined in the foregoing step S1113 that the registered image is discovered (Yes in step S1113), the information processing system 100 subsequently displays an effect indicating that the registered image is discovered (step S1115). The display process of step S1115 is executed by, for example, the output control unit 122. Any effect may be used as the effect indicating that the registered image is discovered. For example, the information processing system 100 executes, for example, display showing ripples spreading from the location in which the portable terminal is placed. When the effect indicating that the registered image is discovered overlaps an image displayed by the portable terminal, the recognition process in the information processing system 100 may be affected. Therefore, the information processing system 100 preferably outputs the effect indicating that the registered image is discovered so that the effect does not overlap the portable terminal.
When the information processing system 100 recognizes an image of the portable terminal and the luminance of a display of the portable terminal is too bright or too dark, the recognition in the information processing system 100 is affected. When the information processing system 100 is in the recognition mode, for example, the user of the portable terminal may adjust the luminance of the display of the portable terminal so that an image can be easily recognized by the information processing system 100.
When the effect indicating that the registered image is discovered in the foregoing step S1115 is displayed, the information processing system 100 subsequently determines whether an application currently executed in the information processing system 100 is an application for which it is necessary to continuously recognize the image (step S1116). This determination process is executed by, for example, the detection unit 121. An example of the application for which it is necessary to continuously recognize the image includes an application for which it is necessary to continuously display information by tracking the recognized image.
When it is determined in the foregoing step S1116 that the application currently executed in the information processing system 100 is not the application for which it is necessary to continuously recognize the image (No in step S1116), it is not necessary for the portable terminal to remain in the recognizable area. Therefore, the information processing system 100 subsequently displays information prompting the user to remove the recognized portable terminal from the recognizable area (step S1117). The display process of step S1115 is executed by, for example, the output control unit 122. Any information may be used as the information prompting the user to remove the portable terminal. However, when the information prompting the user to remove the portable terminal overlaps an image displayed by the portable terminal, the recognition process in the information processing system 100 is affected. Therefore, the information processing system 100 preferably outputs the information prompting the user to remove the portable terminal so that the information does not overlap the portable terminal.
After the information is displayed in the foregoing step S1117, the information processing system 100 determines whether the image registered in the information processing system 100 disappears from the inside of the screen (the inside of the recognizable area) (step S1118). The determination process is executed by, for example, the detection unit 121. When it is determined in step S1118 that the image registered in the information processing system 100 does not disappear from the inside of the screen (inside of the recognizable area) (No in step S1118), the information processing system 100 continuously displays the information displayed in step S1117. Conversely, when it is determined that the user removes the portable terminal from the recognizable area in step S1103 of
Conversely, when it is determined in the foregoing step S1116 that the currently executed application is the application for which it is necessary to continuously recognize the image (Yes in step S1116), the information processing system 100 skips the processes of the foregoing steps S1117 to S1119.
When the image recognition process stops in the foregoing step S1119, the information processing system 100 subsequently records the ID of the image discovered in the foregoing step S1113 (step S1120). The process of step S1120 is executed by, for example, the detection unit 121. Then, the information processing system 100 performs matching of the ID of the image and starts a communication process with the portable terminal displaying the image (step S1121). The communication between the information processing system 100 and the portable terminal is executed through, for example, the Internet, Wi-Fi, or Bluetooth (registered trademark). The information processing system 100 records the position, the posture, and the size of the image discovered in the foregoing step S1113 (step S1122). The process of step S122 is executed by, for example, the detection unit 121.
Then, the information processing system 100 executes display indicating a connection state with the portable terminal on the information display surface using information regarding the position, the posture, and the size of the image discovered in the foregoing step S1113 (step S1123). The display process of step S1123 is executed by, for example, the output control unit 122. The display indicating the connection state with the portable terminal in step S1123 is also referred to as a “connection mark” below. The information processing system 100 may display, for example, the same image as the recognition screen displayed by the recognized portable terminal as the connection mark. By displaying the same image as the recognition screen displayed by the recognized portable terminal as the connection mark, the information processing system 100 can easily allow the user to comprehend which connection mark corresponds to which portable terminal.
As illustrated in
The connection state between the information processing system 100 and the portable terminal may be released through an active connection releasing manipulation from the user or may be automatically released when no operation is executed on the portable terminal or the connection mark for a given time. When the connection state between the information processing system 100 and the portable terminal is released, the information processing system 100 may eliminate the connection mark displayed in the foregoing step S1123. The information processing system 100 can present end of the connection state to the user by eliminating the connection mark displayed in the foregoing step S1123.
The information processing system 100 according to the embodiment of the present disclosure can offer the user various experiences by executing the above-described series of processes and displaying the connection mark on the information display surface. Hereinafter, examples of the experiences offered to the user through the display of the connection mark by the information processing system 100 will be described.
The information processing system 100 enables sharing of image data stored in the portable terminal by displaying the connection mark on the information display surface.
When the connection marks 1303 and 1304 are displayed on the information display surface, as illustrated in
After the information processing system 100 displays the connection mark by executing the above-described series of processes, the user can freely carry the portable terminal. Accordingly, an application that displays an imaged photo in the information processing system 100 when the photo is imaged by the portable terminal linked to the information processing system 100 is also possible.
The information processing system 100 enables sharing of music data stored in the portable terminal by displaying the connection mark on the information display surface.
When the connection marks 1303 and 1304 are displayed, as illustrated in
The information processing system 100 can share various kinds of data with the portable terminal linked to the information processing system 100 in addition to the image data or the music data. The information processing system 100 can enable, for example, websites or bookmarks of browsers displayed by the portable terminal linked to the information processing system 100 to be shared, as in the above-described GUI. For the portable terminal linked to the information processing system 100 to continuously display a website displayed by the information processing system 100, the information processing system 100 can also offer a manipulation of dragging a predetermined menu button of a browser executed by the information processing system 100 to the connection mark.
The information processing system 100 enables sharing of contact address data stored in the portable terminal by displaying the connection mark on the information display surface.
When the information processing system 100 displays the connection marks 1303 and 1304 on the information display surface, as illustrated in
The portable terminal linked to the information processing system 100 can add functions by installing various applications. The information processing system 100 can also realize a GUI in which an application can be given and received between the portable terminals by displaying the connection marks through the above-described processes.
The information processing system 100 according to the embodiment of the present disclosure can acquire the position, the posture, the size, and the like of the portable terminal and then can be linked to execute communication with the portable terminal by recognizing the image displayed by the portable terminal even when a dedicated application is not activated by the portable terminal.
The information processing system 100 according to the embodiment of the present disclosure causes the portable terminal to display any image and registers the displayed image before the device linkage with the portable terminal. The information processing system 100 according to the embodiment of the present disclosure can make image selection more fun for the user through such an image registration process. When the information processing system 100 according to the embodiment of the present disclosure completes the recognition of the image displayed by the portable terminal, the information processing system 100 can allow the user to easily recognize the user of the connection mark by continuously displaying the image as the connection mark on the screen.
The information processing system 100 according to the embodiment of the present disclosure causes the portable terminal to display any image and registers the displayed image before the device linkage with the portable terminal. Therefore, even when there are a plurality of substantially the same kind of portable terminals, the portable terminals can be uniquely identified by proper use of recognition images. There is a possibility of each user incidentally selecting substantially the same image as the recognition image when the plurality of users have the same kind of devices. Accordingly, the information processing system 100 according to the embodiment of the present disclosure may not be linked to the portable terminal when the portable terminal is not caused to register the selected recognition image in the information processing system 100. The information processing system 100 can determine whether the image selected by the portable terminal is superimposed by causing the portable terminal to register the selected recognition image in the information processing system 100.
When substantially the same image is selected as the recognition image, a problem may occur if the plurality of users have the same kind of device. When the exteriors of the devices are similar despite being different kinds of devices, the similar problem may occur at the time of selection of substantially the same image as the recognition image. Accordingly, the information processing system 100 may cause the portable terminals to be linked to select the recognition images and register recognition images in the information processing system 100 so that all of the portable terminals are unique.
The information processing system 100 according to the embodiment of the present disclosure can receive manipulations on a menu in various directions from a plurality of users, for example, as illustrated in
In a state in which the manipulations of the plurality of users on the menu in various directions are received and a plurality of windows are displayed, it is hard to determine the window displayed by oneself. When an application is activated from one menu, login is necessary for each user at the time of start of the application. Thus, inconvenience may increase as the number of users increases.
Accordingly, the information processing system 100 according to the embodiment of the present disclosure is configured to receive manipulations of users, as will be described below, so that an improvement in operability and convenience is achieved at the time of reception of the manipulations of the plurality of users on the menu in various directions.
An example of an operation of the information processing system 100 when an improvement in operability and convenience is achieved at the time of reception of the manipulations of a plurality of users on a menu in various directions will be described.
Hereinafter, the drag manipulation on the menu button 1100 or the menu button group 1110 is also simply referred to as a drag manipulation (on the menu) in some cases.
When the information processing system 100 detects that the user executes the drag manipulation on the menu displayed by the information processing system 100 (step S1201), the information processing system 100 determines whether the menu is pressed at one point and the menu is dragged at another point in the drag manipulation (step S1202). The processes of the foregoing steps S1201 and 1202 are executed by, for example, the detection unit 121.
When it is determined in the foregoing step S1202 that the manipulation of pressing the menu at one point and further dragging the menu dragged at one point is executed (Yes in step S1202), the information processing system 100 generates a copy of the dragged menu (step S1203). The generation of the copy of the menu in step S1203 is executed by, for example, the output control unit 122.
Conversely, when it is determined in the foregoing step S1202 that the manipulation of pressing the menu at one point and further dragging the menu at one point is not executed (No in step S1202), the information processing system 100 subsequently determines whether the menu is pressed at two points and the menu is dragged at one point in the drag manipulation detected in the foregoing step S1201 (step S1204). The process of step S1204 is executed by, for example, the detection unit 121.
When it is determined in the foregoing step S1204 that the manipulation of pressing the menu at two points and dragging the menu at one point is executed (Yes in step S1204), the information processing system 100 subsequently determines whether the menu is a folder menu indicating a folder (step S1205). The process of step S1205 is executed by, for example, the detection unit 121. When it is determined in step S1205 that the dragged menu is not the folder menu (No in step S1205), the information processing system 100 generates a copy of the dragged menu, as in the case of the manipulation of pressing the menu one point and dragging the menu at one point (step S1203). Conversely, when it is determined in step S1205 that the dragged menu is the folder menu (Yes in step S1205), the information processing system 100 generates a shortcut to the menu (the folder menu) (step S1206). The generation of the shortcut to the menu in step S1206 is executed by, for example, the output control unit 122. The shortcut is assumed to refer to a menu which functions as a reference to another menu and has no substance.
The information processing system 100 may generate the copy menu button 1111 of the menu button (B) illustrated in
A difference between generation of a copy of the menu and generation of a shortcut of the menu will be described. When a copy of a menu is generated based on a manipulation from a user and another menu is added to one side menu (for example, a menu of a copy source), the information processing system 100 does not add the added menu to the other side menu (for example, a menu of a copy destination). On the other hand, when a shortcut of a menu is generated based on a manipulation from a user and another menu is added to one side menu (for example, a menu of a shortcut source), the information processing system 100 also adds the added menu to the other side menu (for example, a menu of a shortcut destination).
Conversely, when it is determined in the foregoing step S1204 that the manipulation of pressing the menu at two points and dragging the menu at one point is not executed (No in step S1204), the information processing system 100 subsequently determines whether an angle formed by a row of the menu and a drag direction of the menu is equal to or greater than a prescribed value (step S1207). The process of step S1207 is executed by, for example, the detection unit 121.
When it is determined in the foregoing step S1207 that the angle formed by the row of the menu and the drag direction is equal to or greater than the prescribed value (Yes in step S1207), the information processing system 100 subsequently determines whether the dragged menu is a menu separable from the menu button group (step S1208). The process of step S1208 is executed by, for example, the detection unit 121. When it is determined in step S1208 that the dragged menu is not the menu separable from the menu button group (No in step S1208), the information processing system 100 generates a copy of the dragged menu (step S1203). Conversely, when it is determined in step S1208 that the dragged menu is the separable menu (Yes in step S1208), the information processing system 100 separates the menu from the menu button group (step S1209). The process of separating the menu from the menu button group in step S1209 is executed by, for example, the output control unit 122.
Conversely, when it is determined in the foregoing step S1207 that the angle formed by the row of the menu and the drag direction is not equal to or greater than the prescribed value (No in step S1207), the information processing system 100 executes a drag manipulation from the user as a normal behavior (step S1210). The process of step S1210 is executed by, for example, the output control unit 122. The normal behavior is, for example, a behavior in which the menu button 1100 is moved to track a manipulation from the user or a behavior in which the menu button group 1110 tracks a manipulation from the user and is rotated about the menu button 1100.
The information processing system 100 according to the embodiment of the present disclosure can allow the user to copy the menu, generate the shortcut of the menu, or separate the menu through a simple manipulation by executing the above-described operation according to content of a drag manipulation on the menu button by the user.
Next, an example of an operation of the information processing system 100 according to content of a drop manipulation on the menu button by the user will be described.
Hereinafter, the drop manipulation on the menu button 1100 or the menu button group 1110 is also simply referred to as a drop manipulation (on the menu) in some cases.
When the information processing system 100 detects that the user executes a drop manipulation on a menu displayed by the information processing system 100 (step S1211), the information processing system 100 determines whether a distance dragged by the user is equal to or less than a prescribed distance (step S1212). The processes of the foregoing steps S1211 and 1212 are executed by, for example, the detection unit 121.
When it is determined in the foregoing step S1212 that the distance dragged by the user is equal to or less than the prescribed distance (Yes in step S1212), the information processing system 100 executes functions assigned to the dropped menu (step S1213). The functions assigned to the menu are, for example, various functions such as activation of an application, display of a website, display of image data, and reproduction of music data and are not limited to specific functions.
Conversely, when it is determined in the foregoing step S1212 that the distance dragged by the user exceeds the prescribed distance (No in step S1212), the information processing system 100 determines whether the menu is dropped on a menu which is another menu and in which a dragged distance is equal to or less than the prescribed distance (step S1214). The determination of step S1214 is executed by, for example, the detection unit 121.
When it is determined in the foregoing step S1214 that the menu is dropped on the menu which is the other menu other than the dropped menu and in which the dragged distance is equal to or less than the prescribed distance (Yes in step S1214), the information processing system 100 subsequently determines whether the dropped menu is a menu which accepts the drop (step S1215). The determination of step S1214 is executed by, for example, the detection unit 121.
When it is determined in the foregoing step S1215 that the dropped menu is the menu that accepts the drop (Yes in step S1215), the information processing system 100 subsequently determines whether the dropped menu is a folder menu (step S1216). The determination of step S1215 is executed by, for example, the detection unit 121.
When it is determined in the foregoing step S1216 that the dropped menu is the folder menu (Yes in step S1216), the information processing system 100 subsequently adds the dropped menu to a menu (subordinate menu) in a lower hierarchy of the drop destination (step S1218). The addition process of step S1218 is executed by, for example, the output control unit 122.
Conversely, when it is determined in the foregoing step S1216 that the dropped menu is not the folder menu (No in step S1216), the information processing system 100 subsequently determines whether an item corresponding to the menu dropped by the user is handleable in the dropped menu (step S1217). The determination of step S1217 is executed by, for example, the detection unit 121.
When it is determined in the foregoing step S1217 that the item dropped by the user is handleable in the dropped menu (Yes in step S1217), the information processing system 100 subsequently delivers information linked to the menu dropped by the user to the menu receiving the drop (step S1219). The process of step S1219 is executed by, for example, the output control unit 122.
Conversely, when it is determined in step S1217 that the item dropped by the user is not handleable in the dropped menu (No in step S1217), the information processing system 100 subsequently executes a process of generating a new menu having a menu of the drop source and a menu of the drop destination in subordinate components (step S1220). The process of step S1220 is executed by, for example, the output control unit 122.
When it is determined in the foregoing step S1214 that the menu is not dropped on the menu which is the other menu other than the dropped menu and in which the dragged distance is equal to or less than the prescribed distance (No in step S1214), the information processing system 100 subsequently determines whether a menu other than the dropped menu approaches the menu on which the menu is dropped by a distance equal to or less than the prescribed distance in a state in which the other menu is pressed at one point (step S1221). The determination of step S1221 is executed by, for example, the detection unit 121.
When it is determined in the foregoing step S1221 that the menu other than the dropped menu approaches the dropped menu by the distance equal to or less than the prescribed distance in the state in which the other menu is pressed at one point (Yes in step S1221), the information processing system 100 subsequently determines whether the dropped menu and the other menu can be merged (step S1222). The determination of step S1222 is executed by, for example, the detection unit 121.
When it is determined in the foregoing step S1222 that the dropped menu and the other menu can be merged (Yes in step S1222), the information processing system 100 subsequently executes a process of merging a subordinate menu of the dropped menu and a subordinate menu of the other menu (step S1223). The process of step S1223 is executed by, for example, the output control unit 122. When it is determined in the foregoing step S1222 that the dropped menu and the other menu may not be merged (No in step S1222), the information processing system 100 subsequently executes a process of returning the dropped menu to the position before the drag (step S1226). The process of step S1226 is executed by, for example, the output control unit 122.
When it is determined in the foregoing step S1221 that the menu other than the dropped menu does not approach the dropped menu by the distance equal to or less than the prescribed distance in the state in which the other menu is pressed at one point (No in step S1221), the information processing system 100 subsequently determines whether the dropped menu is dropped on a location within a fixed distance from each of two menus of the same hierarchy (step S1224). The determination of step S1224 is executed by, for example, the detection unit 121.
When it is determined in the foregoing step S224 that the dropped menu is dropped on the location within the fixed distance from each of the two menus of the same hierarchy (Yes in step S1224), the information processing system 100 subsequently executes a process of inserting the dragged and dropped menu between the two menus (step S1225). The process of step S1225 is executed by, for example, the output control unit 122.
Conversely, when it is determined in the foregoing step S1224 that the dropped menu is not dropped on the location within the fixed distance from each of the two menus of the same hierarchy (No in step S1224), the information processing system 100 subsequently determines whether the menu is dragged at a speed equal to or greater than a fixed speed until the menu is dropped (step S1227). The determination of step S1227 is executed by, for example, the detection unit 121.
When it is determined in the foregoing step S1227 that the menu is dragged at a speed equal to or greater than the fixed speed until the menu is dropped (Yes in step S1227), the information processing system 100 subsequently determines whether the dropped menu can be deleted (step S1228). The determination of step S1228 is executed by, for example, the detection unit 121.
When it is determined in the foregoing step S1228 that the dropped menu can be deleted (Yes in step S1228), the information processing system 100 subsequently executes a process of deleting the dragged menu (step S1230). The process of step S1230 is executed by, for example, the output control unit 122. Conversely, when it is determined in the foregoing step S1228 that the dropped menu may not be deleted (No in step S1228), the process of returning the dropped menu to the position before the drag is executed (step S1226). The process of step S1226 is executed by, for example, the output control unit 122.
When it is determined in the foregoing step S1227 that the menu is dragged at a speed less than the fixed speed until the menu is dropped (No in step S1227), the information processing system 100 subsequently determines whether the drop location is outside the screen (step S1229). The determination of step S1229 is executed by, for example, the detection unit 121.
When it is determined in the foregoing step S1229 that the drop location is outside the screen (Yes in step S1229), the information processing system 100 subsequently determines whether the dropped menu can be deleted in the foregoing step S1228. Conversely, when it is determined in the foregoing step S1229 that the drop location is not outside the screen (No in step S1229), the information processing system 100 subsequently executes a process of moving the menu to the drop location (step S1231). The process of step S1231 is executed by, for example, the output control unit 122.
The information processing system 100 can change the state of the menu dropped by the user according to the drop location, the speed of the drag, and the like by executing the above-described series of processes.
The above-described examples of the operations will be further described giving examples of specific GUIs. First, an example of a GUI in which the menu button is deleted will be described.
The deletion of the menu button is not limited to the related example.
The examples of the GUIs in which the menu button is deleted have been described above. Next, an example of a GUI in which the menu button is added to the menu button group will be described.
The example of the GUI in which the menu button is added to the menu button group has been described above. Next, examples of GUIs in which the menu is added to a subordinate item of a drop destination menu will be described.
The examples of the GUIs in which the menu is added to the subordinate item of the drop destination menu have been described above. The examples of the case in which the menu button of the drop destination is not pressed with a finger of the user have been described above. Examples of cases in which a menu button of a drop destination is pressed with a finger of the user will be described below.
In this way, the information processing system 100 merges the menus when the user drops another menu button on a certain menu button while pressing the menu button with his or her finger.
The examples of the cases in which the menu button of the drop destination is pressed with a finger of the user have been described above. Next, examples of manipulations on menus in the information processing system 100 according to an embodiment of the present disclosure will be further described giving specific examples.
When another menu button 1111 is dropped toward the newly generated menu button group 1110, the information processing system 100 adds the dropped menu button 1111 to the newly generated menu button group 1110, as illustrated in
By receiving the drag manipulation or the drop manipulation by the user, the information processing system 100 according to the embodiment of the present disclosure can ensure ease of customization of the menu. By receiving the drag manipulation or the drop manipulation by the user, the information processing system 100 according to the embodiment of the present disclosure can allow a plurality of users to simultaneously use the same menu.
Another embodiment will be described.
For example, by generating shortcut buttons of menus frequently used by a plurality of users according to the number of users, the information processing system 100 can allow any user to reach the menu through one manipulation.
By generating the copy or the shortcut of the menu, as described above, the information processing system 100 can allow, for example, family members to generate a common menu or can allow the family members to generate separate menus.
In the lower right of
The information processing system 100 can allow the users to generate, for example, bookmarks of websites easily and intuitively by generating the copies of the menus, as described above.
The information processing system 100 according to the embodiment of the present disclosure is configured to receive menu manipulations from a plurality of users, and thus a situation in which the same application or similar applications are activated by a plurality of users and are executed simultaneously can occur. When the same application or similar applications are executed simultaneously by a plurality of users, a situation in which it is difficult to comprehend who activates which application may occur. Accordingly, the information processing system 100 according to the embodiment of the present disclosure supplies a structure capable of binding menus with applications and releasing the binding through a simple user manipulation.
For example, in a state in which the menu button 1100, the menu button groups 1110 and 1120, and the web browser 1140 which is an example of an application are displayed by the information processing system 100, as illustrated on the left side of
When the information processing system 100 detects that the user executes a predetermined manipulation, for example, the user executes a manipulation of cutting the binding at a speed equal to or greater than a predetermined speed in the display of the binding, the information processing system 100 executes a process of releasing the binding of the one menu button in the menu button group 1120 and the menu button 1141 of the web browser 1140. When the binding is released, the information processing system 100 executes a process of closing the web browser 1140. When the process of closing the web browser 1140 is executed, the information processing system 100 may execute a display process of gradually thinning the web browser 1140 and finally removing the display, as illustrated in
The information processing system 100 according to the embodiment of the present disclosure can bind the menu with the application and can release the binding through a simple user manipulation.
By binding the menu with the application, as illustrated in
In
By executing the display process of moving the window of the application according to the execution of the user manipulation of bringing the display of the binding close to the user's hand, the information processing system 100 can improve convenience of the user manipulation. In
By binding the menu with the application, as illustrated in
In
By manipulating the application at the location away from the window of the application in this way, the information processing system 100 can improve the convenience of the user manipulation. For example, the projection type information processing system 100a illustrated in
In the example of
In the above-described examples, the information processing system 100 displaying the GUI for displaying the menu button groups 1110 and 1120 using the menu button 1100 as a starting point has been described. However, the starting point of the menu button group 1110 and 1120 is not limited to the menu button 1100. For example, the information processing system 100 may display the menu button groups 1110 and 1120 using a mobile phone, a smartphone, or another portable terminal owned by the user as a starting point. In the above-described examples, the examples in which the information processing system 100 and the portable terminal are linked have been described. The information processing system 100 can also display the menu button groups 1110 and 1120 using the portable terminal linked to the information processing system 100 as a starting point.
By storing information regarding the menu button groups 1110 and 1120 in the portable terminal 1310, it is possible to display the menu button groups 1110 and 1120 in substantially the same layout even when the portable terminal 1310 is linked to another information processing system 100. For example, the user can edit the layout of the menu button groups 1110 and 1120 at home, store the layout in the portable terminal 1310, bring the portable terminal 1310 to his or her friend's home, and display the menu button groups 1110 and 1120 that he or she edited at home using the information processing system 100 at his or her friend's home.
The information processing system 100 according to the embodiment of the present disclosure can allow each user, for example, each family member, to generate each different menu by generating the shortcut button, as illustrated in
A form in which the information processing system 100 according to the embodiment of the present disclosure is simultaneously used by a plurality of users can be assumed. Accordingly, when the menu customized for each user is generated, as described above, a situation in which a certain user uses the menu of another user can occur. When the menu of the user is not locked, anyone can simply use the menu of the user.
Accordingly, the information processing system 100 according to the embodiment of the present disclosure supplies a structure in which the menu is not usable when authentication is not gained. An authentication scheme may be a password scheme or may be a device authentication scheme using the portable terminal used by the user. In the following example, a structure in which access to the menu is authenticated in accordance with the device authentication scheme will be described.
Then, when the information processing system 100 detects that the portable terminal used by the father is placed near the menu used by the father, the information processing system 100 recognizes the portable terminal. When the information processing system 100 recognizes that the portable terminal is the portable terminal of the father, the key to the menu used by the father is released. The information processing system 100 may recognize the portable terminal through the above-described image recognition or may recognize the portable terminal through near field communication (NFC), Wi-Fi communication, Bluetooth (registered trademark) communication, or the like. When the authentication is completed, the information processing system 100 releases the key to the menu locked with the key and executes control such that the user can access the menu.
In this way, using the structure in which the access to the menu is authenticated using the device authentication scheme, the information processing system 100 according to the embodiment of the present disclosure can restrict the access to the menu by an unauthenticated user.
The information processing system 100 according to the embodiment of the present disclosure can display the menu button using the portable terminal as the starting point, as described above. Here, the information processing system 100 according to the embodiment of the present disclosure supplies the structure controlling authority over the portable terminal in accordance with the menu button displayed using the portable terminal as the starting point.
For example, it is assumed that the user executes a manipulation of copying certain authority (for example, payment authority for a price corresponding to 1000 yen) from the menu button group 1120 displayed using the portable terminal 1310 as the starting point to the menu button group 1120 displayed using the portable terminal 1320 as the starting point. The information processing system 100 executes a process of copying the authority maintained in the portable terminal 1310 to the portable terminal 1320 according to the user manipulation.
By allowing the authority to be copied between the portable terminals through the manipulation on the menu button in this way, the information processing system 100 according to the embodiment of the present disclosure can transfer the authority in the portable terminal through a simple manipulation.
The information processing system 100 according to the embodiment of the present disclosure supplies a function of delivering data to an application based on the drag and drop manipulations on the menu button to the window of the application.
In
In this way, by executing the function corresponding to the dropped menu button according to the drag and drop manipulations by the user, the information processing system 100 according to the embodiment of the present disclosure can offer an intuitive manipulation to the user.
The function executed by each menu button of the menu button group 1120 in
Hereinafter, specific examples of user interfaces (UIs) which can be realized by the above-described information processing system 100 will be described. Hereinafter, the projection type information processing system 100a will be assumed for description. However, the UIs related to the specific examples to be described below can also be realized in any type of information processing system described with reference to
The information processing system 100 according to the present specific example supplies a semicircular menu rotated according to the shape of a manipulation object. When a menu is displayed regardless of the shape of a manipulation object, for example, display of the menu may overlap a hand, and thus visibility deteriorates in some cases. Accordingly, the information processing system 100 according to the present specific example displays a menu in a region other than a region in which it would overlap a manipulation object. Hereinafter, the specific example will be described in detail with reference to
As illustrated in
To execute such a display process, the detection unit 121 first detects a manipulation object overlapping the display surface. The manipulation object may be a part of the body of the user such as a finger or a hand, may be any object such as a manipulation stick to be manipulated by the user, or may be a robot arm or the like. The detection unit 121 detects the shape, an orientation of a longer side, an orientation of a shorter side, and a height of the manipulation object overlapping the display surface based on depth information obtained by a stereo camera. When the manipulation object is a finger, the detection unit 121 may detect a direction in which the finger points. In the example illustrated in
Then, the output control unit 122 controls the output unit 130 such that a menu with a circular shape in which a region overlapping the manipulation object detected by the detection unit 121 is omitted (a semicircular shape) is displayed on the display surface. For example, in the example illustrated in
As means for generating the menu with the circular shape in which the region overlapping the manipulation object is omitted, the output control unit 122 may increase or decrease at least one of the number of icons displayed or the display sizes of the icons according to the size of the region in which the manipulation object overlaps the display surface. For example, the output control unit 122 controls the output unit 130 such that the number of displayed icons increases or decreases according to the size of the hand touching the menu. Specifically, as illustrated in
As illustrated in
Since the output control unit 122 executes such display control such that the icons do not overlap the finger of the user, the user can easily comprehend the entire menu. Further, an erroneous operation caused due to an icon of which display overlaps a finger being unintentionally touched with the finger can be avoided. The output control unit 122 can display the icons utilizing the available area as much as possible by controlling the menu display in accordance with the size of the hand. The adjustment of the number of icons displayed can be particularly effective when the total number of icons exceeds the number displayed, that is, not all of the icons may be displayed.
The output control unit 122 may control the direction of the menu to be displayed according to the direction of the manipulation object. Specifically, the output control unit 122 may control the output unit such that the menu in which the items are disposed based on the orientation of the longer side of the manipulation object detected by the detection unit 121 is displayed on the display surface. For example, in the example illustrated in
The information processing system 100 can also identify an individual according to the detection result by the detection unit 121 and execute an individualized output according to a manipulation history of the identified individual or the like. For example, the information processing system 100 can specify the individual according to the thicknesses of fingers. Therefore, for example, even in a state in which the user logs in to the information processing system 100 used at home with a family sharing account, the individualized output can be output to the logged-in family members without orienting a camera toward the face of the user and identifying the user through face recognition. For example, when the information processing system 100 is of a projection type, the user can be supplied with an individualized output even without looking up. The information processing system 100 can output the individualized output even in an environment in which an unspecified large number of users in a bar or the like touch. However, for example, when the user executes a touch while wearing gloves on a snowy mountain, a case in which the thicknesses of the fingers change even for the same individual is considered.
The present specific example is a form in which input and output are optimized for the user by estimating a direction in which the user is located from a direction in which a hand or a finger is observed when a camera, a microphone, a projector, and a speaker are known. When the position of the user is not considered in sound acquisition by the microphone, it is difficult to acquire a clear sound in some cases. When the position of the user is not considered in a sound output from the speaker, it is difficult to output a sound with a sense of presence in some cases. Accordingly, the information processing system 100 according to the present specific example estimates the position of the user and executes optimized input and output at the position of the user. Hereinafter, the present specific example will be described in detail with reference to
For example, an example in which the microphone functioning as the input unit 110 is optimized for the user will be described here. The detection unit 121 according to the present specific example controls directivity of the microphone functioning as the input unit 110 and orients the directivity of the microphone toward the mouth of the user. The detection unit 121 controls the directivity using a microphone array in which a plurality of microphones are combined.
To execute such directivity control, the detection unit 121 first detects the manipulation object. The detection unit 121 detects the shape, the direction of the longer side, the direction of the shorter side, and the height of the manipulated object overlapping the display surface. When the manipulation object is a finger, the detection unit 121 may detect a direction pointed by the finger. Then, the detection unit 121 functions as an estimation unit that estimates a direction in which the user manipulating the manipulation object is located based on the detected manipulated object. For example, the detection unit 121 detects a hand or a finger of the user as a manipulation object and estimates the direction in which the user is located based on the position and direction of the hand or the finger. Hereinafter, a specific estimation method will be described.
For example, in the example illustrated in
For example, in the example illustrated in
For example, in the example illustrated in
For example, in the example illustrated in
In this way, the detection unit 121 estimates the direction in which the user is located. Then, the detection unit 121 controls the input unit 110 such that an input having directivity in the estimated direction in which the user is located is executed. For example, in the example described above with reference to
The output control unit 122 controls the output unit 130 such that an output having directivity in the direction which is estimated by the detection unit 121 and in which the user is located is executed. For example, the output control unit 122 controls a speaker functioning as the output unit 130 such that a channel is configured to output a sound in the direction in which the user is located. In addition to the sound, the output control unit 122 may control the output unit 130 such that an image is output, for example, in the estimated direction in which the user is located so that the image directly faces the user.
As described above, according to the present specific example, by having the directivity of the microphones according to the position of the user, for example, a success ratio of sound recognition or a sound command can be improved, for example, even in an environment in which there is noise such as everyday noise and the microphone is far from the position of the user. According to the present specific example, by changing the channel configuration of the speakers according to the position of the user, it is possible to realize an acoustic space with a better sense of presence. For example, the information processing system 100 according to the present specific example can reproduce content for which a channel configuration is designed on the assumption of use of a home television according to an intention of a content generator. Additionally, according to the present specific example, the information processing system 100 can also display the application such as a web browser so that the application directly faces the user after the sound recognition is completed.
(Photo Application)
Here, referring to
For example, the application acquires a photographing position from exchangeable image file format (Exif) information incidental to a photo and estimates an azimuth of the photographing position viewed from the table 140a. As illustrated in
The photos displayed on the projection surface 140a by the photo application can be manipulated and browsed simultaneously by many people in many directions. For example, the users at four sides of the projection surface 140a can simultaneously select photos in many directions and move their positions while changing the directions, edit the photos, or add new photos.
The present specific example is a form in which the state of the projection surface and the state of an application during activation are managed and an illuminator (a light) is controlled as necessary. In general, when a projector and an illuminator are used, a projected image may become unclear due to brightness of the illuminator. For this reason, a person executes an action of turning off illuminator in a room or turning off illuminator in only the vicinity of a projection surface. When a projector and an illuminator are used, the information processing system 100 according to the present specific example controls the illuminator such that an image projected by the projector is clearly displayed, and thus such effort by the person can be reduced. The information processing system 100 according to the present specific example is of a projection type and it is assumed that a controllable illumination unit is integrated with the body or a separated illumination unit can be remotely adjusted. The illumination unit is assumed to change a radiation range and a radiation direction. Hereinafter, the present specific example will be described specifically with reference to
As illustrated in
For example, when the projector 2012 projects nothing to the projection surface of the table 140a as in
As in
The output control unit 122 can control the illumination units 2010 based on a kind of object placed on the table 140a. For example, when the object placed on the projection surface is detected by the detection unit 121, the output control unit 122 may control the illumination units 2010 such that the projection surface is irradiated with an amount of light according to whether the detected object is an illuminant that emits light or a reflector (which does not emit light) reflecting light. Specifically, the output control unit 122 controls the illumination units 2010 such that the projection surface is irradiated with a small amount of light when the object is an illuminant and such that the projection surface is irradiated with a large amount of light when the object is a reflector. Hereinafter, illumination control when a smartphone is placed as an illuminant and a plate is placed as a reflector on the table 140a will be described with reference to
As illustrated in the left drawing of
For example, when the object is registered as a smartphone in the server, the output control unit 122 controls the illumination units 2010 such that the projection surface is irradiated with an original small amount of light, as illustrated in the right drawing of
On the other hand, when the object is registered as a plate in the server, the output control unit 122 controls the illumination units 2010 such that the projection surface is irradiated with a large amount of light, as illustrated in the left drawing of
The detection unit 121 may detect whether the object is an illuminant or a reflector with reference to an image captured by the camera 2011 while adjusting the amount of light by the illumination units 2010. For example, the detection unit 121 may recognize that the object is an illuminant when the object can be detected even in a dark state, and may recognize that the object is a reflector when the object can first be detected in a bright state. According to such a scheme, the detection unit 121 can identify even an object unregistered in the server as an illuminant or a reflector without inquiring of the server.
The information processing system 100 can improve search precision (detection precision) of the object through control of an illumination area. For example, as illustrated in the left drawing of
The information processing system 100 can improve the search precision of the object through control of the amount of light. Specifically, the detection unit 121 adjusts the amount of light of the illumination units 2010 according to a material of the marker. For example, when the marker is formed of a glossy material such as glass or plastic and a highlight occurs, it may be difficult to detect the marker from the image captured by the camera 2011. Therefore, the detection unit 121 controls the illumination units 2010 such that the projection surface is irradiated with a small amount of light by which the marker can be detected. On the other hand, for example, when the marker is formed of a glossless (matte) material such as cloth, paper, or wood, the marker can be easily detected when an environment is bright. Therefore, the detection unit 121 controls the illumination units 2010 such that the projection surface is irradiated with as large an amount of light as possible. The detection unit 121 can determine the material of the marker, for example, with reference to information indicating the material of the marker registered in the server. As illustrated in
As illustrated in
As described above, according to the present specific example, the information processing system 100 can project a clear image to the projection surface by adjusting the illumination intensity and the illumination range by the illumination units according to the state of the projection surface. The information processing system 100 can suppress an influence on an environment in which an entire room unintentionally becomes dark or bright by adjusting the illumination range so that a necessary spot is irradiated. The information processing system 100 can improve recognition precision of the object placed on the table 140a by adjusting the illumination intensity and the illumination range of the illumination units.
The present specific example is a form in which an excess of the number of recognizable manipulation objects is fed back. For example, when there is no feedback even when the table 140a is touched with a finger, the user may not discern whether the touch has failed to be recognized, whether a UI has failed to respond despite the touch being recognized, or whether he or she has failed to execute a manipulation. Here, when a recognizable number is exceeded, the information processing system 100 fails to detect a touch corresponding to the excess, and thus it is difficult to give the user feedback.
Accordingly, the information processing system 100 according to the present specific example defines a number obtained by subtracting 1 from a computationally recognizable upper limit of the manipulation object as a recognizable upper limit based on specifications. The computationally recognizable upper limit means an upper limit of manipulation objects which can be detected by the detection unit 121. That is, one buffer is provided and the recognizable upper limit based on specifications is defined. Of course, the number of buffers may be any number other than 1. When the recognizable upper limit based on specifications is exceeded by 1, that is, when the number reaches the computationally recognizable upper limit, the information processing system 100 gives the user the feedback indicating that the manipulation object is unrecognizable. Specifically, the detection unit 121 detects a touched manipulation object on the table 140a. When the number of manipulation objects detected by the detection unit 121 is the computationally recognizable upper limit, the output control unit 122 controls the output unit 130 such that a warning is output. Hereinafter, the description will be made specifically with reference to
When the number of fingers touching the table 140a has already reached the recognizable upper limit based on specifications, the output control unit 122 may give feedback indicating that a finger is recognizable or unrecognizable, for example, at a timing at which the finger enters a view angle of the camera before the finger touches the table. Further, the output control unit 122 may give, for example, feedback indicating that a touch may not be detected when hands are clasped and the touch may not be available. In addition to the recognizable upper limit of the detection unit 121, the output control unit 122 may give, for example, feedback according to a recognizable upper limit defined in an application in which fingers are used one by one in a two-player game. For example, when the recognizable upper limit is 4, the number of fingers touching the table is 6, and thus two fingers are unrecognizable, feedback is given in preference for the top left side for scanning convenience. When the recognizable upper limit is null, the recognizable upper limit can be available in preference for the top left side. Of course, the preferential position is not limited to the top left side, but any position may be preferred according to product design.
According to the present specific example, it is possible to explicitly give the user feedback indicating that the recognizable upper limit of manipulation objects is exceeded. Accordingly, it is possible to prevent a situation from deteriorating into the user misunderstanding that the UI is not responding and repeatedly hitting the table 140a.
The present specific example is a form in which a manipulation mode is changed according to a hand with which no manipulation is executed. The user can manipulate an image, text, an application, or the like projected to the table 140a with his or her finger. When the user temporarily stops what he or she is doing to change a manipulation mode, opens a menu, and selects the manipulation mode, it is difficult for him or her to continue without interruption. Accordingly, the information processing system 100 according to the present specific example changes the manipulation mode based on a recognition result of a hand with which no manipulation is executed.
Specifically, the detection unit 121 detects one pair of hands of the user. For example, two hands detected on the same side are detected as the one pair of hands by the detection unit 121. The output control unit 122 controls the output unit 130 such that an output is executed to cause one hand belonging to the one pair of hands detected by the detection unit 121 to function as an action point. For example, the output control unit 122 expresses an interaction of a scroll or the like according to a touched position by causing the right hand touching the table 140a with a finger to function as an action point to manipulate an application projected to the table 140a. The output control unit 122 controls the output unit 130 such that an output is executed to cause the other hand to function as a switcher which switches classification of an action at the action point according to the shape of the one hand. For example, the output control unit 122 switches the manipulation mode of a manipulation by the right hand according to the shape of the left hand. Of course, the functions of the right and left hands may be reversed. Hereinafter, the description will be made specifically with reference to
For example, when the left hand is in the shape of a rock of the rock-paper-scissors game, the detection unit 121 recognizes the shape of the left hand based on a captured image and the output control unit 122 switches the manipulation mode to “paperweight mode.” In the “paperweight mode,” as illustrated in the upper drawing of
The output control unit 122 may explicitly give the user a feedback of the manipulation mode by projecting display indicating the current manipulation mode to one of the right hand functioning as the action point and the left hand functioning as the switcher. For example, the output control unit 122 controls the output unit 130 such that a scissors mark is projected to a fingernail or the back of the hand when the manipulation mode is the scissors mode. The output control unit 122 may switch the classification of the action at the action point according to the shape of the right hand functioning as the action point. For example, the output control unit 122 may control the output unit 130 such that a fine line is drawn when one finger of the right hand is spread and a thick line is drawn when two fingers of the right hand are spread. The output control unit 122 may maintain the manipulation mode even when the left hand functioning as the switcher is off the table 140a and the recognition of the detection unit 121 fails. For example, even when the left hand of the user is in the scissors shape, and the manipulation mode is switched to the scissors mode, and if the user subsequently pulls back his or her left hand, the output control unit 122 may maintain the scissors mode.
As described above, according to the present specific example, the user can switch the manipulation mode with the hand with which no manipulation is executed. Therefore, the user can seamlessly switch the manipulation mode without interruption of his or her current task, and thus continuous work is possible. Since the user can intuitively switch the manipulation mode, a learning cost related to the switching of the manipulation mode is low.
The present specific example is a form in which constituent elements such as a camera and a projector are formed in units of modules and replacement is possible for each module according to necessity by enabling connection by a standardized interface. When the information processing system 100 is formed as an integrated product, methods of extending functions other than replacing the information processing system 100 may become difficult. Accordingly, in the information processing system 100 according to the present specific example, constituent elements can be modularized and module units can be exchanged.
Specifically, a CPU, a camera, a projector, an LED light, a microphone, a speaker, and the like included in the information processing system 100 are stored in standardized modules. Such constituent elements may be individually stored or a plurality of constituent elements may be combined and stored in one module. For example, a module storing the CPU, the projector, and the camera may be comprehended as a core module and a module storing the other constituent elements may be comprehended as a sub-module. Mutual communication and power feeding can be achieved by connecting the modules via a common interface and all of the connected modules can function as the information processing system 100. It is also possible for only the core module to function as the information processing system 100. The interface may be realized through wireless communication, may be realized through wired communication, or may be connected physically by terminals. Hereinafter, the present specific example will be made specifically with reference to
When illuminators are provided away from the core module, the core module may recognize a positional relation between the core module and the illuminators by specifying light-emitting positions through image recognition while causing the illuminators to sequentially emit light. Accordingly, the core module can cause the illuminator provided at proper positions to selectively emit light according to the state of the projection surface or the like. Additionally, the core module may notify the user to that an expiration date is approaching by recording an installation date of the illuminators and projecting a message, “These lights will soon expire,” for example. When a speaker with a broad range is fitted, the core module may output a sound mainly using this speaker and may use another speaker for balance adjustment.
According to the present specific example, since each module can be replaced partially rather than the entire product, the replacement cost of the product is suppressed and resources are saved. According to the present specific example, it is possible to easily realize the extension of the function by replacing the module. For example, the user can improve performance such as the processing capability of the CPU, the resolution of the camera, and the recognition precision by substituting the core module. The user can enjoy a design variation of the abundantly developed speakers and illuminators, for example, by substituting the sub-module.
The present specific example is a form in which display of screens is synchronized when a plurality of screens of the same application are displayed. Generally, applications are displayed in a single direction. However, when it is assumed that a plurality of users surround the table 140a and use one application, for example, it is hard for the users standing in reverse directions to both view the application. Accordingly, the information processing system 100 according to the present specific example displays a plurality of screens of the same application and switches between synchronization (mirroring) and non-synchronization (releasing of the mirroring) of the screens as necessary.
Specifically, the detection unit 121 detects a manipulation object. The output control unit 122 controls the output unit 130 such that at least two screens are displayed on the display surface according to the manipulation object detected by the detection unit 121. For example, the output control unit 122 displays the screens of the same application based on the directions of fingers detected by the detection unit 121 so that the screens directly face the plurality of users surrounding the table 140a. When the screens are synchronized, the output control unit 122 controls the output unit 130 such that display is executed to similarly reflect a manipulation on one screen with the manipulation object detected by the detection unit 121 on the other screen. For example, when one user scrolls the screen with his or her finger, the screen displayed for the other user is scrolled similarly. Hereinafter, the description will be made specifically with reference to
In addition to the scroll, the output control unit 122 can, for example, synchronously display text entry, input of a marker in a map application, etc. on all screens. Additionally, for example, when a plurality of users browse a certain entire web page, the output control unit 122 may display the positions of regions displayed by other users in rectangular forms or may display the directions of the regions with arrows.
The output control unit 122 may divide (branch) one screen into two screens or may unify (join) two screens into one screen. The output control unit 122 may display the plurality of branched screens synchronously or asynchronously. The output control unit 122 may rejoin the plurality of branched screens. In this case, the output control unit 122 displays one screen serving as a master as the joined screen.
Various methods of deciding the screen serving as the master at the time of joining are considered. For example, the output control unit 122 may set the screen first selected to be joined as a master and set the other screen as a slave. At this time, the output control unit 122 may display a dialog “Would you like to join?” on another screen and set a screen on which the user agrees to join as a slave. When the joined screens are branched again, the output control unit 122 may display a screen displayed originally by a slave as the screen of the slave. For example, an example in which the output control unit 122 synchronizes a master displaying web page “A” with a slave displaying web page “B” to display one web page “A” is assumed. Thereafter, when the web pages are branched again, the output control unit 122 may cause the master to display web page “A” and cause the slave to display web page “B.”
Various opportunities to execute branching and joining are considered. For example, the output control unit 122 may control the output unit 130 such that the branching and the joining are executed by a user's selection of a menu item detected by the detection unit 121. Additionally, the output control unit 122 may execute the branching when the detection unit 121 detects an operation of dragging and moving fingers touching the one screen right and left. Additionally, the output control unit 122 may execute the branching when the detection unit 12 detects an operation of two users touching the screen and drawing one screen right and left to cut the screen. In contrast, the output control unit 122 may execute the joining when the detection unit 121 detects an operation of moving fingers touching two screens so that the fingers overlap. The detection unit 121 may distinguish a manipulation indicating the branching and the joining from manipulations such as pinch-in and pinch-out according to the number of fingers or the like. The output control unit 122 may permit only the master to decide the joining or non-joining, may permit only the slave to decide the joining, or may permit all of the screens including the slave to decide the joining.
As described above, according to the present specific example, since a plurality of screens can be displayed in the directions according to the positions of the users, it is possible to realize high visibility from different directions. According to the present specific example, since the plurality of screens can be switched synchronously or asynchronously as necessary, it is possible to realize extemporaneous display and manipulations according to the state of an application. According to the present specific example, when the screens are synchronized, a manipulation from another person is fed back. The user can easily recognize which manipulation the other person executes and how an application operates.
The present specific example is a form in which a subject on the table 140a is recorded and is reproduced with an original size. The subject on the table 140a is, for example, an object such as a picture or a photo placed on the table 140a, or an image projected to the table 140a. The information processing system 100 according to the present specific example images a subject on the table 140a at a certain time point and causes a projector (projection unit) to project the captured image so that the subject is subsequently displayed with a real size on the table 140a. Here, when an environment of a projection distance, a projection view angle, or the like is changed before a state is reproduced after recording of the state, it may be difficult for the information processing system 100 to reproduce the state with the original size in some cases. Accordingly, the information processing system 100 according to the present specific example stores a projection distance between the projector and the table 140a and a projection view angle of the projector at a recording time point and changes (calibrates) a projection size according to the projection distance and the projection view angle at a reproduction time point.
As a prerequisite process for such calibration, the control unit 120 executes an adjustment process of matching an imaged size by the camera and a projected sized by the projector. Specifically, the control unit 120 functions as an adjustment unit executing adjustment so that the projected size of the subject matches the real size of the subject when an image obtained by imaging the subject on the table 140a by the camera is projected to the table 140a by the projector. The control unit 120 executes, for example, position alignment of 4 points on the table 140a and executes homography conversion as the adjustment process. In an environment in which such an adjustment process is executed and which is realized by the projector, the camera, and the table 140a, for example, the information processing system 100 can capture a picture placed on the table 140a and project the picture with the same size at a later date. However, when the settings of the projector such as the projection distance or the projection view angle are changed, the projected size of the image projected by the projector is also changed. Accordingly, the information processing system 100 realizes projection of the subject with the original size by storing the captured image and the setting information in association therewith and adjusting the projected size according to a change in the setting information.
Specifically, in an environment (first environment) at a certain time point at which the adjustment process is executed, the control unit 120 first stores a captured image obtained by the camera (first imaging unit) imaging the subject on the table 140a (first projection surface) and the setting information of the projector. The control unit 120 may function as a storage unit that stores the setting information. The setting information is information that includes information indicating a projection distance which is a distance between the projector and the projection surface. The projection distance may be information indicating a distance between the projector and the subject. The setting information may further include information indicating a projection view angle which is a view angle of the projector.
When the projection distance or the projection view angle is changed, the control unit 120 first executes the adjustment process. Subsequently, in the environment subjected to the adjustment process (second environment), the information processing system 100 compares the setting information of the projector after the change (second projection unit) to the stored setting information before the change. Based on the comparison result, the information processing system 100 controls the projector such that the subject of the stored captured image is projected with the real size. Hereinafter, the description will be made specifically with reference to
As illustrated in the upper drawing of
When the projection distance is changed, for example, when the table 140a is replaced, the output control unit 122 compares the stored setting information to the setting information after the change and controls the projector such that the expansion or reduction display is executed. For example, as illustrated in the middle drawing of
When the projection view angle is changed, for example, when the projector is replaced, the output control unit 122 compares the stored setting information to the setting information after the change and controls the projector such that expansion or reduction display is executed. For example, as illustrated in the lower drawing of
Various timings at which such a calibration function is provoked are considered. For example, as illustrated in
The information processing system 100 may automatically adjust the height using the elevation function so that the distance in which the projection surface is used as broadly as possible is set. Specifically, the output control unit 122 controls the projector such that a predetermined pattern is projected and the detection unit 121 controls the camera such that a projected image is captured. Then, the control unit 120 adjusts the height using the elevation function so that the height for an image in which the projected predetermined pattern entirely falls on the table 140a is captured is achieved. For example, in an example illustrated in the upper drawing of
The information processing system 100 may execute the calibration during elevation of the elevation function. For example, as illustrated in the upper drawing of
According to the present specific example, as described above, it is possible to maintain the display of the original size even when a change in an environment, such as replacement of the table, a change in an installation location, or a change in the projector occurs. Accordingly, it is possible to record and reproduce, for example, a memorable work without change.
In the case of the projection type illustrated in
Here, convergence times were measured by suspending the device included in the information processing system 100 using the following four materials. Exterior examples of a steel shaft made of steel, carbon fiber reinforced plastics (FRP) shaft, and carbon shafts containing power lines viewed in a side direction are illustrated in
As shown in Table 1, the convergence time in which there was no problem in actual use was confirmed when the device included in the information processing system 100 was suspended using the carbon shaft. Further, by using two carbon shafts containing power lines, the convergence time in which there was no problem in actual use was confirmed even in a material with a small outer diameter. As a fiber used for the material of the shafts, for example, glass, aramid, boron, bamboo, hemp, polyethylene terephthalate (PET), polyethylene (PE), or polypropylene (PP) can be used in addition to carbon.
The present specific example is a form in which an application activation location is automatically selected. When an application is normally activated at a predetermined location, for example, an object placed on the table 140a may be an obstacle, and thus it may be difficult to display the entire application screen. Even when the application screen is intended to be moved, an object placed on the table 140a may be an obstacle, and thus it may be difficult to move the application screen. Accordingly, the information processing system 100 according to the present specific example recognizes an object on the table 140a at the time of activation of an application, searches for a position satisfying constraint conditions (display conditions) set for each application, and displays an application screen. When an object is moved or a new object is placed and a state on the table 140a is thus changed, the information processing system 100 moves the application screen to automatically avoid the object after the change.
Specifically, the detection unit 121 first detects an object on the table 140a by acquiring depth information. Then, the output control unit 122 controls the output unit 130 such that an image is displayed in a region other than a region overlapping the object detected by the detection unit 121. Accordingly, an application screen is displayed in a region in which there is no object. Hereinafter, the description will be described specifically with reference to
For example, in an example illustrated in
The detection unit 121 functions as an estimation unit that estimates the position of the user based on the position and the direction of a hand or a finger. The output control unit 122 controls the output unit 130 such that an image is displayed at a position corresponding to the estimated position of the user according to display conditions set as a relation with the user. For example, when displaying the image near the position of the user is set as a display condition, the output control unit 122 controls the output unit 130 such that the application screen is displayed near the position of the user. The display conditions regarding the display position may be comprehended as setting of the weight working on the application screen. For example, the display conditions of the application illustrated in
Web Browser 2021
Music Player 2022
According to the display conditions, the output control unit 122 displays the web browser 2021 and the music player 2022 are displayed at positions close to the user and on the flat surface satisfying the minimum sizes, as illustrated in
When movement of an object is detected by the detection unit 121, the output control unit 122 may display an image according to the positional relation of the moved object on the table 140a at a position at which the display conditions are more matched. For example, in the example illustrated in
The output control unit 122 may control the output unit 130 such that an image is displayed at a position according to the display conditions set as a relation with an object on the table 140a. For example, when the image is set such that the image is displayed adjacent to an end (edge) of the object on the table 140a, the output control unit 122 controls the output unit 130 such that an application screen is displayed adjacent to an object detected by the detection unit 121. For example, the display conditions of the application illustrated in
Brook Application 2023
According to the display conditions, as illustrated in
The output control unit 122 may control the output unit 130 such that an image is displayed according to display conditions set as a relation with the table 140a (projection surface). For example, the display conditions of an application illustrated in
Candle Application 2024
According to the display conditions, as illustrated in
Various processes when searching for a proper projection surface fails are considered. For example, as illustrated in the upper drawing of
According to the present specific example, by automatically detecting and displaying a flat surface that is a proper display region for each application, it is possible to execute optimum display so that it is not necessary for the user to execute a manipulation. According to the present specific example, by dynamically searching for the display regions satisfying the display conditions such as the minimum size and the weight defined for each application, it is possible to automatically execute the optimum display.
The present specific example is a form in which control of sound output is executed so that a sound is audible from a sound source displayed on the table 140a. In the projection type information processing system 100, a video is projected to the projection surface (the table 140a) located therebelow and a sound is produced from the body located thereabove. Therefore, a sense of unity between the video and the sound is lost when a distance between the body and the table 140a is far. Accordingly, the information processing system 100 according to the present specific example causes a sound to be reflected from the projection surface by a directional speaker so that the sound is oriented toward the user. In particular, the information processing system 100 yields the sense of unity between the video and the sound by changing a position from which the sound is reflected in conformity with a manipulation and the position of the user according to the characteristics of an application.
The detection unit 121 functions as an estimation unit that estimates the position of the user based on the position and the direction of a hand or a finger. The output control unit 122 controls the speaker such that a sound output for an image displayed on the display surface is reflected to reach the position of the user estimated by the estimation unit. The information processing system 100 according to the present specific example includes a plurality of directional speakers and is assumed to be able to control direction and a directional range of each speaker. The output control unit 122 selects the speaker installed at a position at which a reflected sound can reach the user at the time of production of the sound toward the application screen based on a positional relation between the position of the user and an application display position and controls the speaker such that the sound is produced. Hereinafter, the description will be made specifically with reference to
For example, when the application is an application in which a sound source is clear, such as a music player, as illustrated in
Various methods of yielding the sense of unity between a video and a sound are considered in addition to the cases in which the sound is reflected from the projection surface by the directional speaker. For example, the output control unit 122 may control the speaker such that a sound image is localized to the position of an application to be displayed. When a plurality of applications are used by a plurality of users, the information processing system 100 may emit only a sound of the application used by each user to the user. Additionally, when a plurality of users view the same moving image, the information processing system 100 may reproduce a sound in the native language of each user for that user. The information processing system 100 may emit a sound to the front side of the application screen, that is, in a direction in which a user executing the manipulation is normally located.
As described above, according to the present specific example, by controlling the position of the sound source according to the display position of the application, it is possible to provide the user with a sense of sound similar to the sound produced from the application screen itself. According to the present specific example, by controlling the position of the reflection according to a manipulation from the user, the sense of unity between a video and a sound can be yielded even when there is no prior information regarding a sound source such as a web browser. According to the present specific example, by controlling the channel configuration of LR according to the position of the user, it is possible to yield the sense of presence as if the user is viewing a home television.
The present specific example is a form in which a pre-set function is provoked when a specific condition is satisfied on the table 140a. A condition in which a function of an application is provoked can normally be set only by a vendor supplying the application. Depending on a use environment, a function is not provoked in a behavior defined in an application in some cases. Accordingly, the information processing system 100 according to the present specific example is configured such that a function to be provoked and a provoking condition can be freely set by the user.
The user generates a program in which a condition regarding the state on a display surface is associated with an output instruction. The information processing system 100 receiving the program executes an output based on a corresponding output instruction when the state on the display surface satisfies a condition defined by the program. Examples of the condition regarding the state on the display surface include placement of a specific object on the table 140a, a temperature on the table 140a, and a change in depth information. Hereinafter, the condition regarding the state on the display surface is also referred to as an output condition.
First, the detection unit 121 recognizes a manipulation object such as a finger touching the table 140a and detects programming by the user. Then, the control unit 120 stores a program in which an output instruction is associated with the output condition based on a detection result of the programming obtained by the detection unit 121. The control unit 120 may function as a storage unit that stores the program. Hereinafter, the description will be made specifically with reference to
As an example of the program, a program by which a temperature on the table 140a is set as an output condition is considered, for example. In the program, a region in which the condition determination is executed is can be set. For example, as illustrated in
Thereafter, the detection unit 121 detects, for example, an object on the table 140a and the pattern, temperature, humidity, or the like of the surface of the object as the state on the table 140a. When the state on the table 140a detected by the detection unit 121 satisfies the stored output condition, the output control unit 122 controls the output unit 130 such that an output according to the output instruction stored in association with the output condition is executed. For example, in regard to the program illustrated in
The program will be further exemplified. For example, a program notifying the user of a temperature when the temperature is displayed around milk for a baby and becomes a temperature of human skin, such as 36 degrees to 37 degrees, is considered. Further, a program automatically turning on an illuminator and taking photos when a birthday cake is monitored, candles are blown out, and a temperature sharply drops is considered. Furthermore, a program displaying news when a black drink (assumed to be a cup of coffee) is placed in front of the user at a morning hour is considered. The output control unit 122 may execute display, such as rectangular display indicated by a broken line, indicating that some program is executed in a programmed region, that is, a condition determination region. Of course, such display can be set not to be executed when presence of a program is desired to be concealed for the purpose of surprise.
In addition to the above-described temperatures and patterns, for example, a change in depth can be set as an output condition. The detection unit 121 detects an object located on the table 140a based on depth information, and the output control unit 122 controls the output unit 130 such that an output is executed according to an output instruction stored in association when the detected state of the object satisfies the output condition. For example, the fact that a player's has a losing hand in mah-jong can be detected based on a change in the depth information. Accordingly, for example, a program recognizing a role which is a state of a player's hand based on a captured image and automatically calculating scores when the player's has a losing hand in mah-jong is considered. The fact that the cover of a cake is removed and the content of the cake appears can also be detected based on a change in depth information. Accordingly, a program reproducing a birthday song, for example, when the cover of the box of a birthday cake placed in the middle of the table 140a is removed is considered. In a board game, the fact that a piece of a real object is stopped in a specific frame can also be detected based on a change in depth information. Accordingly, a program displaying an effect, for example, when the piece of the real object is stopped in the specific frame is considered.
According to the present specific example, not only a program by a vendor supplying an application but also a free and simple program by the user can be set. Accordingly, the provoking of the functions suitable for detailed circumstances on the table 140a is realized.
The present specific example is a form in which it is determined to whom an object placed on the table 140a belongs. According to use of an application, it may be necessary to be able to determine to whom the object placed on the table 140a belongs. Accordingly, in the present specific example, a hand placing an object on the table 140a is detected and it is determined to whom the object belongs by associating the detected hand with the object. It can also be comprehended that the user owns the object which belongs to the user.
First, the detection unit 121 detects that a manipulation object and an object entering a predetermined region in a contact state are separated. For example, based on depth information, the detection unit 121 detects that a hand holding an object enters the table 140a and the hand is separated from the object. Hereinafter, the description will be made specifically with reference to
As illustrated in
Subsequently, the detection unit 121 functions as a recognition unit that recognizes the detected manipulation objects and the objects separated from the manipulation objects in association therewith. Specifically, based on the depth information, the detection unit 121 recognizes the hands indicated by the closed curves adjoined to the sides of the table 140a and the objects indicated by the curved lines which are separated from the closed curves and are not adjoined to the sides of the table 140a in association therewith. When the objects are separated, the detection unit 121 may recognize hands located at positions closest to the objects in association with the objects. The detection unit 121 recognizes the hands associated with the objects as destinations to which the objects belong. For example, as illustrated in
The present specific example can be applied to, for example, a roulette game. The detection unit 121 detects the values of chips placed on the table 140a using a pattern recognized from a captured image and a height recognized from the depth information and detect users betting with the chips. Then, the output control unit 122 controls the output unit 130 such that an obtainable amount of chips is displayed at a hand of the user having won the bet. Accordingly, the user can bring the chips from a pool in person with reference to the displayed amount of chips. Since a dealer is not necessary, all members can participate in the game.
The present specific example can also be applied to, for example, a board game. The detection unit 121 detects a user spinning a roulette wheel on the table 140a based on the direction in which the hand extends. Then, the output control unit 122 controls the output unit 130 such that display of a move of the user spinning the roulette wheel is executed automatically according to a roulette number. The output control unit 122 may execute warning display when the user attempts to spin the roulette wheel out of turn.
The output control unit 122 may execute a warning when the user has an object which the user should not have. For example, as illustrated in the upper drawing of
The detection unit 121 may detect transition in ownership (belonging destination). As a rule of the transition, for example, “first victory” in which the ownership is fixed to a person who first touches an object and “final victory” in which the ownership transitions directly to a person who touches an object are considered. Additionally, as a rule of the transition, “user selection” in which the ownership transitions according to selection of a user is considered. For example, as illustrated in
The detection unit 121 may set the ownership according to division of an object. The detection unit 121 detects one closed curve as one object and detects division of an object when it is detected that two or more closed curves appear from the one closed curve. For example, the detection unit 121 detects division of an object when coins stacked in a plurality of layers for betting collapse. For example, as illustrated in
As described above, according to the present specific example, the belonging of the object placed on the table 140a can be identified and can be treated as attribute information in an application, a game, or the like. Accordingly, for example, the output control unit 122 can execute an output to support game progress according to the ownership of the object. The output control unit 122 can visualize belonging information in the real world by suggesting information indicating the ownership by the user.
The present specific example is a form in which a window projected to the table 140a can freely be manipulated. When a plurality of people surround the table 140a and the positions of the users are moved, the window projected to the table 140a is preferably moved, rotated, expanded, or reduced according to intentions of the users. Accordingly, in the present specific example, user interfaces for receiving manipulations on the window, such as movement of the window, are supplied. Hereinafter, the description will be made more specifically with reference to
In an example illustrated in
The detection unit 121 may comprehend the outer circumference of the window as a handle for a window manipulation and detect a user manipulation on the handle to realize the window manipulation. For example, when the detection unit 121 detects that the handle is touched, the information processing system 100 switches the manipulation mode to a window manipulation mode. The window manipulation mode is a manipulation mode in which a user manipulation is detected as a window manipulation. A manipulation mode in which a user manipulation is detected as a manipulation on an application of a scroll or the like is also referred to as a normal mode.
When the manipulation mode is switched to the window manipulation mode, the output control unit 122 controls the output unit 130 such that a handle 2041 is displayed as display indicating that the manipulation mode is the window manipulation mode, as illustrated in the left drawing of
When the detection unit 121 detects that dragging is executed from the outside of the window to the inside of the window, the detection unit 121 may switch the manipulation mode from the normal mode to the window manipulation mode. For example, as illustrated in the left drawing of
The detection unit 121 may switch the manipulation mode from the normal mode to the window manipulation mode in accordance with the number of fingers touching the window. For example, when the detection unit 121 detects that two fingers are touching the window, the manipulation mode may be switched to the window manipulation mode. Specifically, as illustrated in
According to the present specific example, the user can freely manipulate the window, and thus usability is improved.
Hereinafter, specific examples of applications which can be executed by the control unit 120 of the above-described information processing system 100 will be described.
(A: Karuta Card Assistance Application)
A karuta card assistance application is an application that assists in a karuta card game in which karuta cards arranged on the table 140a are used. The karuta card assistance application has a reading-phrase automatic read-aloud function, an answer display function, and a hint supply function. The reading-phrase automatic read-aloud function is a function of causing the output unit 130 to sequentially sound and output reading phrases registered in advance. The answer display function is a function of recognizing each karuta card from a captured image and generating effect display when the karuta card of an answer overlaps a hand of the user. In the hint supply function, when an answer is not presented despite elapse of a predetermined time from the read-aloud of the reading phrase, display indicating a hint range including a karuta card of the answer may be caused to be projected by the output unit 130 and the hint range may be further narrowed according to elapse of time, as illustrated in
(B: Conversation Assistance Application)
A conversation assistance application is an application that supports an excitement atmosphere during conversation of users. For example, the conversation assistance application can execute sound recognition on conversation of the users, extract keywords through syntax analysis on text from the conversation, and cause the output unit 130 to project an image corresponding to the keywords. Hereinafter, the description will be made more specifically with reference to
X: “I took a trip to Japan recently.”
Y: “How long by airplane?”
X: “About 5 hours. Surprisingly, a city.”
Y: “Did you see Mt. Fuji?”
X: “Mt. Fuji was lovely. I also saw the sunrise.”
In this case, the conversation assistance application extracts, for example, “Japan,” “airplane,” and “Mt. Fuji” as keywords from the conversation of the two users and causes the output unit 130 to project a map of Japan, an airplane, and Mt. Fuji, as illustrated in
(C: Projection Surface Tracking Application)
A projection surface tracking application is an application that executes proper projection according to a state of the projection surface. For example, the projection surface tracking application corrects and projects a projected image so that the projected image is displayed to directly face the user according to a state of the projection surface, such as inclination of the projection surface or unevenness on the projection surface. Hereinafter, the description will be made more specifically with reference to
The projection surface tracking application can also detect a user manipulation according to the state of the projection surface. In general, the projection surface tracking application detects a maximum flat surface as the projection surface at a specific timing such as the time of calibration and activation of a product and detects a user manipulation based on a difference in a height between the projection surface and the finger. For example as illustrated in FIG. 131, when the projection surface is a flat surface, the projection surface tracking application detects a user manipulation by detecting whether a finger is touching the projection surface according to a distance between the finger of the user and the flat surface. When a solid object is placed on the projection surface, the projection surface tracking application may detect a user manipulation by detecting a local difference between the finger of the user and the solid object. For example, as illustrated in
The projection surface tracking application can realize individual display using a mirror. In the individual display, a peripheral device including a mirror and a screen illustrated in
(D: Meal Assistance Application)
A meal assistance application is an application that supports progress of a meal of the user. For example, the meal assistance application recognizes how much food remains on a dish, i.e., a progress status of a meal, by storing an empty state of the dish on which the food is put in advance and comparing the empty dish and a current dish. The meal assistance application can cause the output unit 130 to project a pattern, a message, or the like according to the progress status of the meal. Hereinafter, the description will be made more specifically with reference to
As described above, the meal assistance application can support the progress of the meal of the user by causing the output unit 130 to project a pattern, a message, or the like according to the progress status of the meal and improving motivation for the meal of the user.
(E: Motion Effect Application)
A motion effect application can cause the output unit 130 to project an animation as if a picture were moving based on the picture placed on the table 140a. For example, as illustrated in
(F: Lunch Box Preparation Supporting Application)
A lunch box preparation supporting application is an application that supports the user in expressing various patterns with food ingredients. For example, when a target image is designated by the user, the lunch box preparation supporting application analyzes a color structure of the target image and specifies food ingredients, amounts, and an arrangement to express the target image as a pattern based on the analysis result. The lunch box preparation supporting application causes the output unit 130 to project guide display for guiding specified food ingredients, amounts, and arrangement. The user can generate a lunch box expressing a pattern imitating the target image by arranging the food ingredients according to the guide display. Hereinafter, the description will be made more specifically with reference to
The example in which the package on which the target image is formed is placed on the table 140a has been described above as the method of designating the target image. However, the method of designating the target image is not limited to the example. For example, the user can also designate an image included in a website output to the table 140a by the output unit 130 as a target image. The character image has been described above as an example of the target image. However, the target image may be an image of a vehicle, a landscape, a map, a toy, or the like.
(G: Daily Assistance Application)
A daily assistance application is an application that supports a behavior, such as learning, a hobby, and work, done every day by the user. For example, the daily assistance application can support a behavior done by the user by causing the output unit 130 to project useful information display for the user to an object in the real space. Hereinafter, the description will be made more specifically with reference to
As illustrated in
As illustrated in
(H: Dining Table Representation Application)
A dining table representation application is an application that executes colorizing for dining table representation. For example, the dining table representation application can recognize an object on a dining table from a captured image of the dining table and cause the output unit 130 to project display according to the object to the object. Hereinafter, the description will be made specifically with reference to
(I: Food Recommendation Application)
The food recommendation application is an application that recommends food to the user. For example, the food recommendation application can recognize food on a dining table from a captured image of the dining table, calculate nutrient balance of the food based on a recognition result, and recommend food supplementing deficient nutrients. For example, as illustrated in
(J: Mood Representation Application)
A mood representation application is an application that causes the output unit 130 to project a presentation according to food. For example, the mood representation application can recognize food on a dining table from a captured image of a dining table and cause the output unit 130 to project an image of an object having affinity for food. As a specific example, when food on the dining table is Osechi-ryori, the mood representation application may cause the output unit 130 to project images of pine, bamboo, and plum. When the food on the dining table is thin wheat noodles, the mood representation application may cause the output unit 130 to project an image of a riverbed or a brook. In such a configuration, it is possible to improve the mood of the dining table according to food.
(K: Tableware Effect Application)
A tableware effect application is an application that generates an effect according to placement of tableware on a dining table. For example, the tableware effect application may recognize classification of tableware placed on a dining table and cause the output unit 130 to output a display effect and a sound according to the classification. Hereinafter, the description will be described more specifically with reference to
As described above, the tableware effect application can provide a new type of enjoyment on the dining table by generating an effect according to the placement of the tableware on the dining table.
(L: Inter-Room Linking Application)
An inter-room linking application is an application that shares and links an application used by the user between rooms when the information processing systems 100 are installed in the plurality of rooms. For example, the information processing system 100 installed in a user's room acquires information regarding the application used in a living room by the user and enables the user to use the application continuously from the use in the living room after the user moves to the user's room. Hereinafter, the description will be made more specifically with reference to
The inter-room linking application can share the history information of the user between the rooms. The inter-room linking application continuously supplies an application used in another room with reference to the history information stored in the other room even after the user moves from the room. For example, when the user selects a cooking recipe with the information processing system 100 installed in a living room and the user moves to a kitchen, the selected cooking recipe is projected by the information processing system 100 installed in the kitchen. Accordingly the user can continuously use the same application even after moving to another room.
Here, an example of an illumination control process according to a state of an application or a projection surface will be described.
When a video is projected to a projection surface such as a desk or a screen using a projector, the video of the projector becomes unclear due to environmental factors such as brightness of an illuminator or outside light in some cases. When the video of the projector becomes unclear, the user is forced to execute a task of darkening the entire room or turning off only an illuminator near the projection surface in order to clarify the video, and thus convenience for the user is damaged.
Accordingly, hereinafter, an illumination control process of clearly displaying a video of the projector on the projection surface by acquiring the state of the application or the projection surface and automatically controlling an illuminator according to a status of the projection surface so that the user need not execute a task of adjusting the illumination will be described.
First, definitions of terms in the flowchart illustrated in
In the examples of
For an environment map, an obtainable unit of ambient light is set as a map. The number of cells is decided according to the obtainable unit of ambient light. The obtainable unit of ambient light defining the number of cells of the environment map is, for example, a disposition location of an illuminance sensor or a pixel value of an imaging device that images a projection surface.
Numerical values of each cell of the environment map are normalized to numerical values of each cell of the illumination map. By normalizing the numerical values of each cell of the environment map to the numerical values of each cell of the illumination map, the illuminance of ambient light and brightness of an illuminator are associated.
The information processing system 100 adopts various methods as methods of obtaining the illuminance of ambient light. For example, the illuminance of ambient light may be calculated from a pixel value of an image obtained by imaging the projection surface using the camera of the input unit 110, or an illuminance sensor may be provided in the input unit 110 to obtain the illuminance of ambient light from a value of the illuminance sensor.
In the examples of
Association between the illuminance map and the environment map will be described.
When 3 sensors are arranged in the same direction in the illuminance sensor 3020, the number of cells of the environment map 3021 is 3 and the cells are defined as A00, A01, and A02. In the embodiment, by matching the number of cells of the environment map with the number of cells of the illumination map, the illumination map and the environment map are associated. In an environment map 3021′ after conversion, the number of cells is converted from 3 to 8. The cells after the conversion are defined as B00 to B03 and B10 to B13. A conversion formula at the time of the conversion can be defined as follows, for example.
B00=B10=A00
B01=B11=0.5×A00+0.5×A01
B02=B12=0.5×A01+0.5×A02
B03=B13=A02
The information processing system 100 according to the embodiment of the present disclosure can control the illumination device 3010 based on the value of each cell of the environment map after the conversion by adapting the number of cells of the environment map to the number of cells of the illumination map in this way.
Next, the flowchart illustrated in
When the value of the system setting is set in each cell of the illumination map in the foregoing step S3001, the information processing system 100 subsequently acquires an execution state of a managed application and determines whether there are unprocessed applications (step S3002). The determination of step S3002 can be executed by, for example, the control unit 120.
When it is determined in the foregoing step S3002 that there are unprocessed applications (Yes in step S3002), the information processing system 100 subsequently acquires the application drawn on the innermost side among the unprocessed applications (step S3003). By executing the process from the innermost application, the information processing system 100 can reflect the value of the older application in the illumination map.
When the process of the foregoing step S3003, the information processing system 100 subsequently confirms whether the corresponding application is defined from an application illumination association table (step S3004). The process of step S3004 can be executed by, for example, the control unit 120.
The application illumination association table is a table in which an application to be executed by the information processing system 100 is associated with brightness (brightness realized in accordance with illumination light of the illumination device 3010 with ambient light) at the time of execution of the application. When an application is not defined in the application illumination association table, it means that the application can be used even under the system setting or illumination setting of another application without special illumination control.
The application illumination association table may have information regarding whether to notify the user when each application controls the brightness of the illumination device 3010. For example, it can be understood from the application illumination association table illustrated in
When it is determined in the foregoing step S3004 that the corresponding application is not defined in the application illumination association table (No in step S3004), the information processing system 100 assumes that the application is processed and returns the process to the process of the foregoing step S3002. Conversely, when it is determined that in the foregoing step S3004 that the corresponding application is defined in the application illumination association table (Yes in step S3004), the information processing system 100 acquires the value of the brightness of the application from the application illumination association table (step S3005). The process of step S3005 can be executed by, for example, the control unit 120.
When the process of the foregoing step S3005 ends, the information processing system 100 subsequently acquires a display area of the application in the display region by the projector (step S3006). The process of step S3006 can be executed by, for example, the control unit 120. Here, when there is no display area, for example, when there is the application but the application is iconized, the information processing system 100 may not set the application as a processing target.
When the process of the forgoing step S3006 ends, the information processing system 100 subsequently sets the value acquired from the application association table in the cell corresponding to the display area acquired in the foregoing step S3006 in the illumination map (step S3007). The process of step S3007 can be executed by, for example, the control unit 120. When the process of the foregoing step S3007 ends, the application is assumed to be processed and the process returns to the process of the foregoing step S3002.
When it is determined in the foregoing step S3002 that there is no unprocessed application (No in step S3002), the information processing system 100 subsequently acquires the ambient light and sets a value in the environment map (step S3008). The process of step S3008 can be executed by, for example, the control unit 120.
The information processing system 100 normalizes the value of the ambient light (or the illuminance of the illumination device 3010) to the value of an illumination output (brightness) at the time of the process of setting the value in the environment map in step S3008. The illuminators of the illumination device 3010 which output light to some extent (0% to 100%) and the degrees of illuminance under the illuminators may be associated in advance at the time of factory shipment.
When the process of the foregoing step S3008 ends, the information processing system 100 subsequently executes association to determine how much of an influence the environment map has on the range of the illumination map (step S3009). The process of step S3009 can be executed by, for example, the control unit 120. The association between the illumination map and the environment map is executed by causing the number of cells of the illumination map to match the number of cells of the environment map, as described above. The association between the illumination map and the environment map may be executed in advance at the time of factory shipment.
When the process of the foregoing step S3009 ends, the information processing system 100 subsequently determines whether there is an unprocessed cell of the illumination map (step S3010). The process of step S3010 can be executed by, for example, the control unit 120. The order of the processes on the illumination map may begin, for example, from lower numbers assigned to the cells of the illumination map.
When it is determined in the foregoing step S3010 that there is the unprocessed cell (Yes in step S3010), the information processing system 100 compares the value of the processing target cell of the illumination map to the value (the value of the cell of the environment map) of the ambient light corresponding to the processing target cell. Then, the information processing system 100 determines whether the value of the ambient light corresponding to the processing target cell of the illumination map is equal to or less than the value of the processing target cell of the illumination map (step S3011). The process of step S3010 can be executed by, for example, the control unit 120.
When it is determined in the foregoing step S3011 that the value of the ambient light corresponding to the processing target cell of the illumination map is equal to or less than the value of the processing target cell of the illumination map (Yes in step S3011), it means that brightness necessary for the application is not achieved by only the ambient light. Therefore, the information processing system 100 sets a value obtained by subtracting the value of the ambient light from the value of the processing target cell of the illumination map as a new illumination value of the cell (step S3012). The process of step S3010 can be executed by, for example, the output control unit 122.
Conversely, when it is determined in the foregoing step S3011 that the value of the ambient light corresponding to the value of the processing target cell of the illumination map is greater than the value of the processing target cell of the illumination map (No in step S3011), it means that brightness necessary for the application is exceeded by only the ambient light. Therefore, the information processing system 100 sets 0 as the value of the processing target cell of the illumination map (step S3013). The process of step S3013 can be executed by, for example, the output control unit 122.
When the process of the foregoing step S3013 ends, the information processing system 100 subsequently executes notification only on the application for which the notification is necessary when the brightness necessary for the application is exceeded by only the ambient light. Therefore, it is determined whether notification is set in the processing target application in the application illumination association table (step S3014). The process of step S3014 can be executed by, for example, the output control unit 122.
When it is determined in the foregoing step S3014 that the notification is set in the processing target application in the application illumination association table (Yes in step S3014), the information processing system 100 subsequently gives notification of the processing target application to the application (step S3015). The process of step S3015 can be executed by, for example, the output control unit 122.
The application receiving the notification in the foregoing step S3015 executes a notification process of displaying a message or outputting a sound, for example, “Surroundings are too bright. Please darken environment” or “Light is brighter than expected and recognition precision may deteriorate.”
Conversely, when it is determined in the foregoing step S3014 that the notification is not set in the processing target application in the application illumination association table (No in step S3014), the information processing system 100 returns to the process of the foregoing step S3010 assuming that the processing target cell of the illumination map is processed.
When it is determined in the foregoing step S3010 that there is no unprocessed cell (No in step S3010), the information processing system 100 subsequently sets the value of the illumination map set in the above-described series of processes in the output unit 130 (step S3016). The process of step S3016 can be executed by, for example, the output control unit 122. The output unit 130 controls the brightness of each illuminator of the illumination device 3010 based on the value of the illumination map.
The information processing system 100 according to the embodiment of the present disclosure can execute the illumination control according to the application to be executed and the status of the projection surface such as the brightness of the ambient light by executing the above-described series of operations. The information processing system 100 according to the embodiment of the present disclosure can optimize the illumination according to the purpose of the user by executing the illumination control according to the status of the projection surface.
The process of acquiring the ambient light in step S3008 of the flowchart illustrated in
Hereinafter, the illumination control according to the application to be executed and the status of the projection surface such as the brightness of the ambient light will be described exemplifying specific applications.
A cinema moving image player is an application which darkens at the time of output of content to represent immersion of a movie.
Referring to the application illumination association table illustrated in
A child moving image player is an application which brightens at the time of output of content so that a child is not allowed to view an animation or the like in a dark place.
Referring to the application illumination association table illustrated in
When the illuminator of the illumination device 3010 is turned on and brightness becomes the brightness defined in the application illumination association table, the information processing system 100 notifies the child moving image player that the brightness becomes the brightness defined in the application illumination association table. The notified child moving image player determines that a safe environment can be prepared so that a child can view content and starts producing the content.
Referring to the application illumination association table illustrated in
When there is outside light entering from a window, the information processing system 100 reflects the outside light to the environment map 3021′.
A candle application is an application which projects a candle video to the projection surface such as a table.
Referring to the application illumination association table illustrated in
However, in the example illustrated in
Accordingly, when the brightness is excessive despite the control of the illumination device 3010, for example, the information processing system 100 notifies the candle application that the brightness is excessive. The candle application notified by the information processing system 100 prompts the user to close a curtain, for example, and darken the ambient light by displaying, for example, “Surroundings are too bright. Please darken environment.”
A projection mapping application is an application that aims to project an image to a wall surface of a room or an object. When the projection mapping application is executed, the user installs, for example, a hemisphere in a mirror state on the projection surface so that an image output from the projector of the output unit 130 is reflected from the hemisphere. By reflecting the image output from the projector of the output unit 130 from the hemisphere in the mirror state, it is possible to project an image to a wall surface of a room or an object.
Referring to the application illumination association table illustrated in
As the projection mapping application, for example, there are a planetarium application projecting an image of a starry sky to a wall surface of a room and a revolving lantern application realizing a revolving lantern by projecting an image to Japanese paper installed on the projection surface. Hereinafter, the planetarium application and the revolving lantern application will be exemplified as the projection mapping applications in the description.
A globe application is an application that aims to express a globe by projecting an image to a hemisphere installed on the projection surface. When the globe application is executed, the user installs, for example, a hemisphere with a mat shape on the projection surface so that an image output from the projector of the output unit 130 is projected to the hemisphere. By projecting the image output from the projector of the output unit 130 to the hemisphere with the mat shape, it is possible to express the globe using the hemisphere.
A screen recognition application of a mobile terminal is an application which recognizes a screen of a mobile terminal installed on the projection surface with the camera of the input unit 110 and executes a process according to the recognized screen. When the screen of the mobile terminal is recognized and the illuminator of the illumination device 3010 is turned off and darkened, the screen can be easily recognized with the camera of the input unit 110.
However, when the illuminator of the illumination device 3010 is turned off and darkened, the screen may be whitened depending on a dynamic range of the camera of the input unit 110.
Accordingly, in the embodiment, in order to easily recognize the luminescent screen, the illuminator of the illumination device 3010 is not turned off, but the illuminator of the illumination device 3010 is set to be dark for the purpose of reducing highlight, as illustrated in the application illumination association table of
A food package recognition application is an application which recognizes the surface of a food package installed on the projection surface with the camera of the input unit 110 and executes a process according to the recognized food package. When the food package is recognized and the illuminator of the illumination device 3010 is turned on and brightened, the food package which is a reflector is easily recognized with the camera of the input unit 110.
A general object recognition application is an application which recognizes the surface of an object installed on the projection surface with the camera of the input unit 110 and executes a process according to the recognized object. When the object is recognized, the illuminator of the illumination device 3010 is brightened about half because it is not known in advance which object is placed on the projection surface. By brightening the illuminator of the illumination device 3010 about half, the surface of an object is easily recognized with the camera of the input unit 110.
The information processing system 100 according to the embodiment can optimize the illumination for each purpose of the user by executing the illumination control according to a use status of the projection surface. The information processing system 100 according to the embodiment can adjust illumination of only a necessary spot of the projection surface by executing the illumination control using the environment map and the illumination map divided as the cells. By adjusting the illumination of only a necessary spot of the projection surface, the plurality of users can execute different tasks on content projected to the projection surface by the projector of the output unit 130 without stress.
The information processing system 100 according to the embodiment can clarify a video by detecting a portion in which an application is executed on the projection surface and adjusting the illumination. For example, the input unit 110 detects a portion on the projection surface in which eating is taking place and the brightness of the illuminator of the illumination device 3010 is adjusted, and thus it is possible to prevent the brightness of a neighboring area in which eating is taking place from being darkened.
The information processing system 100 according to the embodiment executes the illumination control when the general object recognition application is executed, and thus it is possible to improve recognition precision. The information processing system 100 according to the embodiment changes a control method for the illumination device 3010 according to a recognition target object, and thus it is possible to improve the recognition precision of the object placed on the projection surface.
The information processing system 100 according to the embodiment may control the brightness based on meta information or attribute information of content to be output from the output unit 130. For example, when the attribute information of the content to be output is set as an animation for children, the information processing system 100 controls the illumination device 3010 to brighten. When the attribute information is set as a movie for adults, the information processing system 100 may control the brightness of the illuminator of the illumination device 3010 to darken.
The information processing system 100 can change the brightness of individual content even for the same application by controlling the brightness based on meta information or attribute information of content to be output in this way.
The information processing system 100 may execute the control for immediate target brightness when the brightness of the illuminator of the illumination device 3010 is controlled. Additionally, the information processing system 100 may execute control through gradual brightening or darkening such that target brightness is ultimately achieved.
Here, an example of a process of provoking a preset function when a specific condition on the projection surface is satisfied will be described. A condition for provoking a function of an application can normally be set by only a vendor supplying the application. Depending on a use environment, a function is not provoked in a behavior defined in an application in some cases. Depending on a use environment, a function is not executed in a behavior defined in an application in some cases.
Accordingly, the information processing system 100 according to the embodiment is configured to allow the user to freely set a function to be provoked and a provoking condition. In the information processing system 100 according to the embodiment, the user is allowed to freely set a function to be provoked and a provoking condition, so that various representations can be expressed.
Various objects are placed on the table 140a for the users and various interactions are executed daily on the table 140a. The information processing system 100 according to the embodiment allows the users to use functions using the interactions as chances to provoke functions. The information processing system 100 according to the embodiment delivers data to be used at the time of provoking of functions to processes (actions) to be executed at the time of provoking of the functions according to occurrence of the interactions.
Examples of the interactions to be used as provoking conditions (triggers) of the functions by the information processing system 100 according to the embodiment are as follows. Of course, the interactions are not limited to the following interactions:
Examples of the triggers when a sound on the table surface is picked up are as follows:
As data delivered to an action when a pattern of a sound is detected, for example, there is ID information for identifying the detected pattern. As data to be delivered to an action when a change in volume, a time for which a sound continues, or a direction from which a sound emanates, for example, there are volume, a sound time, and a sound direction.
Examples of the triggers when an AR marker generated by the user is recognized are as follows:
As data to be delivered to an action when an AR marker is discovered, for example, there are ID information of a discovered marker, discovered coordinates, a discovered posture, a discovered size, and a time at which the AR marker is discovered. As data to be delivered to an action when a discovered AR marker is lost, for example, there are ID information of the lost marker, coordinates at which the marker was last seen, a posture at which the marker was last seen, a size in which the marker was last seen, and a time at which the marker is lost.
Examples of the triggers when depth information is recognized are as follows:
As data to be delivered to an action when a mass is recognized, for example, there are a location in which the mass is discovered, an area of the mass, a cubic volume of the mass, and a time at which the mass id recognized. As data to be delivered to an action when the disposition of an object on the table surface is recognized, for example, there are a location in which the disposition is changed, a time at which the disposition is changed, and a change amount (of area or cubic volume). As data to be delivered to an action when the change from the standard flat surface is detected, for example, there are an exterior, a location, an area, and a cubic volume of an object changed from a standard state and a date on which the object is first placed. As data to be delivered to an action when a motion is detected, for example, there are activeness of the motion, coordinates or area of a region in which the motion is mainly done, and a date on which the motion is done. The activeness of the motion is, for example, an index obtained by multiplying an area in which the motion is done by a speed of the motion.
An example of the trigger when the brightness of the table surface is detected is as follows:
As data to be delivered to an action when the change in the brightness is detected, for example, there is information regarding the brightness.
Examples of the triggers when a hand state on the table surface is recognized are as follows:
As data to be delivered to an action when the number of hands is recognized, for example, there are the number of recognized hands, the number of spread hands, and the positions of recognized hands.
An example of the trigger when the device placed on the table surface is recognized is as follows:
As data to be delivered to an action when the connection of the device is recognized, for example, there are ID information of the recognized device, the position of the device, the posture of the device, and the size of the device.
Examples of the triggers when arrival of a predetermined time is recognized are as follows:
As data to be delivered to an action when a designated time arrives, for example, there is time information. As data to be delivered to an action when elapse of a predetermined time is recognized, for example, there are a starting time, an elapsed time, and a current time.
An example of the trigger when detection of a temperature is recognized is as follows:
As data to be delivered to an action when the change in temperature is detected, for example, there are a temperature change amount, an area in which the temperature is changed, an absolute temperature, and a date on which the change in temperature is detected.
An example of the trigger when detection of the concentration of carbon dioxide is recognized is as follows:
As data to be delivered to an action when the change in the concentration of carbon dioxide is detected, for example, there are a change amount of a concentration of carbon dioxide, absolute concentration, and a date on which the change in the concentration is detected.
An example of the trigger when detection of a smell is recognized is as follows:
As data to be delivered to an action when a smell is detected, for example, there are a detection amount and a date on which the smell is detected.
Examples of functions executed according to the above-described interactions by the information processing system 100 according to the embodiment of the present disclosure are as follows:
As an action when projection of a video or an image is executed, for example, there is projection of a visual effect to the projection surface. As display to be projected to the projection surface, for example, there are display of a visual effect registered in advance (an explosion effect, a glittering effect, or the like) and display of a visual effect generated by the user. An example of the visual effect may be an effect recorded in advance on the table surface, an effect drawn based on a movement trajectory of a hand or a finger on the table surface by the user, or an effect obtained by using an illustration drawn by a paint application or an image searched for and discovered on the Internet by the user.
As a use of data to be delivered from the trigger when the action of projecting the visual effect to the projection surface is executed, for example, there is a change in an amount of the visual effect in proportion to the magnitude of a sound generated in tapping on the table surface.
As an action when reproduction of music is executed, for example, there is an action of reproducing a sound or music. Specifically, for example, there is an action of outputting a sound effect registered in advance or an action of outputting favorite music registered by the user. As a use of data to be delivered from the trigger when an action of reproducing a sound or music is executed, for example, there is reflection of an increase or decrease in given data to loudness of a sound.
As actions when an application is activated, for example, there are activation of a general application, activation of an application designating an argument, and activation of a plurality of applications. As a specific example of the general application, for example, there is an application manipulating a television or an application displaying a clock. As the activation of the application designating an argument, for example, there is activation of a browser designating a URL. As a specific example of the activation of the plurality of applications, for example, there is reproduction of the positions, window sizes, and inclinations of the plurality of stored applications.
As an action of imaging a photo, for example, there are an action of imaging the entire projection surface and an action of imaging a part of the projection surface. As a use of data to be delivered as the trigger when an action of imaging a photo is executed, for example, there is an action of imaging a predetermined range centering on a recognized hand.
As actions when illumination is adjusted or the brightness of a projected image is adjusted, for example, there are adjustment of the brightness and an action of turning off an illuminator. As a specific example of the adjustment of the brightness, for example, the illumination is brightened or darkened, or a starting point and an ending point are designated and movement is executed between the points in a certain time. As a use of data to be delivered from the trigger when the action of adjusting the illumination is executed, for example, there are reflection of a delivered value in the brightness and adjustment of the brightness according to ambient light.
As actions when the volume is adjusted, for example, there are adjustment of the volume and muting of the volume. As a specific example of the adjustment of the volume, for example, a sound is increased or decreased, or a starting point and an ending point are designated and movement is executed between the points in a certain time. As a use of data to be delivered from the trigger when the action of adjusting the volume is executed, for example, there are reflection of a delivered value in the volume and adjustment of the volume according to surrounding volume.
As an action when an alert is displayed, for example, there is display (projection) of an alert message. As a use of data to be delivered from the trigger when the action of displaying an alert is executed, for example, there is an output of an alert message “Manipulation may not be executed with that hand” around a newly recognized hand when the number of recognized hands exceeds a threshold value.
In this way, relations between the function to be provoked and the provoking condition have a diverse range, and it is necessary for the user to simply set the function to be provoked and the provoking condition. Hereinafter, examples of GUIs which are output to the projection surface by the information processing system 100 when the user is allowed to set the function to be provoked and the provoking condition will be described.
When the user is allowed to set the provoking condition, the information processing system 100 according to the embodiment of the present disclosure first outputs the GUI 3200 for allowing the user to select a channel which is to be used as the provoking condition, as illustrated in
Here, when the user selects the sound as the channel, the information processing system 100 subsequently outputs the GUI 3200 for allowing the user to select the trigger, as illustrated in
Here, when the user selects the pattern of the sound as the trigger, the information processing system 100 subsequently outputs the GUI 3200 for allowing the user to record the pattern of the sound. When the user is ready to record, he or she touches a recording button illustrated in
When the recording of the sound is completed, the information processing system 100 subsequently outputs the GUI 3200 illustrated in
When the user reproduces the recorded sound and the information processing system 100 recognizes a sound reproduced by the user as the pattern of the sound, the information processing system 100 outputs the GUI 3200 illustrated in
When the information processing system 100 recognizes the pattern of the sound, the information processing system 100 outputs the GUI 3200 illustrated in
The information processing system 100 according to the embodiment of the present disclosure can allow the user to simply set the function to be provoked and the provoking condition by outputting the GUI illustrated in
The example of the GUI 3200 output by the information processing system 100 when the sound produced by the user is registered as the trigger has been described above. Next, an example of a GUI output by the information processing system 100 when a marker placed on the projection surface by the user is registered as a trigger will be described.
When the user is allowed to set the provoking condition, the information processing system 100 according to the embodiment of the present disclosure first outputs the GUI 3200 for allowing the user to select a channel which is to be used as the provoking condition, as illustrated in
Here, when the user selects the marker as the channel, the information processing system 100 subsequently outputs the GUI 3200 for allowing the user to select the trigger, as illustrated in
Here, when the user selects the mass recognition as the trigger, the information processing system 100 subsequently outputs the GUI 3200 for allowing the user to designate an area in which the recognition is executed, as illustrated in
When the user sets the area in which the object is recognized and which triggers the provoking of the function, the information processing system 100 subsequently outputs the GUI 3200 for allowing the user to actually place the object in the area set by the user and recognizing the object, as illustrated in
When the information processing system 100 recognizes the object placed in the area in which the object is recognized, the information processing system 100 outputs the GUI 3200 indicating that the object placed on the desk surface by the user is registered as a trigger, as illustrated in
When the user is allowed to set the function to be provoked, the information processing system 100 according to the embodiment of the present disclosure first outputs the GUI 3200 for allowing the user to select a channel which is used as the function to be provoked, as illustrated in
Here, when the user selects pictures/videos as the channel, the information processing system 100 subsequently outputs the GUI 3200 for allowing the user to select an action (function to be provoked), as illustrated in
When the user sets the action, the information processing system 100 subsequently outputs the GUI 3200 for allowing the user to confirm the action set by the user, as illustrated in
When the user registers the function to be provoked and the provoking condition, as described above, it is desirable to control visibility of the function to be provoked in accordance with the effect or implication of the function to be provoked. For example, a function set up as a surprise is preferably concealed so that others do not notice the function.
As described above, the information processing system 100 can allow the user to register various provoking conditions of the functions. However, a case in which the user tries to assign the same condition to other functions is also considered. In that case, when the provoking condition that the user tries to register considerably resembles a previously registered condition, the information processing system 100 may reject the registration of the provoking condition.
For example, when the user registers a trigger for tapping a desk and the user registers the trigger because of the fact that a rhythm is slightly different but the number of taps is the same or when the user registers a trigger for placing an object on the desk and a reactive area and a previously registered area are superimposed, data may considerably resemble a registered condition and thus has high similarity. Additionally, for example, a case in which the user tries to assign recognition of objects with slightly different patterns, shapes, or hues to other functions is also considered. In this way, there are patterns in which the conditions have high similarity.
Accordingly, at a time point at which the similarity is proven to be sufficiently high, the information processing system 100 may display an indication reporting that the similarity is high or output a GUI prompting the user to register the provoking condition again or cancel the provoking condition.
The information processing system 100 may generate a new trigger by combining a plurality of triggers registered by the user.
An example of a GUI when the user registers the function to be provoked and the provoking condition, as described above, and the function set by the user to be provoked is bound up with the provoking condition will be described.
In
The information processing system 100 according to the embodiment of the present disclosure allows the user to freely set the function to be provoked and the provoking condition, so that the user can set a program freely and simply in addition to a program by a vendor supplying an application. Accordingly, the provoking of the functions suitable for detailed circumstances on the table 140a is realized.
Various objects are placed on the table 140a for the users and various interactions are executed daily. The information processing system 100 according to the embodiment allows the user to freely set the function to be provoked and the provoking condition, so that interactions executed every day can be used as chances to provoke the functions by the user. Thus, it is possible to adapt experiences to a daily life of the user.
Herein, examples of a manipulation method and a mode of a window displayed by the information processing system 100 will be described.
An image, text, and other content can be displayed in the window displayed by the information processing system 100. Various kinds of content can be displayed in the window, and thus there may be cases in which not all of the content can be displayed in the region of the window. In such cases, the user browses the content by executing a manipulation of scrolling, moving, expanding or reducing the content, and the information processing system 100 has to distinguish the manipulation on the content from a manipulation on the window in which the content is displayed. This is because there may be cases in which, when the manipulation on the content is not correctly distinguished from the manipulation on the window, the manipulation on the window is executed rather than the content despite the fact that the user executes the manipulation on the content.
When an operation is switched from the content manipulation mode to the window manipulation mode, the information processing system 100 may execute display (for example, changing the color of the entire window 3300) indicating that the window 3300 enters the window manipulation mode.
When an operation is switched from the content manipulation mode to the window manipulation mode by pressing and holding a predetermined region in the window 3300 in this way, there is the advantage that erroneous manipulations by the user decrease. However, it is necessary for the user to execute a manipulation of pressing and holding the predetermined region in the window 3300.
In the method of distinguishing the manipulation on the content from the manipulation on the window 3300 by providing the outside frame around the window 3300 in this way, the user is not forced to execute a press and hold manipulation. Therefore, there is the advantage that the manipulations can be distinguished by one manipulation. However, an erroneous manipulation may occur due to interference or the like of a manipulation area because the outside frame is provided.
Accordingly, in the following description, an example in which the user is not forced to execute a complex manipulation when the user is allowed to manipulate the window 3300 displayed by the information processing system 100 and a possibility of an erroneous manipulation occurring decreases will be described.
First, a movement concept of a window in which the content is scrolled and a window in which the content is not scrolled will be described.
In
In
First, when the information processing system 100 detects a touch manipulation on the window by the user (step S3301), the information processing system 100 subsequently determines whether the manipulation is a moving manipulation (step S3302).
When the touch manipulation on the window by the user is the moving manipulation (Yes in step S3302), the information processing system 100 subsequently determines whether the touch manipulation on the window by the user is a manipulation in the window (step S3303).
When the touch manipulation on the window by the user is the manipulation in the window (Yes in step S3303), the information processing system 100 subsequently determines whether the user manipulation is a manipulation with two hands or the content displayed in the window is the content which is not scrolled (step S3304).
When the user manipulation is the manipulation with two hands or the content displayed in the window is the content which is not scrolled (Yes in step S3304), the information processing system 100 subsequently executes a process of moving a manipulation target window (step S3305). Conversely, when the user manipulation is a manipulation with one hand or the content displayed in the window is the content which is scrolled (No in step S3304), the information processing system 100 subsequently executes a process of scrolling the content displayed in the manipulation target window (step S3306).
When the touch manipulation on the window by the user is not the manipulation in the window (No in step S3303), the information processing system 100 subsequently executes a process of moving the manipulation target window (step S3305).
When it is determined in the foregoing step S3302 that the touch manipulation on the window by the user is not the moving manipulation (No in step S3302), the information processing system 100 subsequently determines whether the touch manipulation on the window by the user is a rotating manipulation (step S3307).
When the touch manipulation on the window by the user is the rotating manipulation (Yes in step S3307), the information processing system 100 subsequently determines whether the touch manipulation on the window by the user is a manipulation in the window (step S3308).
When the touch manipulation on the window by the user is the manipulation in the window (Yes in step S3308), the information processing system 100 subsequently determines whether the user manipulation is a manipulation with two hands or the content displayed in the window is the content which is not rotated in the window (step S3309).
When the user manipulation is the manipulation with two hands or the content displayed in the window is the content which is not rotated in the window (Yes in step S3309), the information processing system 100 subsequently executes a process of rotating the manipulating target window (step S3310). Conversely, when the user manipulation is the manipulation with one hand or the content displayed in the window is the content which is rotated (No in step S3309), the information processing system 100 subsequently executes a process of rotating the content displayed in the manipulating target window (step S3311).
When the touch manipulation on the window by the user is not the manipulation in the window (No in step S3308), the information processing system 100 subsequently executes a process of rotating the manipulation target window (step S3310).
When it is determined in the foregoing step S3307 that the touch manipulation on the window by the user is not the rotating manipulation (No in step S3307), the information processing system 100 subsequently determines whether the touch manipulation on the window by the user is a scaling manipulation (step S3307).
When the touch manipulation on the window by the user is the scaling manipulation (Yes in step S3312), the information processing system 100 subsequently determines whether the touch manipulation on the window by the user is a manipulation in the window (step S3313).
When the touch manipulation on the window by the user is the manipulation in the window (Yes in step S3313), the information processing system 100 subsequently determines whether the user manipulation is a manipulation with two hands or the content displayed in the window is the content which is not scaled in the window (step S3314).
When the user manipulation is the manipulation with two hands or the content displayed in the window is the content which is not scaled in the window (Yes in step S3314), the information processing system 100 subsequently executes a process of scaling the manipulating target window (step S3315). Conversely, when the user manipulation is the manipulation with one hand or the content displayed in the window is the content which is scaled (No in step S3315), the information processing system 100 subsequently executes a process of scaling the content displayed in the manipulating target window (step S3316).
When the touch manipulation on the window by the user is not the manipulation in the window (No in step S3313), the information processing system 100 subsequently executes a process of scaling the manipulation target window (step S3315).
When it is determined in the foregoing step S3312 that the touch manipulation on the window by the user is not the scaling manipulation (No in step S3312), the information processing system 100 subsequently executes handling according to the application which is being executed in response to the user manipulation (step S3317). For example, as an example of a case in which a touch manipulation on the window by the user is not moving, rotating, or scaling, for example, there is a tap manipulation by the user. When the tap manipulation is executed by the user, the information processing system 100 may execute a process (for example, displaying an image, reproducing a video, or activating another application) on content which is a tap manipulation target.
The moving manipulation, the rotating manipulation, and the scaling manipulation by the user can be executed simultaneously. In this case, for example, the information processing system 100 may determine which manipulation is the closest to a manipulation executed by the user among the moving manipulation, the rotating manipulation, and the scaling manipulation.
The example of the operation of the information processing system 100 according to the embodiment of the present disclosure has been described with reference to
The information processing system 100 may distinguish display control on a window from display control on content even for the same rotating or scaling by detecting whether two fingers are fingers of the same hand or different hands.
Similarly, when the user executes a scaling manipulation with one hand, as illustrated in (1) of
When the content that is not scrolled is displayed in the window, the content is scrolled in the window by rotating or scaling the content in some cases. As an application capable of displaying such a window, for example, there is an application drawing an illustration or an application displaying a map.
In such a window, the information processing system 100 may allow the rotated or scaled window to transition to a process for a window in which the content is scrolled.
When the content which is not scrolled is displayed in the window and a rotating manipulation is executed on the content by the user, a scroll margin is generated, as illustrated in (3) of
A modification example of the rotating manipulation on the window will be described.
A case in which the user rotates the window 180 degrees when the information processing system 100 executes such display control will be described. As illustrated in (2) of
Next, a display control example according to a user manipulation on the window in which the content which is scrolled is displayed will be described.
When the user executes a manipulation inside the window in this state, the information processing system 100 allows the manipulation to operate on the content displayed in the window, as illustrated in (2) of
On the other hand, when the user executes a manipulation on the edge of the window, the information processing system 100 displays a window handle for manipulating the window 3300 around the window 3300, as illustrated in (3) of
In
In
In
Next, a display control example in which the window is rotated according to a user manipulation will be described.
The information processing system 100 may execute display control such that an outside frame is provided around the window and the window is moved while the window is rotated according to a moving manipulation on the outside frame.
The information processing system 100 may execute display control such that the window is moved while the window is rotated by a special gesture by the user. For example, the information processing system 100 may execute display control on the window by assuming that a manipulation other than a tap manipulation on the content is a manipulation on the window.
The information processing system 100 may execute display control on the window by assuming that a manipulation on the window in which the content is scrolled with one hand, as described above, is a manipulation on the content and assuming that a manipulation on the window with both hands is a manipulation on the window, or may execute display control on the window by displaying an outside frame around the window only for a predetermined time at the time of a manipulation and assuming that a manipulation on the outside frame is a manipulation on the window.
Next, an example of a process when there is a window outside a screen will be described.
When the user removes his or her finger from the window 3300 after executing a moving manipulation of moving the finger holding the window 3300 to a predetermined region of the screen (display region) by a force stronger than the above-described reaction, the information processing system 100 may execute display control such that the window 3300 is closed or the window 3300 is minimized, as in
The information processing system 100 may execute display control using the law of inertia at the time of display control of the window.
Next, a display control example when windows interfere with each other will be described.
When a region in which two windows overlap is equal to or less than a predetermined amount or a ratio of the region is a predetermined amount, the information processing system 100 may execute display control such that the windows merely overlap without executing special control, as illustrated in (1) of
For example, as illustrated in (3) and (4) of
When the user moves a window to overlap another window, the information processing system 100 executes display control such that the other window moves out of the way in an animated manner in real time, and the other window moves to an end of the screen (display region), and the information processing system 100 may execute display control such that the size of the other window is decreased.
When the other window moving out of the way for the window manipulated to be moved by the user is moved to the end of the screen (display region) in this way, the information processing system 100 may execute display control such that the other window is minimized or erased at a time point at which the user removes his or her finger. When the other window that is moving is moved to the end of the screen (display region) in this way, the information processing system 100 may execute display control such that the other window is rotated.
When the position of the user can be detected by the information processing system 100, the information processing system 100 may control motions of windows according to the position of the user.
At this time, the information processing system 100 may execute display control such that only a window bound up with the user approaching the screen (display region) is automatically rotated to face the user from the user's viewpoint and approaches the user. The information processing system 100 may determine whether the window and the user are bound up together, for example, by determining whether the user and the window are originally set to be bound up together or determining whether the user is the last user to have touched the window.
When the position of the user can be detected by the information processing system 100, the information processing system 100 may control granularity of content to be displayed in a window according to the position of the user.
The information processing system 100 may control, for example, an image size or font sizes of letters as the granularity of the content displayed in the window. That is, the information processing system 100 may decrease the image size or the font size in a window close to the user and may increase the image size or the font size in a window far from the user.
Next, a display control example when a window interferes with a real object placed on the projection surface will be described.
When the information processing system 100 executes the display control such that the position of the window 3300 is automatically moved to the position at which the window 3300 does not interfere with the object 3310, as illustrated in (3) and (4) of
Here, linking between the information processing system 100 and another device and an example of a GUI output by the information processing system 100 at the time of the linking will be described.
As described above, for example, when a form in which the information processing system 100 according to the embodiment of the present disclosure projects information to a table and causes a user to manipulate the information is adopted, as illustrated in
However, when a plurality of users own substantially portable terminals that are substantially the same, place the portable terminals on a table simultaneously and individually, and cause the information processing system 100 to recognize the portable terminals, the information processing system 100 may not be able to determine which portable terminal it is better to link to the information processing system 100.
When the linking with the device is executed without using object recognition, information regarding a positional relation between the information processing system 100 and the device to be linked is unusable. Accordingly, when the linking with the device is executed without using object recognition, the device is handled similarly regardless of the location at which the linking with the information processing system 100 starts. For example, when a plurality of users can simultaneously use information omnidirectionally and shared information is all displayed in the same direction or at the same position, the direction of the information is opposite to the direction of the users in some cases. Thus, it may be difficult for the users to handle the information.
Accordingly, the information processing system 100 capable of easily specifying the linked portable terminals even when the plurality of users own substantially the same portable terminals as described above and the same portable terminals are simultaneously and individually placed on the table has been described. The information processing system 100 capable of displaying content shared at the position or in the direction at or in which each of the users executing manipulations can easily use the content omnidirectionally has been described.
In the information processing system 100 according to the embodiment of the present disclosure, a portable terminal such as a smartphone placed on a table can be linked by executing, for example, the operation illustrated in
In the information processing system 100 according to the embodiment of the present disclosure, a connection mark can be displayed on the screen by executing, for example, the operation illustrated in
Hereinafter, an example of a GUI output by the information processing system 100 when the information processing system 100 is linked to a portable terminal such as a smartphone will be described.
When the mode proceeds to the recognition mode, as illustrated in (2) of
When the mode proceeds to the recognition mode, as illustrated in (3) of
When the object is removed after the recognition of the object, the information processing system 100 displays a GUI for indicating connection with the object.
In
When the object is taken away after the recognition of the object, the information processing system 100 displays a GUI for allowing the user to execute a process linked to the object.
In
In
Of course, it is needless to say that the information processing system 100 can output various GUIs in addition to the GUI 3400 illustrated in
Next, an example of a GUI for improving operability or visibility of content stored by a linked device will be described.
For example, when a book, a magazine, a dish, or another object has already been placed on a table, the information processing system 100 may recognize the placed object and output its icon, its thumbnail, or the like, avoiding the object. In
When the information processing system 100 outputs a GUI for manipulating content owned by another device after the other device is linked, the information processing system 100 may change the GUI in which an icon, a thumbnail, or the like maintained by the linked device is output according to a location in which the device is placed. In
For example, the information processing system 100 normally displays any window displaying the content on the right side of the same image as the marker for recognition. When the device is placed at the right end of the display region, any window displaying the content is displayed on the left side of the same image as the marker for recognition. The information processing system 100 changes the GUI according to the location in which the device is placed in this way, and thus it is possible to improve operability for the user.
Next, an example of a GUI regarding an end timing of object recognition will be described.
The information processing system 100 recognizing the object placed on a table or the like may output a GUI such as a window immediately after the recognition. In
The information processing system 100 recognizing the object placed on the table or the like may continue the recognition process while the object is placed on the table. In
When the content taken from the device to be shared is manipulated, it is not necessary to execute the process of recognizing the object. Therefore, to reduce a calculation cost of the process of recognizing the object, the information processing system 100 may stop the recognition process when the object is taken away from the table.
The information processing system 100 recognizing the object placed on the table or the like may stop the recognition process at a time point at which the object is recognized and may output the window or the like according to the recognized object. In
The information processing system 100 recognizing the object placed on the table or the like can considerably reduce the calculation cost of the process of recognizing the object by stopping the recognition process at the time point at which the body is recognized
Next, an example of a GUI for releasing the connection with the linked device will be described.
In
In
In
When the connection between the information processing system 100 and the device is released, the information processing system 100 erases the displayed connection mark in any GUI 3400 illustrated in
Of course, the GUIs output by the information processing system 100 when the information processing system 100 and the portable terminal are linked are not limited to the above-described GUIs.
The present example is an embodiment of the above-described specific example 2. More specifically, particularly, a sound is focused on in the present example and display for optimizing a sound input and output for the user is executed. Hereinafter, the projection type information processing system 100a will be assumed in the description. However, any type of information processing system described with reference to
(Overview)
Accordingly, as illustrated in
The microphone icon 4102 may be displayed immediately below or near the microphone or may be displayed in the middle, the vicinity or any position of the screen or the table 140a. Hereinafter, an example in which various kinds of display of the beamforming range, volume, and the like are executed using the microphone icon 4102 as a center point will be described, but the present example is not limited to this example. For example, display other than the microphone icon 4102 may be the center point or an object indicating the center point may not be displayed. For example, various kinds of display may be executed using any position on the table 140a as the center point or various kinds of display may be executed using a position immediately below or near the microphone as the center point.
Further, the information processing system 100 according to the present example displays information indicating the beamforming range. Accordingly, the user can know that the beamforming related to the sound input is executed. For example, when the user is located in the beamforming range, the user can know that there is no need to shout for the sound not to be buried in ambient noise. Accordingly, the psychological burden on the user is reduced. Further, since the user does not shout, the physical burden is reduced. When the user is not located in the beamforming range, the user can know why the sound uttered by the user is not recognized. Thus, the psychological burden on the user is reduced. Further, the user can move inside the beamforming range. In this case, the precision of the sound recognition by the information processing system 100 is improved.
The sound input has been described in detail in this section, but the similar application may also be achieved in any other output such as a sound output. For example, in the information processing system 100, a speaker may execute the beamforming to display information indicating a sound output range. In this case, the user can know that the beamforming related to the sound output is executed, and thus can adjust volume appropriately and move to the beamforming range. The similar application may be achieved in any input other than the sound input.
The overview of the information processing system 100 according to the present example has been described above. Next, a specific configuration of the information processing system 100 according to the present example will be described.
The input unit 110 according to the present example has a function as a sound input unit. For example, the function of the input unit 110 as the sound input unit is realized by a mic (microphone). In particular, in the present example, a mic array in which a plurality of mics are combined is realized as a mic capable of executing the beamforming. The input unit 110 may include a mic amplifier circuit or an A-to-D converter that executes an amplification process on a sound signal obtained by the microphone or a signal processing circuit that executes a process, such as noise removal or sound source separation, on sound data. The input unit 110 outputs the processed sound data to the control unit 120.
The detection unit 121 according to the present example has a function as an input control unit controlling directivity of the mic which functions as the sound input unit. For example, as described above with reference to
The detection unit 121 has a user position estimation function of estimating the position of the user executing a sound input. The detection unit 121 can estimate the position of the user by various kinds of means. Hereinafter, examples of the user position estimation function will be described with reference to
As described above, the detection unit 121 estimates the user position using the touch on the sound input start object 4108 as an opportunity, but this function is not limited to the example. Hereinafter, another example of the user position estimation function will be described with reference to
Additionally, for example, the detection unit 121 may estimate the user position using the fact that the user is pulling a sound input start object 4180 toward his or her hand as an opportunity. In this case, the detection unit 121 can estimate that the user is located near the position to which the sound input start object 4180 is pulled or on an extension line of a direction from the mic icon 4102 to the sound input start object 4180.
The examples of the user position estimation function have been described above.
The detection unit 121 controls the sound input unit such that the estimated user position is included in the beamforming range, to form the directivity. The detection unit 121 may control the sound input unit such that a plurality of beamforming ranges are formed. For example, when the beamforming range is formed for each of the plurality of users, it is possible to improve the precision of the sound recognition of the sound input from each user.
The detection unit 121 may update the range of the formed directivity according to update of the estimation result of the user position. For example, the detection unit 121 allows the user to track the beamforming range when the user moves. Additionally, when the estimation result of the user position is stabilized, it is predicted that the user remains at the same position and the estimation result is correct, and therefore the detection unit 121 may narrow the beamforming range. In such a case, it is possible to improve the precision of the sound recognition. The detection unit 121 may change the range of the directivity of the sound input unit based on a user input. For example, the detection unit 121 vertically or horizontally moves, broadens, or narrows the range of the directivity formed by the sound input unit according to a user input.
The output control unit 122 according to the present example has a function as a display control unit that controls display indicating the range of the directivity (beamforming range) formed when the sound input unit executes the beamforming. Specifically, the output control unit 122 controls the output unit 130 such that information indicating the beamforming range of the sound input unit is displayed. This display may be the same as the range of the directivity formed by the detection unit 121 or may be different from the range of the directivity. For example, the output control unit 122 may execute display indicating a range deformed by expanding, simplifying, or scaling the range of the directivity formed by the detection unit 121. The output control unit 122 may control display indicating the beamforming range of the sound output unit or any other input or output unit. The output control unit 122 may control any other output indicating the beamforming range in addition to the display. Hereinafter, a specific example of the display indicating the beamforming range will be described.
The output control unit 122 may reflect the position of the user estimated by the detection unit 121 in at least one of the position or the shape of the display indicating the beamforming range. For example, the output control unit 122 may execute display indicating the beamforming range near the estimated user position or may execute display indicating the beamforming range in a shape spreading or narrowing toward the estimated user position. Hereinafter, an example of a user interface according to the user position according to the present example will be described with reference to
Various UIs indicating the beamforming range are considered. Variations of the UIs will be described in detail below.
The output control unit 122 may control display indicating volume of a sound obtained by the sound input unit. Accordingly, the user can know the volume of the input sound of the user and can also compare the volume to the volume of ambient noise. Hereinafter, an example of a user interface indicating volume according to the present example will be described with reference to
The output control unit 122 may simultaneously display information indicating the range of the directivity and information indicating volume. For example, the output control unit 122 may simultaneously execute display indicating the beamforming range and display indicating volume on the same display surface. Hereinafter, an example of a user interface when the display indicating the beamforming range and the display indicating the volume are simultaneously displayed will be described with reference to
The output control unit 122 may display information regarding the volume of the sound obtained inside the range of the directivity and information indicating the volume of the sound obtained outside the range of the directivity by distinguishing display methods. For example, the output control unit 122 may reflect information indicating the inside or the outside of the beamforming range in the display indicating the volume. Specifically, the output control unit 122 executes display indicating the volume by classifying display methods such as hue, shade, highness and lowness at the time of display of and a stereoscopic form, and a broken line or a solid line based on the volume of a sound obtained inside the beamforming range or the volume of a sound obtained outside the beamforming range. From another viewpoint, the output control unit 122 may reflect the information indicating the volume in the display indicating the beamforming range. In this way, the output control unit 122 can execute 3-dimensional display in which an axis of the volume is added to 2-dimensional display indicating a position range such as the beamforming range. In this case, since two meanings of the beamforming range and the volume can be expressed with one kind of display, the display region on the display surface can be saved and the user can understand the display more easily.
Various UIs in which the display indicating the beamforming range and the display indicating the volume are simultaneously executed are considered. Variations of the UIs will be described in detail below.
The output control unit 122 may display information indicating a result of the sound recognition based on the sound acquired by the sound input unit. For example, the output control unit 122 may activate an application such as a browser or may execute a sound input, such as a search word input, on an application based on the sound acquired by the sound input unit. Hereinafter, an example of a user interface related to the display indicating a result of the sound recognition will be described with reference to
The output control unit 122 may execute the display indicating the beamforming range a plurality of times or may dynamically change the display indicating the beamforming range. For example, when the plurality of beamforming ranges are formed by the detection unit 121, the output control unit 122 executes the display indicating the beamforming range a plurality of times accordingly. When the beamforming range is dynamically changed by the detection unit 121, the output control unit 122 can accordingly change the beamforming range. Variations of the UIs will be described in detail below.
The output unit 130 according to the present example has a function as a display unit that displays an image. For example, as illustrated in
The example of the configuration of the information processing system 100 which is characteristic of the present example has been described above.
(Variations of UI)
Display Indicating Beamforming Range
Hereinafter, a variation of a UI according to the present example will be described. First, a variation of a user interface related to display indicating the beamforming range will be described with reference to
The variations of the user interfaces related to the display indicating the beamforming range have been described above.
Display Indicating Beamforming Range and Volume
Next, variations of user interfaces related to simultaneous display of the display indicating the beamforming range and the display indicating the volume will be described with reference to
The variations of the user interfaces related to the simultaneous display of the display indicating the beamforming range and the display indicating the volume have been described above.
Display Indicating Plurality of Beamforming Ranges
Next, variations of user interfaces related to a plurality of displays of the display indicating the beamforming range will be described with reference to
Here, as in the example illustrated in
Here, when there are a plurality of applications receiving sound inputs, a plurality of mic icons may be displayed. Hereinafter, a variation of a user interface related to the display of the plurality of mic icons will be described with reference to
Here, as illustrated in
The variations of the user interfaces related to the display indicating the plurality of beamforming ranges have been described above.
Dynamic Change of Display Indicating Beamforming Range
Next, variations of user interfaces related to a dynamic change of display indicating the beamforming range will be described with reference to
The examples in which the display indicating the beamforming range is changed on the side of the information processing system 100 have been described above, but the present technology is not limited to the examples. For example, the user can change the display indicating the beamforming range. For example, when the displays indicating the plurality of beamforming ranges illustrated in
In the examples illustrated in
The variations of the UIs according to the present example have been described above. Next, an operation process executed in the information processing system 100 according to the present example will be described with reference to
(Operation Process)
As illustrated in
Subsequently, in step S4104, the detection unit 121 executes the beamforming. For example, the detection unit 121 controls the sound input unit such that the directivity is formed so that the estimated user position is included in the beamforming range.
Subsequently, in step S4106, the output control unit 122 outputs the display indicating the beamforming range. For example, as described above with reference to
Subsequently, in step S4108, the output control unit 122 updates the display indicating the beamforming range. For example, as described above with reference to
Subsequently, in step S4110, the output control unit 122 determines whether the display indicating the beamforming range ends. For example, the output control unit 122 determines whether the display indicating the beamforming range ends based on whether the user executes a manipulation of designating the end of the application or the display indicating the beamforming range.
When it is determined that the display does not end (No in S4110), the process proceeds to step S4108 again. When it is determined that the display ends (Yes in S4110), the process ends.
The operation process executed in the information processing system 100 according to the present example has been described above.
The present example is an embodiment of the above-described specific example 9. In this example, details of an internal process according to the specific example 9 are described with reference to
(Overview)
In
In the projection type information processing system 100a, a real object is placed on the table 140a or an existing window is displayed in some cases and the user position is also variable. Therefore, when an appropriate display region is not decided, a situation in which the activation window may be far from the user and is not within reach, the user may expend effort to adjust the activation window so that the activation window is easily visible, or the existing window is hidden by the activation window may occur.
Accordingly, the information processing system 100 according to the present example appropriately decides the display region of the activation window based on the relation among the real object, the display object, and the user. Accordingly, since the activation window is displayed in a location in which the user can easily execute a manipulation, the user can manipulate the activation window immediately after the activation. When the activation window is displayed to face the user, the user can confirm information regarding the activation window without executing any manipulation. Further, when the activation window is displayed so that the existing window is not covered and hidden, the activation window is prevented from interfering with a manipulation on the existing window. In the present example, convenience for the user is improved in this way.
(1) Input Unit 110
The input unit 110 according to the present example is realized by, for example, a camera acquiring an image (a still image or a moving image) and a stereo camera acquiring depth information. For example, the input unit 110 outputs the acquired captured image and the acquired depth information to the control unit 120. The input unit 110 may be realized by a touch sensor provided on the table 140a or any input device such as a remote controller. The input unit 110 can acquire a user manipulation such as a sound input or a touch manipulation and output the user manipulation to the control unit 120.
(2) Output Unit 130
The output unit 130 according to the present example has a function as a display unit that displays an image. For example, as illustrated in
(3) Control Unit 120
The control unit 120 according to the present example executes various processes to decide a display region of an application in the relation among the real object, the existing display object, and the user. As illustrated in
(3-1) Real Object Recognition Unit 4201
The real object recognition unit 4201 has a function of recognizing a real object on the table 140a (display surface). For example, the real object recognition unit 4201 recognizes the presence region of the real object on the table 140a from the depth information output from the input unit 110. Hereinafter, an example of a real object recognition function of the real object recognition unit 4201 will be described with reference to
The real object recognition unit 4201 may recognize content of the real object. For example, the real object recognition unit 4201 can recognize which real object is present on the table 140a by recognizing an image of a portion corresponding to the presence region the real object in a captured image obtained by imaging a state on the table 140a.
(3-2) Display Object Recognition Unit 4202
The display object recognition unit 4202 has a function of recognizing an existing display object which has already been displayed on the table 140a. For example, the display object recognition unit 4202 monitors a display control process by the output control unit 122 and recognizes a display region and content (a corresponding application) of the existing display object.
(3-3) User Recognition Unit 4203
The user recognition unit 4203 has a function of recognizing a user who is an application activation subject. For example, the user recognition unit 4203 recognizes the position and the direction of the user touching the activation instruction object 4224 based on a captured image output from the input unit 110. Various recognition processes by the user recognition unit 4203 are considered. The user recognition process will be described in detail below with reference to
(3-4) Output Control Unit 122
The output control unit 122 functions as a display control unit that decides a display region of the activation window and controls the output unit 130 such that the activation window is displayed in the decided display region. Various display region deciding processes by the output control unit 122 are considered. In the present specification, two examples of the display region decision process will be described.
For example, the output control unit 122 decides a display region of a display object (activation window) of an application to be activated based on at least one of the relation with the real object, the relation with the existing display object, and the relation with the user. For example, the output control unit 122 decides the display region of the activation window from at least one of the position, the size, and the angle. For example, the output control unit 122 decides a region in which the existing display object and the real object do not overlap as a window activatable area. The window activatable area is an area inside which the display region of the activation window can be decided. The output control unit 122 decides, as the display region of the activation window, a region which is inside the window activatable area and which is located at a position close to the user, has a size at which it does not overlap the existing display object or the real object, and has an angle at which it faces the user.
Relation with Real Object
The output control unit 122 decides the display region of the activation window displayed on the display surface according to information regarding the real object on the display surface. For example, the output control unit 122 decides the display region of the activation window so that the display region of the activation window does not overlap the real object. Additionally, the output control unit 122 may decide the display region of the activation window so that the display region of the activation window overlaps the real object.
The information regarding the real object is, for example, not only information regarding the presence region of the real object but also information including attribute information regarding the real object. The attribute information can include various kinds of information. For example, the attribute information may include information regarding the difficulty in moving the real object. The difficulty in moving the real object can be calculated from, for example, a movement history, the weight, and the size of the real object. For example, when the difficulty in moving the real object is considerable, the output control unit 122 decides the display region of the activation window so that the display region of the activation window does not overlap the real object. Conversely, when the difficulty in moving the real object is small, the output control unit 122 decides the display region of the activation window so that the display region of the activation window overlaps the real object and prompts the user to move the real object. The attribute information may include, for example, information regarding a relation with the activation window. For example, when the activation window is an activation window related to the real object, such as display of information for explaining food on the table 140a, the output control unit 122 decides the display region so that the display region is displayed near the real object or overlaps the real object. Accordingly, it is possible to improve convenience for the user.
Hereinafter, the display region decided based on the relation with the real object will be described with reference to
Relation with Existing Display Object
The output control unit 122 may decide the display region of the activation window further according to information regarding another display object (existing display object) which has already been displayed on the display surface.
For example, the output control unit 122 may decide the display region of the activation window in front of an existing display object. Accordingly, the user can view and manipulate the activation windows immediately after the activation. The output control unit 122 may decide the display region of the activation window so that the display region of the activation window does not overlap the display region of the existing display object. The user can view and manipulate the activation window immediately after the activation and can also manipulate the existing window. The output control unit 122 may change the display region of the existing window so that the display region of the existing window does not overlap the display region of the activation window. Accordingly, for example, since the existing display object moves so that a location is cleared out for the activation window, the output control unit 122 can display the activation window in a region in which the user can more easily execute a manipulation.
The information regarding the existing display object is, for example, information including information indicating the display region of the existing display object and information regarding the relation with the activation window. For example, when the information regarding the existing display object is included in the activation window, the output control unit 122 decides the display region of the activation window near the existing display object. Accordingly, it is possible to improve convenience for the user.
Hereinafter, the display region decided based on the relation with the existing display object will be described with reference to
Relation with User
The output control unit 122 may decide the display region of the activation window further according to information regarding the user to whom the display object of the activation window is to be displayed. The information regarding the user includes, for example, information indicating at least one of the position and the direction of the user. For example, the output control unit 122 can decide the display region of the activation window at a position close to the user inside the window activatable area and at an inclination at which the display region of the activation window faces the user according to the direction of the user.
Hereinafter, the display region decided based on the relation with the user will be described with reference to
Combination
The output control unit 122 may decide the display region of the activation window 4221 by combining at least one of the relation with the real object, the relation with the existing display object, and the relation with the user, as described above. Hereinafter, a display region decided by combining the relation with the real object, the relation with the existing display object, and the relation with the user will be described with reference to
The examples of the display region decision process for the activation window by the output control unit 122 have been described above.
For example, the output control unit 122 may decide the display region using an evaluation function evaluating a candidate of the display region of the activation window. The evaluation function can be designed so that the content described above in regard to the relation with the real object, the relation with the existing display object, and the relation with the user is reflected. More specifically, the following factors of the evaluation function are considered, for example.
The superimposition evaluation value with the existing window is an evaluation value in regard to superimposition of the activation window and the existing window. When a plurality of existing windows are displayed, the superimposition evaluation value with the existing windows can be calculated as a statistical value such as a total sum or an average of evaluation values calculated for the existing windows. The output control unit 122 may calculate the superimposition evaluation value with the existing window and may lower the degree of influence on the evaluation for the old existing window using a superimposition evaluation value×(1/index). Here, index is an index (sequence) assigned to the existing window, 1 is given to the newest window, and a larger value is given to an older window. The output control unit 122 may calculate an evaluation value and may consider mutual compatibility of the applications.
The superimposition evaluation value with the real object is an evaluation value in regard to superimposition of the activation window and the real object. When there are a plurality of real objects, the superimposition evaluation value with the real objects can be calculated as a statistical value such as a total sum or an average of evaluation values calculated for the real objects. The output control unit 122 may calculate an evaluation value, and may consider shapes such as the height and the area of the real object or consider a previous movement history.
The distance evaluation value from a user position is an evaluation value in regard to a distance between the activation window and the user position. When the distance is closer, a better evaluation value is calculated.
The distance evaluation value from a user side is an evaluation value in regard to distance between the activation window and the user side. When the distance is closer, a better evaluation value is calculated.
The distance evaluation value with a touch position is an evaluation value in regard to the distance between the activation window and the touch position. When the distance is closer, a better evaluation value is calculated.
The coincidence evaluation value with a finger direction is an evaluation value in regard to coincidence between the direction of the activation window and the direction of the touching finger. When the directions are more coincident, a better evaluation value is calculated. For example, when the directions are coincident, 1 can be calculated. When the directions are not coincident, 0 can be calculated.
The examples of the factors of the evaluation function have been described above. The output control unit 122 can design the evaluation function as the following expression by weighting the factors.
Evaluation value=Superimposition evaluation value with existing window×30+
superimposition evaluation value with real object×200+
distance evaluation value from user position×1+
distance evaluation value from user side×1+
distance evaluation value from touch position×1+
coincidence evaluation value with finger direction×10
Any weighted parameter can be set. Any function design (parameter design) can be used in the factors. The output control unit 122 may learn various parameters in the evaluation function. The output control unit 122 may set parameters of the evaluation function according to a user manipulation. Accordingly, the user can set the parameters so that the activation window is displayed in a preferred display region of the user. For example, by setting fine compatibility in applications which are frequently used adjacently, the user can activate the applications adjacently.
The output control unit 122 decides a candidate with the best evaluation value calculated by the evaluation function among candidates for the display region as the display region of the activation window. For example, the output control unit 122 can search for the display region with the best evaluation value by adopting any scheme such as a hill-climbing method or a genetic algorithm. In the present specification, the examples in which higher evaluation values are set to be better have been described, but lower evaluation values may also be set to be better. That is, the searching of the display region by the output control unit 122 may be a minimization problem or a maximization problem.
Hereinafter, specific design of the evaluation function will be described. For example, the output control unit 122 can use the following formula 1 as the evaluation function.
In the foregoing formula, appm means an m-th application window and indicates an activation window. Further, appn means an n-th application window and indicates an existing window, objn means an n-th real object, and p means the display region of an activation window. In the foregoing formula 1, an evaluation value is calculated when p is the display region of an activation window.
The first term of the foregoing formula 1 is an evaluation term regarding the relation with the existing window. Further, wapp is a weighted parameter, and Napp indicates the number of existing windows. A total sum calculation target is an evaluation function in regard to the compatibility and the distance between the activation window appm and the existing window appn, and is defined by the following formula, for example.
Napp−(n−1)/Napp is an evaluation value in regard to a display sequence and indicates that the value is lower as the display sequence is earlier. A lower index n is given to the existing window in a new display sequence.
Here, c (appm, appn) is an evaluation value in regard to the compatibility between the activation window appm and the existing window appn. For example, the better the compatibility is, the higher the value is. As the compatibility, for example, the compatibility between a photo file and a photo edit application is considered to be good and the compatibility between a music file and a music player is considered to be good. The evaluation value in regard to the compatibility can be decided as shown in the following table which is an example of a compatibility table.
Table 2 is an example of the compatibility table for deciding evaluation values in regard to the compatibility. The evaluation values in regard to the compatibility include a portion which is decided statically and a portion which is dynamically changed. The portion which is statically decided is decided by, for example, compatibility between applications decided in advance at the time of installation or whether the same media (files) can be handled. The portion which is dynamically changed is changed according to, for example, whether the same user (the same hand or a hand coming from the same direction) activates an application or compatibility indicated by a previous manipulation history. Additionally, as an example of the portion which is dynamically changed, the evaluation value can increase when one application is activated and the other application is subsequently activated within a predetermined time or when one application is activated and subsequently approaches the other application within a predetermined time. Conversely, the evaluation value can decrease when one application is activated and is subsequently distanced from the other application within a predetermined time.
In the example shown in Table 2, an evaluation value in regard to compatibility between a still image edit application and a photo application becomes “3” obtained by dynamically adding “1” to static “2.” Further, an evaluation value in regard to compatibility between a moving image edit application and the photo application becomes “2” obtained by dynamically adding “1” to static “1.” An evaluation value in regard to compatibility between the moving image edit application and the still image edit application becomes “0” obtained by dynamically adding “1” to static “0” and reducing “1.”
The compatibility table has been described above. The foregoing formula 2 will be described again. {d−(r1+r2)} is an evaluation value in regard to superimposition between the activation window appm and the existing window appn. An example of a relation among r1, r2, and d is illustrated in
The first term of the foregoing formula 1 has been described above. Next, the second term of the foregoing formula 1 will be described.
The second term of the foregoing formula 1 is an evaluation term regarding the relation with the real object. Here, wobj is a weighted parameter. Nobj indicates the number of real objects. A total sum calculation target is an evaluation function in regard to the compatibility and the distance between the activation window appm and the real object objn and is defined by the following formula, for example.
[Math. 3]
ƒapp
Here, c (appm, objn) is an evaluation value in regard to the compatibility between the activation window appm and the real object objn. For example, the better the compatibility is, the higher the value is. As the compatibility, for example, the compatibility between a ruler application and the real object is considered to be good, the compatibility between a game application using an obstacle and the real object which can be the obstacle is considered to be good, and the compatibility between an application of a ramen timer and a round real object with a predetermined size is considered to be good. The evaluation value in regard to the compatibility can be decided by the compatibility table similar to Table 2 shown above.
{d−(r1+r2)} is an evaluation value in regard to superimposition between the activation window appm and the real object objn. An example of a relation among r1, r2, and d is illustrated in
Here, a/Vobj is an evaluation value in regard to a convex hull cubic volume of an object. This evaluation value is low, for example, when the real object objn is small or thin. Accordingly, an influence of a small or thin real object on the evaluation function is small.
Here, b/∫vdt is an evaluation value in regard to a recent movement distance. This evaluation value is lower, for example, as the recent movement distance is larger. Accordingly, an influence of a real object of which the recent movement distance is large on the evaluation function is small. This is because the object of which the movement distance is large is a movable object or an object placed recently and newly, and thus the user is considered to be able to remove the object.
Here, c/F is an evaluation in regard to the shape of an object. This evaluation value is lower, for example, as an influence of the shape on an image to be projected is smaller. For example, a low value can be given to a pyramid. Accordingly, an influence of a real object with a shape having a small influence on an image to be projected on the evaluation function is small.
The second term of the foregoing formula 1 has been described above. Next, terms subsequent to the third term of the foregoing formula 1 will be described.
Terms subsequent to the third term of the foregoing formula 1 are evaluation terms regarding the relation with the user. Here, wup is a weighted parameter and fup(p) is an evaluation function regarding a distance between the user position and the activation window. Further, wus is a weighted parameter and fus(p) is an evaluation function regarding a distance between the user side and the activation window. Further, wtp is a weighted parameter and ftp(p) is an evaluation function regarding a distance between a touch position and the activation window. Further, wfd is a weighted parameter and ffd(p) is an evaluation function regarding the coincidence of a direction of a finger of the user and a direction in which the activation window is oriented.
The specific example of the evaluation function has been described above.
The example of the configuration of the information processing system 100 has been described above. Next, an example of an operation process executed in the information processing system 100 according to the present example will be described with reference to
(Operation Process)
As illustrated in
Subsequently, in step S4202, the detection unit 121 recognizes the real object, the existing display object, and the user. For example, the real object recognition unit 4201 recognizes the presence region of the real object on the table 140a from the depth information output from the input unit 110. For example, the display object recognition unit 4202 monitors a display control process by the output control unit 122 and recognizes the display region of the existing window. For example, the user recognition unit 4203 recognizes the position and the direction of the user touching the activation instruction object based on a captured image output from the input unit 110.
Next, in step S4203, the output control unit 122 executes the display region decision process for the activation window. Since specific content of the display region decision process will be described in detail below, the detailed description thereof will be omitted here.
In step S4204, the output control unit 122 executes the display process. For example, the output control unit 122 controls the output unit 130 such that the activation window is displayed in the display region decided in the foregoing step S4203.
The flow of the display control process executed in the information processing system 100 has been described above. Next, the flow of the display region decision process of the foregoing step S4203 will be described with reference to
As illustrated in
Subsequently, in step S4222, the output control unit 122 determines whether the estimation of the position of the user is successful. For example, the output control unit 122 executes the determination with reference to the recognition result by the user recognition unit 4203.
When the output control unit 122 determines that the estimation of the position of the user is successful (YES in S4222), the output control unit 122 determines in step S4223 whether the estimation of the direction of the user is successful. For example, the output control unit 122 executes the determination with reference to the recognition result by the user recognition unit 4203.
When the output control unit 122 determines that the estimation of the direction of the user is successful (YES in S4223), the output control unit 122 decides the display region of the activation window based on the position and the direction of the user in step S4424. For example, the output control unit 122 decides the display region of the activation window in the window activatable area decided in the foregoing step S4221 so that the display region of the activation window is inclined to face the user according to the direction of the user at a position close to the user.
Conversely, when the estimation of the position of the user is successful and the estimation of the direction of the user fails (NO in S4223), the output control unit 122 decides the display region of the activation window based on the position of the user in step S4225. For example, the output control unit 122 decides the display region of the activation window at the position close to the user and in the direction corresponding to the user side in the window activatable area decided in the foregoing step S4221.
When the output control unit 122 determines that the estimation of the position of the user fails (NO in S4222), the output control unit 122 decides the display region of the activation window based on the information regarding the real object and the existing window in step S4226. For example, the output control unit 122 decides any region as the display region of the activation window in the window activatable area decided in the foregoing step S4221.
The example of the flow of the display region decision process has been described above. Next, another example of the flow of the display region decision process will be described with reference to
As illustrated in
Subsequently, in step S4232, the output control unit 122 evaluates the neighborhood coordinates. For example, the output control unit 122 calculates an evaluation value of the evaluation function for each of 8 points shifted vertically and horizontally by one pixel from the coordinates selected in the foregoing step S4231. At this time, the output control unit 122 may calculate the evaluation value while also changing the size and the inclination of the display region.
Next, in step S4233, the output control unit 122 changes the coordinates at which the evaluation value is the best as the coordinates of the evaluation target. At this time, when the coordinates at which the evaluation value is better are present in the 8 neighborhood points (NO in S4234), the process returns to step S4232 again and the evaluation (S4232) and the update (S4233) of the coordinates of the evaluation target are repeated.
When the coordinates at which the evaluation value is better are not present at the 8 neighborhood points (YES in S4234), the output control unit 122 determines whether the evaluation (S4232 to S4234) is completed on the N coordinates in step S4235.
When it is determined that the evaluation is not completed (NO in S4235), the process returns to step S4232 again. Accordingly, the output control unit 122 executes the process related to the foregoing steps S4232 to S4234 at the unevaluated coordinates.
When it is determined that the evaluation is completed (YES in S4235), the output control unit 122 decides the display region in which the evaluation value is the best in step S4236.
The example of the flow of the display region decision process has been described above.
The present example is an embodiment of the above-described specific example 8. In this example, details of an internal process according to the specific example 8 are described with reference to
(1) Input Unit 110
The input unit 110 according to the present example is realized by, for example, a camera acquiring an image (a still image or a moving image) and a stereo camera acquiring depth information. For example, the input unit 110 outputs the acquired captured image and the acquired depth information to the control unit 120. The input unit 110 acquires the depth information and outputs the depth information to the control unit 120 not only at the time of imaging but also at the time of projection of a captured image by the output unit 130. The input unit 110 may be realized by a touch sensor provided on the table 140a or any input device such as a remote controller. The input unit 110 can acquire a user manipulation such as a sound input or a touch manipulation and output the user manipulation to the control unit 120.
(2) Output Unit 130
The output unit 130 according to the present example has a function as a display unit that displays an image. For example, as illustrated in
(3) Control Unit 120
The control unit 120 according to the present example executes various processes to reproduce a captured image obtained by imaging a subject on the table 140a with the original size. As illustrated in
(3-1) Environment Information Acquisition Unit 4301
The environment information acquisition unit 4301 has a function of acquiring environment information. The environment information is information regarding an environment of the information processing system 100 having an influence on an input to the information processing system 100 or an output from the information processing system 100. For example, environment information at the time of imaging is information which can influence the size of a subject shown in a captured image. Examples of the environment information at the time of imaging include a distance (depth information) at the time of imaging from a camera to the table 140a, a view angle of the camera at the time of imaging, and the number of pixels of the camera at the time of imaging. The environment information at the time of projection is information which can influence the size of a subject shown a captured image to be projected. Examples of the environment information at the time of projection include a distance at the time of projection from a projector at the time of projection to the table 140a, a view angle of the projector, and the number of pixels of the projector at the time of projection.
(3-2) Environment Information Accumulation Unit 4303
The environment information accumulation unit 4303 has a function of accumulating the environment information acquired by the environment information acquisition unit 4301. For example, the environment information accumulation unit 4303 associates the environment information acquired at the time of imaging of a captured image with the captured image captured by a camera for storage. For example, the environment information may be stored as metadata of the captured image.
(3-3) Setting Information Generation Unit 4305
The setting information generation unit 4305 has a function of generating setting information for projecting the captured image by the output unit 130. In particular, the setting information generation unit 4305 generates the setting information for reproducing the form of the subject at the time of imaging without change based on the environment information at the time of imaging and the environment information at the time of projection. For example, the setting information generation unit 4305 can calculate a projection magnification for reproducing the subject with the original size and generates the projection magnification as setting information. Hereinafter, a calculation example of the projection magnification for reproducing the subject with the original size by the setting information generation unit 4305 will be described with reference to
Y1 indicates the number of pixels in the vertical direction of the camera at the time of imaging, L1 indicates a distance from the camera at the time of imaging to the table 140a, and θ1 indicates a view angle in the vertical direction of the camera at the time of imaging. Y2 indicates the number of pixels in the vertical direction of the projector at the time of projection, L2 indicates a distance at the time of projection from the projector to the table 140a, and θ2 indicates a view angle in the vertical direction of the projector at the time of projection.
The example in which the projection magnification is calculated using the number of pixels Y in the vertical direction has been described in
Various kinds of setting information generated by the setting information generation unit 4305 are considered in addition to the projection magnification. For example, the setting information generation unit 4305 may calculate luminance, contrast, or the like for reproducing hue of a subject or an arrival status of outside light and generate the luminance, the contrast, or the like as setting information.
Based on the setting information generated by the setting information generation unit 4305, the output control unit 122 controls the output unit 130 such that the captured image is projected. Specifically, the output control unit 122 controls the projector using the projection magnification calculated by the setting information generation unit 4305 such that the captured image is expanded or reduced to be projected.
The output control unit 122 may control the output unit 130 such that notification to the user is output to reproduce the form of the subject at the time of imaging without change. For example, a case in which an expanded captured image does not fall on the table 140a when the captured image is expanded and projected is considered. Therefore, the output control unit 122 may output, to the user, a notification requesting the user to execute adjustment related to the table 140a so that the expanded captured image falls on the table 140a.
The example of the configuration of the information processing system 100 according to the present example has been described above. Next, a specific example of a user interface according to the present example will be described with reference to
First, a specific example of a user interface when the projection magnification is changed according to an environment change from the time of imaging to the time of projection will be described with reference to
The specific example of the user interface when the projection magnification is changed according to the environment change from the time of imaging to the time of projection has been described above. Next, a specific example of a notification requesting the user to execute adjustment to reproduce the form of a subject at the time of imaging without change will be described with reference to
The specific example of the user interface according to the example has been described above. Next, an example of an operation process by the information processing system 100 according to the present example will be described with reference to
(Operation Process)
As illustrated in
Subsequently, in step S4304, the environment information acquisition unit 4301 acquires the current environment information. The environment information includes, for example, the distance from the project to the table 140a, the view angle of the projector, and the number of pixels of the projector.
Subsequently, in step S4306, the setting information generation unit 4305 calculates a projection magnification. Specifically, the setting information generation unit 4305 calculates the projection magnification by applying the environment information referred to in the foregoing step S4302 and the environment information acquired in the foregoing step S4304 to the foregoing formula 4.
In step S4308, the output control unit 122 controls the projector such that the captured image is projected. Specifically, the output control unit 122 controls the projector using the projection magnification calculated in the foregoing step S4306 such that the captured image is expanded or reduced to be projected.
The example of the operation process by the information processing system 100 according to the present example has been described above.
(Conclusion)
As described above, according to the present example, the subject can be reproduced with the original size by storing the captured image and the environment information at the time of storage in association therewith and changing the projection magnification according to the difference from the environment information at the time of projection. Additionally, a technology for imaging a subject along with a comparison object such as a cigarette pack or a coin can also be considered. However, in this technology, it is difficult to project a captured image with the original size when an environment at the time of projection is different from an environment at the time of imaging. Accordingly, it is necessary to execute, for example, a manipulation of comparing the comparison object of a real object to a projected comparison object and adjusting a projection magnification. In contrast, according to the present example, a subject can be reproduced with the original size even when an environment at the time of projection is different from an environment at the time of imaging.
The present example is an embodiment of the above-described specific example 4. In this example, details of an internal process according to the specific example 4 are described with reference to
First, a characteristic configuration of the information processing system 100 according to the present example will be described.
(1) Input Unit
The input unit 110 according to the present example has a function of detecting an object on the table 140a. The input unit 110 can be realized by, for example, a camera acquiring an image (a still image or a moving image) on the table 140a, a stereo camera acquiring depth information, or a touch sensor provided on the table 140a. For example, the input unit 110 detects a touch on the table 140a by a finger or a hand floating above the table 140a with the finger separated. Then, the input unit 110 outputs detected input information to the control unit 120.
(2) Output Unit 130
The output unit 130 according to the present example has a function as a display unit that displays an image. For example, as illustrated in
(3) Detection Unit
Finger Detection Function
The detection unit 121 according to the present example has a function of detecting a finger of the user based on the input information output from the input unit 110. The detection unit 121 has a manipulable number detection function and a finger detection function.
Manipulable Number Decision Function
The detection unit 121 has a function of deciding a manipulable number. Hereinafter, a process of deciding a manipulable number by the detection unit 121 will be described in detail.
The detection unit 121 first decides a system recognition limit number N. The system recognition limit number means an upper limit number which can be recognized by the information processing system 100 and corresponds to the above-described computational recognizable upper limit. For example, the detection unit 121 may dynamically calculate the system recognition limit number N from a processing load of an application which is being used and may decide the system recognition limit number N as a fixed value from hardware requisites of the system.
Subsequently, the detection unit 121 decides a manipulation limit number M for each application. Here, M≤N is set. The detection unit 121 may decide M according to content of the application. For example, the detection 121 sets M=2 in a hockey game and sets M=4 in a mah-jong. The detection unit 121 may set M=N.
Next, the detection unit 121 decides a surplus number P. Normally, the detection unit 121 sets P=1. Additionally, the detection unit 121 may set any number satisfying N>P>=N−M.
Then, the detection unit 121 decides a manipulable number to (M−P). The manipulable number corresponding to the above-described recognizable upper limit based on specification. Since the manipulable number is smaller than at least the system recognition limit number because of the surplus number, the information processing system 100 can feed back the fact that a finger for which the manipulable number is exceeded is not recognizable.
Finger Detection Function
The detection unit 121 detects a finger of the user by classifying fingers into two types of fingers, fingers having a manipulation authority and fingers having no manipulation authority. The detection unit 121 detects fingers detected until arrival of the manipulable number as the fingers having the manipulation authority. The detection unit 121 detects fingers detected after arrival of the manipulable number as the fingers having no manipulation authority.
When a finger having the manipulation authority continues to touch the table 140a, the detection unit 121 continues to give the manipulation authority to the finger. Conversely, when the finger having the manipulation authority stops touching the table 140a and stops the manipulation, the manipulation authority is lost and the manipulation authority transitions to another finger. However, even when a finger having the manipulation authority stops touching the table 140a and a manipulation stops, the detection unit 121 continues to give the manipulation authority to the finger when a manipulation is expected to be executed again. Hereinafter, an example in which a manipulation is expected to be executed again will be described specifically with reference to
The manipulation authority can be given not only to a finger executing a touch and starting a manipulation but also a finger expected to start a manipulation. Hereinafter, an example in which start of a manipulation is executed will be described specifically with reference to
(4) Output Control Unit 122
The output control unit 122 according to the present example has a function of controlling the output unit 130 such that whether a manipulation is possible is fed back to the user. For example, when a manipulation by a finger having the manipulation authority is detected, the output control unit 122 executes feedback indicating that a manipulation is possible. Conversely, when a manipulation by a finger having no manipulation authority is detected and, for example, when fingers exceeding the manipulable number are newly detected, the output control unit 122 executes feedback indicating that the manipulation is not possible. Even when the number of fingers exceeds the manipulable number (M-P), the output control unit 122 can detect up to the system recognition limit number N. Therefore, the output control unit 122 can feed back a warning indicating that a manipulation is not possible to fingers until the number of fingers reaches N.
Various conditions in which the output control unit 122 feeds back the warning are considered. For example, when fingers having no manipulation authority are detected a predetermined number of times or more or fingers having no manipulation authority are continuously detected for a predetermined period or more, the output control unit 122 may feed back the warning. The output control unit 122 may combine a plurality of feedbacks or may set a different condition for each feedback. For example, the output control unit 122 may execute feedback indicating that the color of a pointer meaning drag start from an instant at which a finger is detected for the first time and may start feedback indicating that a beep sound emanates after 5 seconds from the instant.
Hereinafter, variations of the feedback by the output control unit 122 will be described with reference to
In the example illustrated in
In the examples illustrated in
In the examples illustrated in
In the examples illustrated in
In the example illustrated in
In the example illustrated in
In the example illustrated in
In the example illustrated in
In the example illustrated in
The example of the characteristic configuration of the information processing system 100 according to the present example has been described above. Next, an operation process of the information processing system 100 according to the present example will be described with reference to
(Operation Process)
As illustrated in
Subsequently, in step S4404, the detection unit 121 decides the manipulation limit number M.
Subsequently, in step S4406, the detection unit 121 decides the surplus number P.
Then, in step S4408, the detection unit 121 decides the manipulable number (M−P).
The preliminary process of deciding the manipulable number has been described above.
As illustrated in
Subsequently, in step S4414, the detection unit 121 executes a finger recognition process on fingers having the manipulation authority in a previous frames. At this time, the detection unit 121 detects the fingers having the manipulation authority, including not only fingers for which a touch manipulation is detected but also fingers for which the touch manipulation described above with reference to
Next, in step S4416, the detection unit 121 determines whether the number of recognized fingers reaches the system recognition limit number N.
When the number of recognized fingers reaches the recognition limit number N (YES in S4416), the process ends.
Conversely, when the number of recognized fingers does not reach the recognition limit number N (NO in S4416), the detection unit 121 executes the finger recognition process on the fingers having no manipulation authority in the previous frames. Accordingly, the detection unit 121 recognizes a newly appearing finger from a current frame.
When the new finger is not recognized (NO in S4420), the process ends.
Conversely, when the new finger is recognized (YES in S4420), the detection unit 121 determines in step S4422 whether the newly recognized finger falls in the manipulable number.
When it is determined that the newly recognized finger does not fall in the manipulable number (NO in step S4422), the output control unit 122 executes feedback indicating the manipulation is not possible in step S4424. The output control unit 122 may combine the feedback examples illustrated in
Conversely, when it is determined that the newly recognized finger falls in the manipulable number (YES in step S4422), the detection unit 121 determines whether a manipulation by the recognized finger is an effective manipulation in step S4426.
When the detection unit 121 determines that the manipulation is the effective manipulation (YES in S4426), the detection unit 121 issues a manipulation event in step S4428. At this time, for example, the detection unit 121 gives the manipulation authority to the newly recognized finger. When it is determined that the manipulation is not the effective manipulation (NO in S4426), the manipulation event is not issued. Even when the manipulation event is not issued, the detection unit 121 may give the manipulation authority to the newly recognized finger.
Subsequently, in step S4430, the detection unit 121 determines whether the finger recognition process is executed up to the final target data. For example, the detection unit 121 executes the determination depending on whether a region is not an unscanned region in the finger recognition process in step S4418.
When it is determined that the finger recognition process is not executed up to the final target data (NO in S4430), the process returns to step S4416 again. Conversely, it is determined that the finger recognition process is executed up to the final target data (YES in S4430), the process in the current frame ends.
The operation process of the information processing system 100 according to the present example has been described above.
Next, a hardware configuration of the information processing system 100 according to an embodiment of the present disclosure will be described with reference to
As illustrated in
The CPU 901 serves as an operation processor and a control device, and controls all or some operations in the information processing system 100 in accordance with various programs recorded in the ROM 903, the RAM 905, the storage device 919 or a removable recording medium 927. The ROM 903 stores programs and operation parameters which are used by the CPU 901. The RAM 905 temporarily stores program which are used in the execution of the CPU 901 and parameters which are appropriately modified in the execution. The CPU 901, ROM 903, and RAM 905 are connected to each other by the host bus 907 configured to include an internal bus such as a CPU bus. In addition, the host bus 907 is connected to the external bus 911 such as a PCI (Peripheral Component Interconnect/Interface) bus via the bridge 909.
The input device 915 is a device which is operated by a user, such as a mouse, a keyboard, a touch panel, buttons, switches and a lever. The input device 915 may include a mic that detects a sound of a user. The input device 915 may be, for example, a remote control unit using infrared light or other radio waves, or may be an external connection device 929 such as a portable phone operable in response to the operation of the information processing system 100. Furthermore, the input device 915 includes an input control circuit which generates an input signal on the basis of the information which is input by a user and outputs the input signal to the CPU 901. By operating the input device 915, a user can input various types of data to the information processing system 100 or issue instructions for causing the information processing system 100 to perform a processing operation. The imaging device 933 to be described below can function as an input device by imaging a motion or the like of a hand of the user.
The output device 917 includes a device capable of visually or audibly notifying the user of acquired information. The output device 917 may include a display device such as an LCD (Liquid Crystal Display), a PDP (Plasma Display Panel), an organic EL (Electro-Luminescence) displays, and a projector, a hologram display device, an audio output device such as, a speaker or a headphone, and a peripheral device such as a printer. The output device 917 may output the results obtained from the process of the information processing system 100 in a form of a video such as text or an image, and an audio such as voice or sound. The output device 917 may include a light or the like to bright surroundings.
The storage device 919 is a device for data storage which is configured as an example of a storage unit of the information processing system 100. The storage device 919 includes, for example, a magnetic storage device such as a HDD (Hard Disk Drive), a semiconductor storage device, an optical storage device, or a magneto-optical storage device. The storage device 919 stores programs to be executed by the CPU 901, various data, and data obtained from the outside.
The drive 921 is a reader/writer for the removable recording medium 927 such as a magnetic disk, an optical disk, a magneto-optical disk, or a semiconductor memory, and is embedded in the information processing system 100 or attached externally thereto. The drive 921 reads information recorded in the removable recording medium 927 attached thereto, and outputs the read information to the RAM 905. Further, the drive 921 writes in the removable recording medium 927 attached thereto.
The connection port 923 is a port used to directly connect devices to the information processing system 100. The connection port 923 may include a USB (Universal Serial Bus) port, an IEEE1394 port, and a SCSI (Small Computer System Interface) port. The connection port 923 may further include an RS-232C port, an optical audio terminal, an HDMI (High-Definition Multimedia Interface) port, and so on. The connection of the external connection device 929 to the connection port 923 makes it possible to exchange various data between the information processing system 100 and the external connection device 929.
The communication device 925 is, for example, a communication interface including a communication device or the like for connection to a communication network 931. The communication device 925 may be, for example, a communication card for a wired or wireless LAN (Local Area Network), Bluetooth (registered trademark), WUSB (Wireless USB) or the like. In addition, the communication device 925 may be a router for optical communication, a router for ADSL (Asymmetric Digital Subscriber Line), a modem for various kinds of communications, or the like. The communication device 925 can transmit and receive signals to and from, for example, the Internet or other communication devices based on a predetermined protocol such as TCP/IP. In addition, the communication network 931 connected to the communication device 925 may be a network or the like connected in a wired or wireless manner, and may be, for example, the Internet, a home LAN, infrared communication, radio wave communication, satellite communication, or the like.
The imaging device 933 is a device that generates an image by imaging a real space using an image sensor such as a charge-coupled device (CCD) or a complementary metal-oxide-semiconductor (CMOS) sensor, as well as various members such as one or more lenses for controlling the formation of a subject image on the image sensor, for example. The imaging device 933 may be a device that takes still images, and may also be a device that takes moving images.
The sensor 935 is any of various sensors such as an acceleration sensor, a gyro sensor, a geomagnetic sensor, an optical sensor, or a sound sensor, for example. The sensor 935 acquires information regarding the state of the information processing system 100, such as the orientation of the case of the information processing system 100, as well as information regarding the environment surrounding the information processing system 100, such as the brightness or noise surrounding the information processing system 100, for example. The sensor 935 may also include a Global Positioning System (GPS) sensor that receives GPS signals and measures the latitude, longitude, and altitude of the apparatus.
The foregoing thus illustrates an exemplary hardware configuration of the information processing system 100. Each of the above components may be realized using general-purpose members, but may also be realized in hardware specialized in the function of each component. Such a configuration may also be modified as appropriate according to the technological level at the time of the implementation.
According to an embodiment of the present disclosure, as described above, there is provided the information processing system 100 capable of displaying information more appropriately and efficiently according to an environment in which information is displayed or a situation of displayed information.
Steps in processes executed by devices in this specification are not necessarily executed chronologically in the order described in a sequence chart or a flow chart. For example, steps in processes executed by devices may be executed in a different order from the order described in a flow chart or may be executed in parallel.
Further, a computer program can be created which causes hardware such as a CPU, ROM, or RAM, incorporated in each of the devices, to function in a manner similar to that of structures in the above-described devices. Furthermore, it is possible to provide a recording medium having the computer program recorded thereon. Moreover, by configuring respective functional blocks shown in a functional block diagram as hardware, the hardware can achieve a series of processes.
The preferred embodiment(s) of the present disclosure has/have been described above with reference to the accompanying drawings, whilst the present disclosure is not limited to the above examples. A person skilled in the art may find various alterations and modifications within the scope of the appended claims, and it should be understood that they will naturally come under the technical scope of the present disclosure.
Note that software that realizes a user interface or an application shown in the above-described embodiments may be realized as a web application that is used via a network such as the Internet. Such a web application may be realized with a markup language, for example, HyperText Markup Language (HTML), Standard Generalized Markup Language (SGML), Extensible Markup Language (XML), or the like.
In addition, the effects described in the present specification are merely illustrative and demonstrative, and not limitative. In other words, the technology according to the present disclosure can exhibit other effects that are evident to those skilled in the art along with or instead of the effects based on the present specification.
Additionally, the present technology may also be configured as below.
(1)
A display control device including:
a display control unit configured to decide a display region of a display object to be displayed on a display surface according to information regarding a real object on the display surface.
(2)
The display control device according to (1),
wherein the information regarding the real object includes attribute information regarding the real object.
(3)
The display control device according to (2),
wherein the attribute information includes information regarding difficulty in moving the real object.
(4)
The display control device according to (2) or (3),
wherein the attribute information includes information regarding a relation with the display object.
(5)
The display control device according to any one of (1) to (4),
wherein the display control unit decides the display region of the display object from at least one of a position, a size, and an angle.
(6)
The display control device according to any one of (1) to (5),
wherein the display control unit decides the display region of the display object so that the display region does not overlap the real object.
(7)
The display control device according to any one of (1) to (6),
wherein the display control unit decides the display region using an evaluation function of evaluating a candidate of the display region of the display object.
(8)
The display control device according to (7),
wherein the display control unit sets a parameter of the evaluation function according to a user manipulation.
(9)
The display control device according to any one of (1) to (8),
wherein the display control unit decides the display region of the display object further according to information regarding another display object already displayed on the display surface.
(10)
The display control device according to (9),
wherein the display control unit decides the display region of the display object in front of the other display object.
(11)
The display control device according to (9) or (10),
wherein the display control unit decides the display region of the display object so that the display region does not overlap a display region of the other display object.
(12)
The display control device according to any one of (9) to (11),
wherein the display control unit changes a display region of the other display object so that the display region does not overlap the display region of the display object.
(13)
The display control device according to any one of (9) to (12),
wherein the information regarding the other display object includes information regarding a relation with the display object.
(14)
The display control device according to any one of (1) to (13),
wherein the display control unit decides the display region of the display object further according to information regarding a user to whom the display object is to be displayed.
(15)
The display control device according to (14),
wherein the information regarding the user includes information indicating at least one of a position and a direction of the user.
(16)
A display control method including:
deciding, by a processor, a display region of a display object to be displayed on a display surface according to information regarding a real object on the display surface.
(17)
A program causing a computer to function as:
a display control unit configured to decide a display region of a display object to be displayed on a display surface according to information regarding a real object on the display surface.
Number | Date | Country | Kind |
---|---|---|---|
JP2013-273369 | Dec 2013 | JP | national |
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/JP2014/073106 | 9/2/2014 | WO | 00 |
Publishing Document | Publishing Date | Country | Kind |
---|---|---|---|
WO2015/098189 | 7/2/2015 | WO | A |
Number | Name | Date | Kind |
---|---|---|---|
8643703 | Karakotsios | Feb 2014 | B1 |
20050108655 | Andrea | May 2005 | A1 |
20070300182 | Bilow | Dec 2007 | A1 |
20080154922 | Arai | Jun 2008 | A1 |
20100079369 | Hartmann | Apr 2010 | A1 |
20110294433 | Matsubara | Dec 2011 | A1 |
20120076353 | Large | Mar 2012 | A1 |
20120249741 | Maciocci | Oct 2012 | A1 |
20120284619 | Myllyla et al. | Nov 2012 | A1 |
20130057588 | Leonard | Mar 2013 | A1 |
20130093708 | Annett | Apr 2013 | A1 |
20130343601 | Jia et al. | Dec 2013 | A1 |
20140240357 | Hou | Aug 2014 | A1 |
20150042556 | Tao | Feb 2015 | A1 |
Number | Date | Country |
---|---|---|
102165394 | Aug 2011 | CN |
102263577 | Nov 2011 | CN |
102668601 | Sep 2012 | CN |
103092344 | May 2013 | CN |
H05-107638 | Apr 1993 | JP |
2004-233845 | Aug 2004 | JP |
2005-317032 | Nov 2005 | JP |
2006-072071 | Mar 2006 | JP |
2006-178205 | Jul 2006 | JP |
2006-287735 | Oct 2006 | JP |
2007-036482 | Feb 2007 | JP |
2007-282077 | Oct 2007 | JP |
2008-033049 | Feb 2008 | JP |
2008-037419 | Feb 2008 | JP |
2008-158631 | Jul 2008 | JP |
2009-025921 | Feb 2009 | JP |
2010-033604 | Feb 2010 | JP |
2010-113455 | May 2010 | JP |
2011-040227 | Feb 2011 | JP |
2011-227199 | Nov 2011 | JP |
2012-058704 | Mar 2012 | JP |
2012-513047 | Jun 2012 | JP |
2013-518383 | May 2013 | JP |
2013-182624 | Sep 2013 | JP |
2013-214259 | Oct 2013 | JP |
2013-235416 | Nov 2013 | JP |
Entry |
---|
Sep. 22, 2017 EP communication issued for related EP application No. 14874868.4. |
Oct. 17, 2017, EP communication issued for related EP application No. 14873382.7. |
Oct. 23, 2017, EP communication issued for related EP application No. 14874038.4. |
May 22, 2017, CN communication issued for related CN application No. 201480069947.6. |
Jun. 29, 2017, EP communication issued for related EP application No. 14873569.9. |
May 8, 2018, Japanese Office Action issued for related JP Application No. 2015-554600. |
Jun. 12, 2018, Japanese Office Action issued for related JP Application No. 2015-554598. |
Aug. 3, 2018, Chinese Office Action issued for related CN application No. 201480069948.0. |
Jan. 25, 2018, Chinese Office Action issued for related CN application No. 201480069947.6. |
Oct. 23, 2018, Japanese Office Action issued for related JP Application No. 2015-554597. |
Oct. 9, 2018, Japanese Office Action issued for related JP Application No. 2015-554599. |
Nov. 1, 2018, Chinese Office Action issued for related CN application No. 201480070024.2. |
Nov. 8, 2018, Chinese Office Action issued for related CN application No. 201480070025.7. |
Jan. 8, 2019, Japanese Office Action issued for related JP Application No. 2015-554600. |
Apr. 3, 2019, Chinese Office Action issued for related CN Application No. 201480070025.7. |
May 7, 2019, Japanese Office Action issued for related JP Application No. 2015-554597. |
Dec. 3, 2019, Japanese Office Action issued for related JP Application No. 2015-554597. |
Apr. 7, 2020, Japanese Office Action issued for related JP application No. 2019-129657. |
Number | Date | Country | |
---|---|---|---|
20170031530 A1 | Feb 2017 | US |