The present disclosure relates to an information processing device, an information processing method, and a program.
Techniques for remotely controlling devices such as home electric appliances or information devices have been proposed. For example, techniques for a user to remotely control such devices using a gesture of a hand, body, or the like have been developed.
A technique of identifying a finger of a user located in a certain environment, recognizing a direction in which the finger is directed, and pointing to a device on the basis of a recognition result is disclosed in Patent Literature 1.
Patent Literature 1: JP 2013-205983A
However, in the technique disclosed in Patent Literature 1, it is necessary to move the finger in the air and direct a fingertip toward a device of an operation target. In this case, since the direction in which the finger is directed is unstable, and a pointing recognition error is large, it is difficult for the user to select an operation target.
In this regard, the present disclosure proposes an information processing device, an information processing method, and a program which are novel and improved and capable of selecting an operation target more easily.
According to the present disclosure, there is provided an information processing device, including: a first selecting unit configured to control selection of a selection target including an operation target on the basis of information related to a first operation on the selection target by an operating entity; a setting unit configured to set an operation region corresponding to the selected selection target in a first object; and a second selecting unit configured to control selection of the operation target on the basis of information related to a second operation on the first object in which the operation region is set by the operating entity.
In addition, according to the present disclosure, there is provided an information processing method, including: controlling, by a processor, selection of a selection target including an operation target on the basis of information related to a first operation on the selection target by an operating entity; setting, by the processor, an operation region corresponding to the selected selection target in a first object; and controlling, by the processor, selection of the operation target on the basis of information related to a second operation on the operation region set in the first object by the operating entity.
In addition, according to the present disclosure, there is provided a program causing a computer to function as: a first selecting unit configured to control selection of a selection target including an operation target on the basis of information related to a first operation on the selection target by an operating entity; a setting unit configured to set an operation region corresponding to the selected selection target in a first object; and a second selecting unit configured to control selection of the operation target on the basis of information related to a second operation on the first object in which the operation region is set by the operating entity.
As described above, according to the present disclosure, it is possible to select an operation target more easily.
Note that the effects described above are not necessarily limitative. With or in the place of the above effects, there may be achieved any one of the effects described in this specification or other effects that may be grasped from this specification.
Hereinafter, (a) preferred embodiment(s) of the present disclosure will be described in detail with reference to the appended drawings. Note that, in this specification and the appended drawings, structural elements that have substantially the same function and structure are denoted with the same reference numerals, and repeated explanation of these structural elements is omitted.
Further, the description will proceed in the following order.
The information processing system 1A according to the present embodiment is applied to an arbitrary space (a space 2 in
Further, such an operation target may be a physical object such as a button, a switch, a lever, a knob, or the like installed on a wall surface or the like of the space 2 or may be the wall surface of the space 2 itself. Further, in a case in which the operation target is a virtual object, a shape, a size, a position, a type, and the like of the operation target are not particularly limited as long as it is a display form of accepting an input on the basis of an operation of the user. The respective devices will be described below.
The information processing device 10 is a device having an information processing function for acquiring detection information obtained from the object detecting device 20 and performing a predetermined control process based on the detection information. The information processing device 10 may include a processing circuit, a storage device, a communication device, and the like. The information processing device 10 can be realized by any device such as a personal computer (PC), a tablet, or a smartphone. Further, as illustrated in
As illustrated in
The control unit 100 controls overall operation of the information processing device 10 according to the present embodiment. The function of the control unit 100 is realized by a processing circuit such as a central processing unit (CPU) included in the information processing device 10. Further, the control unit 100 has functions realized by respective functional units illustrated in
The communication unit 110 is a communication device included in the information processing device 10, and carries out various types of communications with an external device via a network (or directly) in a wireless or wired manner. The function of the communication unit 110 is realized by a communication device included in the information processing device 10. Specifically, the communication unit 110 is realized by a communication device such as a communication antenna and a radio frequency (RF) circuit (wireless communication), an IEEE 802.15.1 port and a transmission/reception circuit (wireless communication), an IEEE 802.11b port and a transmission/reception circuit (wireless communication), or a local area network (LAN) terminal and a transmission/reception circuit (wired communication). For example, as illustrated in
The storage unit 120 is a storage device included in the information processing device 10, and stores information acquired by the communication unit 110, information obtained by processes of the respective functional units of the control unit 100, and the like. The storage unit 120 is realized by, for example, a magnetic recording medium such as a hard disk, a non-volatile memory such as a flash memory, or the like. Further, the storage unit 120 appropriately outputs stored information in response to a request from each functional unit included in the control unit 100 or from the communication unit 110. For example, the storage unit 120 may store a pattern such as a form or the like of a hand related to a gesture by the user in advance and may output the pattern such as the form or the like of the hand to the control unit 100 when the control unit 100 recognizes the detected gesture.
The object detecting device 20 is an example of a detecting device that detects a form of the body of the user. The object detecting device 20 according to the present embodiment detects an object (detected body) such as the body of the user and generates three-dimensional position information of the detected body as detection information. The generated three-dimensional position information is output to the information processing device 10 via the network NW (or directly). Further, as illustrated in
The object detecting device 20 according to the present embodiment can be realized by a depth sensor. Further, the object detecting device 20 may be realized by, for example, a stereo camera or the like. Further, the object detecting device 20 may be realized by a sensor capable of performing distance measurement using an infrared sensor, a time of flight (TOF) type sensor, an ultrasonic sensor, or the like or may be realized by a device which projects an IR laser pattern. In other words, the object detecting device 20 is not particularly limited as long as it can detect the position of the body of the user U1 in the space 2.
Further, in another embodiment, the information processing system 1A may be equipped with a device capable of detecting a line of sight or the like of the user instead of the object detecting device 20. Such a device may be realized, for example, by an image recognition sensor or the like capable of identifying the line of sight or the like of the user by image recognition for an image obtained by imaging a face of the user.
Further, in the example illustrated in
Further, in another embodiment, the object detecting device 20 may be a wearable device worn on the head, the arm, or the like of the user. Various types of inertial sensors or mechanical sensors such as a gyro sensor, a potentiometer, and an encoder are installed in such wearable devices, and the form or the like of the body of the user may be detected by each sensor. Further, in another embodiment, a marker may be installed on the head or the upper limb of the user, and the object detecting device 20 may detect the form or the like of the body of the user by recognizing such a marker.
The display device 30 is a device which is arranged in the space 2 to which the information processing system 1A is applied, and displays a predetermined screen and information output from the information processing device 10 via the network NW (or directly) in a display region 31. Although the details will be described later, the display in the display region of the display device 30 according to the present embodiment is controlled, for example, by the information processing device 10.
For example, as illustrated in
In the information processing system 1A, a virtual object which is an operation target displayed in the display region 31 of the display device 30 is specified by a gesture, a pointing operation, or the like using the hand of the user U1. Accordingly, the operation target is selected by the user U1.
However, in a case in which it is desired to select an operation target which is located remotely by a form or a motion of the hand or the like such as a gesture or a pointing operation, it may be difficult to select the operation target accurately. For example, it is difficult to accurately select the operation target due to an error caused by recognition accuracy of a sensor that detects the form or the motion of the hand, a drift phenomenon in a case in which a gyro sensor or the like is used, or the like. In other words, in order to select the operation target accurately, a strict operation of the form or the motion of the hand by the user is required. However, in this case, since the user performs an operation in a state in which the hand floats in the air, a physical burden on the user increases.
In this regard, the present disclosure proposes technology that enables the user to more easily select the operation target. Specifically, in the present disclosure, a selection target including an operation target is selected by a first operation, and an operation region corresponding to the selected selection target is set in an object such as the hand. Further, in the present disclosure, the operation target is selected by a second operation on the object in which the operation region is set. With the technology, it is possible to perform the operation on the operation target with an object (for example, a part of the body of the user) which is more easily operated. Therefore, even when the operation target is located remotely, the user can more easily select the operation target. Therefore, the selection accuracy of the operation target can be improved, and the operation burden on the user can be reduced.
The information processing system 1A according to the present embodiment will be described below in detail. Further, in the present embodiment, as will be described later, the description will proceed under the assumption that the user selects the selection target including the operation target displayed in the display region through an operation (a first operation) of holding the left hand which is an example of the first object over the display region, sets the operation region corresponding to the selection target in the left hand of the user, and performs a selection operation for the operation region set in the left hand through an operation (a second operation) by the right hand of the user which is an example of the second object.
Here, in the present embodiment, the selection target is constituted by a region including at least one operation target and can be a selection target of the first operation to be described later. In the present embodiment, an example of selecting one operation target from an operation target group including a plurality of operation targets will be described.
Further, the left hand according to the present embodiment is an example of the first object, but the first object is not limited to this example. For example, the first object may be a part of the body of the user which is an operating entity or an object which is attached to or worn on a part of the body of the user (for example, clothing, a wristband, a glove, or the like) in addition to the hand. Further, the first object may be a portable object (a sheet of paper, a board, miscellaneous goods, or the like) owned or used by the user, or a fixed object (for example, a desk, a wall, or the like) which can be touched by the user.
An example of a configuration and a function of the control unit 100 according to the first embodiment of the present disclosure will be described.
The operated position estimating unit 101 has a function of estimating an operated position (a pointing position in the present embodiment) in the display region 31 on the basis of the detection information generated by detecting the object by the object detecting device 20. The pointing position here means a position of a part over which the user holds the left hand in the display region 31. The operation of holding the left hand over something is an example of the first operation, and the pointing position is estimated by the first operation. An intersection point between the extension line of the arm and the left hand and the display region 31 when the user stretches out his/her arm and holds the left hand over it is the pointing position.
The operated position estimating unit 101 according to the present embodiment recognizes the position of the left hand of the user and a direction of the left hand on the basis of three-dimensional position information of the left hand generated by a depth sensor which is an example of the object detecting device 20. Further, the operated position estimating unit 101 estimates the pointing position on the basis of the recognized position and the direction of the left hand and the position of the display region 31. A known technique can be used to estimate the pointing position. For example, a technique disclosed in JP 2013-205983A may be used for estimating the pointing position. Such a pointing position can be estimated as, for example, coordinates in a plane coordinate system of the display region 31. Information related to the estimated pointing position is output to the first selecting unit 103.
Further, the pointing position according to the present embodiment is estimated on the basis of the position of the left hand held over the display region 31, but the present technology is not limited to this example. For example, the pointing position may be estimated on the basis of the form of the body of the user. The form of the body of the user includes, for example, pointing by the left hand (or the right hand), an attitude of the body of the user, the line of sight of the user, or the like in addition to the form of holding the left hand over something.
Further, in a case in which the pointing device described above is used as the object detecting device 20, a position pointed at by the pointing device may be estimated. The pointing using the pointing device is an example of the first operation. Further, in a case in which the pointing position is estimated by the pointing device, the operated position estimating unit 101 need not necessarily be installed.
Further, although the pointing position according to the present embodiment is an example of the operated position, the present technology is not limited to such an example. For example, the operated position may be an operated position specified by a gesture or the like using the hand in addition to the pointing position by the pointing operation using the hand. More specifically, a position at which at least one operation target group is located (or can be displayed) is stored in advance as a candidate of the operated position, and the operated position estimating unit 101 may estimate the operated position by a recognition result of a gesture or the like using the hand (that is, change an operation target group desired to be selected). Such a gesture is an example of the first operation and can be recognized on the basis of the detection information related to the form or the motion of the hand detected by the object detecting device 20 or the like.
The operation recognizing unit 102 has a function of recognizing an operation by the user on the basis of the detection information generated by detecting the object by the object detecting device 20. Here, the recognition of the operation means recognition of a form and a motion of a part of the body of the user. For example, the operation recognizing unit 102 according to the present embodiment recognizes the operation by the user on the basis of the detection information related to the form and the motion of the hand of the user detected by the object detecting device 20. A known technique can be used to recognize such an operation. The operation by the user will be described in detail later but may include a touch operation or a proximity operation of the right hand with respect to the left hand. Further, the operation by the user may include a gesture by the left hand or the right hand. A specific example of the operation by the user will be described later.
Information related to the recognition result of the operation by the gesture of the user by the operation recognizing unit 102 is output to the first selecting unit 103 and the second selecting unit 105.
The first selecting unit 103 has a function of controlling selection of the selection target on the basis of information related to the first operation on the selection target. The first operation according to the present embodiment corresponds to the operation of holding the left hand over something (the operation using the first object), and the information related to the first operation may include, for example, the information related to the pointing position described above. The first selecting unit 103 according to the present embodiment may select the operation target group, for example, on the basis of the information related to the pointing position. More specifically, the first selecting unit 103 may select the operation target group corresponding to the pointing position in a case in which there is a pointing position within a region constituting the operation target group.
Information related to control of the selection by the first selecting unit such as the selected operation target group is output to the setting unit 104. Further, the information can be output to the display control unit 106.
Further, the first selecting unit 103 may perform control for deciding the selection of the operation target group corresponding to the pointing position on the basis of the recognition result of the operation output from the operation recognizing unit 102. In this case, information related to the operation target group which is decided to be selected can be output to the setting unit 104.
Further, as will be described in detail later, the first selecting unit 103 performs control for holding a selection state of the operation target group corresponding to the pointing position (so-called lock control) on the basis of the recognition result of the operation output from the operation recognizing unit 102. Accordingly, it is possible to prevent frequent switching of the selection of the operation target group due to shaking of the left hand. Therefore, the burden on the user in the operation can be reduced, and the operability can be further improved.
Further, in a case in which only one operation target is included in the operation target group, the first selecting unit 103 may perform control for directly selecting the operation target at this time point. Accordingly, processes by the setting unit 104 and the second selecting unit 105 to be described later can be omitted, and thus the burden related to the operation of the user can be reduced.
Further, the information related to the first operation according to the present embodiment is the information related to the pointing position, but the present technology is not limited to such an example. The information related to the first operation is not particularly limited as long as it is information related to the operated position obtained by the operated position estimating unit 101.
The setting unit 104 has a function of setting the operation region corresponding to the selection target selected by the first selecting unit 103 in the first object (the left hand in the present embodiment). Here, the setting of the operation region means assigning (setting) a region corresponding to at least one operation target included in the operation target group which is an example of the selection target to (in) a part of the left hand. In other words, the operation region means a region including each of regions corresponding to the operation targets allocated to respective parts of the left hand. Further, a method of setting the operation region will be described later.
The information related to the operation region set in the first object is output to the second selecting unit 105. Further, the information can be output to the display control unit 106.
The second selecting unit 105 has a function of controlling the selection of the operation target on the basis of the information related to the second operation for the first object (left hand in the present embodiment) in which the operation region is set. Here, the second operation is an operation using the right hand which is an example of the second object in the present embodiment. In other words, the information related to the second operation may include information such as a gesture recognized by the operation recognizing unit 102 on the basis of the detection information obtained by detecting a form or a motion of the right hand. Further, the second object is not particularly limited as long as it is an object different from the first object. Further, the second operation may include, for example, a touch operation or a proximity operation on the left hand which is an example of the first object. A specific example of the second operation will be described later.
The second selecting unit 105 according to the present embodiment specifies the operation on the region corresponding to the operation target assigned to the left hand on the basis of the information related to the operation using the right hand on the left hand (that is, the recognition result of the operation by the user), and selects the operation target corresponding to such a region. In other words, the second selecting unit 105 specifies the operation (for example, touch or proximity) of the right hand on any one region assigned to the left hand on the basis of the above-described recognition result, and selects the operation target corresponding to the specified region.
Accordingly, it is possible to more reliably select the operation target in which it is difficult to select in the pointing operation. Further, as the selection control based on the operation using the body of the user such as the operation using the right hand on the left hand is performed, the user can receive tactile feedback related to the selection operation. Accordingly, it is possible to realize an operation using a somatic sensation of the human body, and for example, even when the user does not see the hand, it is possible to perform a sensory operation. Therefore, the operability can be improved.
The information related to the control of the selection by the second selecting unit 105 may be output to the display control unit 106.
The display control unit 106 has a function of controlling display of selection by at least one of the first selecting unit 103 and the second selecting unit 105. The display control unit 106 according to the present embodiment controls the display of the selection for the display region 31 of the display device 30. For example, in a case in which the selection of the operation target group is decided by the first selecting unit 103, the display control unit 106 may control display of an object for presenting the operation target group which is decided to be selected. Similarly, in a case in which the operation target is selected by the second selecting unit 105, the display control unit 106 may control display of an object for presenting the selected operation target.
Further, the display control unit 106 may control display of the operation region set by the setting unit 104. For example, the display control unit 106 may control display of each region which is assigned to the left hand in association with each operation target. Accordingly, the user can check which region is allocated to which part of the left hand. Further, the display of the operation region may be performed in the display region 31 in the present embodiment but may be performed on the left hand (that is, the first object) of the user.
An example of display control by display control unit 106 will be described in detail in “Process example” and “Display control example.”
The configuration example of the control unit 100 according to the present embodiment has been described above.
Next, a flow of a process by the information processing system 1A according to the present embodiment will be described with reference to
The process by the information processing system 1A according to the present embodiment is carried out in two stages. The first stage is a process of selecting the selection target. The second stage is a process of setting the operation region corresponding to the selection target and selecting the operation target from the operation on the left hand in which the operation region is set. First, the process of the first stage by the information processing system 1A according to the present embodiment will be described with reference to
Referring to
Then, the operated position estimating unit 101 estimates the pointing position which is the operated position on the basis of the three-dimensional position information of the left-hand included in the detection information and the position information of the display region 31 (step S105). Then, the first selecting unit 103 specifies the operation target group which is the selection target from the pointing position (step S107).
Then, the display control unit 106 performs display related to the specified operation target group in the display region 31 (step S109). Here, an example of the display related to the operation target group by the display control unit 106 will be described with reference to
Referring to
Further, as illustrated in
The example of the display control by the display control unit 106 is not limited to the examples illustrated in
The process example of the first stage of the information processing system 1A will be described with reference back to
Here, the predetermined gesture may be, for example, a gesture by the left hand, a gesture by the right hand, or a gesture using a part of the body. Specifically, the predetermined gesture may be a motion of closing and clenching the left hand which is held over, a motion of tapping the left hand which is held with the right hand. Further, in the present embodiment, the selection of the selection target is decided by detecting a predetermined gesture, but the present technology is not limited to this example. For example, the selection of the selection target is decided on the basis of an operation using a physical switch such as a toggle switch of a remote controller connected to the network NW illustrated in
If the selection of the selection target is decided, the process of the first stage of the information processing system 1A according to the present embodiment ends. In the process of the first stage, since the selection target including the operation target can be constituted by a region larger than the operation target, the pointing by the operation such as the gesture of the user is easy. Therefore, even when the operation target is not selected directly in the first stage, the operation region corresponding to the selection target including the operation target is set in the left hand in the second stage to be described later. Therefore, it is possible to reliably select the operation target in the second stage to be described below.
Next, the process of the second stage by the information processing system 1A according to the present embodiment will be described with reference to
Referring to
Further, the setting of the operation region by the setting unit 104 is not limited to the example illustrated in
The process example of the second stage of the information processing system 1A will be described with reference back to
Then, the display control unit 106 performs display related to the selected operation target in the display region 31 (step S205). Here, an example of the display related to the selected operation target by the display control unit 106 will be described with reference to
The user U1 is assumed to perform the tap operation on the region 1101B allocated to the left hand H1 using a right hand H2. In this case, the operation target object 1001B corresponding to the region 1101B is selected by the second selecting unit 105 on the basis of information related to the tap operation. At this time, the display control unit 106 may highlight the operation target object 1001B with a highlight 1021 to indicate that the operation target object 1001B is selected. Accordingly, the user U1 can recognize that the operation target object 1001B is selected.
Further, similarly to the display control in the process example of the first stage illustrated in
The process example of the second stage of the information processing system 1A according to the present embodiment has been described above. In the information processing system 1A according to the present embodiment, in the first stage, the selection target including the operation target can be selected, and in the second stage, the operation target can be selected by the operation on the left hand in which the operation region corresponding to the selection target is set. As described above, the operation target which is located remotely is selected by hand or the like, and thus it is possible to more reliably select the operation target which is located remotely and reduce the burden on the body related to the remote control of the user. Therefore, the user can more easily select the operation target. Further, the operation target and the selection target according to the present embodiment are assumed to be located remotely from the user, but the present technology is not limited to such an example. For example, the present system is also applicable to an operation target and a selection target which the user can touch directly. More specifically, in a case in which the selection target is installed in a wall body, when the user touches the selection target by the left hand, the operation region corresponding to the selection target may be set in the left hand by the setting unit 104. Accordingly, the user can move away from the selection target while holding the operation region on the left hand.
Next, a modified example of the process in the information processing system 1A according to the present embodiment will be described.
In the process of the first stage, the first selecting unit 103 performs control (so-called lock control) for holding the selection state of the operation target group corresponding to the pointing position on the basis of a recognition result of an operation (third operation) output from the operation recognizing unit 102. Accordingly, it is possible to prevent the frequent switching of the selection of the operation target group by shaking of the left hand. Therefore, the burden on the user in the operation can be reduced. The flow of the lock process will be specifically described below.
Referring to
Then, the operated position estimating unit 101 estimates the pointing position which is the operated position on the basis of the three-dimensional position information of the left-hand included in the detection information and the position information of the display region 31 (step S305). Then, the first selecting unit 103 specifies the operation target group which is the selection target from the pointing position (step S307). Then, the display control unit 106 performs the display related to the specified operation target group in the display region 31 (step S309).
Then, in the state in which the operation target group is selected, it is determined whether or not a predetermined operation (third operation) related to the lock control is detected (step S311). In a case in which a predetermined operation is detected (YES in step S311), the first selecting unit 103 performs the lock control for holding the state in which the operation target group is selected (step S313). Accordingly, for example, even in a case in which the pointing position deviates from the region constituting the operation target group, the operation target group is continuously in the selected state. Therefore, since the user need not maintain the pointing position continuously, the burden on the user decreases.
Then, in a case in which the state in which the operation target group is selected is held, it is determined whether or not a release operation (fourth operation) of the lock control is detected (step S315). In a case in which the release operation of the lock control is detected (YES in step S315), the first selecting unit 103 releases the lock control, and the process of step S303 is performed again. On the other hand, in a case in which the release operation of the lock control is not detected (NO in step S315), the first selecting unit 103 decides the selection of the operation target group (step S317).
Further, for example, operations shown illustrated in Table 1 below are included in the third operation which is the operation related to the lock control and the fourth operation which is the release operation of the lock control. For example, an operation of pinching with the fingers of the right hand in a case in which the left hand is held over the display region 31, an operation using a physical switch such as a toggle switch of a remote controller connected to the network NW illustrated in
Further, the operation related to the lock control and release operation corresponding thereto are described in each row of Table 1, but each operation need not be associated necessarily. For example, the operation related to the lock control may be an operation using the right hand, and the operation of releasing the lock control may be an operation using the head. Further, the lock control is not limited to control which is performed on the basis of a recognition result of a gesture or the like by the operation recognizing unit 102 but may be performed on the basis of a detection result of a gesture or the like detected by another sensor (not illustrated). Further, each operation illustrated in Table 1 is merely an example, and it is not particularly limited as long as it is an operation which can be detected by a known sensing technique.
As illustrated in
Then, the setting unit 104 sets an operation region 1102 corresponding to the selection target object 1003 in the back of the left hand H1 of the user U1. In other words, the setting unit 104 assigns regions corresponding to the selector switch object 1003A and the slider object 1003B to the back of the left hand H1. In this case, as illustrated in
In this regard, the display control unit 106 may control the display related to the setting state of the operation region 1102 in the left hand H1. Specifically, as illustrated in
Further, although the selector switch and the slider have been described as the modified example of the operation target, the operation target is not limited to this example. As long as it is a display form in which an input based on the operation of the user is accepted, a shape, a size, a position, a type, and the like of the operation target are not particularly limited. Further, in the present modified example, the display of the selector switch object 1003A and the slider object 1003B which are the operation targets is different from the display of the selector switch object 1022A and the slider object 1022B corresponding to the operation region, but these displays may be identical to each other. In other words, the display of the selector switch object 1003A and the slider object 1003B which are the operation targets may be directly controlled as the display of the selector switch object 1022A and the slider object 1022B corresponding to the operation region.
As illustrated in
In this regard, the display control unit 106 may control the display related to the setting state of the operation region 1103 in the left hand H1. Specifically, as illustrated in
Further, in the second modified example and the third modified example, it is assumed that the object corresponding to the left hand H1 can be displayed in the display region 31, but the present technology is not limited to this example. In other words, at the time of the second operation, control for displaying the object corresponding to the left hand H1 (first object) in the display region 31 may not be performed.
In the process of the second stage of the information processing system 1A described above, the second selecting unit 105 controls the selection of the operation target on the basis of the information related to the touch operation such as the tap operation of the right hand on the left hand, but the present technology is not limited to such an example. For example, the second selecting unit 105 may control the selection of the operation target on the basis of the information related to a proximity operation such as a hover operation. For example, information related to the proximity operation may be obtained on the basis of a recognition result by the operation recognizing unit 102.
First, the setting unit 104 sets the operation region corresponding to the operation target group in the left hand (step S401). Then, it is determined whether or not the approach of the right hand (that is, the hover operation) to the region corresponding to the operation target assigned to the left hand is detected (step S403). In a case in which the approach of the right hand to the region is detected (YES in step S403), the second selecting unit 105 selects the operation target corresponding to the approached region. Then, the display control unit 106 controls display related to the operation target corresponding to the approached region (step S405).
As illustrated in
The user U1 causes the finger of the right hand H2 to approach the region 1101B assigned to the left hand H1 (hover operation). In this case, the second selecting unit 105 specifies the operation target object 1001B corresponding to the region 1101B on the basis of information related to the hover operation. At this time, since the display control unit 106 can cause a highlight 1026 to be displayed on the operation target object 1001B in order to indicate that the operation target object 1001B is specified by the hover operation. Accordingly, the user U1 can recognize that the operation target object 1001B is specified by the hover operation.
Then, it is determined whether or not it is detected that the right hand H2 is away from the region which it has approached once (step S407). Here, in a case in which it is not detected that the right hand H2 is away from the region which it has approached once (NO in step S407), the second selecting unit 105 controls the selection of the operation target on the basis of information related to the tap operation of the right hand H2 on the left hand H1 based on the recognition result by the operation recognizing unit 102 (step S409). Here, in a case in which there is a tap operation of the right hand H2 on the left hand H1 (YES in step S409), the second selecting unit 105 selects the operation target corresponding to the region including a tapped part of the left hand H1. Then, the display control unit 106 displays display related to the selected operation target in the display region 31 (step S411).
As described above, since the feedback of the display in the case of approaching the region corresponding to the operation target by the hover operation is given, it is possible to more reliably perform an operation of selecting a desired operation target.
Further, the second operation according to the present embodiment may include the touch operation and the proximity operation. The touch operation may include, for example, a turning operation, a slide operation, a flick operation, a drag operation, a pinch operation, a knob operation, and the like. Further, the proximity operation may include a gesture using a finger nearby the first object such as the left hand in addition to the hover operation. Further, in the present embodiment, the second operation is the operation by the right hand, but the present technology is not limited to such an example. For example, the second operation may be an operation by a part of the body other than the right hand or an operation by another device such as a remote controller. As the second operation is performed by the right hand, it is possible to perform an operation using a somatic sensation of the body. Accordingly, the operability can be improved.
The first embodiment of the present disclosure has been described above.
Next, a second embodiment of the present disclosure will be described.
The operating body detecting device 40 is an example of a detecting device used for detecting the operating body. The operating body detecting device 40 according to the present embodiment generates operating body detection information related to a hand H1 of the user U1 which is an example of the operating body. The generated operating body detection information is output to the information processing device 10 via the network NW (or directly). Further, as illustrated in
The operating body detection information includes, for example, information related to a position of the detected operating body in (a local coordinate system or a global coordinate system of) a three-dimensional space. In the present embodiment, the operating body detection information includes information related to the position of the operating body in a coordinate system of the space 2. Further, the operating body detection information may include a model or the like generated on the basis of a shape of the operating body. As described above, the operating body detecting device 40 acquires information related to an operation which the user U1 performs on the operating body detecting device 40 as the operating body detection information.
The operating body detecting device 40 according to the present embodiment can be realized by an infrared irradiation light source, an infrared camera, and the like. Further, the operating body detecting device 40 may be realized by any of various types of sensors such as a depth sensor, a camera, a magnetic sensor, and a microphone. In other words, the operating body detecting device 40 is not particularly limited as long as it can acquire a position, a form, and/or the like of the operating body.
Further, in the example illustrated in
The operating body detection information generated by the operating body detecting device 40 is transmitted to the information processing device 10. The control unit 100 acquires the operating body detection information via the communication unit 110. The operated position estimating unit 101 according to the present embodiment estimates the operated position on the basis of the operating body detection information. Since the details of the estimation processing are similar to those of the first embodiment, description thereof is omitted.
As described above, the operated position such as the pointing position is detected by the operating body detecting device 40 according to the present embodiment, and thus it is possible to more certainly specify the operated position. Further, since the operating body detecting device 40 according to the present embodiment is installed on the workbench, an operation can be performed without causing the hand H1 to float. Therefore, the burden of the operation of the hand H1 by the user U1 can be reduced.
The second embodiment of the present disclosure has been described above.
Next, a third embodiment of the present disclosure will be described.
The operated device 50 is a device controlled by an operation on the operation target selected by the user U1. In
As illustrated in
The device control unit 107 has a function of controlling the operated device 50 corresponding to the operation target selected by the second selecting unit 105. For example, in a case in which the operation target is a switch, and the switch is selected by the second selecting unit 105, the device control unit 107 performs control related to switching of the switch for the operated device 50 corresponding to the switch. In a case in which the operated device 50 is a lighting device illustrated in
Further, in a case in which the state of the operation target can be changed at the same time as the selection of the operation target by the second operation like the selector switch object 1003A or the slider object 1003B illustrated in
Further, in the present embodiment, control of display of selection by the display control unit 106 can be controlled, but the present technology is not limited to this example. For example, the function related to the display control unit 106 need not be necessarily set if feedback by display of selection for the user is unnecessary.
Next, the hardware configuration of an information processing apparatus 900 according to an embodiment of the present disclosure is described with reference to
The information processing apparatus 900 includes a central processing unit (CPU) 901, read-only memory (ROM) 903, and random-access memory (RAM) 905. In addition, the information processing apparatus 900 may include a host bus 907, a bridge 909, an external bus 911, an interface 913, an input apparatus 915, an output apparatus 917, a storage apparatus 919, a drive 921, a connection port 925, and a communication apparatus 929. In conjunction with, or in place of, the CPU 901, the information processing apparatus 900 may have a processing circuit called a digital signal processor (DSP) or application specific integrated circuit (ASIC).
The CPU 901 functions as an arithmetic processing unit and a control unit, and controls the whole operation in the information processing apparatus 900 or a part thereof in accordance with various programs recorded in the ROM 903, the RAM 905, the storage apparatus 919, or a removable recording medium 923. The ROM 903 stores programs, operation parameters, or the like used by the CPU 901. The RAM 905 temporarily stores programs used in the execution by the CPU 901, parameters that vary as appropriate in the execution, or the like. For example, the CPU 901, the ROM 903, and the RAM 905 may realize the functions of the control unit 100 in the foregoing embodiment. The CPU 901, the ROM 903, and the RAM 905 are connected with each other via the host bus 907 that includes an internal bus such as a CPU bus. Furthermore, the host bus 907 is connected to the external bus 911 such as peripheral component interconnect/interface (PCI) bus via the bridge 909.
The input apparatus 915 is, in one example, an apparatus operated by a user, such as a mouse, a keyboard, a touch panel, a button, a switch, and a lever. The input apparatus 915 may be, in one example, a remote control apparatus using infrared rays or other radio waves, or may be externally connected equipment 927 such as a cellular phone that supports the operation of the information processing apparatus 900. The input apparatus 915 includes an input control circuit that generates an input signal on the basis of the information input by the user and outputs it to the CPU 901. The user operates the input apparatus 915 to input various data to the information processing apparatus 900 and to instruct the information processing apparatus 900 to perform a processing operation.
The output apparatus 917 includes an apparatus capable of notifying visually or audibly the user of the acquired information. The output apparatus 917 may be a display apparatus such as a liquid crystal display (LCD), a plasma display panel (PDP), and an organic electro-luminescence display (OELD), an audio output apparatus such as a speaker and a headphone, as well as printer apparatus or the like. The output apparatus 917 outputs the result obtained by the processing of the information processing apparatus 900 as a video such as a text or an image, or outputs it as audio such as a speech or sound.
The storage apparatus 919 is a data storage apparatus configured as an example of a storage portion of the information processing apparatus 900. The storage apparatus 919 includes, in one example, a magnetic storage unit device such as hard disk drive (HDD), a semiconductor storage device, an optical storage device, and a magneto-optical storage device. The storage apparatus 919 stores programs executed by the CPU 901, various data, various types of data obtained from the outside, and the like.
The drive 921 is a reader-writer for a removable recording medium 923 such as a magnetic disk, an optical disk, a magneto-optical disk, and a semiconductor memory, and is incorporated in the information processing apparatus 900 or externally attached thereto. The drive 921 reads the information recorded on the loaded removable recording medium 923 and outputs it to the RAM 905. In addition, the drive 921 writes a record in the loaded removable recording medium 923. At least one of the storage apparatus 919, or the drive 921 and the removable recording medium 923 may realize the functions of the storage unit 120 in the foregoing embodiment.
The connection port 925 is a port for directly connecting equipment to the information processing apparatus 900. The connection port 925 may be, in one example, a Universal Serial Bus (USB) port, an IEEE 1394 port, or a Small Computer Device Interface (SCSI) port. In addition, the connection port 925 may be, in one example, an RS-232C port, an optical audio terminal, or High-Definition Multimedia Interface (HDMI, registered trademark) port. The connection of the externally connected equipment 927 to the connection port 925 makes it possible to exchange various kinds of data between the information processing apparatus 900 and the externally connected equipment 927.
The communication apparatus 929 is, in one example, a communication interface including a communication device or the like, which is used to be connected to the communication network NW. The communication apparatus 929 may be, in one example, a communication card for wired or wireless local area network (LAN), Bluetooth (registered trademark), or wireless USB (WUSB). In addition, the communication apparatus 929 may be, in one example, a router for optical communication, a router for asymmetric digital subscriber line (ADSL), or a modem for various communications. The communication apparatus 929 transmits and receives signals or the like using a predetermined protocol such as TCP/IP, in one example, with the Internet or other communication equipment. In addition, the communication network NW connected to the communication apparatus 929 is a network connected by wire or wireless, and is, in one example, the Internet, home LAN, infrared communication, radio wave communication, satellite communication, or the like. Note that at least one of the connection port 925 or the communication apparatus 929 may realize the functions of the communication unit 110 in the foregoing embodiment.
The above illustrates one example of a hardware configuration of the information processing apparatus 900.
The preferred embodiment(s) of the present disclosure has/have been described above with reference to the accompanying drawings, whilst the present disclosure is not limited to the above examples. A person skilled in the art may find various alterations and modifications within the scope of the appended claims, and it should be understood that they will naturally come under the technical scope of the present disclosure.
Note that each of the steps in the processes of the information processing device in this specification is not necessarily required to be processed in a time series following the sequence described as a flowchart. For example, each of the steps in the processes of the information processing device may be processed in a sequence that differs from the sequence described herein as a flowchart, and furthermore may be processed in parallel.
Additionally, it is possible to create a computer program for causing hardware such as a CPU, ROM, and RAM built into an information processing device to exhibit functions similar to each component of the information processing device described above. In addition, a readable recording medium storing the computer program is also provided.
Further, the effects described in this specification are merely illustrative or exemplified effects, and are not limitative. That is, with or in the place of the above effects, the technology according to the present disclosure may achieve other effects that are clear to those skilled in the art from the description of this specification.
Additionally, the present technology may also be configured as below.
An information processing device, including:
a first selecting unit configured to control selection of a selection target including an operation target on the basis of information related to a first operation on the selection target by an operating entity;
a setting unit configured to set an operation region corresponding to the selected selection target in a first object; and
a second selecting unit configured to control selection of the operation target on the basis of information related to a second operation on the first object in which the operation region is set by the operating entity.
The information processing device according to (1), in which the information related to the first operation includes information obtained by estimating a form of a body of the operating entity facing the selection target.
The information processing device according to (1) or (2), in which the first operation includes an operation using the first object. (4)
The information processing device according to (3), in which the first operation includes an operation on a sensor configured to acquire a position, a form, and/or the like of the first object.
The information processing device according to (3) or (4), in which the first object is a part of a body of the operating entity.
The information processing device according to any one of (1) to (5), in which the first selecting unit holds a state in which the selection target is selected on the basis of information related to a third operation.
The information processing device according to (6), in which the first selecting unit releases the holding of the state in which the selection target is selected on the basis of information related to a fourth operation.
The information processing device according to any one of (1) to (7), in which the information related to the first operation includes information related to an operated position estimated by the first operation.
The information processing device according to any one of (1) to (8), in which the second operation includes an operation using a second object different from the first object.
The information processing device according to (9), in which the second operation includes an operation performed in a state in which the second object is caused to approach the first object.
The information processing device according to (9) or (10), in which the second operation includes an operation performed in a state in which the first object and the second object are in contact with each other.
The information processing device according to any one of (9) to (11), in which the second object is a part of a body of the operating entity.
The information processing device according to any one of (1) to (12), further including:
a display control unit configured to control display related to the selection by at least one of the first selecting unit or the second selecting unit.
The information processing device according to (13), in which the display control unit controls display related to a setting state of the operation region in the first object.
The information processing device according to any one of (1) to (14), further including:
a device control unit configured to control an operated device corresponding to the selected operation target.
The information processing device according to any one of (1) to (15), in which the second operation includes an operation for changing a state of the operation target.
The information processing device according to any one of (1) to (16), in which, in a case in which only one operation target is included in the selection target,
the first selecting unit selects the operation target.
The information processing device according to any one of (1) to (17), in which the operation target includes a virtual object displayed in a display region.
An information processing method, including:
controlling, by a processor, selection of a selection target including an operation target on the basis of information related to a first operation on the selection target by an operating entity;
setting, by the processor, an operation region corresponding to the selected selection target in a first object; and
controlling, by the processor, selection of the operation target on the basis of information related to a second operation on the operation region set in the first object by the operating entity.
A program causing a computer to function as:
a first selecting unit configured to control selection of a selection target including an operation target on the basis of information related to a first operation on the selection target by an operating entity;
a setting unit configured to set an operation region corresponding to the selected selection target in a first object; and
a second selecting unit configured to control selection of the operation target on the basis of information related to a second operation on the first object in which the operation region is set by the operating entity.
Number | Date | Country | Kind |
---|---|---|---|
2016-204952 | Oct 2016 | JP | national |
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/JP2017/030147 | 8/23/2017 | WO | 00 |