This application relates to and claims priority from Japanese Patent Application No. 2010-162417 filed on Jul. 20, 2010, the entire disclosure of which is incorporated herein by reference.
The present invention relates to an input apparatus, and an input method thereof, and it relates to an input apparatus for detecting a distance between an operating hand of a user and a screen operated by a sensor, thereby applying an operation instruction depending on a result of detection.
Conventionally, it is common that the user makes an operation upon channel/display through a remote controller, and/or makes an input of a command or data through an input device, such as, a keyboard, a mouse or a touch panel, etc., to an video apparatus, such as, a TV or recorder, or information processing equipments, such as a PC, etc.
Also, in recent years, due to an improvement of a technology of a sensor, in particular, in a field of game machines and/or portable equipments, there is applied a method for recognizing a movement of the user by a sensor, so as to determined an intention of the user depending on a result thereof; thereby operating the equipment.
In the Patent Document 1 is disclosed a video recognizing apparatus for recognizing a shape or operation of a hand or a finger, thereby determining an operation.
Within the video recognizing apparatus disclosed in the Patent Document 1, an operation surface is produced depending on the position of a user body, while a user gives an instruction to the apparatus, through the position or the movement of hands or fingers with respect to that operation surface. The operation surface mentioned above is a virtual operation surface, wherein an operator 102 can make an input operation, easily, by pushing out her/his hand(s) while assuming the virtual operation surface from a marker 101, or by moving her/his hand(s) to touch thereon while comparing or considering a part on the screen and the operation surface as the touch panel, in an engagement with a monitor 111. (paragraph number 0033)<
However, in the Patent Document 1, although consideration is paid on an operation upon the operation surface in parallel with the screen; however no consideration is paid on an operation in the direction perpendicular to that screen.
Then, the present invention, accomplished by taking such situations into the consideration thereof, an object thereof is to provide an input method and an input apparatus for enabling a user to make an operation, intuitively much more, by taking the movement in the direction perpendicular to the display screen, when the user makes an operation.
For accomplishing the object mentioned above, according to the present invention, there is provided an input method and a input apparatus described in the claims, which will be mentioned later, for example. Of such structures, in the input apparatus for executing an input operation in a contactless or contact-free manner, detection is made on a distance between a user's hand and a display screen to be operated, and an input operation is executed depending on that distance.
According to the present invention, the user can grasp her/his operation, institutively, such as, change of an operation target, which is executed depending on the distance between an operating hand of the user and the display screen, and therefore it is possible to input an operation as the user intends.
Those and other objects, features and advantages of the present invention will become more readily apparent from the following detailed description when taken in conjunction with the accompanying drawings wherein:
Hereinafter, embodiments according to the present invention will be fully explained by referring to the attached drawings.
Hereinafter, explanation will be mad on a first embodiment of the present invention, by referring to
First of all, explanation will be given on the input apparatus, according to an embodiment of the present invention, by referring to
The display screen 101 is a device for displaying video information to the user, upon basis of an operation input signal, which is given from an outside of the display screen, and is the apparatus having a display device, for example, a LCD (Liquid Crystal Display), a PDP (Plasma Display Panel), a liquid crystal projector, a laser projector or a rear projection, etc., and also a calculation processor device and a memory, as well, which are necessary for display processing, such as, video contents or GUI (Graphic User Interface), etc.
The sensing unit 102 is a unit for detecting the distance between the hand of the user and the sensor, and is built up with a sensor, such as, an infrared distance sensor, a laser distance sensor, an ultrasonic distance sensor, a distance video sensor or an electric field sensor, etc., a micro-computer for processing data, and software operating on that micro-computer. The sensor to be applied in the sensing unit 102 should not be limited to, in particular, and may have a function for converting a signal obtained for detecting the distance up to the hand of the user into distance data.
The user 103 is a user who makes an operation on the input apparatus 100.
The input apparatus 100, as is shown in
The system controller unit 200 has a distance detector unit 202 and an up/down operation detector unit 203.
The distance detector unit 202 extracts or classifies the distance, which is detected as an operation, from the distance data obtained from the sensing unit 102. The up/down operation detector unit 203 detects an operation of movement of a hand, up or down, by the user 103.
The system controller unit 200 detects the distance of the hand of the user 103, and executes data processing for detecting the operation of moving the hand up/down. The system controller unit 200 may be achieved by a CPU executing a software module memorized on a memory, or may be achieved by a hardware circuit for exclusive use thereof.
The single output unit 201 receives an instruction and data from the system controller unit 200, and outputs an operation input signal for indicating or instructing an operation to the display screen 101.
Next, explanation will be made on an operating method with using the input apparatus according to the first embodiment of the present invention, by referring to
As an operation image of the input apparatus 100, according to the first embodiment of the present invention, as is shown in
Next, explanation will be given about steps of a process for detecting an input operation made by the input apparatus 100, according to the first embodiment of the present invention, by referring to a flowchart shown in
The detecting process of the input operations is executed by the system controller unit 200 shown in
First of all, the system controller unit 200, starting detection of the position of the hand, responding to a predetermined user operation (step 500), executes a process for extracting or classifying the distance, which is detected as an operation, from the distance data obtained from the sensing unit 102 in the distance detector unit 202, and thereby detecting the distance of the hand. When the distance of the hand is detected (step 501), an operating area corresponding to the distance detected is obtained (step 502).
In case where the operating area where the hand locates is the home position (Yes: step 503), detection is continued of the distance of the hand. On the other hand, if the operating area where the hand locates is not the home position (No: step 503), firstly, it is confirmed that the operating area where the hand locates, which was detected in a previous detection, is the home position (Yes: step 504), and then detection is made of the operation if an upper direction or a lower direction, in the up/down operation detector unit (step 505). In this instance, if the operating area, which was detected in the previous detection, is not the home position (No: step 504), then detection is continued of the distance of the hand, as in those described from the step 501 and thereafter. Namely, detection is made on the operation, only when the operation area where the hand locates moves from the home position to other operating area.
When detection the operation of upper direction or the lower direction, the operation input signal for indication an operation on the display screen 101, responding to the operation detected, through the signal output unit 201.
When the user 103 shows an intention of ending the operation (step 507), through a predetermined operation, then the process is ended, and if not so, detection is continued of the distance of the hand, as in those described from the step 501 and thereafter.
In this manner, the input apparatus 100 detects the operation responding to change of the distance up to the hand, which is held up by the user 103 towards the display screen 101, and gives an instruction of operation to the display screen 101. With this, the user 103 can grasp the correspondence between the distance of the hand and the operation, intuitively, from the relationship of distance between the physical apparatus and the hand, and thereby is able to input the operation that the user 103 intends, smoothly.
Hereinafter, explanation will be given about a second embodiment according to the present invention, by referring to
The display controlling method of the input apparatus 100 according to the first embodiment is to provide an interface for executing the operation depending on change of the operating area where the hand locates. According to the present embodiment, in addition to the operation method of the first embodiment, there is further provided an interface for executing the operation depending on change of a relative distance between the hand and the input apparatus 100.
Also, the input apparatus 100 according to the present embodiment, as was shown in
First of all, explanation will be given about the operating method of the input apparatus 100 according to the second embodiment of the present invention, by referring to
As is shown in
As an operation image of the input apparatus 100, according to the second embodiment of the present invention, as is shown in
Next, explanation will be made about steps of a process for detecting an input operation made by the input apparatus 100, according to the second embodiment of the present invention, by referring to a flowchart shown in
The detecting process of the input operations is executed by the system controller unit 200 shown in
First of all, the system controller unit 200, starting detection of the position of the hand, responding to a predetermined user operation (step 800), executes a process for extracting or classifying the distance, which is detected as an operation, from the distance data obtained from the sensing unit 102 in the distance detector unit 202, and thereby detecting the distance of the hand. When the distance of the hand is detected (step 801), a position with respect to the operation criterion is obtained (step 802).
Next, the scale of the map is calculated from the relative position of the hand detected, to the operation criterion 600, in the signal output unit 201, and an operation input signal, indicating an operation to change the scale of the map, is outputted on the display screen 101.
When the user 103 shows an intention of ending the operation (step 804), through a predetermined operation, then process is ended, and if not so, detection is continued of the distance of the hand, as in those described from the step 801 and thereafter.
In this manner, the input apparatus 100, according to the second embodiment of the present invention, detects the position of the hand with respect to the operation criterion, depending on change of the distance up to the hand, which is held up by the user 103 towards the input apparatus 100, and the magnitude, quantity or length, etc., which can be defined by the position of the hand with respect to the operation criterion 600. With this, the user 103 is able to grasp the correspondence between the distance of her/his hand and the quantity, such as, the magnitude, length, depth, scale, etc., institutively, from a relationship between the physical apparatus and the hand, and thereby is able to input the operation that the user 103 intends, smoothly.
Also, the inputting operation mentioned above is effective for the operation on a menu made up with plural numbers of layers or hierarchies. As shown in
Hereinafter, explanation will be made of a third embodiment of the present invention, by referring to
The display controlling method of the input apparatus 100 according to the first embodiment is achieved for the purpose of providing an interface for executing the operation depending on change of the operating area where the hand locates. According to the present embodiment, in addition to the operation method of the first embodiment, there is further provided determination of a detection criterion for detecting the detection, depending upon a shape of the hand, when detecting the distance between the hand and the input apparatus 100.
Also, the input apparatus 100 according to the present embodiment, as will be shown in
The camera unit 1000 is a device for picking up an image of the hand of the user, and may be made up with, for example, an infrared camera having a TOF (Time Of Flight) sensor function, a stereo camera, an RGB camera, etc. The camera to be applied in the camera unit 1000 should not be limited to, in particular, but it may have a function of obtaining an image or picture picked up, for converting the picture into digital data.
The shape detector unit 1100 is a portion for detecting a predetermined shape of the hand, from the picked-up image or picture obtained from the camera unit 1000, wherein, for example, an image analyzing method may be applied, such as, a pattern matching, etc. The image analyzing method to be applied in the shape detector unit 1100 should not be restricted to, in particular, and may have a function of determining on whether there is the predetermined shape of the hand or not within the image obtained, and also a function of detecting the distance and the position of the hand.
First of all, explanation will be given about the detecting method of an operation on the input apparatus 100 according to the third embodiment of the present invention, by referring to
As is shown in
Next, explanation will be given about steps of a process for detecting an input operation by the input apparatus according to the third embodiment of the present invention, by referring to
The process for detecting the input operation is executed by the system controller unit 200 shown in
First of all, the system controller unit 200, starting detection of the position of the hand (step 500) responding to the predetermined user operation, detects the hand from the images obtained from the camera unit 1000 within the distance detector unit 202, and passing through the processes for extracting or classifying the distance to be detected as the operation, it detects the distance of the hand. When the distance of the hand is detected (step 501), a process is executed for detecting the predetermined shape 1200 of the hand (step 1300), within the shape detector unit 1100. If the predetermined shape of the hand is detected (Yes: step 1300), the detection criterion is determined, to be applied when detecting the distance of the hand, and thereafter the processes are executed, e.g., those following the step 502. On the other hand, if the predetermined shape of the hand is not detected (No: step 1300), then the detection criterion is not determined, and the processes are executed, e.g., those following the step 502. About the processes following the step 502 are same to those of the flowchart shown in
In this manner, the input apparatus 100 according to the third embodiment of the present invention determines the detection criterion 1201 depending on the shape of the hand, which is held up by the user 103 towards the input apparatus. With this, the user 103 can change the relative position between the hand and the operating area, at the timing intended, and therefore it is possible for the user 103 to input the operation at an arbitrary position with much more certainty.
Hereinafter, explanation will be made on a fourth embodiment according to the present invention, by referring to
The display controlling method of the input apparatus 100 according to the third embodiment is achieved for the purpose of enabling to change the relative distance between the hand and the operating area at the timing intended, within the operation explained in the first embodiment, by determining the detection criterion 1201 depending on the shape of the hand. According to the present embodiment, in addition to the operating method of the third embodiment, there is further provided a means for enabling the change of the relative position between the hand and an operation criterion 600 at the timing intended, within the operation explained in the second embodiment.
In the input apparatus 100 according to the present embodiment, too, as is shown in
First of all, explanation will be given about the method for detection the operation of the input apparatus 100 according to the fourth embodiment of the present invention, by referring to
As shown in
Next, explanation will be given about steps of a method for detecting the input operation by means of the input apparatus 100 according to the fourth embodiment of the present invention, by referring to
The process for detecting the input operation is executed by the system controller unit 200 shown in
First of all, the system controller unit 200, staring detection of the position of the hand (step 800) responding to the predetermined user operation, detects the hand from the images obtained from the camera unit 1000, within the distance detector unit 202, and after passing the processes for extracting or classifying the distance to be detected as the operation, it detects the distance of the hand. When the distance of the hand is detected (step 801), a process is executed for detecting the predetermined shape 1200 of the hand (step 1500), within the shape detector unit 1100. In this instance, if the predetermined shape of the hand is detected (No: step 1500), the process does not advance to the following steps, but continues only the detection of the hand. Namely, only the case when the predetermined shape 1200 of the hand is detected, the operation becomes effective. On the other hand, if the predetermined shape 1200 of the hand is detected (Yes: step 1500), confirmation is made on whether it is the predetermined shape of the hand or not, at the time when detecting the hand previously (step 1501), and if determined that it is not the predetermined shape of the hand at the time when detecting the hand previously (No: step 1501), then the detection criterion is determined, to be applied when detecting the distance of the hand, and the processes following the step 802 are executed. Also, if determined that it is the predetermined shape of the hand at the time when detecting the hand previously (Yes: step 1501), the detection criterion 1201 is not determined, but the processes following the step 802 are executed. About the processes following the step 802, they are same to those of the flowchart shown in
In this manner, the input apparatus 100 according to the fourth embodiment of the present invention determines the detection criterion 1201 depending on the shape of the hand, which is held up by the user 103 towards the input apparatus. Also, the operation becomes effective, only when the user holds up her/his hand in the predetermined shape thereof. With this, the user 103 can change the relative position between the hand and the operating area, at the timing intended, and further can make the operation only at the timing intended by the shape of the hand; therefore, it is possible for the user 103 to input the operation at an arbitrary position with much more certainty.
As was fully explained from the first embodiment through the fourth embodiment in the above, with the input apparatus and the input method according to the present invention, it is possible to grasp the operation corresponding, depending on the distance between the hand and the display screen, intuitively, and to improve the operability thereof.
Also, with the input apparatus and the input method according to the present invention, since the distance as the criterion is changed, dynamically, depending on the shape of the user's hand, when detecting the distance between the hand and display screen, there is no necessity of determining a timing for calibration on the way thereof, and thereby an improvement can be achieved of the operability.
The present invention may be embodied in other specific forms without departing from the spirit or essential feature or characteristics thereof. The present embodiment(s) is/are therefore to be considered in all respects as illustrative and not restrictive, the scope of the invention being indicated by the appended claims rather than by the forgoing description and range of equivalency of the claims are therefore to be embraces therein.
Number | Date | Country | Kind |
---|---|---|---|
2010-162417 | Jul 2010 | JP | national |