The present invention relates to an input device.
Personal computers or televisions, which accept user's (operator's) operation via a graphical user interface (which will be referred to as GUI, hereinafter) and simultaneously provide a feedback as a result of the user's operation to the user, have come into widespread use.
Patent Document 1 discloses a portable terminal which shows operation guidance information for assist user's operation. The user can execute a target function by moving her or his finger in an up, down, left or right direction according to the guidance.
Patent Document 2 discloses an interface device which displays a gesture image to visually indicate a recognition target of a gesture as user's operation. The user can conduct device operation while confirming the gesture image.
Patent Document 3 discloses a vehicle-mounted device which displays user's allowable operations and icons indicative of gestures corresponding to user's allowable operations. The user can easily know a gesture for the user to be conducted.
Patent Document 4 discloses an operation input device for vehicles which displays selection guide information indicative of the state of a user's hand placed on a steering wheel and an operation target device. The user can select a target operation device by referring to the guide and moving her or his hand to the target device.
Any of the aforementioned Patent Documents discloses the fact that a motion and/or pose for user's operation is displayed and the user can conduct an operation (action) relating to a predetermined device according to the motion or pose.
When the user tries to conduct a predetermined motion or take a predetermined pose for her or his intended operation, however, the user may erroneously conduct another unintended motion or take another unintended pose during user's motion or pose as far as the predetermined motion or pose. The unintended motion or pose may erroneously be recognized as an action for user's operation, As a result, the operation for the user not to intend may undesirably be carried out.
For example, the user tries to move her or his hand in a right direction for the purpose of moving a displayed content to the right direction, an action returning the hand located at the right side to a left direction may be erroneously recognized as an action moving the content to the left direction and be erroneously carried out.
In other words, any of the above Patent Documents pays no consideration to such a device that, when the user conducts a gesture for user's operation, the user can intuitively understand whether or not user's each motion is recognized.
In view of such circumstances, it is therefore an object of the present invention is to provide an input device having a good handleability which causes a user to know as to how user's motion is recognized by the input device to avoid execution of an unintended operation.
In accordance with the present invention, the above object is attained by providing an input device which includes an input unit for inputting operator's motion as an image signal, a motion detector for detecting the motion on the basis of the image signal inputted into the input unit, a display unit for displaying a graphical user interface, and a controller for causing the display unit to change a display of the graphical user interface according to the motion detected by the motion detector. The controller causes the display unit to display a motion synchronized with the motion detected by the motion detector and also to change the display of the graphical user interface.
In this case, the display of the graphical user interface is intended to move display positions of the plurality of selection items for the purpose of selecting desired one of a plurality of selection items, and the controller causes the display unit to display a motion synchronized with the motion detected by the motion detector and also to move the display positions of the plurality of selection items according to the motion.
In the input device, the controller causes the display unit to display an operator's motion which is to be desirably detected by the motion detector for the purpose of explaining the fact that what kind of motion results in what type of operation to the operator before actually conducting a motion.
In this connection, the display of the graphical user interface is used for the operator to move display positions of the plurality of selection items for the purpose of selecting desired one of the plurality of selection items, and the controller causes the display unit to display a motion which is necessary to move the display positions of the plurality of selection items and which is to be desirably detected by the motion detector.
According to the present invention, for example, when the user tries to conduct an operation for a device such as a television with a gesture, the user can know as to how a gesture being now conducted by the user is recognized and thereby conduct such a gesture as to carry out its intended operation.
Other objects, features and advantages of the invention will become apparent from the following description of the embodiments of the invention taken in conjunction with the accompanying drawings.
Explanation will be made in connection with embodiments to which the present invention is applied.
An input device 100 in accordance with the embodiment of the present invention detects an action of user's hand from an imaged dynamic picture image of the user and changes a display of a GUI according to the action.
The display unit 101, forms a display section of the input device 100, is a display such as a liquid crystal display or a plasma display. The display unit 101 has a display panel, a panel control circuit, and a panel control driver; and acts to display video data supplied from a video signal processor 202 on the display panel. The image pickup 102 is used to input a dynamic picture image into the input device 100, and a camera as an example can be uses as the image pickup. The user 103 conducts an operation for the input device 100. The operational guide 104 is a GUI displayed on the display unit 101 and has a letter or a diagram for explaining how to operate the input device 100 to the user 103.
The input device 100 includes a display unit 101, an image pickup 102, an action or motion detector 200, a controller 201, and a video signal processor 202, as shown, e.g., in
The action or motion detector 200 receives a dynamic picture image signal from the image pickup 102 and detects operator's action such as operator's extended hand, waving hand or rotating hand on the basis of the received dynamic picture image signal. The motion detector 200 further outputs a predetermined command corresponding to the detected action. The controller 201, which is, for example, a microprocessor, controls the operation of the video signal processor 202 according to the command received from the motion detector 200. The video signal processor 202 can be a processor such as ASIC, FPGA or MPU. The video signal processor 202 converts video data of GUI to data having a format processable by the display unit 101 under control of the controller 201 and then outputs the converted data.
Explanation will next be made as to the operation of the input device 100 by using
The input device 100 is used to detect an action of the hand of the user 103 on the basis of a signal of an imaged dynamic picture image of the user 103, and changes the display of the GUI according to the detected action.
Explanation will be first made as to a flow of operations of the input device 100 after the input device 100 detects an action of the hand of the user 103 until the input device displays a GUI according to the detected action, by referring to
The controller 201 when responding to the start of the operation instructs the image pickup 102 to start imaging a dynamic picture image. In response to the instruction, the image pickup 102 starts imaging the dynamic picture image, and output data on the imaged dynamic picture image to the motion detector 200. The motion detector 200 detects user's hand motion on the basis of the received dynamic picture image data by a method such as feature point extraction. When the motion detector determines that the hand's action follows a predetermined pattern, the motion detector outputs a command corresponding to the pattern to the controller 201. The controller 201 instructs the video signal processor 202 to display the GUI or to change the display according to the command. In response to the instruction, the video signal processor 202 changes the video signal as an output, which results in that the display of the GUI on the display unit 101 is changed.
Explanation will then be made as to how to display the operational guide 104 and the animation operational guide 400 when the user 103 operates the input device 100 with use of
First of all, the user 103 starts the operation of the input device 100 according to a predetermined procedure. This causes the controller 201 of the input device 100 to start recognition of operator hand's motion (gesture) according to the aforementioned procedure (step 500). An example of the state of the user 103 and the input device 100 is shown by “Normal State” in
As shown by “Start Operation Guide” in
Next, as shown by “Starting Operation” in
It is next assumed that the user 103 conducts the hand rotating action 303 according to the displayed animation operational guide 400 as shown by “Operational State” in
In this manner, the animation operational guide 400 is used as an asynchronized operational guide to cause the animation to be varied independently of user's action prior to the operation of the user in the step 505. In the step 507, when the user starts the operation, the animation operational guide 400 acts as a synchronized operational guide to cause the animation to be varied in synchronism with the operation of the user 103 (refer to
In this way, the input device 100 recognizes a hand moving action of the user 103, and displays the operational guide 104 or the animation operational guide 400 according to the recognized action to present an action (gesture) valid for the operation to the user 103. Thus, since the user 103 can know and confirm what action (gesture) at what timing results in a desired operation, the user can smoothly conduct such an operation as to display the menu or to select the menu items.
The asynchronized operational guide is useful for the user before operating the device to know or understand what action enables what operation by varying the animation. In the present embodiment, the operating wheel 301 and the selection items 302 are illustrated in the display screen. These wheel and items are useful especially for a user who does not understand what action is required for a picture, a figure or an image which can be moved by an operation.
Since the above synchronized operational guide displays the action of the user after her or his action to be conducted in the form of an animation and also causes an actual operation or no operation, the synchronized operational guide can make the user intuitively know whether or not each user's action was recognized. Thus, the user can modifies her or his action so as to avoid an unintended operation.
An embodiment 2 will next be explained. The present embodiment is different from the embodiment 1 in how to operate the input device 100. Thus explanation will be made as to how to operate the input device 100 and how to display an operational guide when an action of the user with both hands enables the user to operate the input device 100. The input device 100 of the present embodiment 2 has substantially the same arrangement as the embodiment 1, but different therefrom in user operating manner.
The present embodiment will be explained by referring to attached drawings, in which constituent elements having functions similar to the constituent elements of the aforementioned embodiment are denoted by the same reference numerals, and explanation thereof is omitted.
Explanation will then be made as to the operation of the input device 100 and how to display an operational guide in the present embodiment by referring to
An illustration shown by “Menu Display” in
In this way, the input device 100 provides an image of an operational method in the form of the shape or appearance of the operational menu 700 to the user 103. Further, when the user 103 tries to actually conduct an operation based on the above image, the animation operational guide 800 explaining the detailed operating method of the user is presented to the user.
As a result, the user can receive a specific guide relating to the operation at a timing of her or his action. Accordingly, the user can smoothly operate the menu items while understanding the validity of an action being conducted by the user or how to operate the input device.
In this connection, the operational menu 700 may be a combination with the operational menu having the operating wheel 301 and the selection items 302 explained in the embodiment 1, for example, as shown in
Explanation will next be made as to an embodiment 3. Explanation will be made as to an input device of the embodiment 3 which can attain an operating method similar to in the embodiment 2 by displaying no animation operational guide and displaying a user's operational image at a location on which an operation is applied in the operational menu in the input device 100 of the embodiment 2.
The present embodiment will be explained by referring to the accompanying drawings. In the drawings, constituent elements having functions equivalent to in the aforementioned embodiments are denoted by the same reference numerals, and explanation thereof is omitted.
Operations of the input device 100 and a displaying method of an operational image of the present embodiment will be explained by using
An operational image 1000 of
In this way, the input device 100 is independent of variations in the operation of the input device caused when the user 103 has different action positions, and displays the operational image 1000 at an actual location on which an operation is applied on the operational menu according to a simple operation. As a result, the user, when operating the input device, can know the fact that the input device is actually in an operable state, and simultaneously can confirm the location on which an operation is applied on the operational menu and operate the input device through such an action as not to require accuracy.
It should be further understood by those skilled in the art that although the foregoing description has been made on embodiments of the invention, the invention is not limited thereto and various changes and modifications may be made without departing from the spirit of the invention and the scope of the appended claims.
Number | Date | Country | Kind |
---|---|---|---|
2009-104641 | Apr 2009 | JP | national |
Number | Name | Date | Kind |
---|---|---|---|
5594469 | Freeman et al. | Jan 1997 | A |
7844921 | Ike et al. | Nov 2010 | B2 |
20040189720 | Wilson et al. | Sep 2004 | A1 |
20040193413 | Wilson et al. | Sep 2004 | A1 |
20050086611 | Takabe et al. | Apr 2005 | A1 |
20060209021 | Yoo et al. | Sep 2006 | A1 |
20070150842 | Chaudhri et al. | Jun 2007 | A1 |
20080052643 | Ike et al. | Feb 2008 | A1 |
20080062125 | Kitaura | Mar 2008 | A1 |
20080141181 | Ishigaki et al. | Jun 2008 | A1 |
20080178126 | Beeck et al. | Jul 2008 | A1 |
20080215975 | Harrison et al. | Sep 2008 | A1 |
20090027337 | Hildreth | Jan 2009 | A1 |
20090073117 | Tsurumi et al. | Mar 2009 | A1 |
20090079813 | Hildreth | Mar 2009 | A1 |
20090187824 | Hinckley et al. | Jul 2009 | A1 |
20090217211 | Hildreth et al. | Aug 2009 | A1 |
20090228841 | Hildreth | Sep 2009 | A1 |
20100031202 | Morris et al. | Feb 2010 | A1 |
20100050133 | Nishihara et al. | Feb 2010 | A1 |
20100058252 | Ko | Mar 2010 | A1 |
20140223299 | Han | Aug 2014 | A1 |
Number | Date | Country |
---|---|---|
101131609 | Feb 2008 | CN |
2 040 156 | Mar 2009 | EP |
8-315154 | Nov 1996 | JP |
2001-216069 | Aug 2001 | JP |
2004-326189 | Nov 2004 | JP |
2004-356819 | Dec 2004 | JP |
2005-250785 | Sep 2005 | JP |
2006-79281 | Mar 2006 | JP |
2007-122136 | May 2007 | JP |
2007-213245 | Aug 2007 | JP |
2008-052590 | Mar 2008 | JP |
2009-75685 | Apr 2009 | JP |
10-2006-0101071 | Sep 2006 | KR |
Entry |
---|
Go Diego Go! Safari Rescue Nintendo Wii game. 2K Play. Official release date Feb. 11, 2008. |
European Search Report issued in European Patent Application No. 10250809.0 on Aug. 1, 2013. |
Office Action issued in Japanese Patent Application No. 2009-104641 on Aug. 21, 2012. |
Number | Date | Country | |
---|---|---|---|
20100275159 A1 | Oct 2010 | US |