The invention relates to a method and apparatus of user interface display, and more particularly, to a method and apparatus of virtual user interface interaction based on gesture recognition.
Most virtual reality (VR) system can track user's movement from human interface devices carried by a user. The human interface device (e.g. joystick, controller, touchpad, etc.) is used for the user to interact with a software system, for example, a VR game, executed by a computing device. In addition, a head-mounted display (HMD) worn by the user is used for displaying the interacting images generated by the computing device to the user for VR experience.
In order to increase user's willingness of VR immersion, virtual user interface are proposed for creating a new user experiences. However, there is no specification for virtual keyboard interaction with gesture recognition. Thus, conventional humanized user interfaces are still implemented with remote or joystick.
It is therefore an objective of the present invention to provide a method and apparatus of virtual user interface interaction based on gesture recognition for an interaction system.
The present invention discloses a method of virtual user interface interaction based on gesture recognition for an interaction system. The method comprises detecting two hands in a plurality images, recognizing each hand's gesture when the two hands are detected, projecting a virtual user interface object on an open gesture hand when one hand is recognized with a point gesture and the other hand is recognized with an open gesture, tracking an index fingertip of the point gesture hand, for obtain a relative position of the index fingertip of the point gesture hand and the open gesture hand, determining whether the index fingertip of the point gesture hand is close to the open gesture hand within a predefined rule, interpreting a movement of the index fingertip of the point gesture hand as a click command when the index fingertip of the point gesture hand is close to the open gesture hand within the predefined rule, and in response to the click command, generating image data with a character object of the virtual user interface object base on the relative position.
The present invention discloses an electronic device of an interaction system for virtual user interface interaction based on gesture recognition. The electronic device comprises a processing device for executing a program, and a memory device coupled to the processing device for storing the program; wherein the program instructs the processing device to perform the following steps: detecting two hands in a plurality of images, recognizing each hand's gesture when the two hands are detected, projecting a virtual user interface on an open gesture hand when one hand is recognized with a point gesture and the other hand is recognized with an open gesture, tracking an index fingertip of the point gesture hand, for obtain a relative position of the index fingertip of the point gesture hand and the open gesture hand, determining whether the index fingertip of the point gesture hand is close to the open gesture hand within a predefined rule, interpreting a movement of the index fingertip of the point gesture hand as a click command when the index fingertip of the point gesture hand is close to the open gesture hand within the predefined rule, and in response to the click command, generating image data with a character object of the virtual user interface object based on the relative position.
These and other objectives of the present invention will no doubt become obvious to those of ordinary skill in the art after reading the following detailed description of the preferred embodiment that is illustrated in the various figures and drawings.
Reference is made to
Step 201: Detect two hands in a plurality of images.
Step 202: Recognize each hand's gesture when the two hands are detected.
Step 203: Project a virtual user interface object on an open gesture hand when one hand is recognized with a point gesture and the other hand is recognized with an open gesture.
Step 203: Track an index fingertip of the point gesture hand, for obtaining a relative position of the index fingertip of the point gesture hand and the open gesture hand.
Step 205: Determine whether the index fingertip of the point gesture hand is close to the open gesture hand within a predefined rule.
Step 206: Interpret a movement of the index fingertip of the point gesture hand as a click command when the index fingertip of the point gesture hand is close to the open gesture hand within the predefined rule.
Step 207: In response to the click command, generate an image data with a character object of the virtual user interface object based on the relative position.
According to the interaction process 20, the electronic device 10 detects two hands in the images IMG0-IMGn as shown in
In detail, the hand detection is realized by the following steps:
1. Extract depth pixels of the images IMG0-IMGn with a working distance;
2. Use Random Decision Forest (RDF) to classify all above pixels into possible left hand group and right hand group;
3. Take a set of depth context surrounding the examined pixel as input reference with RDF, and output the possible group of the examined pixel;
4. Match left/right hand groups into connected objects in a frame;
5. Calculate left/right hand contour radius;
6. Crop left/right hand silhouette; and
7. Extract left/right hand depth information according to the left/right hand silhouette.
If no hand or only one hand is detected, as shown in the image IMG0 of
1. Extract hand subframes;
2. Use RDF to classify subframe pixels into gesture groups; and
3. Set the majority group as gesture result, and use the secondary group as gesture reference.
Note that, the gesture recognition may be performed by machine learning, which is trained by data sets of images. Those skilled in the art should be well known, so it is omitted herein. Moreover, after the gestures of the two hands are recognized, the electronic device 10 should determine whether one hand (e.g. right hand) is in a point gesture and the other hand (e.g. left hand) is in an open gesture. If the point gesture hand and open gesture hand are detected, as shown in the image IMGx of
As considering the open gesture hand as an unmovable object, the electronic device 10 generates image data with a virtual user interface (UI) object projected at the location of the open gesture hand, which is displayed by the display device 2000 for the user, as shown in the images IMGm′ and IMGn′ of
Note that, if the electronic device 10 detects that the open gesture hand moves, the virtual UI object projection is cancelled. That is, the electronic device 10 generates image data without the virtual UI object based on the image received from the image sensor 1000, for the display device 2000 to display.
The abovementioned steps of the processes including suggested steps can be realized by means that could be a hardware, a firmware known as a combination of a hardware device and computer instructions and data that reside as read-only software on the hardware device or an electronic system. Examples of hardware can include analog, digital and mixed circuits known as microcircuit, microchip, or silicon chip. Examples of the electronic system can include a system on chip (SOC), system in package (SiP), a computer on module (COM) and the electronic device 10.
To sum up, the present invention proposes an interaction process for the user to interact with the interaction system via the virtual UI object projection, which is realized by the gesture recognition and gesture movement detection.
Those skilled in the art will readily observe that numerous modifications and alterations of the device and method may be made while retaining the teachings of the invention. Accordingly, the above disclosure should be construed as limited only by the metes and bounds of the appended claims.
Number | Name | Date | Kind |
---|---|---|---|
6771294 | Pulli | Aug 2004 | B1 |
20120225719 | Nowozin | Sep 2012 | A1 |
20140006997 | Kim | Jan 2014 | A1 |
20140204013 | O'Prey | Jul 2014 | A1 |
20140240231 | Minnen | Aug 2014 | A1 |
20140266988 | Fisher | Sep 2014 | A1 |
20150089453 | Dal Mutto | Mar 2015 | A1 |
20150235447 | Abovitz | Aug 2015 | A1 |
20150309629 | Amariutei | Oct 2015 | A1 |
20160306422 | Parham | Oct 2016 | A1 |
20170123487 | Hazra | May 2017 | A1 |
20180329506 | Liang | Nov 2018 | A1 |
Number | Date | Country |
---|---|---|
101387908 | Oct 2010 | CN |
2016-5157 | Jan 2016 | JP |
2016-504652 | Feb 2016 | JP |
2017-91186 | May 2017 | JP |
2018079446 | May 2018 | WO |
Entry |
---|
Office Action dated Mar. 3, 2020 for the Japanese Application No. 2018-226573, filed Dec. 3, 2018, pp. 1-4. |
Number | Date | Country | |
---|---|---|---|
20200125176 A1 | Apr 2020 | US |