Claims
- 1. A method of gesture recognition, comprising the steps of:
imaging a gesture-making target; deriving the start position of the target, the end position of the target, and the velocity between the start and end positions; comparing the velocity of the target to a threshold value; and identifying the gesture as a static gesture if the velocity is below the threshold value, otherwise, identifying the gesture as a dynamic gesture.
- 2. The method of claim 1, wherein the target is a human hand.
- 3. The method of claim 2, further including the step of generating a bounding box around the hand.
- 4. The method of claim 1, further including the step of using an operator to find the edges of the target.
- 5. The method of claim 1, further including the steps of:
receiving a file of recognized gestures along with their vector descriptions; and comparing static gestures to the vector descriptions.
- 6. The method of claim 1, further including the step of treating a dynamic gesture as one or more one-dimensional oscillations.
- 7. The method of claim 6, further including the step of treating a circular motion as a combination of repeating motions in two dimensions having the same magnitude and frequency of oscillation.
- 8. The method of claim 6, further including the step of deriving complex dynamic gestures by varying phase relationships.
- 9. The method of claim 6, further including the step of deriving a multi-gesture lexicon based upon clockwise and counter-clockwise large and small circles and one-dimensional lines.
- 10. The method of claim 6, further including the step of comparing to the next position and velocity of each gesture to one or more predictor bins to determine a gesture's future position and velocity.
- 11. The method of claim 10, further including the use of a linear-with-offset-component model to discriminate among simple dynamic gestures.
- 12. The method of claim 10, further including the use of a velocity damping model to discriminate among non-circular dynamic gestures.
- 13. A gesture-controlled interface for self-service machines and other applications, comprising:
a sensor module for visually analyzing a gesture made by a human or machine, and outputting image data including position and velocity information associated with the gesture; an identification module operative to identify the gesture based upon the image data output by the sensor module; and a transformation module operative to generate a command based upon the gesture identified by the identification module.
- 14. The interface of claim 13, further including a system response module operative to apply to the command from the transformation module to a device to be controlled.
- 15. The interface of claim 14, wherein the device is a virtual-reality simulator.
- 16. The interface of claim 14, wherein the device is a self-service machine.
- 17. The interface of claim 14, wherein the device forms part of a robot.
- 18. The interface of claim 14, wherein the device forms part of a commercial appliance.
REFERENCE TO RELATED APPLICATIONS
[0001] This application claims priority of U.S. provisional patent application Serial No. 60/096,126, filed Aug. 10, 1998, the entire contents of which are incorporated here by reference.
Provisional Applications (1)
|
Number |
Date |
Country |
|
60096126 |
Aug 1998 |
US |