Claims
- 1. A method for enabling a person to interact with an electronic device, the method comprising:
obtaining position information for a plurality of discrete regions on a body part of the person, wherein the position information indicates a depth of each discrete region on the body part relative to a reference; using the position information to classify a gesture formed by the body part as an input for interacting with the electronic device.
- 2. The method of claim 1, wherein the step of using the position information to classify a gesture formed by the body part is performed substantially without using light intensity reflected off of the body part.
- 3. The method of claim 1, further comprising segmenting the body part from a remainder of a scene in a region where a sensor for obtaining the position information is operable.
- 4. The method of claim 3, wherein segmenting the body part from a remainder of a scene includes capturing an image of the scene on a panel of pixels, wherein each pixel corresponds to a discrete region of the scene, and wherein segmenting the body part from a remainder of a scene includes using the position information to cluster pixels representing portions of the body part together.
- 5. The method of claim 4, wherein using the position information to cluster pixels representing portions of the body part together includes clustering pixels together based at least partially on a depth of the discrete region from the reference.
- 6. The method of claim 1, further comprising identifying a shape of the body part.
- 7. The method of claim 6, wherein the step of identifying a shape of the body part includes:
capturing an image of a scene where a sensor for obtaining the position information is operable, the image being captured on a panel of pixels so that each pixel corresponds to a discrete region of the scene; performing a statistical analysis on the image in order to identify the shape of the body part.
- 8. The method of claim 7, wherein performing a statistical analysis includes creating a histogram using coordinates that identify pixels in the panel.
- 9. The method of claim 1, wherein creating a histogram includes using coordinates that identify pixels on a boundary of the image captured by the plurality of panels.
- 10. The method of claim 7, further comprising segmenting the body part from the scene in the image, and wherein performing a statistical analysis on the image includes performing the statistical analysis on an image of the body part segmented from the scene.
- 11. The method of claim 6, further comprising identifying a pose of the body part.
- 12. The method of claim 11, wherein identifying the pose of the body part includes identifying an orientation of the body part.
- 13. The method of claim 1, further comprising classifying the gesture based on the shape and the pose of the body part.
- 14. The method of claim 1, wherein using the position information to classify a gesture formed by the body part includes using the position information to classify a series of body postures that occur at multiple instances of a duration, wherein the combination of body postures form the gesture.
- 15. The method of claim 1, wherein the body part includes the person's hands, and wherein the step of using the position information to classify a gesture includes using the position information to classify the gesture created by the person's hands.
- 16. The method of claim 1, wherein the step of using the position information to classify a gesture formed by the body part includes using the position information to classify the gesture formed by at least one of an arm, a leg, a shoulder, a foot, a finger, and a head.
- 17. The method of claim 1, wherein the step of using the position information to classify a gesture formed by the body part includes using the position information to classify the gesture formed by at least one of an eye lid, an eye ball, and a mouth,
- 18. The method of claim 1, further comprising classifying the gesture as the input for the electronic device selected from a set consisting of a portable computer, a television system, an audio system, a game console, a mobile phone, a robot, and an appliance.
- 19. The method of claim 1, wherein the step of obtaining position information for a plurality of discrete regions on a body part of the person includes obtaining the position information at a plurality of instances during a given duration of time.
- 20. The method of claim 19, wherein using the position information to classify a gesture includes using the position information obtained at each one of the plurality of instances to classify a dynamic gesture formed by the body part being moved in the interval.
- 21. The method of claim 20, further comprising detecting when the dynamic gesture starts.
- 22. The method of claim 20, further comprising detecting when the dynamic gesture ends, wherein the given duration corresponds to a duration between when the dynamic gesture is detected as starting and stopping, and wherein the method further comprises classifying the dynamic gesture using the position information obtained during the given duration.
- 23. The method of claim 20, wherein detecting when the dynamic gesture starts includes detecting an occurrence of a first delimiter action designated to signify when the dynamic gesture starts.
- 24. The method of claim, 21, wherein detecting when the dynamic gesture starts includes detecting an occurrence of a first delimiter action designated to signify when the dynamic gesture starts, and detecting when the dynamic gesture ends includes detecting an occurrence of a second delimiter action dsignated to signify when the dynamic gesture ends.
- 25. The method of claim 24, wherein at least one of the first delimiter action and the second delimiter action corresponds to an action that creates a designated audible.
- 26. The method of claim 25, wherein at least one of the first delimiter action and the second delimiter action corresponds to a formation of a specific posture of the body part of the person.
- 27. The method of claim 25, wherein at least one of the first delimiter action and the second delimiter action corresponds to a formation of a specific posture of another body part of the person.
- 28. The method of claim 1, further comprising indicating how the gesture is classified to the person before classifying the gesture as the input.
- 29. The method of claim 28, further comprising receiving confirmation from the person that the input is what the person intended to enter.
- 30. The method of claim 29, further comprising detecting the confirmation.
- 31. The method of claim 30, wherein detecting the confirmation includes detecting one of another gesture, an audible created by the person, or a manual entry by the person into the electronic device.
- 32. The method of claim 1, wherein at least one of the steps of obtaining position information and using the position information are provided as instructions on a computer-readable medium, wherein the instructions are executable by one or more processors to perform the at least one of the steps of obtaining position information and using the position information.
- 33. A system for for enabling a person to interact with an electronic device, the method comprising:
means for obtaining position information for a plurality of discrete regions on a body part of the person, wherein the position information indicates a depth of each discrete region on the body part relative to a reference; means for using the position information to classify a gesture formed by the body part as an input for interacting with the electronic device.
- 34. A system for enabling a person to interact with an electronic device, the method comprising:
a sensor system for obtaining position information for a plurality of discrete regions on a body part of the person, wherein the position information indicates a depth of each discrete region on the body part relative to a reference; a gesture recognition system that uses the position information to classify a gesture formed by the body part as an input for interacting with the electronic device.
- 35. The system of claim 34, wherein the gesture recognition system includes a segementation module that is configured to segment an image of the body part from an environment that the sensor system views.
- 36. The system of claim 35, wherein the gesture recognition system includes a body posture module to identify a posture of the body part.
- 37. The system of claim 35, wherein the gesture recognition system includes a classification module that classifies the gesture as the input.
- 38. The system of claim 35, wherein the gesture recognition system includes a gesture representation module that can identify the gesture when the gesture is formed by the body part moving.
PRIORITY INFORMATION
[0001] This application claims benefit of priority to the following applications:
[0002] Provisional U.S. Patent Application No. 60/357,730, entitled “Natural Touch and Gesture Based Interaction for Electronic Systems,” filed Feb. 15, 2002, naming Jim Spare, Cyrus Bamji and Abbas Rafii as inventors;
[0003] Provisional U.S. Patent Application No. 60/394,068, entitled “Shape Representation and Recognition by Random Histograms,” filed Jul. 2, 2002, naming Salih Burak Gokturk as inventor; and
[0004] Provisional U.S. Patent Application No. 60/410,415, entitled “Gesture Recognition System with 3D Input,” filed on Sep. 13, 2002, naming Salih Burak Gokturk, Fahri Surucu, and Carlo Tomasi as inventors.
[0005] All of the aforementioned priority applications are hereby incorporated by reference in their respective entirety for all purposes.
Provisional Applications (3)
|
Number |
Date |
Country |
|
60357730 |
Feb 2002 |
US |
|
60394068 |
Jul 2002 |
US |
|
60410415 |
Sep 2002 |
US |