Claims
- 1. A method for recognizing one or more objects, the method comprising:
obtaining a depth distance between each region in a plurality of regions on a surface of an object in the one or more objects and a reference; and determining an identification feature of at least a portion of the object using the depth distance between each of the plurality of regions and the reference.
- 2. The method of claim 1, wherein the step of obtaining a depth distance includes using a sensor system that is configured to measure range information for an object being viewed by the sensor system.
- 3. The method of claim 1, wherein the step of obtaining a depth distance between each region in a plurality of regions on a surface of the object and a reference includes obtaining a depth image for a scene that includes the object.
- 4. The method of claim 3, further comprising obtaining a depth image of the scene without the object, and then obtaining a difference image of the object using the depth image of the scene without the object.
- 5. The method of claim 3, further comprising isolating a depth image of the object from the depth image of the scene using a prior depth expectation of the object.
- 6. The method of claim 3, further comprising isolating a depth image of the object from the depth image of the scene using a segmentation algorithm.
- 7. The method of claim 4, wherein determining an identification feature of at least a portion of the object using the depth distance includes using a depth image that isolates the object from the rest of the scene.
- 8. The method of claim 1, wherein the step of determining an identification feature of at least a portion of the object includes classifying the object as belonging to one or more categories in a plurality of designated categories.
- 9. The method of claim 8, wherein the step of classifying the object as belonging to one or more categories includes determining that the object is a person.
- 10. The method of claim 8, wherein the step of classifying the object as belonging to one or more categories includes determining that the object is a child or infant.
- 11. The method of claim 8, wherein the step of classifying the object as belonging to one or more categories includes determining that the object is a pet.
- 12. The method of claim 6, wherein the step of classifying the object as belonging to one or more categories includes determining that the object belongs to an undetermined category.
- 13. The method of claim 6, wherein the step of classifying the object as belonging to one or more categories includes determining whether the object is a child seat.
- 14. The method of claim 1, wherein obtaining a depth distance includes obtaining the depth distance between each region in a plurality of regions on a surface of multiple objects, and wherein determining an identification feature of at least a portion of the object includes determining the identification feature of at least a portion of each of the multiple objects.
- 15. The method of claim 1, wherein the step of determining an identification feature of at least a portion of the object includes identifying the portion of the object from a remainder of the object.
- 16. The method of claim 1, wherein the step of determining an identification feature of at least a portion of the object includes detecting that the object is a person.
- 17. The method of claim 1, further comprising detecting that a person is in a scene that is being viewed by a sensor for obtaining the depth distances, and wherein the step of obtaining a depth distance includes using the sensor to obtain the depth distance of the plurality of regions on at least a portion of the person.
- 18. The method of claim 1, wherein obtaining the depth distance between each region in a plurality of regions includes obtaining at least some of the depth distances from a face of a person.
- 19. The method of claim 1, wherein obtaining the depth distance between each region in a plurality of regions includes obtaining at least some of the depth distances from a head of a person.
- 20. The method of claim 1, wherein the step of determining an identification feature of at least a portion of the object includes identifying that a face is in a scene that is being viewed by a sensor for measuring the depth distances.
- 21. The method of claim 1, wherein the step of determining an identification feature of at least a portion of the object includes recognizing at least a portion of a face of a person in order to be able to uniquely identify the person.
- 22. The method of claim 21, wherein the step of determining an identification feature of at least a portion of the object includes determining an identity of the person by recognizing at least a portion of the face of the person.
- 23. The method of claim 21, wherein recognizing at least a portion of a face of a person includes classifying the object as the person, and detecting the face from other body parts of the person.
- 24. The method of claim 1, wherein the step of obtaining a depth distance between each region in a plurality of regions includes passively measuring the depth distances using a depth perceptive sensor.
- 25. The method of claim 24, wherein the step of determining an identification feature of at least a portion of the object includes authenticating a person.
- 26. The method of claim 25, further comprising authorizing the person to perform an action after authorizing the person.
- 27. The method of claim 25, further comprising permitting the person to enter a secured area after authorizing the person.
- 28. A method for classifying one or more objects that are present in a monitored area, the method comprising:
measuring a depth distance between each region in a plurality of regions on a surface of each of the one or more objects and a reference; and identifying one or more features of each of the one or more objects using the depth distance of one or more regions in the plurality of regions; and classifying each of the one or more objects individually based on the one or more identified features.
- 29. The method of claim 28, wherein measuring a depth distance between each region in a plurality of regions on a surface of the one or more objects and a reference includes capturing a depth image of each of the one or more objects.
- 30. The method of claim 28, wherein classifying each of the one or more objects based on the one or more identified features includes classifying each of the one or more objects as at least one of a person, an infant, a pet, or a child seat.
- 31. A method for controlling access to a secured area, the method comprising:
detecting a person positioned within a scene associated with the secured area; capturing one or more depth images of at least a portion of the person; and determining whether to grant the person access to the secured area based on the depth image.
- 32. The method of claim 31, wherein the step of detecting a person and capturing one or more images are performed passively, such that no action other than the person's presence in the monitored region is necessary to perform the steps.
- 33. The method of claim 31, wherein capturing a depth image of a portion of the person includes capturing the depth image of at least a portion of the person's face.
- 34. The method of claim 31, wherein determining whether to grant the person access includes recognizing the person from the depth image.
- 35. The method of claim 34, wherein recognizing the person includes recognizing at least a portion of a face of the person.
- 36. The method of claim 34, wherein recognizing the person from the depth image includes determining an identity of the person.
- 37. The method of claim 34, wherein recognizing the person from the depth image includes recognizing that the person is of a class of people that is granted access to the given area.
- 37. The method of claim 34, wherein recognizing the person from the depth image includes determining an identity of the person, and using the identity to perform the step of determining whether to grant the person access to the given area.
- 39. A method for recognizing an object, the method comprising:
obtaining a depth distance between each region in a plurality of regions on a surface of the object and a reference; and using the depth distances to identify a set of features from the object that are sufficient to uniquely identify the object from a class that includes a plurality of members.
- 40. The method of claim 39, further comprising determining an identity of the object based on the depth distances between the regions on the surface.
- 41. The method of claim 38, wherein the object corresponds to a person, and wherein the step of using the depth distances to identify a set of features includes using the depth distances to recognize features on the person's body.
- 42. The method of claim 41, wherein the step of using the depth distances to identify a set of features includes using the depth distances to recognize one or more identifying features on the person's face.
- 43. The method of claim 42, further comprising matching the one or more identifying features of the person's face to one or more corresponding features of a known face associated with a person having a particular identity.
- 44. The method of claim 42, further comprising authenticating the person based on the one or more identifying features of the person's face.
- 45. The method of claim 42, further comprising authorizing the person to perform a given action based on the one or more identifying features of the person's face
RELATED APPLICATION AND PRIORITY INFORMATION
[0001] This application claims benefit of priority to:
[0002] Provisional U.S. Patent Application No. 60/360,137, entitled “Passive, Low-Impact, Keyless Entry System,” naming James Spare as inventor, filed on Feb. 26, 2002;
[0003] Provisional U.S. Patent Application No. 60/382,550, “Detection of faces from Depth Images,” naming Salih Burak Gokturk and Abbas Rafii as inventors, filed on May 22, 2002;
[0004] Provisional U.S. Patent Application No. 60/424,662, “Provisional: Methods For Occupant Classification,” naming Salih Burak Gokturk as inventor, filed on Nov. 7, 2002
[0005] All of the aforementioned priority applications are hereby incorporated by reference in their entirety for all purposes.
Provisional Applications (3)
|
Number |
Date |
Country |
|
60360137 |
Feb 2002 |
US |
|
60382550 |
May 2002 |
US |
|
60424662 |
Nov 2002 |
US |