Claims
- 1. A method for tracking objects entering and exiting an environment, the method comprising:
obtaining image data for one or more fields of view covering one or more paths by way of which objects can enter or exit the environment; identifying objects of interest represented in the image data; for each of the objects, determining from the image data whether the object is entering or exiting the environment and generating a signature for the object based, at least in part, on the image data; and after generating one or more entry signatures for objects determined to be entering the environment and at least one exit signature for objects determined to be exiting the environment, matching the at least one exit signature to the one or more entry signatures, such that each matched exit signature corresponds with one matched entry signature.
- 2. A method according to claim 1, wherein determining from the image data whether the object is entering or exiting the environment comprises tracking a position of the object over a plurality of image frames to determine a trajectory of the object.
- 3. A method according to claim 2, wherein determining from the image data whether the object is entering or exiting the environment is based, at least in part, on the position of the object when it leaves the one or more fields of view.
- 4. A method according to claim 1, wherein generating a signature for the object comprises assigning one or more confidence measures to features of the image data, each such confidence measure being based on the uniformity of an associated feature of the image data over a plurality image frames.
- 5. A method according to claim 4, wherein each of the one or more confidence measures is representative of a variance of the associated feature of the image data.
- 6. A method according to claim 5, wherein generating a signature for the object comprises including one or more confidence measures as components of the signature for the object.
- 7. A method according to claim 1, wherein matching the at least one exit signature to the one or more entry signatures comprises, after generating each of the at least one exit signatures for objects determined to be exiting the environment, matching the exit signature to the one or more entry signatures to identify a corresponding entry signature that provides a best match to the exit signature.
- 8. A method according to claim 7, wherein matching the exit signature to the one or more entry signatures comprises removing the exit signature and the corresponding entry signature from any subsequent matching.
- 9. A method according to claim 7 comprising, after identifying the corresponding entry signature, checking a plurality of previously matched exit signatures and corresponding previously matched entry signatures to determine whether any of the previously matched entry signatures would be a better match to the exit signature than the corresponding entry signature.
- 10. A method according to claim 7 comprising, after identifying the particular entry signature, checking a plurality of previously matched exit signatures and corresponding previously matched entry signatures to determine whether rematching one or more of the previously matched exit signatures would provide better overall matching between a plurality of available exit signatures and the one or more entry signatures.
- 11. A method according to claim 1, wherein matching the at least one exit signature to the one or more entry signatures comprises matching a plurality of exit signatures to the one or more entry signatures to determine a plurality of matched entry and exit signatures, each matched exit signature corresponding to one matched entry signature.
- 12. A method according to claim 11, wherein matching a plurality of exit signatures to the one or more entry signatures comprises optimizing an overall match over the plurality of exit signatures, as opposed to a match over any one individual exit signature.
- 13. A method according to claim 11, wherein a number of exit signatures in the plurality of exit signatures is equal to a number of the one or more entry signatures.
- 14. A method according to claim 11, wherein a number of exit signatures in the plurality of exit signatures is different from a number of the one or more entry signatures.
- 15. A method according to claim 1, wherein matching the at least one exit signature to the one or more entry signatures comprises weighting each of a plurality of components of the entry and exit signatures.
- 16. A method according to claim 15, wherein weighting each of a plurality of components of the entry and exit signatures comprises non-uniformly weighting components of the entry and exit signatures.
- 17. A method according to claim 1, wherein matching the at least one exit signature to the one or more entry signatures comprises minimizing a metric representative of a Euclidean distance between the at least one exit signature and the one or more entry signatures.
- 18. A method according to claim 17, wherein minimizing a metric representative of a Euclidean distance between the at least one exit signature and the one or more entry signatures comprises applying non-uniform weighting to components of the metric.
- 19. A method according to claim 17, wherein minimizing a metric representative of a Euclidean distance between the at least one exit signature and the one or more entry signatures comprises applying non-uniform weighting to components of the entry and exit signatures.
- 20. A method according to claim 1 comprising:
recording an entry time for each of the objects determined to be entering the environment and associating the entry time with the entry signature for that object; recording an exit time for each of the objects determined to be exiting the environment and associating the exit time with the exit signature for that object; and after matching the at least one exit signature to the one or more entry signatures, determining a dwell time for one or more objects corresponding to matched pairs of entry and exit signatures.
- 21. A method according to claim 1, wherein the image data comprises stereoscopic image data and the fields of view are stereo vision fields.
- 22. A method according to claim 2, wherein the image data comprises stereoscopic image data and the fields of view are stereo vision fields.
- 23. A method according to claim 4, wherein the image data comprises stereoscopic image data and the fields of view are stereo vision fields.
- 24. A method according to claim 7, wherein the image data comprises stereoscopic image data and the fields of view are stereo vision fields.
- 25. A method according to claim 11, wherein the image data comprises stereoscopic image data and the fields of view are stereo vision fields.
- 26. A method according to claim 15, wherein the image data comprises stereoscopic image data and the fields of view are stereo vision fields.
- 27. A method according to claim 17, wherein the image data comprises stereoscopic image data and the fields of view are stereo vision fields.
- 28. A method according to claim 20, wherein the image data comprises stereoscopic image data and the fields of view are stereo vision fields.
- 29. A method according to claim 21, wherein generating a signature for the object is based, at least in part, on features of the image data, the features of the image data comprising one or more of: a shape of the object; a location of the object; a transverse dimension of the object; a trajectory of the object; a color histogram of the object and a two-dimensional template of the object.
- 30. A method according to claim 1, wherein the objects of interest are persons.
- 31. A method according to claim 2, wherein the objects of interest are persons.
- 32. A method according to claim 4, wherein the objects of interest are persons.
- 33. A method according to claim 7, wherein the objects of interest are persons.
- 34. A method according to claim 11, wherein the objects of interest are persons.
- 35. A method according to claim 15, wherein the objects of interest are persons.
- 36. A method according to claim 17, wherein the objects of interest are persons.
- 37. A method according to claim 20, wherein the objects of interest are persons.
- 38. A method according to claim 29, wherein the objects of interest are persons.
- 39. A method according to claim 21, wherein the objects of interest are persons.
- 40. A method according to claim 39, wherein generating a signature for the object is based, at least in part, on features of the image data, the features of the image data comprising one or more of: a height of the person; a shoulder width of the person; a shape of the person; a trajectory of the person; a location of the person; a color histogram of the person; a two-dimensional image template of the person; hair color of the person; a sex of the person; iris characteristics of the person; and facial characteristics of the person.
- 41. A method according to claim 1, wherein generating a signature for the object is based, at least in part, on data obtained from one or more non-image data producing sensors, which are positioned to measure characteristics of objects entering or exiting the environment by way of the paths.
- 42. A method according to claim 21, wherein generating a signature for the object is based, at least in part, on data obtained from one or more non-image data producing sensors, which are positioned to measure characteristics of objects entering or exiting the environment by way of the paths.
- 43. A method according to claim 30, wherein generating a signature for the object is based, at least in part, on data obtained from one or more non-image data producing sensors, which are positioned to measure characteristics of objects entering or exiting the environment by way of the paths.
- 44. A method according to claim 42, wherein the one or more non-image data producing sensors comprise at least one of: a laser range finder; a ultrasonic distance sensor; and a weight sensor.
- 45. A method according to claim 39, wherein generating a signature for the object is based, at least in part, on data obtained from one or more non-image data producing sensors, which are positioned to measure characteristics of objects entering or exiting the environment by way of the paths.
- 46. A method according to claim 1, wherein the objects of interest are one or more of: animals, vehicles and shopping carts.
- 47. A method according to claim 21, wherein the objects of interest are one or more of: animals, vehicles and shopping carts.
- 48. A method according to claim 30, wherein the objects of interest are one or more of: animals, vehicles and shopping carts.
- 49. A method according to claim 41, wherein the objects of interest are one or more of: animals, vehicles and shopping carts.
- 50. A method of tracking objects entering and exiting an environment, the method comprising:
providing one or more cameras positioned in such a manner that paths of the environment are substantially covered by fields of view associated with the one or more cameras; determining entry signatures for objects entering the environment, the entry signatures based, at least in part, on image data obtained by the one or more cameras, each entry signature associated with a particular object; determining exit signatures for objects exiting the environment, the entry signatures based, at least in part, on image data obtained by the one or more cameras, each exit signature associated with a particular object; and at any time after determination of at least one exit signature, matching the at least one exit signature to a set of available entry signatures.
- 51. A method according to claim 50, wherein the one or more cameras are stereo vision cameras and the fields of view associated with the one or more cameras are stereo vision fields.
- 52. A method of tracking objects entering and exiting an environment, the method comprising:
monitoring paths to enter the environment using stereo vision cameras, the stereo vision cameras having stereo vision fields that substantially cover the paths; tracking moving objects in the stereo vision fields; deciding whether the moving objects are entering or exiting the environment and: if a particular moving object is entering the environment, determining an entry signature associated with that particular object; and if a particular moving object is exiting the environment, determining an exit signature associated with that particular object; and at any time after determining at least one exit signature, matching the at least one exit signature to a set of available entry signatures, such that each matched exit signature corresponds with one matched entry signature..
- 53. A method according to claim 52, wherein matching the at least one exit signature to a set of available entry signatures comprises matching a first plurality of exit signatures with a second plurality of entry signatures.
- 54. A method according to claim 52, wherein the objects are people.
- 55. An apparatus for tracking objects entering and exiting an environment by way of one or more paths, the apparatus comprising:
means for obtaining three-dimensional digital image data from regions proximate the paths; means for determining entry signatures for objects entering the environment and for determining exit signature vectors for objects exiting the environment; and, means for matching one or more exit signatures to a set of available entrance signatures.
- 56. An apparatus according to claim 55, comprising means for determining a dwell time of the objects in the environment.
- 57. A machine readable medium carrying a set of instructions, which when executed by a data processor cause the data processor to perform a method of tracking objects entering and exiting an environment, the method comprising:
receiving image data from one or more stereo vision cameras which are positioned in such a manner that paths to enter the environment are substantially covered by stereo vision fields associated with the one or more stereo vision cameras; determining entry signatures for objects entering the environment using the image data obtained by the one or more stereo vision cameras, each entry signature associated with a particular object; determining exit signatures for objects exiting the environment using image data obtained by the one or more stereo vision cameras, each exit signature associated with a particular object; and at any time after determining at least one exit signature, matching at least one exit signature to a set of available entry signatures.
- 58. A method according to claim 57 comprising determining a dwell time of a person by subtracting, from an exit time associated with an exit signature for that person, an entrance time associated with a matching entrance signature for that person.
CROSS-REFERENCE TO RELATED APPLICATION
[0001] This application claims the benefit of the filing date of U.S. Application No. 60/301,828 filed Jul. 2, 2001, which is hereby incorporated by reference.
Provisional Applications (1)
|
Number |
Date |
Country |
|
60301828 |
Jul 2001 |
US |