Claims
- 1. A method of controlling one or more sensors used to capture data associated with an event, the method comprising the steps of:
processing sensor data captured in accordance with the event; and automatically controlling the one or more sensors based on information pertaining to the continual activity of at least one of one or more objects and one or more persons associated with the event in real time obtained using at least a portion of the processed data captured in accordance with the event.
- 2. The method of claim 1, wherein the step of automatically controlling the one or more sensors further comprises obtaining one or more user preferences.
- 3. The method of claim 2, wherein the step of automatically controlling the one or more sensors further comprises utilizing the one or more user preferences in conjunction with at least a portion of the activity information to generate one or more signals for controlling the one or more sensors.
- 4. The method of claim 2, wherein the one or more user preferences comprise at least one of an object or person preference, a view preference, and an object or person behavior preference.
- 5. The method of claim 4, wherein the step of automatically controlling the one or more sensors further comprises identifying a two dimensional display screen coordinate corresponding to the object or person specified by the object or person preference.
- 6. The method of claim 4, wherein the step of automatically controlling the one or more sensors further comprises specifying an identifier corresponding to the object or person specified by the object or person preference.
- 7. The method of claim 4, wherein a reasoning subsystem is used to identify the behavior specified by the object or person behavior preference.
- 8. The method of claim 7, wherein the step of automatically controlling the one or more sensors further comprises analyzing a spatial behavior corresponding to the object or person specified by the behavior preference.
- 9. The method of claim 7, wherein the step of automatically controlling the one or more sensors further comprises analyzing a spatial behavior relating to the surrounding three dimensional environment for the object or person specified by the behavior preference.
- 10. The method of claim 7, wherein the step of automatically controlling the one or more sensors further comprises analyzing a spatial behavior relating to one or more surrounding objects in the environment for the object or person specified by the behavior preference.
- 11. The method of claim 7, wherein the step of automatically controlling the one or more sensors further comprises analyzing a temporal behavior corresponding to the object or person specified by the behavior preference.
- 12. The method of claim 7, wherein the step of automatically controlling the one or more sensors further comprises specifying a temporal behavior relating to historical data for the object or person specified by the behavior preference.
- 13. The method of claim 7, wherein the step of automatically controlling the one or more sensors further comprises specifying a temporal behavior relating to at least one of the speed, acceleration, and direction of the object or person specified by the behavior preference.
- 14. The method of claim 7, wherein the step of automatically controlling the one or more sensors further comprises specifying a temporal behavior relating to the time of actions of the object or person specified by the behavior preference.
- 15. The method of claim 7, wherein the step of automatically controlling the one or more sensors further comprises specifying a temporal behavior relating to prediction of location of the object or person specified by the behavior preference.
- 16. The method of claim 4, wherein the step of automatically controlling the one or more sensors further comprises obtaining a motion trajectory corresponding to the object or person specified by the object or person preference.
- 17. The method of claim 16, wherein the step of automatically controlling the one or more sensors further comprises finding one or more objects or persons in a neighborhood of the object or person specified by the object or person preference.
- 18. The method of claim 17, wherein the step of automatically controlling the one or more sensors further comprises predicting the next locations of the object or person specified by the object or person preference and of the one or more neighboring objects or persons, using respective motion trajectories.
- 19. The method of claim 18, wherein the step of automatically controlling the one or more sensors further comprises selecting at least one sensor for capturing data associated with the object or person specified by the object or person preference at its predicted next location, based on the view preference and at least a portion of the processed, captured data.
- 20. The method of claim 19, wherein the step of automatically controlling the one or more sensors further comprises determining whether any of the neighboring objects or persons block the view of the at least one selected sensor.
- 21. The method of claim 20, wherein the step of automatically controlling the one or more sensors further comprises directing the at least one selected sensor to the predicted next location of the object or person specified by the object or person preference, when not blocked or only partially blocked by any of the neighboring objects or persons.
- 22. The method of claim 21, wherein the step of automatically controlling the one or more sensors further comprises determining the actual position of the object or person specified by the object or person preference.
- 23. The method of claim 1, wherein the one or more sensors are associated with a multimedia database system.
- 24. Apparatus for controlling one or more sensors used to capture data associated with an event, the apparatus comprising:
a memory; and at least one processor coupled to the memory and operative to: (i) obtain processed sensor data captured in accordance with the event; and (ii) automatically control the one or more sensors based on information pertaining to the continual activity of at least one of one or more objects and one or more persons associated with the event in real time obtained using at least a portion of the processed data captured in accordance with the event.
- 25. The apparatus of claim 24, wherein the operation of automatically controlling the one or more sensors further comprises obtaining one or more user preferences.
- 26. The apparatus of claim 25, wherein the operation of automatically controlling the one or more sensors further comprises utilizing the one or more user preferences in conjunction with at least a portion of the activity information to generate one or more signals for controlling the one or more sensors.
- 27. The apparatus of claim 25, wherein the one or more user preferences comprise at least one of an object or person preference, a view preference, and an object or person behavior preference.
- 28. The apparatus of claim 27, wherein the operation of automatically controlling the one or more sensors further comprises identifying a two dimensional display screen coordinate corresponding to the object or person specified by the object or person preference.
- 29. The apparatus of claim 27, wherein the operation of automatically controlling the one or more sensors further comprises specifying an identifier corresponding to the object or person specified by the object or person preference.
- 30. The apparatus of claim 27, wherein a reasoning subsystem is used to identify the behavior specified by the object or person behavior preference.
- 31. The method of claim 27, wherein the operation of automatically controlling the one or more sensors further comprises obtaining a motion trajectory corresponding to the object or person specified by the object or person preference.
- 32. The apparatus of claim 25, wherein the one or more sensors are associated with a multimedia database system.
- 33. An article of manufacture for controlling one or more sensors used to capture data associated with an event, comprising a machine readable medium containing one or more programs which when executed implement the steps of:
processing sensor data captured in accordance with the event; and automatically controlling the one or more sensors based on information pertaining to the continual activity of at least one of one or more objects and one or more persons associated with the event in real time obtained using at least a portion of the processed data captured in accordance with the event.
CROSS REFERENCE TO RELATED APPLICATIONS
[0001] This application relates to U.S. patent applications identified as Ser. No. 10/167,539 (attorney docket No. Carlbom 8-1-8) entitled “Method and Apparatus for Retrieving Multimedia Data Through Spatio-Temporal Activity Maps;” Ser. No. 10/167,534 (attorney docket No. Carlbom 9-6-2-9) entitled “Instantly Indexed Databases for Multimedia Content Analysis and Retrieval;” and Ser. No. 10/167,533 (attorney docket No. Carlbom 10-7-3-10) entitled “Performance Data Mining Based on Real Time Analysis of Sensor Data,” each filed on Jun. 12, 2002, and the disclosures of which are incorporated by reference herein.