Claims
- 1. A method of using sensory data corresponding with content data comprising:
a. recording the content data through a recording device; b. simultaneously capturing the sensory data through a sensor while recording the content; and c. relating a portion of the sensory data corresponding to a portion of the content data.
- 2. The method according to claim 1 further comprising storing a user preference.
- 3. The method according to claim 2 further comprising searching the sensory data in response to the user preference.
- 4. The method according to claim 2 further comprising storing the portion of the content data in response to the user preference.
- 5. The method according to claim 1 further comprising tagging the portion of the content data in response to the portion of the sensory data.
- 6. The method according to claim 1 further comprising generating the sensory data via the sensor.
- 7. The method according to claim 1 wherein the sensory data includes positional data.
- 8. The method according to claim 1 wherein the sensory data includes force data.
- 9. The method according to claim 1 wherein the content data includes audio/visual data.
- 10. The method according to claim 1 wherein the recording device includes a audio/visual camera.
- 11. The method according to claim 1 wherein the sensor is an accelerometer.
- 12. A method of recording an event comprising:
a. capturing an audio/visual data stream of the event through a recording device; b. capturing a sensory data stream of the event through a sensing device; and c. synchronizing the audio/visual data stream and the sensory data stream such that a portion of the sensory data stream corresponds with a portion of the audio/visual data stream.
- 13. The method according to claim 12 further comprising storing a user preference describing a viewing desire of a user.
- 14. The method according to claim 13 further comprising highlighting a portion of the audio/visual data stream based on the user preference.
- 15. The method according to claim 12 further comprising analyzing the sensory data stream for specific parameters.
- 16. The method according to claim 15 further comprising highlighting the portion of the audio/visual data stream based on analyzing the sensory data stream.
- 17. The method according to claim 12 wherein the sensory data stream describes the scene using location data of subjects within the event.
- 18. The method according to claim 12 wherein the sensory data stream describe the scene using force data of subjects within the event.
- 19. A system for recording an event comprising:
a. a recording device for capturing a sequence of images of the event; b. sensing device for capturing a sequence of sensory data of the event; and c. a synchronizer device connected to the recording device and the sensing device for formatting the sequence of images and the sequence of sensory data into a correlated data stream wherein a portion of the sequence of images corresponds to a portion of the sequence of sensory data.
- 20. The system according to claim 20 further comprising a storage device connected to the recording device and the sensing means for storing the plurality of images and the plurality of sensory data.
- 21. The system according to claim 20 further comprising a storage device connected to the synchronizer device for storing the correlated data stream.
- 22. The system according to claim 20 wherein the sensing device is an accelerometer.
- 23. The system according to claim 20 wherein the sensing device is a location transponder.
- 24. The system according to claim 20 wherein the sensing device is force sensor.
- 25. The system according to claim 20 wherein the recording device is a video camera.
- 26. The system according to claim 20 wherein the plurality of sensory data includes positional data.
- 27. The system according to claim 20 wherein the plurality of sensory data includes force data.
- 28. A computer-readable medium having computer executable instructions for performing a method comprising:
a. recording the content data through a recording device; b. simultaneously capturing the sensory data through a sensor while recording the content; and c. relating a portion of the sensory data corresponding to a portion of the content data.
CROSS REFERENCE TO RELATED APPLICATIONS
[0001] The present application claims benefit of U.S. Provisional Patent Application No. 60/311,071, filed on Aug. 8, 2001, entitled “Automatic Tagging and Caching of Highlights” listing the same inventors, the disclosure of which is hereby incorporated by reference.
Provisional Applications (1)
|
Number |
Date |
Country |
|
60311071 |
Aug 2001 |
US |