Claims
- 1. A system comprising:
a. a sensing device for generating an instrumentation data stream; b. a camera device for capturing a real video stream; and c. a rendering module configured for receiving the instrumentation data stream, rendering a virtual image, and allocating a portion of the virtual image for displaying the real video stream wherein the real video stream is oriented within the virtual image in response to the instrumentation data stream.
- 2. The system according to claim 1 wherein the sensing device is a sensor for measuring a physical parameter associated with a real event.
- 3. The system according to claim 1 wherein the physical parameter may include a location.
- 4. The system according to claim 1 wherein the sensing device is incorporated within the camera device for measuring a parameter of the camera device.
- 5. The system according to claim 1 further comprising a compositing module configured for integrating the real video stream within the virtual image wherein the virtual image and the real video stream are simultaneously displayed.
- 6. The system according to claim 1 wherein the portion of the virtual image for displaying the real video stream is a surface plane.
- 7. The system according to claim 6 wherein the surface plane is positioned within the virtual image in response to the instrumentation data stream.
- 8. The system according to claim 6 wherein the surface plane is rotated relative to the virtual image in response to the instrumentation data stream.
- 9. The system according to claim 1 wherein the instrumentation data includes one of a camera position, a camera zoom, a camera pan, a camera tilt, a camera field-of-view, and an object location.
- 10. A method comprising:
a. sensing an instrumentation data stream from a sensor; b. capturing a video stream of a real event from a camera; and c. rendering a virtual image including a display area for displaying the video stream wherein the display area is positioned in response to the instrumentation data stream.
- 11. The method according to claim 10 wherein the instrumentation data includes one of a camera position, a camera zoom, a camera pan, a camera tilt, a camera field-of-view, and an object location.
- 12. The method according to claim 10 further comprising integrating the virtual image and the video stream into a display image wherein the display image includes a simultaneous display of the virtual image and the video stream.
- 13. The method according to claim 10 further comprising designating a surface plane as the display area for the video stream.
- 14. The method according to claim 13 further comprising tilting the surface plane in response to the instrumentation data wherein an angle of the surface plane represents a perspective of the camera capturing the video stream.
- 15. The method according to claim 10 further comprising matching a virtual position of a virtual object in the virtual image with a real position of a real object within the video stream wherein the real position corresponds with the instrumentation data stream.
- 16. The method according to claim 15 further comprising positioning the display area relative to the virtual image in response to matching the virtual position with the real position.
- 17. A computer-readable medium having computer executable instructions for performing a method comprising:
a. sensing an instrumentation data stream from a sensor; b. capturing a video stream of a real event from a camera; and c. rendering a virtual image including a display area for displaying the video stream wherein the display area is positioned in response to the instrumentation data stream.
CROSS REFERENCE TO RELATED APPLICATIONS
[0001] The application claims relating from the U.S. provisional application entitled “Method and Apparatus for Mixed Reality Broadcast” filed on Aug. 10, 2001, with serial number 60/311,477, which is herein incorporated by reference.
Provisional Applications (1)
|
Number |
Date |
Country |
|
60311477 |
Aug 2001 |
US |