Claims
- 1. A system comprising:
a. a real camera device for capturing a real image and a corresponding instrumentation data associated with the real image; and b. a rendering module for generating a virtual image wherein the virtual image is generated from a virtual perspective based upon the instrumentation data for blending the virtual image and the real image into a resultant image.
- 2. The system according to claim 1 further comprising a compositing module configured for receiving the virtual image and the real image and displaying the resultant image.
- 3. The system according to claim 1 wherein the resultant image includes the real image and the virtual image.
- 4. The system according to claim 1 wherein the instrumentation data includes one of a camera position, a camera zoom, a camera pan, a camera tilt, a camera field-of-view, and an object location.
- 5. The system according to claim 1 wherein the rendering module further comprising a depth map module configured for ordering a blending priority for an object within the real image is based on the instrumentation data.
- 6. The system according to claim 5 wherein the blending priority is based on a perspective angle change of the object.
- 7. The system according to claim 1 wherein the rendering module further comprising an alignment module configured for aligning the virtual image with a viewpoint of the real camera device wherein the alignment module utilizes the instrumentation data.
- 8. The system according to claim 1 further comprising a sensor for generating the instrumentation data for measuring a physical parameter of a real event.
- 9. The system according to claim 8 wherein the physical parameter may be one of a force exerted on an object and a location of the object.
- 10. A method comprising:
a. capturing a real image of a real event from a real camera; b. receiving an instrumentation data based on the real event; and c. rendering a virtual image wherein the virtual image is based on the instrumentation data and the real image.
- 11. The method according to claim 10 wherein the instrumentation data includes one of a camera position, a camera zoom, a camera pan, a camera tilt, a camera field-of-view, and an object location.
- 12. The method according to claim 10 further comprising tracking a parameter of the real camera via the instrumentation data.
- 13. The method according to claim 10 further comprising aligning a viewpoint of the real camera with a viewpoint of the virtual image via the instrumentation data.
- 14. The method according to claim 10 further comprising blending the real image and the virtual image into a resultant image.
- 15. The method according to claim 10 further comprising calculating a perspective angle change of an object within the real image via the instrumentation data.
- 16. The method according to claim 10 further comprising positioning the real image relative to the virtual image in response to the instrumentation data.
- 17. A computer-readable medium having computer executable instructions for performing a method comprising:
a. capturing a real image of a real event from a real camera; b. receiving an instrumentation data based on the real event; and c. rendering a virtual image wherein the virtual image is based on the instrumentation data and the real image.
CROSS REFERENCE TO RELATED APPLICATIONS
[0001] The application claims relating from the U.S. provisional application entitled “Method and Apparatus for Mixed Reality Broadcast” filed on Aug. 10, 2001, with serial No. 60/311,477, which is herein incorporated by reference.
Provisional Applications (1)
|
Number |
Date |
Country |
|
60311477 |
Aug 2001 |
US |