Claims
- 1. A single-computer real-time augmented reality system comprising:
a computer having a processor and a bus, the processor in signal communication with the bus; a head-mounted display in signal communication with the computer; a first frame acquisition unit disposed relative to the computer and in signal communication with the processor, the first frame acquisition unit having a direct video output; a left video camera disposed relative to the head-mounted display and in signal communication with the first frame acquisition unit; a left video display disposed relative to the head-mounted display and in signal communication with the direct video output of the first frame acquisition unit; a second frame acquisition unit disposed relative to the computer and in signal communication with the processor, the second frame acquisition unit having a direct video output; a right video camera disposed relative to the head-mounted display and in signal communication with the second frame acquisition unit; and a right video display disposed relative to the head-mounted display and in signal communication with the direct video output of the second frame acquisition unit.
- 2. An augmented reality system as defined in claim 1, further comprising:
a third frame acquisition unit disposed relative to the computer and in signal communication with the bus; and a tracking camera disposed relative to the head-mounted display and in signal communication with the third frame acquisition unit.
- 3. An augmented reality system as defined in claim 1 wherein the computer is a personal computer.
- 4. An augmented reality system as defined in claim 1 wherein the processor operates at or above about 400 MHz.
- 5. An augmented reality system as defined in claim 1, further comprising at least about 256 MB of random access memory disposed relative to the computer and in signal communication with the processor.
- 6. An augmented reality system as defined in claim 1 wherein the bus is a Peripheral Component Interconnect bus.
- 7. An augmented reality system as defined in claim 1 wherein the direct video outputs are VGA outputs.
- 8. An augmented reality system as defined in claim 1 wherein the left and right video displays are VGA displays with at least about 24-bit color and at least about 640×480 pixel resolution.
- 9. An augmented reality system as defined in claim 1 wherein real-time comprises about 30 frames per second per video display.
- 10. An augmented reality system as defined in claim 1 wherein each of the first and second frame acquisition units comprises a Matrox Corona-II® frame grabber card.
- 11. An augmented reality system as defined in claim 2 wherein the tracking camera comprises an infrared video camera.
- 12. A method for providing augmented reality in real-time using a single computer, the method comprising:
capturing tracking video data; passing tracking video data through a bus to a computer memory for motion tracking; computing pose estimation results for the motion tracking and passing the results to left and right frame acquisition units; capturing left and right video data to the left and right frame acquisition units, respectively; passing the acquired left and right video data through the on-board display buffers of the respective frame acquisition units and out to their direct video outputs; applying the pose estimation results to the rendering of virtual objects on each of the left and right frame acquisition units for an augmented reality overlay; and displaying the left and right video data with augmented reality overlays in real-time.
- 13. A method as defined in claim 12 wherein the tracking video data is reflected from a tracking marker.
- 14. A method as defined in claim 12 wherein the tracking video data is captured to a tracking frame acquisition unit.
- 15. A method as defined in claim 12 wherein the tracking video data is infrared.
- 16. A method as defined in claim 12 wherein the tracking video data is captured by at least one of a left video camera, a right video camera, and a tracking camera.
- 17. A method as defined in claim 12, further comprising:
using a marker made of infrared reflectors for motion tracking; pre-calibrating a tracking camera for its internal parameters; computing the pose of the tracking camera related to the infrared marker using the homography between the infrared marker and its image correspondences; and obtaining the poses of left and right video cameras with the known system calibration results.
- 18. A method as defined in claim 17, further comprising:
calibrating the internal parameters of the left and right video cameras using an infrared marker together with coded visual markers; computing the transformations from the tracking camera coordinate system to the left and right video camera coordinate systems for system calibration; and computing the poses of the left and right video cameras from the homography of the feature points of the visual markers and their image correspondences.
- 19. A method as defined in claim 18, further comprising:
performing virtual object overlays within a non-destructive overlay buffer on each of the left and right frame acquisition units; and rendering the virtual objects in the overlay buffers, with a background set to be a transparent key-color, to achieve the augmented reality overlays.
- 20. A method as defined in claim 19, further comprising:
obtaining the addresses of the on-board overlay buffers and corresponding rendering surfaces; and directly rendering at least one of text and 2D objects in the overlay buffers.
- 21. A method as defined in claim 19, further comprising:
obtaining the addresses of the on-board overlay buffers and corresponding rendering surfaces; programming an on-board graphics accelerator to render 3D objects in the overlay buffers in real-time.
- 22. A method as defined in claim 12, further comprising:
pre-calibrating a plurality of cameras for their internal parameters; using markers to calibrate a system for the transformation between an infrared tracking camera and a plurality of video cameras; and tracking an infrared reflecting marker for a virtual object overlay.
- 23. A single-computer real-time augmented reality system comprising:
bus means for passing tracking video data through a bus to a computer memory for motion tracking; processor means for computing pose estimation results for the motion tracking and passing the results to left and right frame acquisition units; left and right video camera means for capturing left and right video data to the left and right frame acquisition units, respectively; overlay means for applying the pose estimation results to the rendering of virtual objects on each of the left and right frame acquisition units for an augmented reality overlay; direct video output means for passing the acquired left and right video data through the on-board display buffers of the respective frame acquisition units and out to their direct video outputs; and head-mounted display means for displaying the left and right video data with augmented reality overlays in real-time.
- 24. A program storage device readable by machine, tangibly embodying a program of instructions executable by the machine to perform program steps for single-computer real-time augmented reality, the program steps comprising:
passing tracking video data through a bus to a computer memory for motion tracking; computing pose estimation results for the motion tracking and passing the results to left and right frame acquisition units; capturing left and right video data to the left and right frame acquisition units, respectively; passing the acquired left and right video data through the on-board display buffers of the respective frame acquisition units and out to their direct video outputs; applying the pose estimation results to the rendering of virtual objects on each of the left and right frame acquisition units for an augmented reality overlay; and displaying the left and right video data with augmented reality overlays in real-time.
- 25. A program storage device as defined in claim 19, the program steps further comprising:
pre-calibrating a plurality of cameras for their internal parameters; using markers to calibrate a system for the transformation between a tracking camera and at least one video camera; and tracking a reflecting marker for a virtual object overlay.
Parent Case Info
[0001] This application claims priority to U.S. Provisional application serial No. 60/343,008 by Xiang Zhang filed Dec. 20, 2001.
Provisional Applications (1)
|
Number |
Date |
Country |
|
60343008 |
Dec 2001 |
US |