Claims
- 1. A method of visualization of hazards, comprising:
providing a display unit for the user; providing motion tracking hardware; using the motion tracking hardware to determine the location and direction of the viewpoint to which the computer-generated three-dimensional graphical elements are being rendered; providing an image or view of the real world; using a computer to generate three-dimensional graphical elements as representations of hazards; rendering the computer-generated graphical elements to correspond to the user's viewpoint; creating for the user a mixed view comprised of an actual view of the real world as it appears in front of the user, where graphical elements can be placed anywhere in the real world and remain anchored to that place in the real world regardless of the direction in which the user is looking, wherein the rendered graphical elements are superimposed on the actual view, to accomplish an augmented reality view of representations of hazards in the real world; and presenting the augmented reality view, via the display unit, to the user.
- 2. The method of claim 1 in which the display unit is selected from the group of display units consisting of a heads-up display, a Head Mounted Display (HMD), a see-through HMD, and a non-see-through HMD.
- 3. The method of claim 1 in which the display unit is selected from the group of display units consisting of a heads-down-display, a display unit that is moveable, but not held, by the user, a fixed computer monitor, a display unit that is used in a vehicle, and a display unit that is used in an aircraft.
- 4. The method of claim 1 in which the display unit is selected from the group of display units consisting of a handheld display device, a handheld see-through device, a handheld binocular type of display, a handheld monocular type of display, a handheld non-see-through device, and a display unit that is carried by a user.
- 5. The method of claim 1 in which providing an image or view of the real world comprises capturing an image with a video camera that is mounted to the display unit.
- 6 The method of claim 1 in which the image of the real world is a static image.
- 7. The method of claim 1 in which the image of the real world is from a ground-based stationary imaging sensor from a known viewpoint.
- 8. The method of claim 1 in which the image of the real world has been modified to appear approximately like a thermal view of the real world would appear.
- 9. The method of claim 1 in which the motion tracking hardware is selected from the group of motion tracking hardware consisting of a motorized camera mount, an external tracking system, and a Global Positional System.
- 10. The method of claim 1 in which the representations are designed to be reproductions to mimic the appearance and actions of actual hazards.
- 11. The method of claim 1 in which the representations are designed to be indicators of actual hazards, and to convey their type and positions.
- 12. The method of claim 1 in which the representations are used to indicate a safe region in the vicinity of a hazard.
- 13. The method of claim 1 in which the representations are entered into the computer interactively by a user.
- 14. The method of claim 1 in which the representations are automatically placed using a database of locations.
- 15. The method of claim 1 in which the representations are automatically placed using input from sensors.
- 16. The method of claim 1 in which the representations are static 3D objects.
- 17 The method of claim 1 in which the representations are animated textures mapped onto 3D objects.
- 18 The method of claim 1 in which the representations are objects that appear to be emanating out of the ground.
- 19. The method of claim 1 in which the representations blink or have a blinking component.
- 20. The method of claim 1 in which the representations represent at least the location of a hazard selected from the group of hazards consisting of visible fire, visible water, visible smoke, poison gas, heat, chemicals and radiation.
- 21. The method of claim 1 in which the representations are created to appear and act to mimic how a hazard selected from the group of hazards consisting of fire in that location would appear and act, water in that location would appear and act, smoke in that location would appear and act, unseen poison gas in that location would act, unseen heat in that location would act, and unseen radiation in that location would act.
- 22. The method of claim 1 in which the rendered computer-generated three-dimensional graphical elements are representations displaying an image property selected from the group of properties consisting of fuzziness, fading, transparency, and blending, to represent the intensity, spatial extent, and edges of at least one hazard.
- 23. The method of claim 1 in which the rendered computer-generated three-dimensional graphical elements are icons which represent hazards.
- 24. The method of claim 1 in which information about the hazard is displayed to the user via text overlaid onto a view of a real background.
- 25. The method of claim 1 further comprising generating for the user an audio warning component appropriate to at least one hazard being represented.
- 26. The method of claim 1 in which the representations are used in operations.
- 27. The method of claim 1 in which the representations are used in training.
- 28. The method of claim 1 in which the representations are displayed without a view of the real world.
CROSS REFERENCE TO RELATED APPLICATIONS
[0001] This application claims priority of Provisional patent application No. 60/349,029 filed Jan. 15, 2002. This application is a Continuation in Part of “Augmented Reality Navigation Aid” Ser. No. 09/634,203 filed Aug. 9, 2000.
Provisional Applications (1)
|
Number |
Date |
Country |
|
60349029 |
Jan 2002 |
US |
Continuation in Parts (1)
|
Number |
Date |
Country |
Parent |
09634203 |
Aug 2000 |
US |
Child |
10215567 |
Aug 2002 |
US |