Claims
- 1. An internetworked augmented reality (AR) system, comprising:
a. At least one Local Station, at least one of which must be a Local AR Station, b. At least one Remote Station, and c. A network connecting these stations.
- 2. The system of claim 1 wherein an AR Station is comprised of at least:
a. A computing system b. An AR display system, and c. A tracking system
- 3. The system of claim 1 wherein a Non-AR Station is comprised of at least:
a. A computing system
- 4. The system of claim 1 wherein the network is selected from the group of networks consisting of a local area network (LAN), a wide area network (WAN), a wireless network, and the Internet.
- 5. The system of claim 3 wherein a Non-AR Station computing system is selected from the group of computing systems consisting of a PC, web server, database server, and high-performance computer (HPC).
- 6. The system of claim 3 wherein there is equipment allowing a human to use at least one Station in addition to the required Local AR Station.
- 7. The system of claim 5 wherein an AR Station user can remotely interact with a HPC that performs computationally intensive calculations.
- 8. The system of claim 5 wherein an AR Station user can perform shopping online by downloading items from a web server for placement, evaluation, and interaction in the user's own environment.
- 9. The system of claim 5 wherein an AR Station user is aided in maintenance tasks by accessing information from a remote database server.
- 10. The system of claim 5 wherein an AR Station user is aided in design tasks by accessing information from a remote database computer.
- 11. The system of claim 1 further including means to capture video from an AR Station and transmit it over a network to another Station.
- 12. The system of claim 6 wherein an AR Station user is a trainee/student and another Station user is an instructor/teacher.
- 13. The system of claim 6 wherein an AR Station user can collaborate with another user.
- 14. The system of claim 6 wherein a user at another Station can control the experience at an AR Station via an input device.
- 15. The system of claim 6 wherein a user at another Station can observe the experience at an AR Station via a live video feed.
- 16. The system of claim 6 wherein a user at another Station can communicate with a person at an AR Station by voice via audio feed(s).
- 17. The system of claim 6 wherein a user at another Station can visually communicate with an AR Station user via graphical overlays in the field of view of the AR Station user.
- 18. The system of claim 5 wherein an AR Station user is aided in navigation by accessing frequently updated information over a network.
- 19. The system of claim 6 wherein a user at another Station controls a testing program at an AR Station.
- 20. The system of claim 5 wherein an AR Station user is aided in situational awareness (SA) by accessing frequently updated information over a network.
- 21. The system of claim 6 wherein an AR Station user can play a game with at least one other user at another Station.
- 22. The system of claim 15 wherein at least one live video feed is from the first person perspective as seen by an AR Station user.
- 23. The system of claim 15 wherein at least one live video feed is from a non-first-person perspective camera.
- 24. The system of claim 23 wherein a live video feed is from at least one movable camera controllable remotely from a Station user.
- 25. The system of claim 6 wherein a user at a Station can view from any viewpoint a virtual representation of an AR scenario, which includes virtual representations of an AR Station user or users.
- 26. The system of claim 25 wherein a user at a Station can select a virtual representation of an AR Station user to read information about that particular user.
- 27. The system of claim 6 wherein a user at a Station can observe the effects of a stimulus which results in an AR Station user perceiving sounds from objects in AR.
- 28. The system of claim 6 wherein a user at a Station can observe the effects of a stimulus which results in an AR Station user perceiving forces or surface textures (haptic feedback) from objects in AR.
- 29. The system of claim 6 wherein a user at a Station can observe the effects of a stimulus which results in an AR Station user perceiving smell from objects in AR.
- 30. The system of claim 6 wherein a user at a Station can observe the effects of a stimulus which results in an AR Station user perceiving heat and cold from objects in AR.
- 31. The system of claim 6 wherein a user at a Station can observe the effects of a stimulus which results in an AR Station user perceiving electrical shock from objects in AR.
- 32. The system of claim 2 wherein the effects onto and from real objects of reflections, shadows, and light emissions from virtual objects downloaded from a web server are seen by an AR Station user.
- 33. The system of claim 3 wherein an AR Station user can augment telepresence imagery with virtual imagery by adding a video camera and image capture capability to a Non-AR Station to capture and send video back to an AR Station for viewing by the user.
- 34. The system of claim 33 wherein a motion tracking system at an AR station controls a mechanized camera mount at a Non-AR Station.
- 35. The system of claim 33 wherein a video camera is stationary and aimed at a reflective curved surface, and the video image received at the AR Station is mapped to the inside of a virtual curved surface for undistorted viewing of the camera scene.
- 36. The system of claim 2 further including at least one video camera.
- 37. The system of claim 2 further including at least one input device.
- 38. The system of claim 3 further including at least one input device.
- 39. The system of claim 5 wherein an AR Station user is aided in design tasks by accessing information from a remote HPC (high performance computer).
- 40. The system of claim 6 wherein a user at a Station can visually communicate with an AR Station user via text overlays in the field of view of the AR Station user.
- 41. The system of claim 25 wherein a user at a Station can select a virtual representation of an AR Station user to send information to that particular user.
- 42. The system of claim 6 wherein a user at a Station can control a stimulus which results in an AR Station user perceiving sounds from objects in AR.
- 43. The system of claim 6 wherein a user at a Station can control a stimulus which results in an AR Station user perceiving forces or surface textures (haptic feedback) from objects in AR.
- 44. The system of claim 6 wherein a user at a Station can control a stimulus which results in an AR Station user perceiving smell from objects in AR.
- 45. The system of claim 6 wherein a user at a Station can control a stimulus which results in an AR Station user perceiving heat and cold from objects in AR.
- 46. The system of claim 6 wherein a user at a Station can control a stimulus which results in an AR Station user perceiving electrical shock from objects in AR.
CROSS REFERENCE TO RELATED APPLICATIONS
[0001] This application claims priority of pending Provisional patent applications No. 60/180,001 filed Feb. 3, 2000; No. 60/184,578 filed Feb. 24, 2000; and No. 60/192,730 filed Mar. 27, 2000.
Provisional Applications (3)
|
Number |
Date |
Country |
|
60180001 |
Feb 2000 |
US |
|
60184578 |
Feb 2000 |
US |
|
60192730 |
Mar 2000 |
US |