Claims
- 1. An augmented reality system, comprising:
a video source, the video source residing at a location and producing an image; at least one encoded marker residing within the image; a marker detector, coupled to the video source, adapted to derive encoded data from the marker residing within the image; a database; and a localization processor adapted to receive data from the marker detector and to generate data regarding location and orientation of the marker, the localization processor retrieving information from the database that is related to the location and orientation of the marker, the localization processor making the information retrieved from the database available to a user.
- 2. The augmented reality system of claim 1 wherein the encoded marker is formed to store data within a matrix.
- 3. The augmented reality system of claim 2 wherein the matrix is formed to include at least one cell reserved for marker orientation data.
- 4. The augmented reality system of claim 2 wherein at least some cells within the matrix contain an indicator of a portion of a numerical value.
- 5. The augmented reality system of claim 4 wherein the indicator within each cell is a binary indicator.
- 6. The augmented reality system of claim 5 wherein the binary indicator comprises:
a first state indicated by a vacant cell; and a second state indicated by a nonvacant cell.
- 7. The augmented reality system of claim 6 wherein a nonvacant cell is occupied by a substantially opaque circle.
- 8. The augmented reality system of claim 2 wherein the matrix is surrounded by a substantially opaque frame.
- 9. The augmented reality system of claim 8 wherein the matrix and the frame are substantially rectangular.
- 10. The augmented reality system of claim 8 wherein the frame is shaped so as to reveal corners when the plane is viewed from a plane not containing the frame.
- 11. A computer assisted localization system, comprising:
a computer; a database; a camera; at least one marker containing encoded data within a matrix; a tracker, receiving images from the camera, and detecting an image of the marker within the image received by the camera; and a localization processor, using data from the computer and the tracker to calculate a position of the camera within an environment, and retrieving from the database data associated with the position for display to a user of the system.
- 12. The system of claim 11, wherein the tracker further comprises:
a closed string detector, for identifying a series of points within an image that form an enclosed region; and a calculator, for determining a characteristic interior location for each enclosed region and identifying those regions having similarly located characteristic interior locations that correspond to marker geometry.
- 13. The system of claim 12, wherein the marker further comprises:
a substantially opaque frame; a matrix of rows and columns forming cells within the frame; and a series of substantially opaque indicators occupying at least some of the cells so as to indicate orientation and identification of the marker.
- 14. The system of claim 13 wherein the frame of each marker includes a plurality of corners, the tracker detecting the corners in order to establish the presence of a marker within an image.
- 15. The system of claim 13 wherein the frame of each marker is formed so as to have an exterior edge string and an interior edge string, the tracker calculating relative lengths of exterior and interior edge strings to establish the presence of a marker within an image.
- 16. A method of selecting database images and data for use in an augmented reality system, comprising the steps of:
marking items within an environment with respective markers that indicate marker orientation and item identification; viewing the items with a camera interconnected to a computer which accesses the database; detecting the markers viewed by the camera; and associating particular images and data with each marker for display to a user when a particular image is viewed by the camera.
- 17. The method of claim 16 further comprising the step of calculating a camera location based on information derived from a marker viewed by the camera, the marker identifying a best available image residing in the database corresponding to the camera location.
- 18. The method of claim 16 further comprising the steps of:
calculating a camera location based on information derived from a marker viewed by the camera; calculating a distance between the camera location and respective virtual camera locations for which images exist in the database; and selecting an image from the database corresponding to a virtual camera location that is closest to the calculated camera location.
- 19. The method of claim 16, further comprising the steps of:
calculating camera orientation; and selecting an image from the database corresponding to a virtual camera having a substantially similar orientation.
- 20. The method of claim 16, further comprising the steps of:
determining an optical axis of the camera; calculating an optical center of the camera; and selecting an image from the database produced from a viewpoint and in a direction substantially similar to the optical axis and the optical center of the camera.
Parent Case Info
[0001] This patent application claims priority to U.S. Provisional Patent Application Serial No. 60/326,961, entitled “Technologies For Computer Assisted Localization, Site Navigation, And Data Navigation” by Nassir Navab et al. filed Oct. 4, 2001 (Atty Dkt No. 2001P18439US)
Provisional Applications (1)
|
Number |
Date |
Country |
|
60326961 |
Oct 2001 |
US |