Claims
- 1. An intuitive user interface for representing three-dimensional (“3D”) data from a user viewpoint, the interface comprising:
a computer having a graphics rendering engine; a stereoscopic display in signal communication with the computer for displaying the 3D data as a rendered virtual object; a physical base disposed relative to the stereoscopic display for defining a location of the virtual object; an instrument in signal communication with the computer for interacting with the virtual object; and a tracking device in signal communication with the computer for tracking the relative poses of the physical base, instrument and user viewpoint.
- 2. An intuitive user interface as defined in claim 1 wherein the stereoscopic display comprises a binocular display.
- 3. An intuitive user interface as defined in claim 2 wherein the binocular display comprises a head-mounted stereoscopic display.
- 4. An intuitive user interface as defined in claim 3 wherein the head-mounted stereoscopic display is of the video-see-through variety.
- 5. An intuitive user interface as defined in claim 1 wherein the tracking device comprises an optical tracking device in signal communication with the computer.
- 6. An intuitive user interface as defined in claim 5 wherein the optical tracking device comprises a head-mounted tracking camera.
- 7. An intuitive user interface as defined in claim 1 wherein the instrument comprises a switch in signal communication with the computer.
- 8. An intuitive user interface as defined in claim 1 wherein the instrument comprises at least one of a trackball and a thumbwheel in signal communication with the computer.
- 9. An intuitive user interface as defined in claim 8 wherein the signal communication is wireless.
- 10. An intuitive user interface as defined in claim 1 wherein the physical base comprises a switch in signal communication with the computer.
- 11. An intuitive user interface as defined in claim 1 wherein the physical base comprises at least one of a trackball and a thumbwheel in signal communication with the computer.
- 12. An intuitive user interface as defined in claim 11 wherein the signal communication is wireless.
- 13. An intuitive user interface as defined in claim 1 wherein the 3D data comprises 3D medical images.
- 14. A method for representing a virtual object from a user viewpoint, the method comprising:
providing a user viewpoint; defining a pose of a virtual object relative to a physical base; providing an instrument for interacting with the virtual object; tracking the relative poses of the physical base, instrument and user viewpoint; rendering three-dimensional (“3D”) data indicative of the virtual object and the instrument in accordance with the defined and tracked poses; and stereoscopically displaying the rendered virtual object.
- 15. A method as defined in claim 14, further comprising placing and moving multiplanar reconstruction (“MPR”) planes for interacting with the virtual object.
- 16. A method as defined in claim 14, further comprising outlining structures in the 3D data for interaction with the virtual object.
- 17. A method as defined in claim 14, further comprising switching between different functionalities of the instrument.
- 18. A method as defined in claim 17, further comprising visualizing the instrument as a virtual instrument in a pose linked to its tracked pose.
- 19. A method as defined in claim 18, further comprising rendering the virtual instrument with a different appearance in accordance with its selected functionality.
- 20. A program storage device readable by machine, tangibly embodying a program of instructions executable by the machine to perform program steps for representing a virtual object from a user viewpoint, the program steps comprising:
providing a user viewpoint; defining a pose of a virtual object relative to a physical base; providing an instrument for interacting with the virtual object; tracking the relative poses of the physical base, instrument and user viewpoint; rendering three-dimensional (“3D”) data indicative of the virtual object and the instrument in accordance with the defined and tracked poses; and stereoscopically displaying the rendered virtual object.
- 21. A virtual camera interface for intuitively selecting the orientation of a three-dimensional (“3D”) data set to be rendered as an image, the interface comprising:
a computer having a graphics engine for rendering an image from a 3D data set; a display device in signal communication with the computer for displaying the rendered image from the 3D data set; a handheld instrument in signal communication with the computer for selecting an orientation; and a tracking device in signal communication with the computer for tracking the position of the instrument to determine the orientation.
- 22. A virtual camera interface as defined in claim 1 wherein the 3D data set comprises a 3D image of a real object.
- 23. A virtual camera interface as defined in claim 22 wherein the real object is a person.
- 24. A virtual camera interface as defined in claim 22 wherein the 3D data set comprises a 3D medical image.
- 25. A virtual camera interface as defined in claim 22 wherein the 3D image is approximately registered to the real object.
- 26. A virtual camera interface as defined in claim 25 wherein the orientation for rendering the 3D image is approximately equal to the orientation of the handheld instrument with respect to the real object.
- 27. A virtual camera interface as defined in claim 21 wherein the handheld instrument comprises a switch in signal communication with the computer for updating the rendering of the image according to the orientation by means of triggering the switch.
- 28. A virtual camera interface as defined in claim 21 wherein the tracking device comprises a tracking camera.
- 29. A virtual camera interface as defined in claim 21 wherein the handheld instrument comprises at least one optical marker.
- 30. A method for intuitively selecting the orientation of a three-dimensional (“3D”) data set to be rendered as an image, the method comprising:
selecting a orientation for an image from a 3D dataset in correspondence with a handheld instrument; rendering the image from the 3D data set in accordance with the selected orientation; displaying the rendered image from the 3D data set on a display device; and tracking the position of the handheld instrument to maintain the orientation.
- 31. A method as defined in claim 30 wherein the 3D data set comprises a 3D image of a real object.
- 32. A method as defined in claim 31 wherein the real object is a person.
- 33. A method as defined in claim 31 wherein the 3D data set comprises a 3D medical image.
- 34. A method as defined in claim 31 wherein the 3D image is approximately registered to the real object.
- 35. A method as defined in claim 34, further comprising rendering the image from a orientation approximately equal to the orientation of the handheld instrument with respect to the real object.
- 36. A method as defined in claim 30, further comprising updating the rendering of the image in accordance with the orientation by detecting a triggering event of the handheld instrument.
- 37. A method as defined in claim 30, further comprising tracking the position of the handheld instrument with a tracking camera.
- 38. A method as defined in claim 30, further comprising tracking the handheld instrument by means of at least one optical marker.
- 39. A program storage device readable by machine, tangibly embodying a program of instructions executable by the machine to perform program steps for intuitively selecting the orientation of a three-dimensional (“3”) data set to be rendered as an image, the program steps comprising:
selecting a orientation for an image from a 3D dataset in correspondence with a handheld instrument; rendering the image from the 3D data set in accordance with the selected orientation; displaying the rendered image from the 3D data set on a display device; and tracking the position of the handheld instrument to determine the orientation.
- 40. A virtual camera interface for intuitively selecting the orientation of a three-dimensional (“3D”) data set to be rendered as an image, the interface comprising:
instrument means for selecting a orientation for an image from a 3D dataset; computing means for rendering the image from the 3D data set in accordance with the selected orientation; display means for displaying the rendered image from the 3D data set; and tracking means for tracking the position of the instrument means to determine the orientation.
CROSS-REFERENCE TO RELATED APPLICATIONS
[0001] This application claims the benefit of U.S. Provisional Application Serial No. 60/356,191 (Attorney Docket No. 2002P02426US), filed Feb. 12, 2002 and entitled “Virtual Reality/Augmented Reality User Interface for Studying and Interacting with 3D Data Sets”, which is incorporated herein by reference in its entirety. This application further claims the benefit of U.S. Provisional Application Serial No. 60/356,190 (Attorney Docket No. 2002P02429US), filed Feb. 12, 2002 and entitled “Virtual Camera as Intuitive User Interface to 3D Data Display”, which is incorporated herein by reference in its entirety.
Provisional Applications (2)
|
Number |
Date |
Country |
|
60356191 |
Feb 2002 |
US |
|
60356190 |
Feb 2002 |
US |