The present invention relates to systems and methods for displaying images and specifically to the display of endoscopic images for medical and industrial applications.
Endoscopes are elongated devices used to visualize the insides of cavities. Originally, endoscopes were equipped with eyepieces for direct observation. Today many endoscopes are equipped with electronic cameras, such as CCD or CMOS image sensors. These sensors are used to capture images from the area viewed with the endoscope. Endoscopic imaging is the process of capturing images from internal structures and transmitting them to an external viewer.
Generally, a viewing situation involves a three-dimensional surface. However, the captured image is only a two-dimensional entity in an image plane that is generally orthogonal to the viewing direction of the endoscope. The image generated at the image plane is typically displayed to the user as would be seen looking along the viewing direction of the endoscope from the endoscopic viewing point. For comparison, stereo-viewing endoscopes capture two slightly offset images which are used to provide the user with a three-dimensional image. However, they still only provide a view from the viewing point and in the viewing direction of the endoscope. Because the endoscope, the user, and the internal structure being examined exist in an actual three-dimensional world, tying the user to the viewing set of the endoscope limits the way information about the internal structure can be conveyed. The user will often desire to change the viewing set. With existing technology, the only option for the user is to move the endoscope. This is not always convenient or even possible. In these and other instances it would be useful for the user to be able to change the viewing set without changing the actual position of the endoscope. An alternative viewing set could provide a better perspective of the physical surface and give the user a better sense of the relative locations of viewed features. For example, it would be advantageous to be able to use a viewing set which is aligned with the user's physical position instead of the physical position of the endoscope. Other viewing sets might also be desired, as determined by the preferences of the user.
Two methods of volumetric image navigation are described in U.S. Pat. No. 6,167,296 to Shahidi and in U.S. Pat. No. 6,442,417 to Shahidi, et al. These methods both utilize a volumetric data set obtained in a preoperative X-ray or MRI scan to construct a three-dimensional anatomical model in a computer. This model is used to generate two-dimensional perspective projection views of the simulated anatomy. These views may then be compared with an actual endoscopic image. However, although these systems provide a three-dimensional model with a variable viewing set, they can only display the endoscopic image from the viewing set of the endoscope.
Because of this limitation, which is common for all existing endoscopic display systems, the true nature of the viewed area is often not conveyed adequately to the user. It would therefore be desirable to display the endoscopic image in a way that more accurately represents the actual three-dimensional surface from which the image was taken and permits the user to achieve a wide variety of different views of this surface.
Accordingly, the primary object of the present invention is to provide a versatile method of displaying endoscopic images that includes three-dimensional surfaces, variable viewing points, directions, and orientations. It is a further object of this invention to have this method applicable to all endoscopes regardless of type, viewing direction, or image sensor format.
In accordance with the present invention, a method for displaying an endoscopic image comprises receiving an endoscopic image of a viewed surface, providing a virtual surface with said endoscopic image mapped onto said virtual surface, rendering a rendered image of said virtual surface, and providing said rendered image to a user.
The following detailed description illustrates the invention by way of example, not by way of limitation of the principles of the invention. This description will enable one skilled in the art to make and use the invention, and describes several embodiments, adaptations, variations, alternatives and uses of the invention, including what we presently believe is the best mode of carrying out the invention.
The preferred embodiment of the invention is a software program running on a computer. The computer communicates electronically with an endoscope and a display device such as a monitor. The computer includes a graphics processing unit such as those manufactured by NVidia Corporation. The graphics processing unit is specifically designed to quickly perform the types of graphics related calculations required by the present invention. Other devices may be connected to the computer as appropriate for a given application.
The program uses the graphical display programming library OpenGL. This library offers a powerful set of programming tools optimized for displaying textured shapes in three dimensions. It allows a collection of virtual shapes and a virtual viewing set to be defined as data within the computer memory. The collection of virtual shapes is then rendered based on the viewing parameters and displayed on a monitor. An alternative library, such as DirectX, could be used without departing from the scope of this invention
In certain cases the anatomy may be shaped in such a way that parts of it are obscured from the endoscopic viewing point.
The configuration parameters required by the computer may be obtained in a variety of ways. For example the virtual surface can be constructed from scan data obtained through techniques such as MRI. Alternatively, the virtual surface could be selected from a collection of standard surfaces. The virtual viewing point, viewing direction, and viewing orientation may be specified using any standard data input technique. Specialized pointers for specifying the viewing set can also be used. The relationship between the endoscopic viewing set and the actual viewing surface can be input by the user or obtained from stereotactic systems or other sensors.
Accordingly, the present invention provides a new method for viewing endoscopic images which affords the user an enhanced, versatile, and more realistic representation of structures viewed by an endoscope.
The present invention has been described above in terms of a presently preferred embodiment so that an understanding of the present invention can be conveyed. However, there are many alternative modes of operation not specifically described herein but with which the present invention is applicable. For example, although specific surface approximation schemes were given, any surface approximation technique known from fields such as computer graphics and machine vision would fall under the scope of this invention. Also, there are many different ways to implement a user interface. In addition, while the examples were given with respect to endoscopes for use in surgical procedures, the present invention would be equally applicable with respect to borescopes or the like for use in non-medical situations. The scope of the present invention should therefore not be limited by the embodiments illustrated, but rather it should be understood that the present invention has wide applicability with respect to viewing instruments and procedures generally. All modifications, variations, or equivalent elements and implementations that are within the scope of the appended claims should therefore be considered within the scope of the invention.
Number | Name | Date | Kind |
---|---|---|---|
3572325 | Bazell et al. | Mar 1971 | A |
3880148 | Kanehira et al. | Apr 1975 | A |
4697577 | Forkner | Oct 1987 | A |
5230623 | Guthrie et al. | Jul 1993 | A |
5307804 | Bonnet | May 1994 | A |
5313306 | Kuban et al. | May 1994 | A |
5515160 | Schulz et al. | May 1996 | A |
5531227 | Schneider | Jul 1996 | A |
5617857 | Chader et al. | Apr 1997 | A |
5623560 | Nakajima et al. | Apr 1997 | A |
5638819 | Manwaring et al. | Jun 1997 | A |
5661519 | Franetzki | Aug 1997 | A |
5677763 | Redmond | Oct 1997 | A |
5704897 | Truppe | Jan 1998 | A |
5776050 | Chen et al. | Jul 1998 | A |
5899851 | Koninckx | May 1999 | A |
5920395 | Schulz | Jul 1999 | A |
5954634 | Igarashi | Sep 1999 | A |
5976076 | Kolff et al. | Nov 1999 | A |
5995108 | Isobe et al. | Nov 1999 | A |
6007484 | Thompson | Dec 1999 | A |
6097423 | Mattsson-Boze et al. | Aug 2000 | A |
6135946 | Konen et al. | Oct 2000 | A |
6139499 | Wilk | Oct 2000 | A |
6167296 | Shahidi | Dec 2000 | A |
6283918 | Kanda et al. | Sep 2001 | B1 |
6371909 | Hoeg et al. | Apr 2002 | B1 |
6442417 | Shahidi et al. | Aug 2002 | B1 |
6443894 | Sumanaweera et al. | Sep 2002 | B1 |
6464631 | Girke et al. | Oct 2002 | B1 |
6471637 | Green et al. | Oct 2002 | B1 |
6500115 | Krattiger et al. | Dec 2002 | B2 |
6505065 | Yanof et al. | Jan 2003 | B1 |
6648817 | Schara et al. | Nov 2003 | B2 |
6663559 | Hale et al. | Dec 2003 | B2 |
6695774 | Hale et al. | Feb 2004 | B2 |
20020045855 | Frassica | Apr 2002 | A1 |
20020099263 | Hale et al. | Jul 2002 | A1 |
20020161280 | Chatenever et al. | Oct 2002 | A1 |
20030016883 | Baron | Jan 2003 | A1 |
20040127769 | Hale et al. | Jul 2004 | A1 |
20040210105 | Hale et al. | Oct 2004 | A1 |
20050015005 | Kockro | Jan 2005 | A1 |
20050020883 | Chatenever et al. | Jan 2005 | A1 |
20050027167 | Chatenever et al. | Feb 2005 | A1 |
20050054895 | Hoeg et al. | Mar 2005 | A1 |
20050085718 | Shahidi | Apr 2005 | A1 |
20050228250 | Bitter et al. | Oct 2005 | A1 |
Number | Date | Country |
---|---|---|
6269403 | Sep 1994 | JP |
WO 9501749 | Jan 1995 | WO |
WO 0122865 | Apr 2001 | WO |
Number | Date | Country | |
---|---|---|---|
20050113643 A1 | May 2005 | US |