The present invention relates to endoscopic surgical navigation by the combination of variable direction of view endoscopy and image guided surgery techniques, especially as it relates to neurosurgery.
Image guided surgical navigation is the process of planning minimally invasive surgical approaches and guiding surgical tools towards targets inside a patient's body with the help of anatomical imaging information obtained with techniques such as ultrasound, magnetic resonance, and various radiographic techniques. Such anatomical imaging information is useful because during a minimally invasive procedure, the surgical tools and the subcutaneous anatomy are not directly visible to the surgeon. With early image guided surgical techniques, the surgeon had to rely on her ability to accurately correlate two-dimensional slice-plane data with the three dimensionality of the patient in order to safely guide tools in the surgical field. The main drawbacks with this method were that it required abstract visualization by the surgeon in an attempt to develop an accurate mental picture of the interior anatomy, and that it did not provide feedback to the surgeon about the position of the surgical instruments during a procedure. These problems were addressed with the advent of frameless stereotactic systems, as disclosed in U.S. Pat. No. 5,230,623 to Guthrie (1993), U.S. Pat. No. 5,531,227 to Schneider (1996), U.S. Pat. No. 5,617,857 to Chader (1997), and U.S. Pat. No. 5,920,395 to Schulz which could locate and display the real time global position of a surgical instrument relative to reconstructed computer graphical models of diagnostic imaging data obtained through newer techniques such as computed tomography, magnetic resonance imaging, positron emission tomography, ultrasound scans, and other techniques. The methods of frameless stereotaxy were further improved by methods which could provide real time virtual anatomical views from the viewpoint of the surgical instrument as it was positioned inside the patient, as disclosed in U.S. Pat. No. 6,167,296 (2000) and U.S. Pat. No. 6,442,417 (2002) to Shahidi.
The backbone of minimally invasive surgical procedures is the endoscope, which affords surgeons an actual view of the internal anatomy. The combination of endoscopy and image guided surgery is interesting because it brings together the interior view of the endoscope and the exterior perspective of the image guided surgical system, much like local visual information such as landmarks or street signs are correlated with a map or a global positioning system to accurately determine position in a landscape. This combination is suggested by Shahidi, who teaches correlating and overlaying real endoscopic images with virtual images of the same view reconstructed from global imaging data, affording advantages such as graphical image enhancement. Shahidi exclusively deals with images generated from the viewpoint of an endoscope or surgical instrument looking along its longitudinal axis, tying the disclosure to fixed-axis instruments. Disclosure U.S. Pat. No. 6,442,417 specifically teaches the use of virtual perspective images of regions outside the field of view of fixed-angle endoscope as substitutes for obtaining live endoscopic views of such regions. Variable direction of view endoscopes can provide real images of such areas without the need for much shaft movement or reinsertion of the endoscope from an alternate direction. Variable direction of view endoscopes, which can be either rigid or flexible, as disclosed in U.S. Pat. No. 3,880,148 to Kanehira (1975), U.S. Pat. No. 4,697,577 to Forkner (1987), U.S. Pat. No. 6,371,909 to Hoeg (2002), WIPO publication WO 01/22865A1 to Ramsbottom (2001), DE 29907430 to Schich (1999), U.S. Pat. No. 3,572,325 to Bazell et al. (1971), and U.S. Pat. No. 6,007,484 to Thompson (1999) typically have a mechanism at the tip allowing the user to change the viewing direction without moving the endoscope shaft. Electronic endoscopes, as disclosed in U.S. Pat. No. 5,954,634 to Igarashi (1998) and U.S. Pat. No. 5,313,306 to Kuban, et al. (1994), with extreme wide angle lenses that allow the user to selectively look at portions of the optical field also belong to the class of variable direction of view endoscopes.
The value of using image guidance system in conjunction with variable direction of view endoscopy is potentially much greater than for standard fixed-angle endoscopy. Firstly, such a combination would allow real and virtual image correlation over a much greater viewing range, which would mean improved approach planning, improved guidance capabilities, and improved procedures overall. Secondly, it would provide a significant betterment of viewing navigation with variable direction of view endoscopes. A problem introduced by variable direction of view endoscopes is that it is difficult for the surgeon to estimate the changing endoscopic line of sight, which has a variable relationship to the shaft axis, because the tip of the instrument is concealed during use. Getting an external estimate of where the endoscope is “looking” during a procedure is important as the surgeon tries to integrate preexisting knowledge of the anatomy with the viewing process. Even with indicator knobs and dials (as in United States patent application 20020099263), or markers along the imaging axis (U.S. Pat. No. 6,500,115 to Krattiger et al.) it can be difficult to estimate which part of the anatomy is being seen through the endoscope because the user does not know the location of endoscope tip, which is the point of origin for the variable view vector. Fixed-angle endoscopes do not suffer from this problem to the same degree because the viewing direction has a fixed relationship to the endoscope shaft and can often be mentally extrapolated by the surgeon during a procedure.
The solution to this problem is to use an image guided system to provide the surgeon with a global perspective of the endoscope's viewing direction. In order to achieve this, it is not sufficient to simply monitor the position of the shaft of the endoscope as described in the prior art and done in current practice. The endoscopic viewing direction has to monitored as well. One way to do this, is to equip the view changing mechanism with an emitter/transponder which can be sensed through the patient's skin by external sensors. A better way to monitor the viewing direction is to sense its orientation relative to the endoscope shaft which position can be found by current image guided systems. This requires a variable direction endoscope instrumented with means to monitor its internal configuration. By combining the instrument's internal configuration data with its global position data as determined by the image guided surgical system, its viewing direction can then be determined. The variable direction of view endoscopes disclosed in the prior art listed above, are not equipped with means of monitoring their internal configuration. Apparently the only system currently capable of such internal configuration monitoring is the system disclosed in U.S. Pat. No. 6,663,559 by Hale et al. which discloses a novel system and method for precision control of variable direction of view endoscopes, making it ideal for integration with an image guided surgical system.
With proper integration, the extended viewing capabilities of an appropriately instrumented variable direction of view endoscope such as the one disclosed by Hale, combined with the features of an image guided surgical system could simplify and improve surgical planning and procedure. Global view vector monitoring would solve many of the endoscopic orientation problems surgeons face during variable direction of view endoscopy. Further, such an omnidirectional viewing navigation system could greatly expand the graphical image enhancement techniques disclosed by Shahidi.
From the discussion above, it should become apparent that there is a need for a method which provides the following capabilities: improved endoscopic orientation capabilities, global monitoring of endoscopic position and viewing direction, and improved surgical approach and procedure planning.
In accordance with the present invention, a method is provided for combining variable direction of view endoscopy system with image guided techniques yielding significantly improved viewing capabilities and novel surgical planning features. A method for improving a diagnostic or surgical procedure involving a variable direction of view endoscope with a variable line of sight comprising: acquiring volumetric scan data of a subsurface structure; positioning said endoscope relative to said subsurface structure; establishing the position of said endoscope relative to said subsurface structure; acquiring internal endoscope configuration data; displaying representations of said subsurface structure and said endoscopic line of sight in their correct relative spatial relationship based on said volumetric scan data, said endoscope position data, and said internal endoscope configuration data.
The following detailed description illustrates the invention by way of example, not by way of limitation of the principles of the invention. This description will clearly enable one skilled in the art to make and use the invention, and describes several embodiments, adaptations, variations, alternatives and uses of the invention, including what we presently believe is the best mode of carrying out the invention.
Prior Art
Referring now to the drawings, in which like reference numbers represent similar or identical structures throughout,
Preferred Embodiment
The relative positions of the endoscope, its viewing direction, the anatomy, and additional relevant information are presented to the user as shown in
One of the important features of the present embodiment is its surgical approach planning capability. Because a variable direction of view endoscope can change its line of sight once it is positioned in a cavity, its entry angle can be chosen from a large range. This makes it easier to avoid critical or delicate anatomy when positioning the endoscope.
Another valuable diagnostic and surgical planning feature is displaying the subset of points of a scan data set corresponding to the parts of the anatomy which an endoscope is capable of seeing from a given position. This is illustrated in the 2-dimensional example of
The integrated omnidirectional endoscopic image guided stereotactic system can also provide the endoscope itself with safe entry and retraction procedures. One of the biggest advantages of an omnidirectional endoscope is its ability to look forward during insertion, and then change its line of sight once the tip is inside the surgical cavity. Fixed viewing scopes with an off-angle view can be dangerous because they are not “looking” in the direction they are being plunged. This can be likened to driving a car without watching the road. If the omnidirectional endoscope is plunged manually, it can be programmed to do intermediate reconnaissance scans on its way into the cavity. For example, at certain preset depths determined from stereotactic information, the plunging procedure would temporarily stop and allow the endoscope to scan or look in prescribed directions to verify its location and also look for any obstacles in its path. If the endoscope is fully automated, it plunges itself a certain distance before stopping and stepping the surgeon through a predetermined scan. If the scan is satisfactory, the surgeon instructs the endoscope to return to its forward-looking configuration and plunge another incremental distance, and so on. A similar procedure could be performed as the endoscope is retracted.
It is possible to establish the global position of an endoscope with respect to a set of volumetric scan data without the use of external sensors, such as the cameras 64 (
Accordingly, the present invention provides new endoscopic and surgical orientation capabilities, global monitoring of the endoscopic position and viewing direction, and improved surgical approach and procedure planning.
The present invention has been described above in terms of a presently preferred embodiment so that an understanding of the present invention can be conveyed. However, there are many configurations for a variable direction-of-view endoscope and method for viewing not specifically described herein but with which the present invention is applicable. Many structural and material variations are possible, as are variations in application. For example, while the examples were given with respect to an endoscope for use in surgical procedures, the present invention would be equally applicable with respect to a borescope for use within various mechanical structures, or for other types of variable direction probes which use wave lengths other than visible light. The scope of the present invention should therefore not be limited by the embodiments illustrated, but rather it should be understood that the present invention has wide applicability with respect to viewing or sensing instruments and procedures generally. All modifications, variations, or equivalent elements and implementations that are within the scope of the appended claims should therefore be considered within the scope of the invention.
This patent application is a divisional of U.S. patent application Ser. No. 11/058,311, filed Feb. 14, 2005, now U.S. Pat. No. 7,967,742, the content of which is incorporated herein by reference.
Number | Name | Date | Kind |
---|---|---|---|
3572325 | Bazell et al. | Mar 1971 | A |
3880148 | Kanehira et al. | Apr 1975 | A |
4697577 | Forkner | Oct 1987 | A |
5230623 | Guthrie et al. | Jul 1993 | A |
5313306 | Kuban et al. | May 1994 | A |
5531227 | Schneider | Jul 1996 | A |
5617857 | Chader et al. | Apr 1997 | A |
5623560 | Nakajima et al. | Apr 1997 | A |
5638819 | Manwaring et al. | Jun 1997 | A |
5661519 | Franetzki | Aug 1997 | A |
5677763 | Redmond | Oct 1997 | A |
5704897 | Truppe | Jan 1998 | A |
5899851 | Koninckx | May 1999 | A |
5920395 | Schulz | Jul 1999 | A |
5954634 | Igarashi | Sep 1999 | A |
5976076 | Kolff et al. | Nov 1999 | A |
6007484 | Thompson | Dec 1999 | A |
6097423 | Mattsson-Boze et al. | Aug 2000 | A |
6167296 | Shahidi | Dec 2000 | A |
6241657 | Chen et al. | Jun 2001 | B1 |
6371909 | Hoeg et al. | Apr 2002 | B1 |
6442417 | Shahidi et al. | Aug 2002 | B1 |
6464631 | Girke et al. | Oct 2002 | B1 |
6471637 | Green et al. | Oct 2002 | B1 |
6500115 | Krattiger et al. | Dec 2002 | B2 |
6505065 | Yanof et al. | Jan 2003 | B1 |
6648817 | Schara et al. | Nov 2003 | B2 |
6663559 | Hale et al. | Dec 2003 | B2 |
6695774 | Hale et al. | Feb 2004 | B2 |
6801643 | Pieper | Oct 2004 | B2 |
7660623 | Hunter et al. | Feb 2010 | B2 |
20020007108 | Chen et al. | Jan 2002 | A1 |
20020010384 | Shahidi et al. | Jan 2002 | A1 |
20020045855 | Frassica | Apr 2002 | A1 |
20020099263 | Hale et al. | Jul 2002 | A1 |
20020161280 | Chatnever et al. | Oct 2002 | A1 |
20030016883 | Baron | Jan 2003 | A1 |
20030216639 | Gilboa et al. | Nov 2003 | A1 |
20040127769 | Hale et al. | Jul 2004 | A1 |
20040210105 | Hale et al. | Oct 2004 | A1 |
20050020878 | Ohnishi et al. | Jan 2005 | A1 |
20050020883 | Chatenever et al. | Jan 2005 | A1 |
20050027167 | Chatenever et al. | Feb 2005 | A1 |
20050033117 | Ozaki et al. | Feb 2005 | A1 |
20050054895 | Hoeg et al. | Mar 2005 | A1 |
20050113643 | Hale et al. | May 2005 | A1 |
20050154260 | Schara et al. | Jul 2005 | A1 |
20050187432 | Hale et al. | Aug 2005 | A1 |
20050228230 | Schara et al. | Oct 2005 | A1 |
Number | Date | Country |
---|---|---|
29907430 | Oct 1999 | DE |
6269403 | Sep 1994 | JP |
9501749 | Jan 1995 | WO |
0122865 | Apr 2001 | WO |
Number | Date | Country | |
---|---|---|---|
20110230710 A1 | Sep 2011 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 11058311 | Feb 2005 | US |
Child | 13117875 | US |