Method and apparatus for perspective inversion

Abstract
A surgical instrument navigation system is disclosed that allows a surgeon to invert the three-dimensional perspective of the instrument to match their perspective of the actual instrument. A data processor is operable to generate a three-dimensional representation of a surgical instrument as it would visually appear from either of at least two different perspectives and to overlay the representation of the surgical instrument onto an image data of the patient. The image data and the representations can be displayed on a display.
Description
FIELD

The present teachings relate generally to surgical instrument navigation systems and, more particularly, to a navigation system that provides perspective inversion of the surgical instrument.


BACKGROUND

Modern diagnostic medicine has benefited significantly from radiology. Radiation, such as x-rays, may be used to generate images of internal body structures. In general, radiation is emanated towards a patient's body and absorbed in varying amounts by tissues in the body. An x-ray image is then created based on the relative differences of detected radiation passing through the patients' body.


Surgical navigation guidance can provide a tool for helping the physician perform surgery. One known technique involves tracking position in real-time of a surgical instrument in the patient's anatomy as it is represented by an x-ray image. The virtual representation of the surgical instrument is a three-dimensional object superimposed onto the two-dimensional image of the patient. Thus, the three-dimensional representation appears to be directed into or out of the two-dimensional image of the patient. An exemplary surgical navigation guidance system is disclosed in U.S. application Ser. No. 09/274,972 filed on Mar. 23, 1999 which is assigned to the assignee of the present teachings and incorporated herein by reference.


When an image is acquired, it is acquired from a certain perspective or point-of-view. In the case of a C-arm imaging device, the perspective is determined by the orientation of the C-arm around the patient. Specifically, the perspective is along the line connecting the image source and the image receiver. If the surgeon navigates the surgical instrument from the position of the image receiver, the perspective of the virtual representation of the instrument will match the surgeon's perspective of the actual instrument. However, if the surgeon navigates from the position of the radiation source, the perspective of the virtual representation of the instrument will appear “flipped” from the surgeon's perspective of the actual instrument.


Therefore, it is desirable to provide a surgical navigation system that allows the surgeon to invert or “flip” the three-dimensional perspective of the instrument to match their perspective of the actual instrument.


SUMMARY

In accordance with the present teachings, a surgical instrument navigation system is provided that allows a surgeon to invert the three-dimensional perspective of the instrument to match their perspective of the actual instrument. The surgical instrument navigation system includes: a surgical instrument; an imaging device that is operable to capture image data representative of a patient; a tracking subsystem that is operable to capture in real-time position data indicative of the position of the surgical instrument; and a data processor adapted to receive the image data from the imaging device and the position data from the tracking subsystem. The data processor is operable to generate a three-dimensional representation of the surgical instrument as it would visually appear from either of at least two different perspectives and to overlay the representation of the surgical instrument onto the image data of the patient. The navigation system further includes a display that is operable to display the representation of the surgical instrument superimposed onto the image data of the patient.


For a more complete understanding of the teachings, reference may be made to the following specification and to the accompanying drawings.





BRIEF DESCRIPTION OF THE DRAWINGS


FIG. 1 is a diagram of a surgical instrument navigation system in accordance with present teachings;



FIG. 2 is a diagram of an ideal and distorted image that may be captured by the surgical navigation system;



FIGS. 3A and 3B illustrates the projective transformation process employed by the surgical navigation system;



FIG. 4 is a flowchart depicting the operation of the surgical navigation system;



FIGS. 5A and 5B illustrate graphical representations of the surgical instrument superimposed onto a two-dimensional image of the patient;



FIG. 6 illustrates an exemplary graphical user interface of the surgical instrument navigation system; and



FIG. 7 is a flowchart depicting how perspective inversion is incorporated into the operation of the surgical instrument navigation system in accordance with the present teachings.





DETAILED DESCRIPTION OF THE VARIOUS EMBODIMENTS


FIG. 1 is a diagram of an exemplary surgical instrument navigation system. The primary component of the surgical instrument navigation system is a fluoroscopic imaging device 100. The fluoroscopic imaging device 100 generally includes a C-arm 103, x-ray source 104, x-ray receiving section 105, a calibration and tracking target 106, and radiation sensors 107. Calibration and tracking target 106 includes infrared reflectors (or alternatively infrared emitters) 109 and calibration markers 111. C-arm control computer 115 allows a physician to control the operation of imaging device 100, such as setting imaging parameters. One appropriate implementation of imaging device 100 is the “Series 9600 Mobile Digital Imaging System,” from OEC Medical Systems, Inc., of Salt Lake City, Utah. It should be noted that calibration and tracking target 106 and radiation sensors 107 are typically not included in the Series 9600 Mobile Digital Imaging System; otherwise the “Series 9600 Mobile Digital Imaging System” is similar to imaging system 100.


In operation, x-ray source 104 generates x-rays that propagate through patient 110 and calibration target 106, and into x-ray receiving section 105. Receiving section 105 generates an image representing the intensities of the received x-rays. Typically, receiving section 105 comprises an image intensifier that converts the x-rays to visible light and a charge coupled device (CCD) video camera that converts the visible light to digital images. Receiving section 105 may also be a device that converts x-rays directly to digital images, thus potentially avoiding distortion introduced by first converting to visible light.


Fluoroscopic images taken by imaging device 100 are transmitted to computer 115, where they may further be forwarded to computer 120. Computer 120 provides facilities for displaying (on monitor 121), saving, digitally manipulating, or printing a hard copy of the received images. Three-dimensional images, such as pre-acquired patient specific CT/MR data set 124 or a three-dimensional atlas data set 126 may also be manipulated by computer 120 and displayed by monitor 121. Images, instead of or in addition to being displayed on monitor 121, may also be displayed to the physician through a heads-up-display.


Although computers 115 and 120 are shown as two separate computers, they alternatively could be variously implemented as multiple computers or as a single computer that performs the functions performed by computers 115 and 120. In this case, the single computer would receive input from both C-arm imager 100 and tracking sensor 130.


Radiation sensors 107 sense the presence of radiation, which is used to determine whether or not imaging device 100 is actively imaging. The result of their detection is transmitted to processing computer 120. Alternatively, a person may manually indicate when device 100 is actively imaging or this function can be built into x-ray source 104, x-ray receiving section 105, or control computer 115.


In operation, the patient is positioned between the x-ray source 104 and the x-ray receiving section 105. In response to an operator's command input at control computer 115, x-rays emanate from source 104 and pass through patient 110, calibration target 106, and into receiving section 105 which generates a two-dimensional image of the patient.


C-arm 103 is capable of rotating relative to patient 110, thereby allowing images of patient 110 to be taken from multiple directions. For example, the physician may rotate C-arm 103 in the direction of arrows 108 or about the long axis of the patient. Each of these directions of movement involves rotation about a mechanical axis of the C-arm. In this example, the long axis of the patient is aligned with the mechanical axis of the C-arm.


Raw images generated by receiving section 105 tend to suffer from undesirable distortion caused by a number of factors, including inherent image distortion in the image intensifier and external electromagnetic fields. Drawings representing ideal and distorted images are shown in FIG. 2. Checkerboard 202 represents the ideal image of a checkerboard shaped object. The image taken by receiving section 105, however, can suffer significant distortion, as illustrated by distorted image 204.


The image formation process in a system such as fluoroscopic C-arm imager 100 is governed by a geometric projective transformation which maps lines in the fluoroscope's field of view to points in the image (i.e., within the x-ray receiving section 105). This concept is illustrated in FIGS. 3A and 3B. Image 300 (and any image generated by the fluoroscope) is composed of discrete picture elements (pixels), an example of which is labeled as 302. Every pixel within image 300 has a corresponding three-dimensional line in the fluoroscope's field of view. For example, the line corresponding to pixel 302 is labeled as 304. The complete mapping between image pixels and corresponding lines governs projection of objects within the field of view into the image. The intensity value at pixel 302 is determined by the densities of the object elements (i.e., portions of a patient's anatomy, operating room table, etc.) intersected by the line 304. For the purposes of computer assisted navigational guidance, it is necessary to estimate the projective transformation which maps lines in the field of view to pixels in the image, and vice versa. Geometric projective transformation is well known in the art.


Intrinsic calibration, which is the process of correcting image distortion in a received image and establishing the projective transformation for that image, involves placing “calibration markers” in the path of the x-ray, where a calibration marker is an object opaque or semi-opaque to x-rays. Calibration markers 111 are rigidly arranged in predetermined patterns in one or more planes in the path of the x-rays and are visible in the recorded images. Tracking targets, such as emitters or reflectors 109, are fixed in a known position relative to calibration markers 111.


Because the true relative position of the calibration markers 111 in the recorded images are known, computer 120 is able to calculate an amount of distortion at each pixel in the image (where a pixel is a single point in the image). Accordingly, computer 120 can digitally compensate for the distortion in the image and generate a distortion-free, or at least a distortion improved image. Alternatively, distortion may be left in the image, and subsequent operations on the image, such as superimposing an iconic representation of a surgical instrument on the image (described in more detail below), may be distorted to match the image distortion determined by the calibration markers. The calibration markers can also be used to estimate the geometric perspective transformation, since the position of these markers are known with respect to the tracking target emitters or reflectors 109 and ultimately with respect to tracking sensor 130. A more detailed explanation of methods for performing intrinsic calibration is described in the references B. Schuele et al., “Correction of Image Intensifier Distortion for Three-Dimensional Reconstruction,” presented at SPIE Medical Imaging 1995, San Diego, Calif., 1995 and G. Champleboux et al., “Accurate Calibration of Cameras and Range Imaging Sensors: the NPBS Method,” Proceedings of the 1992 IEEE International Conference on Robotics and Automation, Nice, France, May 1992, and U.S. application Ser. No. 09/106,109, filed on Jun. 29, 1998 by the present assignee, the contents of which are hereby incorporated by reference.


Calibration and tracking target 106 may be attached to x-ray receiving section 105 of the C-arm. Alternately, the target 106 can be mechanically independent of the C-arm, in which case it should be positioned such that the included calibration markers 111 are visible in each fluoroscopic image to be used in navigational guidance. Element 106 serves two functions. The first, as described above, is holding calibration markers 111 used in intrinsic calibration. The second function, which is described in more detail below, is holding infrared emitters or reflectors 109, which act as a tracking target for tracking sensor 130.


Tracking sensor 130 is a real-time infrared tracking sensor linked to computer 120. Specially constructed surgical instruments and other markers in the field of tracking sensor 130 can be detected and located in three-dimensional space. For example, a surgical instrument 140, such as a drill, is embedded with infrared emitters or reflectors 141 on its handle. Tracking sensor 130 detects the presence and location of infrared emitters or reflectors 141. Because the relative spatial locations of the emitters or reflectors in instrument 140 are known a priori, tracking sensor 130 and computer 120 are able to locate instrument 140 in three-dimensional space using well known mathematical transformations. Instead of using infrared tracking sensor 130 and corresponding infrared emitters or reflectors, other types of positional location devices which are known in the art may be used. For example, positional location devices based on magnetic fields, sonic emissions, or radio waves are also within the scope of the present teachings.


Reference frame marker 150, like surgical instrument 140, is embedded with infrared emitters or reflectors, labeled 151. As with instrument 140, tracking sensor 130 similarly detects the spatial location of emitters/reflectors 151, through which tracking sensor 130 and computer 120 determine the three-dimensional position of dynamic reference frame marker 150. The determination of the three-dimensional position of an object relative to a patient is known in the art, and is discussed, for example, in the following references, each of which is hereby incorporated by reference: PCT Publication WO 96/11624 to Bucholz et al., published Apr. 25, 1996; U.S. Pat. No. 5,384,454 to Bucholz; U.S. Pat. No. 5,851,183 to Bucholz; and U.S. Pat. No. 5,871,445 to Bucholz.


During an operation, dynamic reference frame marker 150 is attached in a fixed position relative to the portion of the patient to be operated on. For example, when inserting a screw into the spine of patient 110, dynamic reference frame marker 150 may be physically attached to a portion of the spine of the patient. Because dynamic reference frame 150 is in a fixed position relative to the patient anatomy, and instrument 140 can be accurately located in three dimensional space relative to dynamic reference frame 150, instrument 140 can also be located relative to the patient's anatomy.


As discussed above, calibration and tracking target 106 also includes infrared emitters or reflectors 109 similar to those in instrument 140 or dynamic reference frame 150. Accordingly, tracking sensor 130 and computer 120 may determine the three-dimensional position of calibration target 106 relative to instrument 140 and/or dynamic reference frame 150 and thus the patient position.


In general, the imaging system assists physicians performing surgery by displaying real-time or pre-acquired images, such as fluoroscopic x-ray images, of the patient 110 on display 121. Representations of surgical instruments 140 are overlaid on pre-acquired fluoroscopic images of patient 110 based on the position of the instruments determined by tracking sensor 130. In this manner, the physician is able to see the location of the instrument relative to the patient's anatomy, without the need to acquire real-time fluoroscopic images, thereby greatly reducing radiation exposure to the patient and to the surgical team. “Pre-acquired,” as used herein, is not intended to imply any required minimum duration between receipt of the x-ray signals and displaying the corresponding image. Momentarily storing the corresponding digital signal in computer memory while displaying the fluoroscopic image constitutes pre-acquiring the image.



FIG. 4 is a flowchart depicting the operation of the surgical navigation system. The physician begins by acquiring one or more fluoroscopic x-ray images of patient 110 using imager 100 (step 400). As previously mentioned, acquiring an x-ray image triggers radiation sensors 107, which informs computer 120 of the beginning and end of the radiation cycle used to generate the image. For a fluoroscopic x-ray image acquired with imager 100 to be useable for navigational guidance, imager 100, when acquiring the image, should be stationary with respect to patient 110. If C-arm 103 or patient 110 is moving during image acquisition, the position of the fluoroscope will not be accurately determined relative to the patient's reference frame. Thus, it is important that the recorded position of imager 100 reflects the true position of the imager at the time of image acquisition. If imager 100 moves during the image acquisition process, or if imager 100 moves after image acquisition but before its position is recorded, the calibration will be erroneous, thereby resulting in incorrect graphical overlays. To prevent this type of erroneous image, computer 120 may examine the position information from tracking sensor 130 while radiation sensors 107 are signaling radiation detection. If the calibration and tracking target 106 moves relative to dynamic reference frame 150 during image acquisition, this image is marked as erroneous (Steps 401 and 402).


At the end of the radiation cycle, computer 120 retrieves the acquired image from C-arm control computer 115 and retrieves the location information of target marker 106 and dynamic reference frame 150 from tracking sensor 130. Computer 120 calibrates the acquired image, as described above, to learn its projective transformation and optionally to correct distortion in the image, (step 403), and then stores the image along with its positional information (step 404). The process of steps 400-404 is repeated for each image that is to be acquired (step 405).


Because the acquired images are stored with the positional information of the calibration and tracking target 106 and dynamic reference frame 150, the position of C-arm 103, x-ray source 104, and receiving section 105 for each image, relative to patient 110, can be computed based upon the projective transformation identified in the calibration process. During surgery, tracking sensor 130 and computer 120 detect the position of instrument 140 relative to dynamic reference frame 150, and hence relative to patient 110. With this information, computer 120 dynamically calculates, in real-time, the projection of instrument 140 into each fluoroscopic image as the instrument is moved by the physician. A graphical representation of instrument 140 may then be overlaid on the fluoroscopic images (step 406). The graphical representation of instrument 140 is an iconic representation of where the actual surgical instrument would appear within the acquired fluoroscopic x-ray image if imager 100 was continuously acquiring new images from the same view as the original image. There is no theoretical limit to the number of fluoroscopic images on which the graphical representations of instrument 140 may be simultaneously overlaid.


The graphical representation of the surgical instrument is a three-dimensional object superimposed onto a two-dimensional image of the patient. The three-dimensional representation of the instrument may appear to be directed into or out of the two-dimensional image as shown in FIGS. 5A and 5B. In FIG. 5A, the tip 502 of the instrument 504 and the projected length appear to be directed into the image. Conversely, in FIG. 5B, the tip 502 of the instrument 504 and the projected length appear to be coming out of the image.


When an image is acquired, it is acquired from a certain perspective or point-of-view. In the case of a C-arm imaging device 100, the perspective is determined by the orientation of the C-arm 103 around the patient 110. Specifically, the perspective is along the line connecting the image source 104 and the image receiver section 105. If the surgeon navigates the surgical instrument from the position of the image receiver section 105, the perspective of the virtual representation of the instrument will match the surgeon's perspective of the actual instrument. However, if the surgeon navigates from the position of the image source 104, the perspective of the virtual representation of the instrument will appear “flipped” from the surgeon's perspective of the actual instrument.


In accordance with the present teachings, the surgical instrument navigation system described above has been enhanced to allow a surgeon to invert the graphical representation of the instrument to match their perspective of the actual instrument. In various embodiments, the navigation system provides two possible perspectives: positive (+) or negative (−). The positive state renders the instrument from the perspective of the image receiver section 105; whereas the negative state renders the instrument from the perspective of the image source 104. It is envisioned that either state may be designated the default state. It is further envisioned that more than two perspectives may be available for selection by the surgeon.


Referring to FIG. 6, the perspective of the instrument is selectable using a touch screen operable button 601 provided on the graphical user interface of the navigation system. One skilled in the art will readily recognize that rendering a particular perspective of the instrument does not affect the profile of the instrument or the location of the instrument on the image. The perspective selection only affects the internal contours that give the instrument the appearance into or out of the image as shown in FIGS. 5A and 5B. Although a touch screen operable button is possible, it is envisioned that other techniques for selecting the perspective of the instrument, such as a foot pedal or other switching device in close proximity to the surgeon, are also within the scope of the present teachings.


A more detailed description of how perspective inversion is incorporated into the operation of the surgical instrument navigation system is provided in conjunction with FIG. 7. As noted above, the projection of the instrument into the fluoroscopic image is calculated in real-time as the instrument is moved by the surgeon.


To do so, the tracking sensor 130, in conjunction with the computer 120, detects the position of the instrument 140 at step 702 relative to the dynamic reference frame 150, and thus relative to the patient 110. The tracking sensor 130, in conjunction with the computer 120, also determines the position of the tracking target 106 at step 704 relative to the dynamic reference frame 150. Based this position data, the computer 120 can determine the position of the instrument 140 relative to the tracking target 106 at step 706, and calibrate the position of the instrument relative to the image plane of the fluoroscopic images at step 708.


Prior to rendering the image, the navigation system accounts for the various user settings 714, including instrument perspective. The selected perspective setting 714 is input into the computer 120 at step 710 which in turn provides corresponding input to the graphic rendering software. One skilled in the art will readily recognize that other user settings (e.g., zoom, rotate, etc.) may be accounted for by the navigation system.


Lastly, the fluoroscopic image is rendered by the navigation system at step 712. Specifically, the three-dimensional representation of the surgical instrument is rendered from the perspective input by an operator of the navigation system. The representation of the instrument is then superimposed over the previously calibrated image data for the patient. In this way, the perspective of the displayed instrument matches the surgeon's perspective of the actual instrument. As noted above, the representation of the surgical instrument is tracked in real-time as it is moved by the surgeon.


While the teachings have been described according to various embodiments, it will be understood that the teachings are capable of modification without departing from the spirit of the teachings as set forth in the appended claims.

Claims
  • 1. A method for orienting a displayed representation of a surgical instrument while using a surgical instrument navigation system, comprising: operating the surgical instrument navigation system to display a two-dimensional image of a patient that is based on an x-ray projection through the patient;operating the surgical instrument navigation system to track an instrument location of the surgical instrument relative to the patient;selecting a three-dimensional graphic rendering to represent the surgical instrument to be superimposed on the displayed two-dimensional image, wherein the three-dimensional graphic rendering is selected from one of a rendered first three-dimensional representation of the surgical instrument as the surgical instrument would visually appear from a first perspective at a position relative to the patient or a rendered second three-dimensional representation of the surgical instrument as the surgical instrument would visually appear from a second perspective that is different from the first perspective at the position relative to the patient, wherein the selection is based on an actual operator perspective of the surgical instrument relative to the patient, wherein a profile of the graphic rendering is the same between the rendered first three-dimensional representation of the surgical instrument and the rendered second three-dimensional representation of the surgical instrument; andviewing the selected three-dimensional rendering that represents the surgical instrument superimposed on the displayed two-dimensional image, wherein the selected three-dimensional rendering that represents the surgical instrument is one of the rendered first three-dimensional representation or the rendered second three-dimensional representation;wherein the first perspective represents a first orientation of two possible orientations of the surgical instrument relative to the patient and the second perspective represents a second orientation of the two possible orientations of the surgical instrument relative to the patient.
  • 2. The method of claim 1, further comprising: inputting a selection into the surgical instrument navigation system by an actual operator.
  • 3. The method of claim 1, wherein the first perspective is a perspective from an imager source.
  • 4. The method of claim 3, wherein the second perspective is from an imager receiver.
  • 5. The method of claim 1, further comprising: selecting by an actual operator user settings of the surgical instrument navigation system, including a zoom and a rotation.
  • 6. The method of claim 1, further comprising: detecting the position of the surgical instrument relative to a dynamic reference frame fixed to the patient to determine the position of the surgical instrument relative to the patient;determining a display position of the surgical instrument for display relative to the displayed image data representative of the patient based upon the detected position of the surgical instrument relative to the dynamic reference frame;wherein displaying the superimposed graphic on the displayed image includes displaying the superimposed graphic of the surgical instrument at the determined position of the surgical instrument relative to the patient, wherein the determined position relative to the patient is unchanged between the first perspective and the second perspective.
  • 7. The method of claim 1, further comprising: determining a position of an imaging device including the image source that acquired the image data of the patient relative to the dynamic reference frame;determining a position of the surgical instrument relative to the imaging device based upon the determined position of the surgical instrument relative to the dynamic reference frame; andcalibrating the determined position of the surgical instrument relative to an image plane of the image data representative of the patient.
  • 8. The method of claim 1, wherein the selected graphic superimposed on the displayed image of the selected rendering of the first three-dimensional representation or the second three-dimensional representation only affects contours that give the respective rendered first three-dimensional representation or the rendered second three-dimensional representation of the surgical instrument an appearance into or out of the displayed image.
  • 9. A method for orienting a displayed representation of a surgical instrument while using a surgical instrument navigation system, comprising: operating the surgical instrument navigation system to display a two-dimensional image of a patient acquired from a first perspective of a x-ray image receiver;operating the surgical instrument navigation system to track an instrument location of the surgical instrument relative to the patient;selecting a three-dimensional graphical representation from one of a first rendered three-dimensional graphical representation of the surgical instrument at the instrument location at a first instrument perspective relative to the patient and the x-ray image receiver or a second rendered three-dimensional graphical representation of the surgical instrument at the instrument location at a second instrument perspective relative to a x-ray radiation source that is opposite the x-ray image receiver; andviewing on a display the selected three-dimensional graphical representation of the first rendered three-dimensional graphical representation of the first instrument perspective or the second rendered three-dimensional graphical representation of the second instrument perspective superimposed on the two-dimensional image at the tracked instrument location superimposed on the two-dimensional image of the patient;wherein the profile is the same and only contours are affected of the selected three-dimensional graphic representation superimposed on the displayed two-dimensional image between the first rendered three-dimensional graphical representation and second rendered three-dimensional graphical representation, wherein the contours give the respective first rendered three-dimensional graphical representation or the second rendered three-dimensional graphical representation of the surgical instrument a first appearance that is inverted relative to a second appearance relative to the displayed two-dimensional image.
  • 10. The method of claim 9, wherein selecting the three-dimensional graphical representation is to match a user's perspective of the surgical instrument relative to the patient in space.
  • 11. The method of claim 10, wherein selecting the three-dimensional graphical representation is to allow the user to invert the graphical representation of the surgical instrument.
  • 12. The method of claim 9, wherein the first instrument perspective is from the first perspective.
  • 13. The method of claim 9, further comprising: selecting the surgical instrument.
  • 14. The method of claim 13, wherein selecting the surgical instrument includes selecting a drill.
  • 15. The method of claim 9, further comprising: selecting at least one characteristic for displaying the three-dimensional graphical representation.
  • 16. The method of claim 15, wherein selecting the at least one characteristic includes selecting at least one of an orientation, a zoom amount, a rotation, or combinations thereof.
  • 17. The method of claim 9, further comprising: calibrating an acquired image data to the patient, wherein the image data is used to generate the two-dimensional image.
  • 18. The method of claim 17, wherein calibrating the image data to the patient includes transforming the image data to correct for at least one interference.
  • 19. The method of claim 9, further comprising: placing a dynamic reference frame on the patient; andwherein operating the surgical instrument navigation system to track an instrument location of the surgical instrument relative to the patient includes tracking the surgical instrument relative to the placed dynamic reference frame.
  • 20. The method of claim 9, wherein selecting one of the rendered tracked location of the surgical instrument includes manually inputting into the surgical instrument navigation system the selected perspective.
  • 21. A method for orienting a displayed representation of a surgical instrument while using a surgical instrument navigation system, comprising: viewing, by a user, a displayed image including a two-dimensional image of a patient received from an x-ray imaging device;operating the surgical instrument navigation system to track a first instrument position of the surgical instrument relative to the patient;inputting, by the user, a first selected perspective of the surgical instrument into the surgical instrument navigation system, wherein the selected perspective is an actual perspective of the surgical instrument as the surgical instrument visually appears from a perspective of the user relative to the patient when viewing the surgical instrument at a location of the user at the tracked first instrument position or a second selected perspective of the surgical instrument into the surgical instrument navigation system, wherein the second selected alternative perspective is different than the first selected perspective;viewing on a display device a superimposed three-dimensional graphical rendering of the surgical instrument at the inputted first selected perspective or the second selected perspective of the surgical instrument on the viewed two-dimensional image of the patient; andmaintaining a profile of the three-dimensional graphical rendering of the surgical instrument superimposed on the displayed image to be the same between (i) the inputted first selected perspective and (ii) the inputted second selected perspective;wherein only contours are effected between (i) the three-dimensional graphical rendering of the surgical instrument superimposed on the displayed image at the inputted first selected perspective and (ii) the three-dimensional graphical rendering of the surgical instrument superimposed on the displayed image at the inputted second selected perspective to give the respective (i) three-dimensional graphical rendering of the surgical instrument superimposed on the displayed image at the inputted first selected perspective or (ii) the three-dimensional graphical rendering of the surgical instrument superimposed on the displayed image at the inputted second selected perspective an appearance of into or out of the displayed image.
  • 22. The method of claim 21, further comprising: operating the surgical navigation system to operate a graphic rendering software to render the selected perspective.
  • 23. The method of claim 22, wherein inputting, by the user, the first selected perspective or the second selected perspective of the surgical instrument into the surgical instrument navigation system includes operating the surgical navigation system to operate the graphic rendering software to render at least a first three-dimensional representation of the surgical instrument as the surgical instrument would appear going into the viewed two-dimensional image and rendering and a second three-dimensional representation of the surgical instrument as the surgical instrument would appear coming out of the viewed two-dimensional image.
  • 24. The method of claim 22, further comprising: operating the surgical instrument navigation system to track a first position of the imaging device and tracking a second instrument position of the surgical instrument.
  • 25. The method of claim 24, further comprising: placing a dynamic reference frame on the patient; andwherein tracking the second instrument position of the instrument includes detecting the second position of the surgical instrument relative to the dynamic reference frame;wherein viewing on the display device the superimposed three-dimensional graphical rendering of the surgical instrument includes displaying the three-dimensional rendering of the surgical instrument at the detected second position of the surgical instrument relative to the patient displayed on the display device with the image data.
  • 26. The method of claim 25, further comprising: operating the surgical instrument navigation system to calibrate the position of the surgical instrument relative to an image plane of the displayed two-dimensional image based on image data acquired with the imaging device and the tracked first position of the imaging device and the tracked second instrument position of the surgical instrument.
  • 27. A method for orienting a displayed representation of a surgical instrument while using a surgical instrument navigation system, comprising: viewing, by a user, a displayed image including a two-dimensional image of a patient received from an x-ray imaging device;operating the surgical instrument navigation system to track a first instrument position of the surgical instrument relative to the patient;inputting, by the user, a selected perspective of the surgical instrument into the surgical instrument navigation system, wherein the selected perspective is an actual perspective of the surgical instrument as the surgical instrument visually appears from a perspective of the user relative to the patient when viewing the surgical instrument at a location of the user at the tracked first instrument position;viewing on a display device a superimposed three-dimensional graphical rendering of the surgical instrument at the inputted selected perspective of the surgical instrument on the viewed two-dimensional image of the patient; andmaintaining a profile of the three-dimensional graphical rendering to be the same between a first selected perspective rendered three-dimensional representation of the surgical instrument and a second selected perspective rendered three-dimensional representation of the surgical instrument;wherein the three-dimensional graphical rendering superimposed on the displayed image of the first selected perspective rendered three-dimensional representation of the surgical instrument or the second selected perspective rendered three-dimensional representation of the surgical instrument only affects contours that give the respective first selected perspective rendered three-dimensional representation of the surgical instrument or the second selected perspective rendered three-dimensional representation of the surgical instrument an appearance into or out of the displayed image.
CROSS-REFERENCE TO RELATED APPLICATIONS

This application is a divisional of U.S. patent application Ser. No. 11/188,972, filed on Jul. 25, 2005, now U.S. Pat. No. 7,630,753, which is a continuation of U.S. patent application Ser. No. 10/087,288 filed on Feb. 28, 2002, now U.S. Pat. No. 6,947,786. The disclosures of the above applications are incorporated herein by reference.

US Referenced Citations (563)
Number Name Date Kind
1576781 Phillips Mar 1926 A
1735726 Bornhardt Nov 1929 A
2407845 Nemeyer Sep 1946 A
2650588 Drew Sep 1953 A
2697433 Sehnder Dec 1954 A
3016899 Stenvall Jan 1962 A
3017887 Heyer Jan 1962 A
3061936 Dobbeleer Nov 1962 A
3073310 Mocarski Jan 1963 A
3109588 Polhemus et al. Nov 1963 A
3294083 Alderson Dec 1966 A
3367326 Frazier Feb 1968 A
3439256 Kahne Apr 1969 A
3577160 White May 1971 A
3614950 Rabey Oct 1971 A
3644825 Davis, Jr. et al. Feb 1972 A
3674014 Tillander Jul 1972 A
3702935 Carey et al. Nov 1972 A
3704707 Halloran Dec 1972 A
3821469 Whetstone et al. Jun 1974 A
3868565 Kuipers Feb 1975 A
3941127 Froning Mar 1976 A
3983474 Kuipers Sep 1976 A
4017858 Kuipers Apr 1977 A
4037592 Kronner Jul 1977 A
4052620 Brunnett Oct 1977 A
4054881 Raab Oct 1977 A
4117337 Staats Sep 1978 A
4173228 Van Steenwyk et al. Nov 1979 A
4182312 Mushabac Jan 1980 A
4202349 Jones May 1980 A
4228799 Anichkov et al. Oct 1980 A
4256112 Kopf et al. Mar 1981 A
4262306 Renner Apr 1981 A
4287809 Egli et al. Sep 1981 A
4298874 Kuipers Nov 1981 A
4314251 Raab Feb 1982 A
4317078 Weed et al. Feb 1982 A
4319136 Jinkins Mar 1982 A
4328548 Crow et al. May 1982 A
4328813 Ray May 1982 A
4339953 Iwasaki Jul 1982 A
4341220 Perry Jul 1982 A
4346384 Raab Aug 1982 A
4358856 Stivender et al. Nov 1982 A
4368536 Pfeiler Jan 1983 A
4396885 Constant Aug 1983 A
4396945 DiMatteo et al. Aug 1983 A
4418422 Richter et al. Nov 1983 A
4419012 Stephenson et al. Dec 1983 A
4422041 Lienau Dec 1983 A
4431005 McCormick Feb 1984 A
4485815 Amplatz et al. Dec 1984 A
4506676 Duska Mar 1985 A
4543959 Sepponen Oct 1985 A
4548208 Niemi Oct 1985 A
4571834 Fraser et al. Feb 1986 A
4572198 Codrington Feb 1986 A
4582995 Lim et al. Apr 1986 A
4583538 Onik et al. Apr 1986 A
4584577 Temple Apr 1986 A
4608977 Brown Sep 1986 A
4613866 Blood Sep 1986 A
4617925 Laitinen Oct 1986 A
4618978 Cosman Oct 1986 A
4621257 Brown Nov 1986 A
4621628 Brudermann Nov 1986 A
4625718 Olerud et al. Dec 1986 A
4638798 Shelden et al. Jan 1987 A
4642786 Hansen Feb 1987 A
4645343 Stockdale et al. Feb 1987 A
4649504 Krouglicof et al. Mar 1987 A
4651732 Frederick Mar 1987 A
4653509 Oloff et al. Mar 1987 A
4659971 Suzuki et al. Apr 1987 A
4660970 Ferrano Apr 1987 A
4673352 Hansen Jun 1987 A
4688037 Krieg Aug 1987 A
4701049 Beckman et al. Oct 1987 A
4705395 Hageniers Nov 1987 A
4705401 Addleman et al. Nov 1987 A
4706665 Gouda Nov 1987 A
4709156 Murphy et al. Nov 1987 A
4710708 Rorden et al. Dec 1987 A
4719419 Dawley Jan 1988 A
4722056 Roberts et al. Jan 1988 A
4722336 Kim et al. Feb 1988 A
4723544 Moore et al. Feb 1988 A
4727565 Ericson Feb 1988 A
RE32619 Damadian Mar 1988 E
4733969 Case et al. Mar 1988 A
4737032 Addleman et al. Apr 1988 A
4737794 Jones Apr 1988 A
4737921 Goldwasser et al. Apr 1988 A
4742356 Kuipers May 1988 A
4742815 Ninan et al. May 1988 A
4743770 Lee May 1988 A
4743771 Sacks et al. May 1988 A
4745290 Frankel et al. May 1988 A
4750487 Zanetti Jun 1988 A
4753528 Hines et al. Jun 1988 A
4761072 Pryor Aug 1988 A
4764016 Johansson Aug 1988 A
4771787 Wurster et al. Sep 1988 A
4779212 Levy Oct 1988 A
4782239 Hirose et al. Nov 1988 A
4788481 Niwa Nov 1988 A
4791934 Brunnett Dec 1988 A
4793355 Crum et al. Dec 1988 A
4794262 Sato et al. Dec 1988 A
4797907 Anderton Jan 1989 A
4803976 Frigg et al. Feb 1989 A
4804261 Kirschen Feb 1989 A
4805615 Carol Feb 1989 A
4809694 Ferrara Mar 1989 A
4821200 Oberg Apr 1989 A
4821206 Arora Apr 1989 A
4821731 Martinelli et al. Apr 1989 A
4822163 Schmidt Apr 1989 A
4825091 Breyer et al. Apr 1989 A
4829373 Leberl et al. May 1989 A
4836778 Baumrind et al. Jun 1989 A
4838265 Cosman et al. Jun 1989 A
4841967 Chang et al. Jun 1989 A
4845771 Wislocki et al. Jul 1989 A
4849692 Blood Jul 1989 A
4860331 Williams et al. Aug 1989 A
4862893 Martinelli Sep 1989 A
4869247 Howard, III et al. Sep 1989 A
4875165 Fencil et al. Oct 1989 A
4875478 Chen Oct 1989 A
4884566 Mountz et al. Dec 1989 A
4889526 Rauscher et al. Dec 1989 A
4896673 Rose et al. Jan 1990 A
4905698 Strohl, Jr. et al. Mar 1990 A
4923459 Nambu May 1990 A
4931056 Ghajar et al. Jun 1990 A
4945305 Blood Jul 1990 A
4945914 Allen Aug 1990 A
4951653 Fry et al. Aug 1990 A
4955891 Carol Sep 1990 A
4961422 Marchosky et al. Oct 1990 A
4977655 Martinelli Dec 1990 A
4989608 Ratner Feb 1991 A
4991579 Allen Feb 1991 A
5002058 Martinelli Mar 1991 A
5005592 Cartmell Apr 1991 A
5013317 Cole et al. May 1991 A
5016639 Allen May 1991 A
5017139 Mushabac May 1991 A
5027818 Bova et al. Jul 1991 A
5030196 Inoue Jul 1991 A
5030222 Calandruccio et al. Jul 1991 A
5031203 Trecha Jul 1991 A
5042486 Pfeiler et al. Aug 1991 A
5047036 Koutrouvelis Sep 1991 A
5050608 Watanabe et al. Sep 1991 A
5054492 Scribner et al. Oct 1991 A
5057095 Fabian Oct 1991 A
5059789 Salcudean Oct 1991 A
5078140 Kwoh Jan 1992 A
5079699 Tuy et al. Jan 1992 A
5086401 Glassman et al. Feb 1992 A
5094241 Allen Mar 1992 A
5097839 Allen Mar 1992 A
5098426 Sklar et al. Mar 1992 A
5099845 Besz et al. Mar 1992 A
5099846 Hardy Mar 1992 A
5105829 Fabian et al. Apr 1992 A
5107839 Houdek et al. Apr 1992 A
5107843 Aarnio et al. Apr 1992 A
5107862 Fabian et al. Apr 1992 A
5109194 Cantaloube Apr 1992 A
5119817 Allen Jun 1992 A
5142930 Allen et al. Sep 1992 A
5143076 Hardy et al. Sep 1992 A
5152288 Hoenig et al. Oct 1992 A
5160337 Cosman Nov 1992 A
5161536 Vilkomerson et al. Nov 1992 A
5178164 Allen Jan 1993 A
5178621 Cook et al. Jan 1993 A
5186174 Schlondorff et al. Feb 1993 A
5187475 Wagener et al. Feb 1993 A
5188126 Fabian et al. Feb 1993 A
5190059 Fabian et al. Mar 1993 A
5193106 DeSena Mar 1993 A
5197476 Nowacki et al. Mar 1993 A
5197965 Cherry et al. Mar 1993 A
5198768 Keren Mar 1993 A
5198877 Schulz Mar 1993 A
5202670 Oha Apr 1993 A
5207688 Carol May 1993 A
5211164 Allen May 1993 A
5211165 Dumoulin et al. May 1993 A
5211176 Ishiguro et al. May 1993 A
5212720 Landi et al. May 1993 A
5214615 Bauer May 1993 A
5219351 Teubner et al. Jun 1993 A
5222499 Allen et al. Jun 1993 A
5224049 Mushabac Jun 1993 A
5228442 Imran Jul 1993 A
5230338 Allen et al. Jul 1993 A
5230623 Guthrie et al. Jul 1993 A
5233990 Barnea Aug 1993 A
5237996 Waldman et al. Aug 1993 A
5249581 Horbal et al. Oct 1993 A
5251127 Raab Oct 1993 A
5251635 Dumoulin et al. Oct 1993 A
5253647 Takahashi et al. Oct 1993 A
5255680 Darrow et al. Oct 1993 A
5257636 White Nov 1993 A
5257998 Ota et al. Nov 1993 A
5261404 Mick et al. Nov 1993 A
5265610 Darrow et al. Nov 1993 A
5265611 Hoenig et al. Nov 1993 A
5269759 Hernandez et al. Dec 1993 A
5271400 Dumoulin et al. Dec 1993 A
5273025 Sakiyama et al. Dec 1993 A
5274551 Corby, Jr. Dec 1993 A
5279309 Taylor et al. Jan 1994 A
5285787 Machida Feb 1994 A
5291199 Overman et al. Mar 1994 A
5291889 Kenet et al. Mar 1994 A
5295483 Nowacki et al. Mar 1994 A
5297549 Beatty et al. Mar 1994 A
5299253 Wessels Mar 1994 A
5299254 Dancer et al. Mar 1994 A
5299288 Glassman et al. Mar 1994 A
5300080 Clayman et al. Apr 1994 A
5305091 Gelbart et al. Apr 1994 A
5305203 Raab Apr 1994 A
5306271 Zinreich et al. Apr 1994 A
5307072 Jones, Jr. Apr 1994 A
5309913 Kormos et al. May 1994 A
5315630 Sturm et al. May 1994 A
5316024 Hirschi et al. May 1994 A
5318025 Dumoulin et al. Jun 1994 A
5320111 Livingston Jun 1994 A
5325728 Zimmerman et al. Jul 1994 A
5325873 Hirschi et al. Jul 1994 A
5329944 Fabian et al. Jul 1994 A
5330485 Clayman et al. Jul 1994 A
5333168 Fernandes et al. Jul 1994 A
5353795 Souza et al. Oct 1994 A
5353800 Pohndorf et al. Oct 1994 A
5353807 DeMarco Oct 1994 A
5359417 Muller et al. Oct 1994 A
5368030 Zinreich et al. Nov 1994 A
5371778 Yanof et al. Dec 1994 A
5375596 Twiss et al. Dec 1994 A
5377678 Dumoulin et al. Jan 1995 A
5383454 Bucholz Jan 1995 A
5384454 Iijima Jan 1995 A
5385146 Goldreyer Jan 1995 A
5385148 Lesh et al. Jan 1995 A
5386828 Owens et al. Feb 1995 A
5389101 Heilbrun et al. Feb 1995 A
5391199 Ben-Haim Feb 1995 A
5394457 Leibinger et al. Feb 1995 A
5394875 Lewis et al. Mar 1995 A
5397329 Allen Mar 1995 A
5398684 Hardy Mar 1995 A
5399146 Nowacki et al. Mar 1995 A
5400384 Fernandes et al. Mar 1995 A
5402801 Taylor Apr 1995 A
5408409 Glassman et al. Apr 1995 A
5413573 Koivukangas May 1995 A
5417210 Funda et al. May 1995 A
5419325 Dumoulin et al. May 1995 A
5423334 Jordan Jun 1995 A
5425367 Shapiro et al. Jun 1995 A
5425382 Golden et al. Jun 1995 A
5426683 O'Farrell, Jr. et al. Jun 1995 A
5426687 Goodall et al. Jun 1995 A
5427097 Depp Jun 1995 A
5429132 Guy et al. Jul 1995 A
5433198 Desai Jul 1995 A
RE35025 Anderton Aug 1995 E
5437277 Dumoulin et al. Aug 1995 A
5443066 Dumoulin et al. Aug 1995 A
5443489 Ben-Haim Aug 1995 A
5444756 Pai et al. Aug 1995 A
5445144 Wodicka et al. Aug 1995 A
5445150 Dumoulin et al. Aug 1995 A
5445166 Taylor Aug 1995 A
5446548 Gerig et al. Aug 1995 A
5446799 Tuy Aug 1995 A
5447154 Cinquin et al. Sep 1995 A
5448610 Yamamoto et al. Sep 1995 A
5453686 Anderson Sep 1995 A
5456718 Szymaitis Oct 1995 A
5457641 Zimmer et al. Oct 1995 A
5458718 Venkitachalam Oct 1995 A
5464446 Dreessen et al. Nov 1995 A
5469847 Zinreich et al. Nov 1995 A
5478341 Cook et al. Dec 1995 A
5478343 Ritter Dec 1995 A
5480422 Ben-Haim Jan 1996 A
5480439 Bisek et al. Jan 1996 A
5483961 Kelly et al. Jan 1996 A
5484437 Michelson Jan 1996 A
5485849 Panescu et al. Jan 1996 A
5487391 Panescu Jan 1996 A
5487729 Avellanet et al. Jan 1996 A
5487757 Truckai et al. Jan 1996 A
5490196 Rudich et al. Feb 1996 A
5494034 Schlondorff et al. Feb 1996 A
5503416 Aoki et al. Apr 1996 A
5513637 Twiss et al. May 1996 A
5514146 Lam et al. May 1996 A
5515160 Schulz et al. May 1996 A
5517990 Kalfas et al. May 1996 A
5531227 Schneider Jul 1996 A
5531520 Grimson et al. Jul 1996 A
5542938 Avellanet et al. Aug 1996 A
5543951 Moehrmann Aug 1996 A
5546940 Panescu et al. Aug 1996 A
5546949 Frazin et al. Aug 1996 A
5546951 Ben-Haim Aug 1996 A
5551429 Fitzpatrick et al. Sep 1996 A
5558091 Acker et al. Sep 1996 A
5566681 Manwaring et al. Oct 1996 A
5568384 Robb et al. Oct 1996 A
5568809 Ben-haim Oct 1996 A
5571109 Bertagnoli et al. Nov 1996 A
5572999 Funda et al. Nov 1996 A
5573533 Strul Nov 1996 A
5575794 Walus et al. Nov 1996 A
5575798 Koutrouvelis Nov 1996 A
5583909 Hanover Dec 1996 A
5588430 Bova et al. Dec 1996 A
5590215 Allen Dec 1996 A
5592939 Martinelli Jan 1997 A
5595193 Walus et al. Jan 1997 A
5596228 Anderton et al. Jan 1997 A
5600330 Blood Feb 1997 A
5603318 Heilbrun et al. Feb 1997 A
5611025 Lorensen et al. Mar 1997 A
5617462 Spratt Apr 1997 A
5617857 Chader et al. Apr 1997 A
5619261 Anderton Apr 1997 A
5622169 Golden et al. Apr 1997 A
5622170 Schulz Apr 1997 A
5627873 Hanover et al. May 1997 A
5628315 Vilsmeier et al. May 1997 A
5630431 Taylor May 1997 A
5636644 Hart et al. Jun 1997 A
5638819 Manwaring et al. Jun 1997 A
5640170 Anderson Jun 1997 A
5642395 Anderton et al. Jun 1997 A
5643268 Vilsmeier et al. Jul 1997 A
5645065 Shapiro et al. Jul 1997 A
5646524 Gilboa Jul 1997 A
5647361 Damadian Jul 1997 A
5662111 Cosman Sep 1997 A
5664001 Tachibana et al. Sep 1997 A
5674296 Bryan et al. Oct 1997 A
5676673 Ferre et al. Oct 1997 A
5681260 Ueda et al. Oct 1997 A
5682886 Delp et al. Nov 1997 A
5682890 Kormos et al. Nov 1997 A
5690108 Chakeres Nov 1997 A
5694945 Ben-Haim Dec 1997 A
5695500 Taylor et al. Dec 1997 A
5695501 Carol et al. Dec 1997 A
5696500 Diem Dec 1997 A
5697377 Wittkampf Dec 1997 A
5702406 Vilsmeier et al. Dec 1997 A
5711299 Manwaring et al. Jan 1998 A
5713946 Ben-Haim Feb 1998 A
5715822 Watkins et al. Feb 1998 A
5715836 Kliegis et al. Feb 1998 A
5718241 Ben-Haim et al. Feb 1998 A
5727552 Ryan Mar 1998 A
5727553 Saad Mar 1998 A
5729129 Acker Mar 1998 A
5730129 Darrow et al. Mar 1998 A
5730130 Fitzpatrick et al. Mar 1998 A
5732703 Kalfas et al. Mar 1998 A
5735278 Hoult et al. Apr 1998 A
5738096 Ben-Haim Apr 1998 A
5740802 Nafis et al. Apr 1998 A
5741214 Ouchi et al. Apr 1998 A
5742394 Hansen Apr 1998 A
5744953 Hansen Apr 1998 A
5748767 Raab May 1998 A
5749362 Funda et al. May 1998 A
5749835 Glantz May 1998 A
5752513 Acker et al. May 1998 A
5755725 Druais May 1998 A
RE35816 Schulz Jun 1998 E
5758667 Slettenmark Jun 1998 A
5762064 Polvani Jun 1998 A
5767669 Hansen et al. Jun 1998 A
5767699 Bosnyak et al. Jun 1998 A
5767960 Orman Jun 1998 A
5769789 Wang et al. Jun 1998 A
5769843 Abela et al. Jun 1998 A
5769861 Vilsmeier Jun 1998 A
5772594 Barrick Jun 1998 A
5772661 Michelson Jun 1998 A
5775322 Silverstein et al. Jul 1998 A
5776064 Kalfas et al. Jul 1998 A
5782765 Jonkman Jul 1998 A
5787886 Kelly et al. Aug 1998 A
5792055 McKinnon Aug 1998 A
5795294 Luber et al. Aug 1998 A
5797849 Vesely et al. Aug 1998 A
5799055 Peshkin et al. Aug 1998 A
5799099 Wang et al. Aug 1998 A
5800352 Ferre et al. Sep 1998 A
5800535 Howard, III Sep 1998 A
5802719 O'Farrell, Jr. et al. Sep 1998 A
5803089 Ferre et al. Sep 1998 A
5807252 Hassfeld et al. Sep 1998 A
5810008 Dekel et al. Sep 1998 A
5810728 Kuhn Sep 1998 A
5810735 Halperin et al. Sep 1998 A
5820553 Hughes Oct 1998 A
5823192 Kalend et al. Oct 1998 A
5823958 Truppe Oct 1998 A
5828725 Levinson Oct 1998 A
5828770 Leis et al. Oct 1998 A
5829444 Ferre et al. Nov 1998 A
5831260 Hansen Nov 1998 A
5833608 Acker Nov 1998 A
5834759 Glossop Nov 1998 A
5836954 Heilbrun et al. Nov 1998 A
5840024 Taniguchi et al. Nov 1998 A
5840025 Ben-Haim Nov 1998 A
5843076 Webster, Jr. et al. Dec 1998 A
5848967 Cosman Dec 1998 A
5851183 Bucholz Dec 1998 A
5865846 Bryan et al. Feb 1999 A
5868674 Glowinski et al. Feb 1999 A
5868675 Henrion et al. Feb 1999 A
5871445 Bucholz Feb 1999 A
5871455 Ueno Feb 1999 A
5871487 Warner et al. Feb 1999 A
5873822 Ferre et al. Feb 1999 A
5882304 Ehnholm et al. Mar 1999 A
5884410 Prinz Mar 1999 A
5889834 Vilsmeier et al. Mar 1999 A
5891034 Bucholz Apr 1999 A
5891157 Day et al. Apr 1999 A
5904691 Barnett et al. May 1999 A
5907395 Schulz et al. May 1999 A
5913820 Bladen et al. Jun 1999 A
5920395 Schulz Jul 1999 A
5921992 Costales et al. Jul 1999 A
5923727 Navab Jul 1999 A
5928248 Acker Jul 1999 A
5938603 Ponzi Aug 1999 A
5938694 Jaraczewski et al. Aug 1999 A
5947980 Jensen et al. Sep 1999 A
5947981 Cosman Sep 1999 A
5950629 Taylor et al. Sep 1999 A
5951475 Gueziec et al. Sep 1999 A
5951571 Audette Sep 1999 A
5954647 Bova et al. Sep 1999 A
5954796 McCarty et al. Sep 1999 A
5957844 Dekel et al. Sep 1999 A
5964796 Imran Oct 1999 A
5967980 Ferre et al. Oct 1999 A
5967982 Barnett Oct 1999 A
5968047 Reed Oct 1999 A
5971997 Guthrie et al. Oct 1999 A
5976156 Taylor et al. Nov 1999 A
5980535 Barnett et al. Nov 1999 A
5983126 Wittkampf Nov 1999 A
5986670 Dries et al. Nov 1999 A
5987349 Schulz Nov 1999 A
5987960 Messner et al. Nov 1999 A
5999837 Messner et al. Dec 1999 A
5999840 Grimson et al. Dec 1999 A
6001130 Bryan et al. Dec 1999 A
6006126 Cosman Dec 1999 A
6006127 Van Der Brug et al. Dec 1999 A
6013087 Adams et al. Jan 2000 A
6014580 Blume et al. Jan 2000 A
6016439 Acker Jan 2000 A
6019725 Vesely et al. Feb 2000 A
6024695 Taylor et al. Feb 2000 A
6050724 Schmitz et al. Apr 2000 A
6059718 Taniguchi et al. May 2000 A
6063022 Ben-Haim May 2000 A
6071288 Carol et al. Jun 2000 A
6073043 Schneider Jun 2000 A
6076008 Bucholz Jun 2000 A
6096050 Audette Aug 2000 A
6104944 Martinelli Aug 2000 A
6118845 Simon et al. Sep 2000 A
6122538 Sliwa, Jr. et al. Sep 2000 A
6122541 Cosman et al. Sep 2000 A
6131396 Duerr et al. Oct 2000 A
6139183 Graumann Oct 2000 A
6147480 Osadchy et al. Nov 2000 A
6149592 Yanof et al. Nov 2000 A
6156067 Bryan et al. Dec 2000 A
6161032 Acker Dec 2000 A
6165181 Heilbrun et al. Dec 2000 A
6167296 Shahidi Dec 2000 A
6172499 Ashe Jan 2001 B1
6175756 Ferre et al. Jan 2001 B1
6178345 Vilsmeier et al. Jan 2001 B1
6194639 Botella et al. Feb 2001 B1
6201387 Govari Mar 2001 B1
6203497 Dekel et al. Mar 2001 B1
6211666 Acker Apr 2001 B1
6223067 Vilsmeier et al. Apr 2001 B1
6233476 Strommer et al. May 2001 B1
6246231 Ashe Jun 2001 B1
6259942 Westermann et al. Jul 2001 B1
6273896 Franck et al. Aug 2001 B1
6285902 Kienzle, III et al. Sep 2001 B1
6298262 Franck et al. Oct 2001 B1
6314310 Ben-Haim et al. Nov 2001 B1
6332089 Acker et al. Dec 2001 B1
6341231 Ferre et al. Jan 2002 B1
6346072 Cooper Feb 2002 B1
6348058 Melkent et al. Feb 2002 B1
6351659 Vilsmeier Feb 2002 B1
6381485 Hunter et al. Apr 2002 B1
6424856 Vilsmeier et al. Jul 2002 B1
6427314 Acker Aug 2002 B1
6428547 Vilsmeier et al. Aug 2002 B1
6434415 Foley et al. Aug 2002 B1
6437567 Schenck et al. Aug 2002 B1
6445943 Ferre et al. Sep 2002 B1
6450978 Brosseau et al. Sep 2002 B1
6470207 Simon et al. Oct 2002 B1
6474341 Hunter et al. Nov 2002 B1
6477226 Lehmann et al. Nov 2002 B1
6477228 Spahn Nov 2002 B2
6478802 Kienzle, III et al. Nov 2002 B2
6484049 Seeley et al. Nov 2002 B1
6490475 Seeley et al. Dec 2002 B1
6493573 Martinelli et al. Dec 2002 B1
6493575 Kesten et al. Dec 2002 B1
6498944 Ben-Haim et al. Dec 2002 B1
6499488 Hunter et al. Dec 2002 B1
6516046 Frohlich et al. Feb 2003 B1
6527443 Vilsmeier et al. Mar 2003 B1
6529758 Shahidi Mar 2003 B2
6551325 Neubauer et al. Apr 2003 B2
6567690 Giller et al. May 2003 B2
6584174 Schubert et al. Jun 2003 B2
6609022 Vilsmeier et al. Aug 2003 B2
6611700 Vilsmeier et al. Aug 2003 B1
6640128 Vilsmeier et al. Oct 2003 B2
6694162 Hartlep Feb 2004 B2
6695786 Wang et al. Feb 2004 B2
6701179 Martinelli et al. Mar 2004 B1
6947786 Simon et al. Sep 2005 B2
7006085 Acosta et al. Feb 2006 B1
7302288 Schellenberg Nov 2007 B1
7630753 Simon et al. Dec 2009 B2
20010007918 Vilsmeier et al. Jul 2001 A1
20020085681 Jensen Jul 2002 A1
20020095081 Vilsmeier et al. Jul 2002 A1
20030098881 Nolte et al. May 2003 A1
20040024309 Ferre et al. Feb 2004 A1
20050273004 Simon et al. Dec 2005 A1
Foreign Referenced Citations (70)
Number Date Country
964149 Mar 1975 CA
3042343 Jun 1982 DE
3508730 Sep 1986 DE
3717871 Dec 1988 DE
3831278 Mar 1989 DE
3838011 Jul 1989 DE
4213426 Oct 1992 DE
4225112 Dec 1993 DE
4233978 Apr 1994 DE
19715202 Oct 1998 DE
19751761 Oct 1998 DE
19832296 Feb 1999 DE
19747427 May 1999 DE
10085137 Nov 2002 DE
0062941 Oct 1982 EP
0119660 Sep 1984 EP
0155857 Sep 1985 EP
0319844 Jun 1989 EP
0326768 Aug 1989 EP
350996 Jan 1990 EP
0419729 Apr 1991 EP
0427358 May 1991 EP
0456103 Nov 1991 EP
0469966 Feb 1992 EP
0581704 Feb 1994 EP
0651968 May 1995 EP
0655138 May 1995 EP
0894473 Feb 1999 EP
0908146 Apr 1999 EP
0930046 Jul 1999 EP
2417970 Sep 1979 FR
2618211 Jan 1989 FR
2094590 Sep 1982 GB
2164856 Apr 1986 GB
62327 Jan 1983 JP
2765738 Jun 1988 JP
63240851 Oct 1988 JP
3267054 Nov 1991 JP
6194639 Jul 1994 JP
WO-8809151 Dec 1988 WO
WO-8905123 Jun 1989 WO
WO-9005494 May 1990 WO
WO-9103982 Apr 1991 WO
WO-9104711 Apr 1991 WO
WO-9107726 May 1991 WO
WO-9203090 Mar 1992 WO
WO-9206645 Apr 1992 WO
WO-9404938 Mar 1994 WO
WO-9423647 Oct 1994 WO
WO-9424933 Nov 1994 WO
WO-9507055 Mar 1995 WO
WO-9611624 Apr 1996 WO
WO-9632059 Oct 1996 WO
WO-9736192 Oct 1997 WO
WO-9749453 Dec 1997 WO
WO-9808554 Mar 1998 WO
WO-9838908 Sep 1998 WO
WO-9915097 Apr 1999 WO
WO-9921498 May 1999 WO
WO-9923956 May 1999 WO
WO-9926549 Jun 1999 WO
WO-9927839 Jun 1999 WO
WO-9929253 Jun 1999 WO
WO-9933406 Jul 1999 WO
WO-9937208 Jul 1999 WO
WO-9938449 Aug 1999 WO
WO-9952094 Oct 1999 WO
WO-9960939 Dec 1999 WO
WO0056215 Sep 2000 WO
WO-0130437 May 2001 WO
Non-Patent Literature Citations (127)
Entry
Adams et al., Computer-Assisted Surgery, IEEE Computer Graphics & Applications, pp. 43-51, (May 1990).
Adams, L., Knepper, A., Kyrbus, W., Meyer-Ebrecht, D., Pfeifer, G., Ruger, R., Witte, M., Aide au Reperage Tridimensionel pour la Chirurgie dela Base du Crane, Innov. Tech. Biol. Med., vol. 13, No. 4, 1992, pp. 409-424.
Ali Hamadeh et al., “Automated 3-Dimensional Computer Tomographic and Fluorscopic Image Registration,” Computer Aided Surgery (1998), 3:11-19.
Ali Hamadeh et al., “Towards Automatic Registration Between CT and X-ray Images: Cooperation Between 3D/2D Registration and 2D Edge Detection,” MRCAS '95, pp. 39-46.
Andre P. Gueziec et al., “Registration of Computer Tomography Data to a Surgical Robot Using Fluorscopy: A Feasibility Study,” Computer Science/Mathematics, Sep. 27, 1996, 6 pages.
Barrick, Frederick E., et al., “Phophylactic Intramedullary Fixation of the Tibia for Stress Fracture in a Professional Athlete,” Journal of Orthopaedic Trauma, vol. 6, No. 2, pp. 241-244 (1992).
Barrick, Frederick E., et al., “Technical Difficulties with the Brooker-Wills Nail in Acute Fractures of the Femur,” Journal of Orthopaedic Trauma, vol. 6, No. 2, pp. 144-150 (1990).
Batnitzky, S., Price, H.I., Lee, K.R., Cook, P.N., Cook, L.T., Fritz, S.L., Dwyer, S.J., Watts, C., Three-Dimensinal Computer Reconstructions of Brain Lesions from Surface Contours Provided by Computed Tomography: A Prospectus, Neurosurgery, vol. 11, No. 1, Part 1, 1982, pp. 73-84.
Benzel et al., “Magnetic Source Imaging: a Review of the Magnes System of Biomagnetic Technologies Incorporated,” Neurosurgery, vol. 33, No. 2 (Aug. 1993), pp. 252-259.
Bergstrom et al. Stereotaxic Computed Tomography, Am. J. Roentgenol, vol. 127 pp. 167-170 (1976).
Bouazza-Marouf et al.; “Robotic-Assisted Internal Fixation of Femoral Fractures”, IMECHE.pp. 51-58 (1995).
Brown, R., M.D., A Stereotactic Head Frame for Use with CT Body Scanners, Investigative Radiology .Copyrgt. J.B. Lippincott Company, pp. 300-304 (Jul.-Aug. 1979).
Bryan, “Bryan Cervical Disc System Single Level Surgical Technique”, Spinal Dynamics, 2002, pp. 1-33.
Bucholz et al., “Variables affecting the accuracy of stereotactic localizationusing computerized tomography,” Journal of Neurosurgery, vol. 79, Nov. 1993, pp. 667-673.
Bucholz, R.D., et al. Image-guided surgical techniques for infections and trauma of the central nervous system, Neurosurg. Clinics of N.A., vol. 7, No. 2, pp. 187-200 (1996).
Bucholz, R.D., et al., A Comparison of Sonic Digitizers Versus Light Emitting Diode-Based Localization, Interactive Image-Guided Neurosurgery, Chapter 16, pp. 179-200 (1993).
Bucholz, R.D., et al., Intraoperative localization using a three dimensional optical digitizer, SPIE—The Intl. Soc. for Opt. Eng., vol. 1894, pp. 312-322 (Jan. 17-19, 1993).
Bucholz, R.D., et al., The Correction of Stereotactic Inaccuracy Caused by Brain Shift Using an Intraoperative Ultrasound Device, First Joint Conference, Computer Vision, Virtual Reality and Robotics in Medicine and Medical Robotics andComputer-Assisted Surgery, Grenoble, France, pp. 459-466 (Mar. 19-22, 1997).
C. Brack et al., “Accurate X-ray Based Navigation in Computer-Assisted Orthopedic Surgery,” CAR '98, pp. 716-722.
Champleboux et al., “Accurate Calibration of Cameras and Range Imaging Sensors: the NPBS Method,” IEEE International Conference on Robotics and Automation, Nice, France, May 1992.
Champleboux, G., Utilisation de Fonctions Splines pour la Mise au Point D'un Capteur Tridimensionnel sans Contact, Quelques Applications Medicates, Jul. 1991.
Cinquin, P., Lavallee, S., Demongeot, J., Computer Assisted Medical Interventions, International Advanced Robotics Programme, Sep. 1989, pp. 63-65.
Clarysse, P., Gibon, D., Rousseau, J., Blond, S., Vasseur, C., Marchandise, X., A Computer-Assisted System for 3-D Frameless Localization in Stereotaxic MRI, IEEE Transactions on Medical Imaging, vol. 10, No. 4, Dec. 1991, pp. 523-529.
Colchester, A.C.F., Hawkes, D.J., Information Processing in Medical Imaging, Lecture Notes in Computer Science, 12th International Conference, IPMI, Jul. 1991, pp. 136-141.
Cutting M.D. et al., Optical Tracking of Bone Fragments During Craniofacial Surgery, Second Annual International Symposium on Medical Robotics and Computer Assisted Surgery, pp. 221-225, (Nov. 1995).
E. Frederick Barrick, “Journal of Orthopaedic Trauma: Distal Locking Screw Insertion Using a Cannulated Drill Bit: Technical Note,” Raven Pres, vol. 7, No. 3, 1993, pp. 248-251.
Feldmar et al., “3D-2D Projective Registration of Free-Form Curves and Surfaces,” Rapport de recherche (Inria Sophia Antipolis), 1994, pp. 1-44.
Foley, J.D., Van Dam, A., Fundamentals of Interactive Computer Graphics, The Systems Programming Series, Chapter 7, Jul. 1984, pp. 245-266.
Foley, K.T., Smith, K.R., Bucholz, R.D., Image-guided intraoperative Spinal Localization, Intraoperative Neuroprotection, Chapter 19, 1996, pp. 325-340.
Foley, K.T., The SteathStation. Three-Dimensional Image-Interactive Guidance for the Spine Surgeon, Spinal Frontiers, Apr. 1996, pp. 7-9.
Friets, E.M., et al. A Frameless Stereotaxic Operating Microscope for Neurosurgery, IEEE Trans. on Biomed. Eng., vol. 36, No. 6, pp. 608-617 (Jul. 1989).
G. Selvik, et al., “A Roentgen Stereophotogrammetric System,” Acta Radiologica Diagnosis, 1983, pp. 343-352.
Gallen, C.C., et al., Intracranial Neurosurgery Guided by Functional Imaging, Surg. Neurol., vol. 42, pp. 523-530 (1994).
Galloway, R.L., et al., Interactive Image-Guided Neurosurgery, IEEE Trans. on Biomed. Eng., vol. 89, No. 12, pp. 1226-1231 (1992).
Galloway, R.L., Jr. et al, Optical localization for interactive, image-guided neurosurgery, SPIE, vol. 2164, pp. 137-145 (May 1, 1994).
Germano, “Instrumentation, Technique and Technology”, Neurosurgery, vol. 37, No. 2, Aug. 1995, pp. 348-350.
Gildenberg, P.L., Kaufman, H.H. Murthy, K.S., Calculation of Stereotactic Coordinates from the Computed Tomographic Scan, Neurosurgery, vol. 10, No. 5, May 1982, pp. 580-586.
Gomez, C.R., et al., Transcranial Doppler Ultrasound Following Closed Head Injury: Vasospasm or Vasoparalysis?, Surg. Neurol., vol. 35, pp. 30-35 (1991).
Gonzalez, R.C., Digital Image Fundamentals, Digital Image Processing, Second Edition, 1987, pp. 52-54.
Grimson, W.E.L., An Automatic Registration Method for Frameless Stereotaxy, Image Guided Surgery, and enhanced Reality Visualization, IEEE, pp. 430-436 (1994).
Grimson, W.E.L., et al., Virtual-reality technology is giving surgeons the equivalent of x-ray vision helping them to remove tumors more effectively, to minimize surgical wounds and to avoid damaging critical tissues, Sci. Amer., vol. 280, No. 6,pp. 62-69 (Jun. 1999).
Guthrie, B.L., Graphic-Interactive Cranial Surgery: The Operating Arm System, Handbook of Stereotaxy Using the CRW Apparatus, Chapter 13, pp. 193-211 (1994).
Hamadeh et al, “Kinematic Study of Lumbar Spine Using Functional Radiographies and 3D/2D Registration,” TIMC UMR 5525—IMAG (1997).
Hardy, T., M.D., et al., CASS: A Program for Computer Assisted Stereotaxic Surgery, The Fifth Annual Symposium on Comptuer Applications in Medical Care, Proceedings, Nov. 1-4, 1981, IEEE, pp. 1116-1126, (1981).
Hatch, et al., “Reference-Display System for the Integration of CT Scanning and the Operating Microscope”, Proceedings of the Eleventh Annual Northeast Bioengineering Conference, May 1985, pp. 252-254.
Hatch, J.F., Reference-Display System for the Integration of CT Scanning and the Operating Microscope, Thesis, Thayer School of Engineering, Oct. 1984, pp. 1-189.
Heilbrun et al., “Preliminary experience with Brown-Roberts-Wells (BRW) computerized tomography stereotaxic guidance system,” Journal of Neurosurgery, vol. 59, Aug. 1983, pp. 217-222.
Heilbrun, M.D., Progressive Technology Applications, Neurosurgery for the Third Millenium, Chapter 15, J. Whitaker & Sons, Ltd., Amer. Assoc. of Neurol. Surgeons, pp. 191-198 (1992).
Heilbrun, M.P., Computed Tomography—Guided Stereotactic Systems, Clinical Neurosurgery, Chapter 31, pp. 564-581 (1983).
Heilbrun, M.P., et al., Stereotactic Localization and Guidance Using a Machine Vision Technique, Sterotact & Funct. Neurosurg., Proceed. of the Mtg. of the Amer. Soc. for Sterot. and Funct. Neurosurg. (Pittsburgh, PA) vol. 58, pp. 94-98 (1992).
Henderson, J.M., Smith, K.R. Bucholz, R.D., An Accurate and Ergonomic Method of Registration for Image-guided Neurosurgery, Computerized Medical Imaging and Graphics, vol. 18, No. 4, Jul.-Aug. 1994, pp. 273-277.
Hoerenz, P., The Operating Microscope I. Optical Principles, Illumination Systems, and Support Systems, Journal of Microsurgery, vol. 1, 1980, pp. 364-369.
Hofstetter et al., “Fluoroscopy Based Surgical Navigation—Concept and Clinical Applications,” Computer Assisted Radiology and Surgery, 1997, pp. 956-960.
Homer et al., “A Comparison of CT-Stereotaxic Brain Biopsy Techniques,” Investigative Radiology, Sep.-Oct. 1984, pp. 367-373.
Hounsfield, G.N., Computerized transverse axial scanning (tomography): Part 1. Description of system, British Journal of Radiology, vol. 46, No. 552, Dec. 1973, pp. 1016-1022.
Jacques Feldmar et al., “3D-2D Projective Registration of Free-Form Curves and Surfaces,” Rapport de recherche (Inria Sophia Antipolis), 1994, pp. 1-44.
Jacques, S., Sheldon, C.H., McCann, G.D., A Computerized Microstereotactic Method to Approach, 3-Dimensionally Reconstruct, Remove and Adjuvantly Treat Small CNS Lesions, Applied Neurophysiology, vol. 43, 1980, pp. 176-182.
Jacques, S., Sheldon, C.H., McCann, G.D., Freshwater, D.B., Rand, R., Computerized three-dimensional stereotaxic removal of small central nervous system lesion in patients, J. Neurosurg., vol. 53, Dec. 1980, pp. 816-820.
Jurgen Weese, et al., “An Approach to 2D/3D Registration of a Vertebra in 2D X-ray Fluoroscopies with 3D CT Images,” pp. 119-128.
Kali, B., The Impact of Computer and Imgaging Technology on Stereotactic Surgery, Proceedings of the Meeting of the American Society for Stereotactic and Functional Neurosurgery, pp. 10-22 (1987).
Kato, A., et al., A frameless, armless navigational system for computer-assisted neurosurgery, J. Neurosurg., vol. 74, pp. 845-849 (May 1991).
Kelly et al., “Computer-assisted stereotaxic laser resection of intra-axial brain neoplasms,” Journal of Neurosurgery, vol. 64, Mar. 1986, pp. 427-439.
Kelly, P.J., Computer Assisted Stereotactic Biopsy and Volumetric Resection of Pediatric Brain Tumors, Brain Tumors in Children, Neurologic Clinics, vol. 9, No. 2, pp. 317-336 (May 1991).
Kelly, P.J., Computer-Directed Stereotactic Resection of Brain Tumors, Neurologica Operative Atlas, vol. 1, No. 4, pp. 299-313 (1991).
Kelly, P.J., et al., Results of Computed Tomography-based Computer-assisted Stereotactic Resection of Metastatic Intracranial Tumors, Neurosurgery, vol. 22, No. 1, Part 1, 1988, pp. 7-17 (Jan. 1988).
Kelly, P.J., Kall, B., Goerss, S., Alker, G.J., Jr., Precision Resection of Intra-Axial CNS Lesions by CT-Based Stereotactic Craniotomy and Computer Monitored CO Laser, Acta Neurochirurgica, vol. 68, 1983, pp. 1-9.
Kelly, P.J., Stereotactic Imaging, Surgical Planning and Computer-Assisted Resection of Intracranial Lesions: Methods and Results, Advances and Technical Standards in Neurosurgery, vol. 17, pp. 78-118, (1990).
Kim, W.S. et al., A Helmet Mounted Display for Telerobotics, IEEE, pp. 543-547 (1988).
Klimek, L., et al., Long-Term Experience with Different Types of Localization Systems in Skull-Base Surgery, Ear, Nose & Throat Surgery, Chapter 51, pp. 635-638 (1996).
Kosugi, Y., et al., An Articulated Neurosurgical Navigation System Using MRI and CT Images, IEEE Trans. on Biomed, Eng. vol. 35, No. 2, pp. 147-152 (Feb. 1988).
Krybus, W., et al., Navigation Support for Surgery by Means of Optical Position Detection, Computer Assisted Radiology Proceed. of the Intl. Symp. CAR '91 Computed Assisted Radiology, pp. 362-366 (Jul. 3-6, 1991).
Kwoh, Y.S., Ph.D., et al., A New Computerized Tomographic-Aided Robotic Stereotaxis System, Robotics Age, vol. 7, No. 6, pp. 17-22 (Jun. 1985).
L. Lemieux et al., “A Patient-to-Computer-Tomography Image Registration Method Based on Digitally Reconstructed Radiographs,” Med. Phys. 21 (11), Nov. 1994, pp. 1749-1760.
Laitinen et al., “An Adapter for Computed Tomography-Guided, Stereotaxis,” Surg. Neurol., 1985, pp. 559-566.
Laitinen, “Noninvasive multipurpose stereoadapter,” Neurological Research, Jun. 1987, pp. 137-141.
Lavallee et al., “Computer Assisted Spine Surgery: A Technique for Accurate Transpedicular Screw Fixation Using CT Data and a 3-D Optical Localizer,” TIMC, Faculte de Medecine de Grenoble. (1995).
Lavallee, S., A New System for Computer Assisted Neurosurgery, IEEE Engineering in Medicine & Biology Society 11th Annual International Conference, 1989, pp. 0926-0927.
Lavallee, S., Brunie, L., Mazier, B., Cinquin, P., Matching of Medical Images for Computed and Robot Assisted Surgery, IEEE EMBS, Orlando, 1991.
Lavallee, S., Cinquin, P., Dermongeot, J., Benabid, A.L., Marque, I., Djaid M.,Computer Assisted Interventionist Imaging: The Instance of Stereotactic Brain Surgery, North-Holland MEDINFO 89, Part 1, 1989, pp. 613-617.
Lavallee, S., Cinquin, P., Dermongeot, J., Benabid, A.L., Marque, I., Djaid, M., Computer Assisted Driving of a Needle into the Brain, Proceedings of the International Symposium CAR '89, Computer Assisted Radiology, 1989, pp. 416-420.
Lavallee, S., et al., Computer Assisted Knee Anterior Cruciate Ligament Reconstruction First Clinical Tests, Proceedings of the First International Symposium on Medical Robotics and Computer Assisted Surgery, pp. 11-16 (Sep. 1994).
Lavallee, S., et al., Computer Assisted Medical Interventions, NATO ASI Series, vol. F 60, 3d Imaging in Medic., pp. 301-312 (1990).
Lavallee, S., Zseliski, R., Brunie, L., Matching 3-D Smooth Surfaces with Their 2-D Projections using 3-D Distance Maps, SPIE, vol. 1570, Geometric Methods in Computer Vision, 1991, pp. 322-336.
Leavitt, D.D., et al., Dynamic Field Shaping to Optimize Stereotactic Radiosurgery, I.J. Rad. Onc. Biol. Physc., vol. 21, pp. 1247-1255 (1991).
Leksell, L., Jemberg, B. Stereotaxis and Tomography—A Technical Note, ACTA Neurochirurgica, vol. 52, 1980, pp. 1-7.
Leo Joskowicz et al., “Computer-Aided Image-Guided Bone Fracture Surgery: Concept and Implementation,” CAR '98, pp. 710-715.
Levin, D.N., Hu, X., Tan, K.K., Galhotra, S., Pelizzari, C.A., Chen, G.T.Y., Beck, R.N., Chen. C., Cooper, M.D., Mullan, J.F., Hekmatpanah, J., Spire, J., The Brain: Integrated Three-dimensional Display of MR and PET Images, Radiology, vol. 172, No. 3, Sep. 1989, pp. 783-789.
Lisa M. Gottesfeld Brown et al., “Registration of Planar Film Radiographs with Computer Tomography,” Proceedings of MMBIA, Jne 1996, pp. 42-51.
Maurer, Jr., et al., Registration of Head CT Images to Physical Space Using a Weighted Combination of Points and Surfaces, IEEE Trans. on Med. Imaging, vol. 17, No. 5, pp. 753-761 (Oct. 1998).
Mazier, B., Lavallee, S., Cinquin, P., Chirurgie de la Colonne Vertebrale Assistee par Ordinateur: Appication au Vissage Pediculaire, Innov. Tech. Biol. Med., vol. 11, No. 5, 1990, pp. 559-566.
Mazier, B., Lavallee, S., Cinquin, P., Computer-Assisted Interventionist Imaging: Application to the Vertebral Column Surgery, Annual International Conference of the IEEE Engineering in Medicine and Biology Society, vol. 12, No. 1, 1990, pp. 0430-0431.
McGirr, S., M.D., et al., Stereotactic Resection of Juvenile Pilocytic Astrocytomas of the Thalamus and Basal Ganglia, Neurosurgery, vol. 20, No. 3, pp. 447-452, (1987).
Merloz, et al., “Computer Assisted Spine Surgery”, Clinical Assisted Spine Surgery, No. 337, pp. 86-96 (1997).
Ng, W.S. et al., Robotic Surgery—A First-Hand Experience in Transurethral Resection of the Prostate Surgery, IEEE Eng. in Med. and Biology, pp. 120-125 (Mar. 1993).
P. Cinquin, et al., “Computer Assisted Medical Interventions,” IEEE Engineering in Medicine and Biology, May/Jun. 1995, pp. 254-263.
P. Potamianos, et al., “Intra-Operative Imaging Guidance for Keyhole Surgery Mehtodology and Calibration,” First International Symposium on Medical Robotics and Computer Assisted Surgery, Sep. 22-24, 1994, pp. 98-104.
Pascal Phillipe Sautot, “Vissage Pediculaire Assiste Par Ordinateur,” Sep. 20, 1994.
Pelizzari, C.A., Chen, G.T.Y., Halpern, H., Chen, C.T., Cooper, M.D., No. 528—Three Dimensional Correlation of PET, CT and MRI Images, The Journal of Nuclear Medicine, vol. 28, No. 4, Apr. 1987, p. 682.
Pelizzari, C.A., Chen, G.T.Y., Spelbring, D.R., Weichselbaum, R.R., Chen, C., Accurate Three-Dimensional Registration of CT, PET, and/or MR Images of the Brain, Journal of Computer Assisted Tomography, Jan./Feb. 1989, pp. 20-26.
Penn, R.D., et al., Stereotactic Surgery with Image Processing of Computerized Tomographic Scans, Neurosurgery, vol. 3, No. 2, pp. 157-163 (Sep.-Oct. 1978).
R. Hofstetter et al., “Fluoroscopy Based Surgical Navigation—Concept and Clinical Applications,” Computer Assisted Radiology and Surgery, 1997, pp. 956-960.
R. Phillips et al., “Image Guided Orthopaedic Surgery Design and Analysis,” Trans Inst. MC, vol. 17, No. 5, 1995, pp. 251-264.
Reinhardt et al., “CT-Guided ‘Real Time’ Stereotaxy,” ACTA Neurochirurgica, 1989.
Reinhardt, H., et al., A Computer-Assisted Device for Intraoperative CT-Correlated Localization of Brain Tumors, pp. 51-58 (1988).
Reinhardt, H.F. et al., Sonic Stereometry in Microsurgical Procedures for Deep-Seated Brain Tumors and Vascular Malformations, Neurosurgery, vol. 32, No. 1, pp. 51-57 (Jan. 1993).
Reinhardt, H.F., et al., Mikrochirugische Entfernung tiefliegender Gefa.beta.mi.beta.bildungen mit Hilfe der Sonar-Stereometrie (Microsurgical Removal of Deep-Seated Vascular Malformations Using Sonar Stereometry). Ultraschall in Med. 12, pp. 80-83(1991).
Reinhardt, H.F., Landolt, H., CT-Guided “Real Time” Stereotaxy, ACTA Neurochirurgica, 1989.
Reinhardt, Hans. F., Neuronavigation: A Ten-Year Review, Neurosurgery, pp. 329-341 (1996).
Roberts, D.W., Strohbehn, J.W., Hatch, J.F. Murray, W. Kettenberger, H., A frameless stereotaxic integration of computerized tomographic imaging and the operating microscope, J. Neurosurg., vol. 65, Oct. 1986, pp. 545-549.
Rosenbaum, A.E., Lunsford, L.D., Perry, J.H., Computerized Tomography Guided Stereotaxis: A New Approach, Applied Neurophysiology, vol. 43, No. 3-5, 1980, pp. 172-173.
Schueler et al., “Correction of Image Intensifier Distortion for Three-Dimensional X-Ray Angiography,” SPIE Medical Imaging 1995, vol. 2432, pp. 272-279.
Sheldon, C.H., McCann, G., Jacques, S., Lutes, H.R., Frazier, R.E., Katz, R., Kuki, R., Development of a computerized microsteroetaxic method for localization and removal of minute CNS lesions under direct 3-D vision, J. Neurosurg., vol. 52, 1980,pp. 21-27.
Simon, D.A., Accuracy Validation in Image-Guided Orthopaedic Surgery, Second Annual Intl. Symp. on Med. Rob. and Comp-Assisted surgery, MRCAS, pp. 185-192 (1995).
Smith et al., “The Neurostation.TM.—A Highly Accurate, Minimally Invasive Solution to Frameless Stereotactic Neurosurgery,” Computerized Medical Imaging and Graphics, vol. 18, Jul.-Aug. 1994, pp. 247-256.
Smith, Kr., Bucholz, R.D., Computer Methods for Improved Diagnostic Image Display Applied to Stereotactic Neurosurgery, Automedical, vol. 14, 1992, pp. 371-382.
Smith, K.R., et al. Multimodality Image Analysis and Display Methods for Improved Tumor Localization in Stereotactic Neurosurgery, Annul Intl. Conf. of the IEEE Eng. in Med. and Biol. Soc., vol. 13, No. 1, p. 210 (1991).
Stephane Lavallee, et al., “Image guided operating robot: a clinical application in stereotactic neurosurgery,” Proceedings of the 1992 IEEE Internation Conference on Robotics and Automation, May 1992,pp. 618-624.
Tan, K., Ph.D., et al., A frameless stereotactic approach to neurosurgical planning based on retrospective patient-image registration, J Neurosurgy, vol. 79, pp. 296-303 (Aug. 1993).
Thompson, et al., A System for Anatomical and Functional Mapping of the Human Thalamus, Computers and Biomedical Research, vol. 10, pp. 9-24 (1977).
Trobraugh, J.W., et al., Frameless Stereotactic Ultrasonography: Method and Applications, Computerized Medical Imaging and Graphics, vol. 18, No. 4, pp. 235-246 (1994).
Von Hanwhr et al., Foreword, Computerized Medical Imaging and Graphics, vol. 18, No. 4, pp. 225-228, (Jul.-Aug. 1994).
W.J. Viant et al., “A Computer Assisted Orthopaedic System for Distal Locking of Intramedullary Nails,” Proc. MediMEC '95, Bristol, 1995, pp. 86-91.
Wang, M.Y., et al., An Automatic Technique for Finding and Localizing Externally Attached Markers in CT and MR Volume Images of the Head, IEEE Trans. on Biomed. Eng., vol. 43, No. 6, pp. 627-637 (Jun. 1996).
Watanabe, E., M.D., et al., Open Surgery Assisted by the Neuronavigator, a Stereotactic, Articulated, Sensitive Arm, Neurosurgery, vol. 28, No. 6, pp. 792-800 (1991).
Watanabe, E., Watanabe, T., Manaka, S., Mayanagi, Y., Takakura, K., Three-Dimensional Digitizer (Neuronavigator): New Equipment for Computed Tomography-Guided Stereotaxic Surgery, Surgical Neurology, vol. 27, No. 6, Jun. 1987, pp. 543-547.
Watanabe, H., Neuronavigator, Igaku-no-Ayumi, vol. 137, No. 6, May 10, 1986, pp. 1-4.
Weese et al., “An Approach to D Registration of a Vertebra in 2D X-ray Fluoroscopies with 3D CT Images,” (1997) pp. 119-128.
Related Publications (1)
Number Date Country
20090262111 A1 Oct 2009 US
Divisions (1)
Number Date Country
Parent 11188972 Jul 2005 US
Child 12493670 US
Continuations (1)
Number Date Country
Parent 10087288 Feb 2002 US
Child 11188972 US