The present invention is directed generally to image guided surgery, and more particularly, to systems and methods for using one or more fluoroscopic X-ray images to assist in instrument navigation during surgery.
Modern diagnostic medicine has benefitted significantly from radiology, which is the use of radiation, such as x-rays, to generate images of internal body structures. In general, to create an x-ray image, x-ray beams are passed through the body and absorbed, in varying amounts, by tissues in the body. An x-ray image is created based on the relative differences in the transmitted x-ray intensities.
Techniques are known through which x-ray images are used to locate the real-time position of surgical instruments in the patient anatomy represented by the x-ray image without requiring x-rays to be continually taken. In one such system, as disclosed in U.S. Pat. No. 5,772,594 to Barrick, light emitting diodes (LEDs) are placed on a C-arm fluoroscope x-ray imager, on a drill, and on a reference bar positioned on the bone to be studied. A three-dimensional optical digitizer senses the position of the LEDs, and hence the position of the drill, the C-arm fluoroscope, and the object bone. Based on this information, the real-time position of the drill in anatomy represented by the x-ray image is determined, and a corresponding representation of the drill in the x-ray image is displayed. This allows the surgeon to continually observe the progress of the surgery without necessitating additional x-ray images.
Surgical navigational guidance, as discussed above, can provide a tool for helping the physician perform surgery. It is an object of the present invention to provide several enhancements to traditional surgical navigational guidance techniques.
Objects and advantages of the invention will be set forth in part in the description which follows, and in part will be obvious from the description, or may be learned by practice of the invention. The objects and advantages of the invention will be realized and attained by means of the elements and combinations particularly pointed out in the appended claims.
One aspect of the present invention is directed to an x-ray imaging device comprising a plurality of elements. In particular, the x-ray imaging device includes an x-ray source for generating cycles of x-ray radiation corresponding to an image acquisition cycle; an x-ray receiving section positioned so that x-rays emanating from the x-ray source enter the x-ray receiving section, the x-ray receiving section generating an image representing intensities of the x-rays entering the x-ray receiving section. Additionally, a computer is coupled to the x-ray receiving section and radiation sensors are located in a path of the x-rays emitted from the x-ray source. The radiation sensors detect the beginning and end of a radiation cycle and transmit the detected beginning and end of the radiation cycle to the computer.
Another imaging device consistent with the present invention includes a rotatable C-arm support having first and second ends. The first end includes an x-ray source for initiating an imaging cycle and the second end includes an x-ray receiving section positioned so that x-rays emanating from the x-ray source enter the x-ray receiving section. The x-ray receiving section generates an image representing the intensities of the x-rays entering the x-ray receiving section. Further, a calibration and tracking target is included and a tracking sensor detects the position, in three-dimensional space, of the calibration and tracking target; and a computer is coupled to the x-ray receiving section and the tracking sensor. The computer detects motion of the C-arm based on changes in the position detected by the tracking sensor.
Another aspect consistent with the present invention is directed to a surgical instrument navigation system. The system comprises a computer processor; a tracking sensor for sensing three-dimensional position information of a surgical instrument and transmitting the position information to the computer processor; a memory coupled to the computer processor, the memory including computer instructions that when executed by the computer processor cause the processor to generate an icon representing the surgical instrument and to overlay the icon on a pre-acquired x-ray image, the icon of the surgical instrument representing the real-time position of the surgical instrument projected into the pre-acquired x-ray image and the icon being generated as a first representation when the surgical instrument is positioned such that it is substantially viewable in the plane of the pre-acquired image and the icon being generated as a second representation when the surgical instrument is positioned such that it is substantially perpendicular to the plane of the pre-acquired image. Finally, a display is coupled to the processor for displaying the generated icon superimposed on the pre-acquired image.
Yet another system consistent with the present invention comprises a computer processor and a tracking sensor for sensing three-dimensional position information of a surgical instrument and transmitting the position information to the computer processor. A memory is coupled to the computer processor, the memory including computer instructions that when executed by the computer processor cause the processor to generate an icon representing the surgical instrument positioned in a pre-acquired image of a patient's anatomy, the icon of the surgical instrument including a first portion corresponding to an actual position of the surgical instrument and a second portion corresponding to a projection of the surgical instrument along a line given by a current trajectory of the surgical instrument. A display is coupled to the processor for displaying the generated icon superimposed on the pre-acquired image.
Still further, another surgical instrument navigation system consistent with the present invention comprises a rotatable C-arm including an x-ray source and an x-ray receiving section for acquiring x-ray images of a patient, the C-arm being rotatable about one of a plurality of mechanical axes. A computer processor is coupled to the rotatable C-arm and a memory is coupled to the computer processor. The memory stores the x-ray images acquired by the rotatable C-arm and computer instructions that when executed by the computer processor cause the computer processor to generate a line representing a projection of a plane parallel to one of the plurality of the mechanical axes of the C-arm into the x-ray image, the line enabling visual alignment of the one of the plurality of mechanical axes of the C-arm with an axis relating complimentary image views. A display is coupled to the processor for displaying the generated line superimposed on the x-ray image.
Yet another system consistent with the present invention is for defining a surgical plan and comprises an x-ray imaging device; a surgical instrument; a tracking sensor for detecting the position, in three-dimensional space, of the surgical instrument; a computer processor in communication with the tracking sensor for defining a point in a virtual x-ray imaging path as the three-dimensional location of the surgical instrument, the point being outside of a true x-ray imaging path of the x-ray imaging device, the computer processor translating position of the surgical instrument within the virtual x-ray imaging path to a corresponding position in the true x-ray imaging path; and a display coupled to the processor for displaying a pre-acquired x-ray image overlaid with an iconic representation of the surgical instrument, the position of the iconic representation of the surgical instrument in the pre-acquired x-ray image corresponding to the translated position of the surgical instrument.
Yet another system consistent with the present invention for defining a surgical plan comprises a combination of elements. The elements include an x-ray imaging device; a surgical instrument; a tracking sensor for detecting the position, in three-dimensional space, of the surgical instrument; a computer processor in communication with the tracking sensor for calculating a projection of the trajectory of the surgical instrument a distance ahead of the actual location of the surgical instrument; and a display coupled to the processor for displaying a pre-acquired x-ray image overlaid with an iconic representation of the surgical instrument and the calculated projection of the trajectory of the surgical instrument.
Yet another system consistent with the present invention is for aligning a first bone segment with a second bone segment in a patient. The system comprises a first tracking marker attached to the first bone segment and a second tracking marker attached to the second bone segment. A tracking sensor detects the relative position, in three-dimensional space, of the first and second tracking markers. A computer delineates boundaries of images of the first and second bone segments in a pre-acquired x-ray image and when the second bone segment is moved in the patient, the computer correspondingly moves the delineated boundary of the second bone segment in the x-ray image. A display is coupled to the computer and displays the pre-acquired x-ray image overlaid with representations of the delineated boundaries of the first and second bone segments.
Yet another system consistent with the present invention is directed to a system for placing a surgical implant into a patient. The system comprises a computer processor; means for entering dimensions of the implant; a tracking sensor for sensing three-dimensional position information of a surgical instrument on which the surgical implant is attached, the tracking sensor transmitting the position information to the computer processor; and a memory coupled to the computer processor, the memory including computer instructions that when executed by the computer processor cause the processor to generate an icon representing the surgical instrument and the attached surgical implant, and to overlay the icon on a pre-acquired two-dimensional x-ray image, the icon of the surgical instrument representing the real-time position of the surgical instrument relative to the pre-acquired two-dimensional x-ray image.
In addition to the above mention devices and systems, the concepts of the present invention may be practiced as a number of related methods.
An additional method consistent with the present invention is a method of acquiring a two-dimensional x-ray image of patient anatomy from a desired view direction. The method comprises generating the two-dimensional image using an x-ray imager; specifying a view direction in a three-dimensional image representing the patient anatomy; generating a two-dimensional digitally reconstructed radiograph (DRR) image based on the three-dimensional image and the specified view direction; and
determining that the two-dimensional x-ray image corresponds to the desired view direction by matching the DRR image to the x-ray image.
Another method consistent with the present invention is a method of calculating an angle between a surgical instrument and a plane selected in an x-ray image. The method comprises a number of steps, including: defining at least two points in the x-ray image; defining a plane passing through the x-ray image as the plane including the two points and linear projections of the two points as dictated by a calibration transformation used to calibrate the x-ray image for its particular imaging device; sensing a position of the surgical instrument in three-dimensional space; and calculating the angle between intersection of a projection of the surgical instrument in three-dimensional space and the plane.
Yet another method consistent with the present invention is a method for aligning a fluoroscopic imager with a view direction of the medial axis of a patient's pedicle. The method comprises displaying a three-dimensional image of an axial cross-section of vertebra of the patient; extracting an angle from the three-dimensional image corresponding to the angle separating an anterior/posterior axis and the medial axis of the pedicle; aligning the fluoroscopic imager with a long axis of the patient; and rotating the fluoroscopic imager about the long axis of the patient through the measured angle.
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate several embodiments consistent with this invention and, together with the description, help explain the principles of the invention. In the drawings,
As described herein, novel methods and systems improve surgical navigational guidance using one or more fluoroscopic x-ray images. The methods and systems may be used for either navigational guidance using only two-dimensional fluoroscopic images or for navigational guidance using a combination of two-dimensional fluoroscopic images and three-dimensional volumetric images, such as CT or MRI images.
Reference will now be made in detail to embodiments of the invention, examples of which are illustrated in the accompanying drawings. Wherever possible, the same reference numbers will be used throughout the drawings to refer to the same or like parts.
One appropriate implementation of imaging device 100 is the “Series9600 Mobile Digital Imaging System,” from OEC Medical Systems, Inc., of Salt Lake City, Utah, although calibration and tracking target 106 and radiation sensors 107 are typically not included in the Series9600 Mobile Digital Imaging System and may have to be added. The “Series9600 Mobile Digital Imaging System” is otherwise structurally similar to imaging system 100.
In operation, x-ray source 104 generates x-rays that propagate through patient 110 and calibration target 106, and into x-ray receiving section 105. Receiving section 105 generates an image representing the intensities of the received x-rays. Typically, receiving section 105 comprises an image intensifier that converts the x-rays to visible light and a charge coupled device (CCD) video camera that converts the visible light to digital images. Receiving section 105 may also be a device that converts x-rays directly to digital images, thus potentially avoiding distortion introduced by first converting to visible light.
Fluoroscopic images taken by imaging device 100 are transmitted to computer 115, where they may further be forwarded to computer 120. Computer 120 provides facilities for displaying (on monitor 121), saving, digitally manipulating, or printing a hard copy of the received images. Three-dimensional images, such as pre-acquired patient specific CT/MR data set 124 or a three-dimensional atlas data set 126 (described in more detail below) may also be manipulated by computer 120 and displayed by monitor 121. Images, instead of or in addition to being displayed on monitor 121, may also be displayed to the physician through a heads-up-display.
Although computers 115 and 120 are shown as two separate computers, they alternatively could be variously implemented as multiple computers or as a single computer that performs the functions performed by computers 115 and 120. In this case, the single computer would receive input from both C-arm imager 100 and tracking sensor 130.
Radiation sensors 107 sense the presence of radiation, which is used to determine whether or not imaging device 100 is actively imaging. The result of their detection is transmitted to processing computer 120. Alternatively, a person may manually indicate when device 100 is actively imaging or this function can be built into x-ray source 104, x-ray receiving section 105, or control computer 115.
In operation, the patient is positioned between the x-ray source 104 and the x-ray receiving section 105. In response to an operator's command input at control computer 115, x-rays emanate from source 104 and pass through patient 110, calibration target 106, and into receiving section 105, which generates a two-dimensional image of the patient.
C-arm 103 is capable of rotating relative to patient 110, allowing images of patient 110 to be taken from multiple directions. For example, the physician may rotate C-arm 103 in the direction of arrows 108 or about the long axis of the patient. Each of these directions of movement involves rotation about a mechanical axis of the C-arm. In this example, the long axis of the patient is aligned with the mechanical axis of the C-arm.
Raw images generated by receiving section 105 tend to suffer from undesirable distortion caused by a number of factors, including inherent image distortion in the image intensifier and external electromagnetic fields. Drawings representing ideal and distorted images are shown in
The image formation process in a system such as fluoroscopic C-arm imager 100 is governed by a geometric projective transformation which maps lines in the fluoroscope's field of view to points in the image (i.e., within the x-ray receiving section 105). This concept is illustrated in
Intrinsic calibration, which is the process of correcting image distortion in a received image and establishing the projective transformation for that image, involves placing “calibration markers” in the path of the x-ray, where a calibration marker is an object opaque or semi-opaque to x-rays. Calibration markers 111 are rigidly arranged in predetermined patterns in one or more planes in the path of the x-rays and are visible in the recorded images. Tracking targets, such as emitters or reflectors 109, are fixed in a rigid and known position relative to calibration markers 111.
Because the true relative position of the calibration markers 111 in the recorded images are known, computer 120 is able to calculate an amount of distortion at each pixel in the image (where a pixel is a single point in the image). Accordingly, computer 120 can digitally compensate for the distortion in the image and generate a distortion-free, or at least a distortion improved image. Alternatively, distortion may be left in the image, and subsequent operations on the image, such as superimposing an iconic representation of a surgical instrument on the image (described in more detail below), may be distorted to match the image distortion determined by the calibration markers. The same calibration markers can also be used to estimate the geometric perspective transformation, since the position of these markers are known with respect to the tracking target emitters or reflectors 109 and ultimately with respect to tracking sensor 130. A more detailed explanation of methods for performing intrinsic calibration is described in the references B. Schuele et al., “Correction of Image Intensifier Distortion for Three-Dimensional Reconstruction,” presented at SPIE Medical Imaging 1995, San Diego, Calif., 1995 and G. Champleboux et al., “Accurate Calibration of Cameras and Range Imaging Sensors: the NPBS Method,” Proceedings of the 1992 IEEE International Conference on Robotics and Automation, Nice, France, May 1992, and U.S. application Ser. No. 09/106,109, filed on Jun. 29, 1998 by the present assignee, the contents of which are hereby incorporated by reference.
Calibration and tracking target 106 may be attached to x-ray receiving section 105 of the C-arm. Alternately, the target 106 can be mechanically independent of the C-arm, in which case it should be positioned such that the included calibration markers 111 are visible in each fluoroscopic image to be used in navigational guidance. Element 106 serves two functions. The first, as described above, is holding calibration markers 111 used in intrinsic calibration. The second function, which is described in more detail below, is holding infrared emitters or reflectors 109, which act as a tracking target for tracking sensor 130.
Tracking sensor 130 is a real-time infrared tracking sensor linked to computer 120. Specially constructed surgical instruments and other markers in the field of tracking sensor 130 can be detected and located in three-dimensional space. For example, a surgical instrument 140, such as a drill, is embedded with infrared emitters or reflectors 141 on its handle. Tracking sensor 130 detects the presence and location of infrared emitters or reflectors 141. Because the relative spatial locations of the emitters or reflectors in instrument 140 are known a priori, tracking sensor 130 and computer 120 are able to locate instrument 140 in three-dimensional space using well known mathematical transformations. Instead of using infrared tracking sensor 130 and corresponding infrared emitters or reflectors, other types of positional location devices are known in the art, and may be used. For example, a positional location device may also be based on magnetic fields, sonic emissions, or radio waves.
Reference frame marker 150, like surgical instrument 140, is embedded with infrared emitters or reflectors, labeled 151. As with instrument 140, tracking sensor 130 similarly detects the spatial location of emitters/reflectors 151, through which tracking sensor 130 and computer 120 determine the three-dimensional position of dynamic reference frame marker 150. The determination of the three-dimensional position of an object relative to a patient is known in the art, and is discussed, for example, in the following references, each of which is hereby incorporated by reference: PCT Publication WO 96/11624 to Bucholz et al., published Apr. 25, 1996; U.S. Pat. No. 5,384,454 to Bucholz; U.S. Pat. No. 5,851,183 to Bucholz; and U.S. Pat. No. 5,871,445 to Bucholz.
During an operation, dynamic reference frame marker 150 is attached in a fixed position relative to the portion of the patient to be operated on. For example, when inserting a screw into the spine of patient 110, dynamic reference frame marker 150 may be physically attached to a portion of the spine of the patient. Because dynamic reference frame 150 is in a fixed position relative to the patient anatomy, and instrument 140 can be accurately located in three dimensional space relative to dynamic reference frame 150, instrument 140 can also be located relative to the patient's anatomy.
As discussed above, calibration and tracking target 106 also includes infrared emitters or reflectors 109 similar to those in instrument 140 or dynamic reference frame 150. Accordingly, tracking sensor 130 and computer 120 may determine the three-dimensional position of calibration target 106 relative to instrument 140 and/or dynamic reference frame 150 and thus the patient position.
In general, the imaging system shown in
At the end of the radiation cycle, computer 120 retrieves the acquired image from C-arm control computer 115 and retrieves the location information of target marker 106 and dynamic reference frame 150 from tracking sensor 130. Computer 120 calibrates the acquired image, as described above, to learn its projective transformation and optionally to correct distortion in the image, (step 1403), and then stores the image along with its positional information (step 404). The process of steps 400-404 is repeated for each image that is to be acquired (step 405).
Because the acquired images are stored with the positional information of the calibration and tracking target 106 and dynamic reference frame 150, the position of C-arm 103, x-ray source 104, and receiving section 105 for each image, relative to patient 110, can be computed based upon the projective transformation identified in the calibration process. During surgery, tracking sensor 130 and computer 120 detect the position of instrument 140 relative to dynamic reference frame 150, and hence relative to patient 110. With this information, computer 120 dynamically calculates, in real-time, the projection of instrument 140 into each fluoroscopic image as the instrument is moved by the physician. A graphical representation of instrument 140 may then be overlaid on the fluoroscopic images (step 406). The graphical representation of instrument 140 is an iconic representation of where the actual surgical instrument would appear within the acquired fluoroscopic x-ray image if imager 100 was continuously acquiring new images from the same view as the original image. There is no theoretical limit to the number of fluoroscopic images on which the graphical representations of instrument 140 may be simultaneously overlaid.
In certain situations, the physician may wish to know where the tip of the instrument would be if the instrument were projected along a line given by the instrument's current trajectory. Consistent with an aspect of the present invention, at the physician's command, computer 120 may calculate and display this projection. Area 505 in
Although the “look-ahead” technique described above projected the graphical representation of the instrument into the image, there is no requirement that the instrument's graphical representation be in the space of the image for look-ahead trajectory 505 to be projected into the image. For example, the physician may be holding the instrument above the patient and outside the space of the image, so that the representation of the instrument does not appear in the image. However, it may still be desirable to project look-ahead portion 505 into the image to facilitate planning of a surgical procedure.
When surgical instrument 140 is perpendicular to the plane of the fluoroscopic image, the graphical overlay of the surgical instrument essentially collapses to a point, making it difficult to view. To alleviate this problem, computer 120 may optionally use a different graphical representation of instrument 140 when the distance in the image plane between the tip and the tail of instrument 140 becomes smaller than a fixed distance (e.g., 15 pixels).
Frequently, the physician would like to acquire two complementary fluoroscopic images of the patient, such as images from an anterior/posterior view and a lateral view of the vertebral discs. The complementary views are related to one another by a rotation about an axis by a particular amount. For example, an anterior/posterior view is related to a lateral view by a 90 degree rotation around the axis running parallel through the length of the patient. When the mechanical axis of rotation of C-arm 103 is aligned with the axis relating the complementary views (e.g., when the mechanical axis is aligned with the axis running through the length of the patient), the physician can accurately and quickly switch between the complementary views by simply rotating C-arm 103 through the separation of the complementary views (usually 90 degrees). Generally, however, the axis of rotation of C-arm 103 is not inherently aligned with the axis that relates the complementary views, requiring the physician to perform a series of time consuming trial-and-error based adjustments of the fluoroscope's position through two or more axes of rotation.
Consistent with an aspect of the present invention, software on computer 120 allows the surgeon to easily adjust the fluoroscope's position so that one of its mechanical rotation axes, such as the axis of rotation shown by arrows 108 in
Images of complementary views and the axis that relates them are illustrated in
Although the alignment of lines 802 and 804, as discussed above, was illustrated using both lines 802 and 804 drawn on the fluoroscopic image, in practice, it may only be necessary to display line 802 in the image. In this case, line 804 is mentally visualized by the physician. Additionally, although the relation of complimentary views was discussed using the example of the spine, complimentary fluoroscopic images of other anatomical regions, such as, for example, the pelvis, femur, or cranium, may similarly be obtained by application of the above discussed concepts.
Before, or during, surgery, the physician may find it desirable to input an operation “plan” to computer 120. The plan may, for example, specify a desired trajectory of a surgical instrument superimposed on a fluoroscopic image. During the surgical navigation process, the goal of the surgeon would be to align the graphical icon representing the real-time location of the surgical instrument with the graphical overlay representing the planned trajectory.
Yet another method consistent with the present invention for specifying a planned trajectory of a surgical instrument, which, unlike the method discussed above, does not require positioning the surgical instrument on or near the patient's bone, is illustrated in
As shown in
To define the correspondence between actual and virtual cones, it is necessary for the physician to define the position of the virtual cone relative to the tracking sensor. In general, there are many ways to define a cone in space. For example, the position and orientation of a cone can be defined by three points, one corresponding to its apex, one corresponding to a second point along its central axis, and a third corresponding to the rotation of the cone about the central axis. Therefore, one way to define the cone would be to use the tip of the surgical instrument to define these three points in space relative to the tracking sensor. Another way to define this correspondence is to use a single measurement of a surgical instrument. Using this method, the axis of the instrument corresponds to the axis of the cone, the tip of the instrument corresponds to a fixed point along the axis of the cone (which could be the apex, but could also be another point along the central axis), and the orientation of the instrument about its axis corresponds to the orientation of the cone about its axis. In general any set of measurements which define the position and orientation of a given cone can be used to establish the correspondence between the actual and virtual cones.
The operations illustrated in
It is also consistent with this invention to provide automated planning using computer analysis techniques to define an “optimal” trajectory in the C-arm images. Once the optimal trajectory is determined, computer 120 overlays the optimal trajectory in the fluoroscopic image. For example, automated plans can be generated using computational techniques to reduce a specified amount of lordosis in spine surgery.
A common clinical problem, especially in orthopaedic trauma, is the realignment (reduction) of broken or misaligned bone fragments.
To begin the alignment procedure, the physician places a tracking sensor marker on each of bone fragments 1201 and 1202 (step 1301) and acquires the fluoroscopic images, (step 1302), such as the image shown in
After acquisition of the fluoroscopic image(s), computer 120 uses image detection and extraction techniques to delineate the boundaries of the bone fragments in the images (step 1304). Suitable edge detection algorithms for generating the contours are well known in the art, and may be, for example, the Canny edge detector, the Shen-Casten edge detector, or the Sobel edge detector. An edge detected version of
Overlaying the detected image contours on the fluoroscopic image allows the physician to easily identify the correspondence between image contours 1203-1204 and bone fragments 1201-1202. The physician inputs this correspondence into computer 120 (step 1305). Alternatively, computer 120 may automatically identify the correspondence between the image contours and the bone fragments. Once the correspondence is established, the physician specifies which contour is to remain fixed and which is to be repositioned. The tracking sensor marker attached to the fragment to be repositioned is referred to as the dynamic reference marker and the tracking sensor marker attached to the fixed fragment is referred to as the fixed reference frame marker, although physically the dynamic reference marker and the fixed reference frame marker may be identical.
During surgical navigation, the physician moves the bone fragment having the dynamic reference marker (step 1306). Tracking sensor 130 detects the position of the dynamic reference frame marker and the fixed frame marker. With this information and the previously generated positional location information, computer 120 calculates and displays the new position of the dynamic reference frame, and hence its corresponding bone fragment, in the fluoroscopic image (step 1307).
Methods described above for aligning bone fragments may also be applied to the proper alignment of multiple vertebral bodies, for example in the reduction of scoliosis.
The navigational guidance system consistent with the present invention is not limited to providing surgical navigational guidance with two-dimensional fluoroscopic images. Three-dimensional volumetric data sets may also be overlaid with graphical representations of a surgical instrument. Three-dimensional data sets (such as CT or MRI) may be either pre-acquired or acquired during the operation.
Two types of three-dimensional data sets are typically used in surgical navigation: patient-specific image data and non-patient specific or atlas data. Patient-specific three-dimensional images are typically acquired prior to surgery using computed tomography (CT), magnetic resonance (MR), or other known three-dimensional imaging modalities, although intra-operative acquisition is also possible. Atlas data is non-patient specific three-dimensional data describing a “generic” patient. Atlas data may be acquired using CT, MR or other imaging modalities from a particular patient; and may even comprise images from several modalities which are spatially registered (e.g., CT and MR together in a common coordinate system). Atlas data may be annotated with supplemental information describing anatomy, physiology, pathology, or “optimal” planning information (for example screw placements, lordosis angles, scoliotic correction plans, etc).
A three-dimensional patient CT or MR data set is shown in
Before overlaying a three-dimensional image with graphical representations of surgical instruments, the correspondence between points in the three-dimensional image and points in the patient's reference frame must be determined. This procedure is known as registration of the image. One method for performing image registration is described in the previously mentioned publications to Bucholz. Three-dimensional patient specific images can be registered to a patient on the operating room table (surgical space) using multiple two-dimensional image projections. This process, which is often referred to as 2D/3D registration, uses two spatial transformations that can be established. The first transformation is between the acquired fluoroscopic images and the three-dimensional image data set (e.g., CT or MR) corresponding to the same patient. The second transformation is between the coordinate system of the fluoroscopic images and an externally measurable reference system attached to the fluoroscopic imager. Once these transformations have been established, it is possible to directly relate surgical space to three-dimensional image space.
When performing three-dimensional registration, as with two-dimensional registration, imager 100, when acquiring the image, should be stationary with respect to patient 110. If C-arm 103 or patient 110 is moving during image acquisition, the position of the fluoroscope will not be accurately determined relative to the patient's reference frame. Accordingly, the previously described technique for detecting movement of imager 100 during the image acquisition process can be used when acquiring fluoroscopic images that are to be used in 2D/3D registration. That is, as described, computer 120 may examine the position information from tracking sensor 130 while radiation sensors 107 are signaling radiation detection. If the calibration and tracking target 106 moves relative to dynamic reference frame 150 during image acquisition, this image is marked as erroneous.
It may be necessary to acquire complementary fluoroscopic views (e.g., lateral and anterior/posterior) to facilitate 2D/3D registration. The techniques previously discussed in reference to
Once registered, computer 120 may use positional information of instrument 140 to overlay graphical representations of the instrument in the three-dimensional image as well as the two-dimensional fluoroscopic images.
The two-dimensional images generated by imager 100 are not always able to adequately represent the patient's bone structure. For example, fluoroscopic x-ray images are not effective when taken through the length of the patient (i.e., from the point of view looking down at the patient's head or up from the patient's feet) because the large number of bones that the x-rays pass through occlude one another in the final image. However, information required for planning a surgical procedure which is not otherwise available based on two-dimensional image data alone may be extracted from a three-dimensional image data set such as a CT or MR image data set. The extracted information may then be transferred to the two-dimensional x-ray images generated by imager 100 and used in surgical navigation. The following examples describe additional methods for using three-dimensional and two-dimensional data in surgical navigation.
Rectangle 1401 represents the projection of the cylindrical inter-vertebral cage into the image. While the long axis of the cylinder appears to be completely within the bone in this image, this may not be the case due to curvature of the anterior aspect of vertebrae 1402. FIG. 14B is an image of a three-dimensional axial CT cross section of the vertebrae. Corner 1403 of rectangle 1401 protrudes from the bone—a highly undesirable situation that cannot be reliably detected in x-ray images such as that of
Once the cage length has been determined by the physician and entered into computer 120, the length value can then be used by computer 120 in properly displaying the graphical overlay in the associated two-dimensional image. The position of the surgical instrument used to hold the cage during the insertion process, as detected by tracking sensor 130, is used to calculate the position of the cage in
Although the above discussed example was with a cylindrical spinal implant, in general, the described concepts could be applied to any surgical implant.
In certain clinical procedures, it may be desirable to acquire a fluoroscopic x-ray image view looking substantially straight down the medial axis of a vertebral pedicle. For the purposes of this example, a vertebral pedicle can be thought of as a cylinder, and the medial axis corresponds to the central axis of the cylinder.
Given an anterior/posterior fluoroscopic image view of the spine, such as the one shown in
With conventional fluoroscopic x-ray image acquisition, radiation passes through a physical media to create a projection image on a radiation sensitive film or an electronic image intensifier. Given a 3D CT data set, a simulated x-ray image can also be generated using a technique known as digitally reconstructed radiography (DRR). DRR is well known in the art, and is described, for example, by L. Lemieux et al., “A Patient-to-Computed-Tomography Image Registration Method Based on Digitally Reconstructed Radiographs,” Medical Physics 21(11), pp 1749-1760, November 1994.
When a DRR image is created, a fluoroscopic image is formed by computationally projecting volume elements (voxels) of the 3D CT data set onto a selected image plane. Using a 3D CT data set of a given patient, it is possible to create a DRR image that appears very similar to a corresponding x-ray image of the same patient. A requirement for this similarity is that the “computational x-ray imager” and actual x-ray imager use similar intrinsic imaging parameters (e.g., projection transformations, distortion correction) and extrinsic imaging parameters (e.g., view direction). The intrinsic imaging parameters can be derived from the calibration process.
A DRR image may be used to provide guidance to the surgeon in the problem discussed in Example 1 of appropriately placing an inter-vertebral cage in the patient. Given a 3D CT data set of two adjacent vertebrae, the physician, interacting with computer 120, may manually position a 3D CAD model of an inter-vertebral cage in a clinically desired position in the three-dimensional view of the vertebrae. The physician may then use the DRR technique to synthesize an anterior/posterior, lateral, or other x-ray view of the vertebrae showing the three-dimensional CAD model of the inter-vertebral cage. Thus, a synthetic fluoroscopic x-ray image can be created which simulates what a properly placed cage would look like after implantation.
The simulated x-ray images may be compared to the actual images taken by imager 100 during surgery. The goal of the surgeon is to position the implant such that the intra-operative images match the DRR images. For this comparison, two types of intra-operative images may preferably be used. First, conventional fluoroscopy could be used to acquire an image after the inter-vertebral cage has been implanted. Second, images acquired prior to cage placement could be supplemented with superimposed graphical icons representing the measured cage position. In either case, the synthetic fluoroscopic image can be used as a template to help guide the surgeon in properly placing the inter-vertebral cage.
Although the above example was described in the context of implanting an inter-vertebral cage, implants other than the inter-vertebral cage could also be used.
The DRR technique can be used to provide guidance to the physician when acquiring an owl's eye view of a vertebral pedicle. Given a three-dimensional CT data set containing a vertebra and associated pedicle, the physician may use computer 120 to manually locate a three-dimensional representation of the pedicle's medial axis relative to the three-dimensional images of the vertebrae. Once this placement has been achieved, it is possible to synthesize an owl's eye view of the vertebrae based upon the view direction specified by the physician's selection of the three-dimensional medial axis. This synthetic image can then be displayed to the surgeon during surgery and used to guide the acquisition of an actual owl's eye view using the fluoroscope. By visually comparing fluoroscopic images taken while positioning the fluoroscope to the synthetic owl's eye view, the physician can acquire a true fluoroscopic image with a view direction approximately equal to the manually selected medial axis. In this manner, a high quality owl's eye view can be acquired.
Although the above example was described in the context of synthesizing a two-dimensional owl's eye view, in general, any three-dimensional view direction can be selected and a corresponding two-dimensional image synthesized and used to acquire a fluoroscopic two-dimensional image.
It may be desirable to measure the angle between the trajectory of a surgical instrument and the plane of a fluoroscopic image (such as a plane aligned with the mid-line of the spine 1502) during surgery using a pre-acquired fluoroscopic image. This is useful, as it is often desirable to position or implant a surgical instrument at a certain angle relative to the plane of the fluoroscopic image. For example, the surgical instrument may need to be implanted in the direction aligned with the medial axis of the pedicle 1503.
Consider the vertebral cross section shown as an axial CT image in
Plane 1603 defines the midline of the spine in three-dimensional space. During navigational guidance, the equation of this plane can be expressed in the coordinate system of either the dynamic reference frame 150 or the tracking sensor 130.
Using the tracking sensor 130 to measure the position and orientation (i.e., the trajectory) of the instrument 140, computer 120 then mathematically projects this trajectory onto the plane 1603. This projection will define a line passing through plane 1603. The angle between this line in plane 1603 and the instrument trajectory corresponds to the angle to be measured. In other words, the angle to be measured corresponds to the minimum angle present between the trajectory of the instrument and the plane 1603. The angle to be measured can be calculated by computer 120 and displayed to the physician either in a textual or graphical format.
In summary, as described in this example, a single fluoroscopic image can be used during surgery to position a surgical instrument at a desired trajectory relative to the plane of the fluoroscopic image. More generally, the methods described in this example relate to measuring the angle between the trajectory of a surgical instrument 140 and a plane (e.g. 1603) defined by two or more points (e.g., 1601) which have been manually or automatically selected in a fluoroscopic image. While the explanation uses a CT for clarity of the example, the measurement and display of the angle can be achieved without the use of any 3D image data.
Although the above five examples used three-dimensional patient specific data and not atlas data, in certain situations, it may be possible to use a 2D/3D registration scheme that registers non-patient specific atlas data to patient specific fluoroscopic images using deformable registration methods that do not preserve the rigidity of anatomical structure during the registration process. In this manner, the patient specific fluoroscopic images may be used to deform the atlas data to better correspond to the patient and thereby transfer atlased knowledge to the patient specific fluoroscopic images.
The above described systems and methods significantly extend the conventional techniques for acquiring and using x-ray images for surgical navigational guidance. It will be apparent to those skilled in the art that various modifications and variations can be made to the present invention without departing from the scope or spirit of the invention. For example, although certain of the examples were described in relation to spinal examples, many other regions of body could be operated on.
Other embodiments of the invention will be apparent to those skilled in the art from consideration of the specification and practice of the invention disclosed herein. In particular, an alternative embodiment of the calibration and tracking target may allow the calibration component to be detached from the C-arm and introduced into the C-arm view for calibrations only, and then removed. It is intended that the specification and examples be considered as exemplary only, with the true scope and spirit of the invention being indicated by the following claims.
This disclosure is related to U.S. patent application Ser. No. 09/106,109, entitled “System and Methods for the Reduction and Elimination of Image Artifacts in the Calibration of X-Ray Imagers,” filed on Jun. 29, 1998.
Number | Name | Date | Kind |
---|---|---|---|
1576781 | Philips | Mar 1926 | A |
1735726 | Bornhardt | Nov 1929 | A |
2407845 | Nemeyer | Sep 1946 | A |
2650588 | Drew | Sep 1953 | A |
2697433 | Sehnder | Dec 1954 | A |
3016899 | Stenvall | Jan 1962 | A |
3017887 | Heyer | Jan 1962 | A |
3061936 | Dobbeleer | Nov 1962 | A |
3073310 | Mocarski | Jan 1963 | A |
3109588 | Polhemus et al. | Nov 1963 | A |
3294083 | Alderson | Dec 1966 | A |
3367326 | Frazier | Feb 1968 | A |
3439256 | Kähne et al. | Apr 1969 | A |
3577160 | White | May 1971 | A |
3614950 | Rabey | Oct 1971 | A |
3644825 | Davis, Jr. et al. | Feb 1972 | A |
3674014 | Tillander | Jul 1972 | A |
3702935 | Carey et al. | Nov 1972 | A |
3704707 | Halloran | Dec 1972 | A |
3821469 | Whetstone et al. | Jun 1974 | A |
3868565 | Kuipers | Feb 1975 | A |
3941127 | Froning | Mar 1976 | A |
3983474 | Kuipers | Sep 1976 | A |
4017858 | Kuipers | Apr 1977 | A |
4037592 | Kronner | Jul 1977 | A |
4052620 | Brunnett | Oct 1977 | A |
4054881 | Raab | Oct 1977 | A |
4117337 | Staats | Sep 1978 | A |
4173228 | Van Steenwyk et al. | Nov 1979 | A |
4182312 | Mushabac | Jan 1980 | A |
4202349 | Jones | May 1980 | A |
4228799 | Anichkov et al. | Oct 1980 | A |
4256112 | Kopf et al. | Mar 1981 | A |
4262306 | Renner | Apr 1981 | A |
4287809 | Egli et al. | Sep 1981 | A |
4298874 | Kuipers | Nov 1981 | A |
4314251 | Raab | Feb 1982 | A |
4317078 | Weed et al. | Feb 1982 | A |
4319136 | Jinkins | Mar 1982 | A |
4328548 | Crow et al. | May 1982 | A |
4328813 | Ray | May 1982 | A |
4339953 | Iwasaki | Jul 1982 | A |
4341220 | Perry | Jul 1982 | A |
4346384 | Raab | Aug 1982 | A |
4358856 | Stivender et al. | Nov 1982 | A |
4368536 | Pfeiler | Jan 1983 | A |
4396885 | Constant | Aug 1983 | A |
4396945 | DiMatteo et al. | Aug 1983 | A |
4403321 | DiMarco | Sep 1983 | A |
4418422 | Richter et al. | Nov 1983 | A |
4419012 | Stephenson et al. | Dec 1983 | A |
4422041 | Lienau | Dec 1983 | A |
4431005 | McCormick | Feb 1984 | A |
4485815 | Amplatz | Dec 1984 | A |
4506676 | Duska | Mar 1985 | A |
4543959 | Sepponen | Oct 1985 | A |
4548208 | Niemi | Oct 1985 | A |
4571834 | Fraser et al. | Feb 1986 | A |
4572198 | Codrington | Feb 1986 | A |
4583538 | Onik et al. | Apr 1986 | A |
4584577 | Temple | Apr 1986 | A |
4608977 | Brown | Sep 1986 | A |
4613866 | Blood | Sep 1986 | A |
4617925 | Laitinen | Oct 1986 | A |
4618978 | Cosman | Oct 1986 | A |
4621628 | Bludermann | Nov 1986 | A |
4625718 | Olerud et al. | Dec 1986 | A |
4638798 | Shelden et al. | Jan 1987 | A |
4642786 | Hansen | Feb 1987 | A |
4645343 | Stockdale et al. | Feb 1987 | A |
4649504 | Krouglicof et al. | Mar 1987 | A |
4651732 | Frederick | Mar 1987 | A |
4653509 | Oloff et al. | Mar 1987 | A |
4659971 | Suzuki et al. | Apr 1987 | A |
4660970 | Ferrano | Apr 1987 | A |
4673352 | Hansen | Jun 1987 | A |
4688037 | Krieg | Aug 1987 | A |
4701049 | Beckman et al. | Oct 1987 | A |
4705395 | Hageniers | Nov 1987 | A |
4705401 | Addleman et al. | Nov 1987 | A |
4706665 | Gouda | Nov 1987 | A |
4709156 | Murphy et al. | Nov 1987 | A |
4710708 | Rorden et al. | Dec 1987 | A |
4719419 | Dawley | Jan 1988 | A |
4722056 | Roberts et al. | Jan 1988 | A |
4722336 | Kim et al. | Feb 1988 | A |
4723544 | Moore et al. | Feb 1988 | A |
4727565 | Ericson | Feb 1988 | A |
RE32619 | Damadian | Mar 1988 | E |
4733969 | Case et al. | Mar 1988 | A |
4737032 | Addleman et al. | Apr 1988 | A |
4737794 | Jones | Apr 1988 | A |
4737921 | Goldwasser et al. | Apr 1988 | A |
4742356 | Kuipers | May 1988 | A |
4742815 | Ninan et al. | May 1988 | A |
4743770 | Lee | May 1988 | A |
4743771 | Sacks et al. | May 1988 | A |
4745290 | Frankel et al. | May 1988 | A |
4750487 | Zanetti | Jun 1988 | A |
4753528 | Hines et al. | Jun 1988 | A |
4761072 | Pryor | Aug 1988 | A |
4764016 | Johansson | Aug 1988 | A |
4771787 | Wurster et al. | Sep 1988 | A |
4779212 | Levy | Oct 1988 | A |
4782239 | Hirose et al. | Nov 1988 | A |
4788481 | Niwa | Nov 1988 | A |
4791934 | Brunnett | Dec 1988 | A |
4793355 | Crum et al. | Dec 1988 | A |
4794262 | Sato et al. | Dec 1988 | A |
4797907 | Anderton | Jan 1989 | A |
4803976 | Frigg et al. | Feb 1989 | A |
4804261 | Kirschen | Feb 1989 | A |
4805615 | Carol | Feb 1989 | A |
4809694 | Ferrara | Mar 1989 | A |
4821200 | Öberg | Apr 1989 | A |
4821206 | Arora | Apr 1989 | A |
4821731 | Martinelli et al. | Apr 1989 | A |
4822163 | Schmidt | Apr 1989 | A |
4825091 | Breyer et al. | Apr 1989 | A |
4829373 | Leberl et al. | May 1989 | A |
4836778 | Baumrind et al. | Jun 1989 | A |
4838265 | Cosman et al. | Jun 1989 | A |
4841967 | Chang et al. | Jun 1989 | A |
4845771 | Wislocki et al. | Jul 1989 | A |
4849692 | Blood | Jul 1989 | A |
4860331 | Williams et al. | Aug 1989 | A |
4862893 | Martinelli | Sep 1989 | A |
4869247 | Howard, III et al. | Sep 1989 | A |
4875165 | Fencil et al. | Oct 1989 | A |
4875478 | Chen | Oct 1989 | A |
4884566 | Mountz et al. | Dec 1989 | A |
4889526 | Rauscher et al. | Dec 1989 | A |
4896673 | Rose et al. | Jan 1990 | A |
4905698 | Strohl, Jr. et al. | Mar 1990 | A |
4923459 | Nambu | May 1990 | A |
4931056 | Ghajar et al. | Jun 1990 | A |
4945305 | Blood | Jul 1990 | A |
4945914 | Allen | Aug 1990 | A |
4951653 | Fry et al. | Aug 1990 | A |
4955891 | Carol | Sep 1990 | A |
4961422 | Marchosky et al. | Oct 1990 | A |
4977655 | Martinelli | Dec 1990 | A |
4989608 | Ratner | Feb 1991 | A |
4991579 | Allen | Feb 1991 | A |
5002058 | Martinelli | Mar 1991 | A |
5005592 | Cartmell | Apr 1991 | A |
5013317 | Cole et al. | May 1991 | A |
5016639 | Allen | May 1991 | A |
5017139 | Mushabac | May 1991 | A |
5027818 | Bova et al. | Jul 1991 | A |
5030196 | Inoue | Jul 1991 | A |
5030222 | Calandruccio et al. | Jul 1991 | A |
5031203 | Trecha | Jul 1991 | A |
5042486 | Pfeiler et al. | Aug 1991 | A |
5047036 | Koutrouvelis | Sep 1991 | A |
5050608 | Watanabe et al. | Sep 1991 | A |
5054492 | Scribner et al. | Oct 1991 | A |
5057095 | Fabian | Oct 1991 | A |
5059789 | Salcudean | Oct 1991 | A |
5078140 | Kwoh | Jan 1992 | A |
5079699 | Tuy et al. | Jan 1992 | A |
5086401 | Glassman et al. | Feb 1992 | A |
5094241 | Allen | Mar 1992 | A |
5097839 | Allen | Mar 1992 | A |
5098426 | Sklar et al. | Mar 1992 | A |
5099845 | Besz et al. | Mar 1992 | A |
5099846 | Hardy | Mar 1992 | A |
5105829 | Fabian et al. | Apr 1992 | A |
5107839 | Houdek et al. | Apr 1992 | A |
5107843 | Aarnio et al. | Apr 1992 | A |
5107862 | Fabian et al. | Apr 1992 | A |
5109194 | Cantaloube | Apr 1992 | A |
5119817 | Allen | Jun 1992 | A |
5142930 | Allen et al. | Sep 1992 | A |
5143076 | Hardy et al. | Sep 1992 | A |
5152288 | Hoenig et al. | Oct 1992 | A |
5160337 | Cosman | Nov 1992 | A |
5161536 | Vikomerson et al. | Nov 1992 | A |
5178164 | Allen | Jan 1993 | A |
5178621 | Cook et al. | Jan 1993 | A |
5186174 | Schlondorff et al. | Feb 1993 | A |
5187475 | Wagener et al. | Feb 1993 | A |
5188126 | Fabian et al. | Feb 1993 | A |
5190059 | Fabian et al. | Mar 1993 | A |
5193106 | DeSena | Mar 1993 | A |
5197476 | Nowacki et al. | Mar 1993 | A |
5197965 | Cherry et al. | Mar 1993 | A |
5198768 | Keren | Mar 1993 | A |
5198877 | Schulz | Mar 1993 | A |
5207688 | Carol | May 1993 | A |
5211164 | Allen | May 1993 | A |
5211165 | Dumoulin et al. | May 1993 | A |
5211176 | Ishiguro et al. | May 1993 | A |
5212720 | Landi et al. | May 1993 | A |
5214615 | Bauer | May 1993 | A |
5219351 | Teubner et al. | Jun 1993 | A |
5222499 | Allen et al. | Jun 1993 | A |
5224049 | Mushabac | Jun 1993 | A |
5228442 | Imran | Jul 1993 | A |
5230338 | Allen et al. | Jul 1993 | A |
5230623 | Guthrie et al. | Jul 1993 | A |
5233990 | Barnea | Aug 1993 | A |
5237996 | Waldman et al. | Aug 1993 | A |
5249581 | Horbal et al. | Oct 1993 | A |
5251127 | Raab | Oct 1993 | A |
5251635 | Dumoulin et al. | Oct 1993 | A |
5253647 | Takahashi et al. | Oct 1993 | A |
5255680 | Darrow et al. | Oct 1993 | A |
5257636 | White | Nov 1993 | A |
5257998 | Ota et al. | Nov 1993 | A |
5261404 | Mick et al. | Nov 1993 | A |
5265610 | Darrow et al. | Nov 1993 | A |
5265611 | Hoenig et al. | Nov 1993 | A |
5269759 | Hernandez et al. | Dec 1993 | A |
5271400 | Dumoulin et al. | Dec 1993 | A |
5273025 | Sakiyama et al. | Dec 1993 | A |
5274551 | Corby, Jr. | Dec 1993 | A |
5279309 | Taylor et al. | Jan 1994 | A |
5285787 | Machida | Feb 1994 | A |
5291199 | Overman et al. | Mar 1994 | A |
5291889 | Kenet et al. | Mar 1994 | A |
5295483 | Nowacki et al. | Mar 1994 | A |
5297549 | Beatty et al. | Mar 1994 | A |
5299253 | Wessels | Mar 1994 | A |
5299254 | Dancer et al. | Mar 1994 | A |
5299288 | Glassman et al. | Mar 1994 | A |
5300080 | Clayman et al. | Apr 1994 | A |
5305091 | Gelbart et al. | Apr 1994 | A |
5305203 | Raab | Apr 1994 | A |
5306271 | Zinreich et al. | Apr 1994 | A |
5307072 | Jones, Jr. | Apr 1994 | A |
5309913 | Kormos et al. | May 1994 | A |
5315630 | Sturm et al. | May 1994 | A |
5316024 | Hirschi et al. | May 1994 | A |
5318025 | Dumoulin et al. | Jun 1994 | A |
5320111 | Livingston | Jun 1994 | A |
5325728 | Zimmerman et al. | Jul 1994 | A |
5325873 | Hirschi et al. | Jul 1994 | A |
5329944 | Fabian et al. | Jul 1994 | A |
5330485 | Clayman et al. | Jul 1994 | A |
5333168 | Fernandes et al. | Jul 1994 | A |
5353795 | Souza et al. | Oct 1994 | A |
5353800 | Pohndorf et al. | Oct 1994 | A |
5353807 | DeMarco | Oct 1994 | A |
5359417 | Müller et al. | Oct 1994 | A |
5368030 | Zinreich et al. | Nov 1994 | A |
5371778 | Yanof et al. | Dec 1994 | A |
5375596 | Twiss et al. | Dec 1994 | A |
5377678 | Dumoulin et al. | Jan 1995 | A |
5383454 | Bucholz | Jan 1995 | A |
5385146 | Goldreyer | Jan 1995 | A |
5385148 | Lesh et al. | Jan 1995 | A |
5386828 | Owens et al. | Feb 1995 | A |
5389101 | Heilbrun et al. | Feb 1995 | A |
5391199 | Ben-Haim | Feb 1995 | A |
5394457 | Leibinger et al. | Feb 1995 | A |
5394875 | Lewis et al. | Mar 1995 | A |
5397329 | Allen | Mar 1995 | A |
5398684 | Hardy | Mar 1995 | A |
5399146 | Nowacki et al. | Mar 1995 | A |
5400384 | Fernandes et al. | Mar 1995 | A |
5402801 | Taylor | Apr 1995 | A |
5408409 | Glassman et al. | Apr 1995 | A |
5413573 | Koivukangas | May 1995 | A |
5417210 | Funda et al. | May 1995 | A |
5419325 | Dumoulin et al. | May 1995 | A |
5423334 | Jordan | Jun 1995 | A |
5425367 | Shapiro et al. | Jun 1995 | A |
5425382 | Golden et al. | Jun 1995 | A |
5426683 | O'Farrell, Jr. et al. | Jun 1995 | A |
5426687 | Goodall et al. | Jun 1995 | A |
5427097 | Depp | Jun 1995 | A |
5429132 | Guy et al. | Jul 1995 | A |
5433198 | Desai | Jul 1995 | A |
RE35025 | Anderton | Aug 1995 | E |
5437277 | Dumoulin et al. | Aug 1995 | A |
5443066 | Dumoulin et al. | Aug 1995 | A |
5443489 | Ben-Haim | Aug 1995 | A |
5444756 | Pai et al. | Aug 1995 | A |
5445144 | Wodicka et al. | Aug 1995 | A |
5445150 | Dumoulin et al. | Aug 1995 | A |
5445166 | Taylor | Aug 1995 | A |
5446548 | Gerig et al. | Aug 1995 | A |
5447154 | Cinquin et al. | Sep 1995 | A |
5448610 | Yamamoto et al. | Sep 1995 | A |
5453686 | Anderson | Sep 1995 | A |
5456718 | Szymaitis | Oct 1995 | A |
5457641 | Zimmer et al. | Oct 1995 | A |
5458718 | Venkitachalam | Oct 1995 | A |
5464446 | Dreessen et al. | Nov 1995 | A |
5469847 | Zinreich et al. | Nov 1995 | A |
5478341 | Cook et al. | Dec 1995 | A |
5478343 | Ritter | Dec 1995 | A |
5480422 | Ben-Haim | Jan 1996 | A |
5480439 | Bisek et al. | Jan 1996 | A |
5483961 | Kelly et al. | Jan 1996 | A |
5484437 | Michelson | Jan 1996 | A |
5485849 | Panescu et al. | Jan 1996 | A |
5487391 | Panescu | Jan 1996 | A |
5487729 | Avellanet et al. | Jan 1996 | A |
5487757 | Truckai et al. | Jan 1996 | A |
5490196 | Rudich et al. | Feb 1996 | A |
5494034 | Schlondorff et al. | Feb 1996 | A |
5503416 | Aoki et al. | Apr 1996 | A |
5513637 | Twiss et al. | May 1996 | A |
5514146 | Lam et al. | May 1996 | A |
5515160 | Schulz et al. | May 1996 | A |
5517990 | Kalfas et al. | May 1996 | A |
5531227 | Schneider | Jul 1996 | A |
5531520 | Grimson et al. | Jul 1996 | A |
5542938 | Avellanet et al. | Aug 1996 | A |
5543951 | Moehrmann | Aug 1996 | A |
5546940 | Panescu et al. | Aug 1996 | A |
5546949 | Frazin et al. | Aug 1996 | A |
5546951 | Ben-Haim | Aug 1996 | A |
5551429 | Fitzpatrick et al. | Sep 1996 | A |
5558091 | Acker et al. | Sep 1996 | A |
5566681 | Manwaring et al. | Oct 1996 | A |
5568384 | Robb et al. | Oct 1996 | A |
5568809 | Ben-haim | Oct 1996 | A |
5571109 | Bertagnoli | Nov 1996 | A |
5572999 | Funda et al. | Nov 1996 | A |
5573533 | Strul | Nov 1996 | A |
5575794 | Walus et al. | Nov 1996 | A |
5575798 | Koutrouvelis | Nov 1996 | A |
5583909 | Hanover | Dec 1996 | A |
5588430 | Bova et al. | Dec 1996 | A |
5590215 | Allen | Dec 1996 | A |
5592939 | Martinelli | Jan 1997 | A |
5595193 | Walus et al. | Jan 1997 | A |
5596228 | Anderton et al. | Jan 1997 | A |
5600330 | Blood | Feb 1997 | A |
5603318 | Heilbrun et al. | Feb 1997 | A |
5611025 | Lorensen et al. | Mar 1997 | A |
5617462 | Spratt | Apr 1997 | A |
5617857 | Chader et al. | Apr 1997 | A |
5619261 | Anderton | Apr 1997 | A |
5622169 | Golden et al. | Apr 1997 | A |
5622170 | Schulz | Apr 1997 | A |
5627873 | Hanover et al. | May 1997 | A |
5628315 | Vilsmeier et al. | May 1997 | A |
5630431 | Taylor | May 1997 | A |
5636644 | Hart et al. | Jun 1997 | A |
5638819 | Manwaring | Jun 1997 | A |
5640170 | Anderson | Jun 1997 | A |
5642395 | Anderton et al. | Jun 1997 | A |
5643268 | Vilsmeier et al. | Jul 1997 | A |
5645065 | Shapiro et al. | Jul 1997 | A |
5646524 | Gilboa | Jul 1997 | A |
5647361 | Damadian | Jul 1997 | A |
5662111 | Cosman | Sep 1997 | A |
5664001 | Tachibana et al. | Sep 1997 | A |
5674296 | Bryan et al. | Oct 1997 | A |
5676673 | Ferre et al. | Oct 1997 | A |
5681260 | Ueda et al. | Oct 1997 | A |
5682886 | Delp et al. | Nov 1997 | A |
5682890 | Kormos et al. | Nov 1997 | A |
5690108 | Chakeres | Nov 1997 | A |
5694945 | Ben-Haim | Dec 1997 | A |
5695500 | Taylor et al. | Dec 1997 | A |
5695501 | Carol et al. | Dec 1997 | A |
5697377 | Wittkampf | Dec 1997 | A |
5702406 | Vilsmeier et al. | Dec 1997 | A |
5711299 | Manwaring et al. | Jan 1998 | A |
5713946 | Ben-Haim | Feb 1998 | A |
5715822 | Watkins et al. | Feb 1998 | A |
5715836 | Kliegis et al. | Feb 1998 | A |
5718241 | Ben-Haim et al. | Feb 1998 | A |
5727552 | Ryan | Mar 1998 | A |
5727553 | Saad | Mar 1998 | A |
5729129 | Acker | Mar 1998 | A |
5730129 | Darrow et al. | Mar 1998 | A |
5730130 | Fitzpatrick et al. | Mar 1998 | A |
5732703 | Kalfas et al. | Mar 1998 | A |
5735278 | Hoult et al. | Apr 1998 | A |
5738096 | Ben-Haim | Apr 1998 | A |
5740802 | Nafis et al. | Apr 1998 | A |
5741214 | Ouchi et al. | Apr 1998 | A |
5742394 | Hansen | Apr 1998 | A |
5744953 | Hansen | Apr 1998 | A |
5748767 | Raab | May 1998 | A |
5749362 | Funda et al. | May 1998 | A |
5749835 | Glantz | May 1998 | A |
5752513 | Acker et al. | May 1998 | A |
5755725 | Druais | May 1998 | A |
RE35816 | Schulz | Jun 1998 | E |
5758667 | Slettenmark | Jun 1998 | A |
5762064 | Polyani | Jun 1998 | A |
5767669 | Hansen et al. | Jun 1998 | A |
5767699 | Bosnyak et al. | Jun 1998 | A |
5767960 | Orman | Jun 1998 | A |
5769789 | Wang et al. | Jun 1998 | A |
5769843 | Abela et al. | Jun 1998 | A |
5769861 | Vilsmeier | Jun 1998 | A |
5772594 | Barrick | Jun 1998 | A |
5772661 | Michelson | Jun 1998 | A |
5775322 | Silverstein et al. | Jul 1998 | A |
5776064 | Kalfas et al. | Jul 1998 | A |
5782765 | Jonkman | Jul 1998 | A |
5787886 | Kelly et al. | Aug 1998 | A |
5792055 | McKinnon | Aug 1998 | A |
5795294 | Luber et al. | Aug 1998 | A |
5797849 | Vesely et al. | Aug 1998 | A |
5799055 | Peshkin et al. | Aug 1998 | A |
5799099 | Wang et al. | Aug 1998 | A |
5800352 | Ferre et al. | Sep 1998 | A |
5800535 | Howard, III | Sep 1998 | A |
5802719 | O'Farrell, Jr. et al. | Sep 1998 | A |
5803089 | Ferre et al. | Sep 1998 | A |
5807252 | Hassfeld et al. | Sep 1998 | A |
5810008 | Dekel et al. | Sep 1998 | A |
5810728 | Kuhn | Sep 1998 | A |
5810735 | Halperin et al. | Sep 1998 | A |
5820553 | Hughes | Oct 1998 | A |
5823192 | Kalend et al. | Oct 1998 | A |
5823958 | Truppe | Oct 1998 | A |
5828725 | Levinson | Oct 1998 | A |
5828770 | Leis et al. | Oct 1998 | A |
5829444 | Ferre et al. | Nov 1998 | A |
5831260 | Hansen | Nov 1998 | A |
5833608 | Acker | Nov 1998 | A |
5834759 | Glossop | Nov 1998 | A |
5836954 | Heilbrun et al. | Nov 1998 | A |
5840024 | Taniguchi et al. | Nov 1998 | A |
5840025 | Ben-Haim | Nov 1998 | A |
5843076 | Webster, Jr. et al. | Dec 1998 | A |
5848967 | Cosman | Dec 1998 | A |
5851183 | Bucholz | Dec 1998 | A |
5865846 | Bryan et al. | Feb 1999 | A |
5868674 | Glowinski et al. | Feb 1999 | A |
5868675 | Henrion et al. | Feb 1999 | A |
5871445 | Bucholz | Feb 1999 | A |
5871455 | Ueno | Feb 1999 | A |
5871487 | Warner et al. | Feb 1999 | A |
5873822 | Ferre et al. | Feb 1999 | A |
5882304 | Ehnholm et al. | Mar 1999 | A |
5884410 | Prinz | Mar 1999 | A |
5889834 | Vilsmeier et al. | Mar 1999 | A |
5891034 | Bucholz | Apr 1999 | A |
5891157 | Day et al. | Apr 1999 | A |
5904691 | Barnett et al. | May 1999 | A |
5907395 | Schultz et al. | May 1999 | A |
5913820 | Bladen et al. | Jun 1999 | A |
5920395 | Schulz | Jul 1999 | A |
5921992 | Costales et al. | Jul 1999 | A |
5923727 | Navab | Jul 1999 | A |
5928248 | Acker | Jul 1999 | A |
5938603 | Ponzi | Aug 1999 | A |
5938694 | Jaraczewski et al. | Aug 1999 | A |
5947980 | Jensen et al. | Sep 1999 | A |
5947981 | Cosman | Sep 1999 | A |
5950629 | Taylor et al. | Sep 1999 | A |
5951475 | Gueziec et al. | Sep 1999 | A |
5951571 | Audette | Sep 1999 | A |
5954647 | Bova et al. | Sep 1999 | A |
5957844 | Dekel et al. | Sep 1999 | A |
5964796 | Imran | Oct 1999 | A |
5967980 | Ferre et al. | Oct 1999 | A |
5967982 | Barnett | Oct 1999 | A |
5968047 | Reed | Oct 1999 | A |
5971997 | Guthrie et al. | Oct 1999 | A |
5974165 | Giger et al. | Oct 1999 | A |
5976156 | Taylor et al. | Nov 1999 | A |
5980535 | Barnett et al. | Nov 1999 | A |
5983126 | Wittkampf | Nov 1999 | A |
5987349 | Schulz | Nov 1999 | A |
5987960 | Messner et al. | Nov 1999 | A |
5999837 | Messner et al. | Dec 1999 | A |
5999840 | Grimson et al. | Dec 1999 | A |
6001130 | Bryan et al. | Dec 1999 | A |
6006126 | Cosman | Dec 1999 | A |
6006127 | Van Der Brug et al. | Dec 1999 | A |
6013087 | Adams et al. | Jan 2000 | A |
6014580 | Blume et al. | Jan 2000 | A |
6016439 | Acker | Jan 2000 | A |
6019725 | Vesely et al. | Feb 2000 | A |
6024695 | Taylor et al. | Feb 2000 | A |
6050724 | Schmitz et al. | Apr 2000 | A |
6059718 | Taniguchi et al. | May 2000 | A |
6063022 | Ben-Haim | May 2000 | A |
6067371 | Gouge et al. | May 2000 | A |
6071288 | Carol et al. | Jun 2000 | A |
6073043 | Schneider | Jun 2000 | A |
6076008 | Bucholz | Jun 2000 | A |
6096050 | Audette | Aug 2000 | A |
6104944 | Martinelli | Aug 2000 | A |
6118845 | Simon et al. | Sep 2000 | A |
6122538 | Sliwa, Jr. et al. | Sep 2000 | A |
6122541 | Cosman et al. | Sep 2000 | A |
6131396 | Duerr et al. | Oct 2000 | A |
6139183 | Graumann | Oct 2000 | A |
6147480 | Osadchy et al. | Nov 2000 | A |
6149592 | Yanof et al. | Nov 2000 | A |
6156067 | Bryan et al. | Dec 2000 | A |
6161032 | Acker | Dec 2000 | A |
6165181 | Heilbrun et al. | Dec 2000 | A |
6167296 | Shahidi | Dec 2000 | A |
6172499 | Ashe | Jan 2001 | B1 |
6175756 | Ferre et al. | Jan 2001 | B1 |
6178345 | Vilsmeier et al. | Jan 2001 | B1 |
6194639 | Botella et al. | Feb 2001 | B1 |
6201387 | Govari | Mar 2001 | B1 |
6203497 | Dekel et al. | Mar 2001 | B1 |
6211666 | Acker | Apr 2001 | B1 |
6223067 | Vilsmeier et al. | Apr 2001 | B1 |
6233476 | Strommer et al. | May 2001 | B1 |
6241735 | Marmulla | Jun 2001 | B1 |
6246231 | Ashe | Jun 2001 | B1 |
6259942 | Westermann et al. | Jul 2001 | B1 |
6273896 | Franck et al. | Aug 2001 | B1 |
6285902 | Kienzle, III et al. | Sep 2001 | B1 |
6298262 | Franck et al. | Oct 2001 | B1 |
6314310 | Ben-Haim et al. | Nov 2001 | B1 |
6332089 | Acker et al. | Dec 2001 | B1 |
6341231 | Ferre et al. | Jan 2002 | B1 |
6348058 | Melkent et al. | Feb 2002 | B1 |
6351659 | Vilsmeier | Feb 2002 | B1 |
6381485 | Hunter et al. | Apr 2002 | B1 |
6424856 | Vilsmeier et al. | Jul 2002 | B1 |
6427314 | Acker | Aug 2002 | B1 |
6428547 | Vilsmeier et al. | Aug 2002 | B1 |
6434415 | Foley et al. | Aug 2002 | B1 |
6437567 | Schenck et al. | Aug 2002 | B1 |
6445943 | Ferre et al. | Sep 2002 | B1 |
6470207 | Simon et al. | Oct 2002 | B1 |
6474341 | Hunter et al. | Nov 2002 | B1 |
6478802 | Kienzle, III et al. | Nov 2002 | B2 |
6484049 | Seeley et al. | Nov 2002 | B1 |
6490475 | Seeley et al. | Dec 2002 | B1 |
6493573 | Martinelli et al. | Dec 2002 | B1 |
6498944 | Ben-Haim et al. | Dec 2002 | B1 |
6499488 | Hunter et al. | Dec 2002 | B1 |
6516046 | Fröhlich et al. | Feb 2003 | B1 |
6527443 | Vilsmeier et al. | Mar 2003 | B1 |
6551325 | Neubauer et al. | Apr 2003 | B2 |
6584174 | Schubert et al. | Jun 2003 | B2 |
6609022 | Vilsmeier et al. | Aug 2003 | B2 |
6611700 | Vilsmeier et al. | Aug 2003 | B1 |
6640128 | Vilsmeier et al. | Oct 2003 | B2 |
6694162 | Hartlep | Feb 2004 | B2 |
6701179 | Martinelli et al. | Mar 2004 | B1 |
6801801 | Sati | Oct 2004 | B1 |
6996431 | Ben-Haim et al. | Feb 2006 | B2 |
20010007918 | Vilsmeier et al. | Jul 2001 | A1 |
20020095081 | Vilsmeier | Jul 2002 | A1 |
20040024309 | Ferre et al. | Feb 2004 | A1 |
Number | Date | Country |
---|---|---|
964149 | Mar 1975 | CA |
3042343 | Jun 1982 | DE |
35 08730 | Mar 1985 | DE |
37 17 871 | May 1987 | DE |
38 38011 | Nov 1988 | DE |
3831278 | Mar 1989 | DE |
42 13 426 | Apr 1992 | DE |
42 25 112 | Jul 1992 | DE |
4233978 | Apr 1994 | DE |
197 15 202 | Apr 1997 | DE |
197 47 427 | Oct 1997 | DE |
197 51 761 | Nov 1997 | DE |
198 32 296 | Jul 1998 | DE |
10085137 | Nov 2002 | DE |
0 062 941 | Mar 1982 | EP |
0 119 660 | Sep 1984 | EP |
0 155 857 | Jan 1985 | EP |
0 319 844 | Jan 1988 | EP |
0 326 768 | Dec 1988 | EP |
0419729 | Sep 1989 | EP |
350996 | Jan 1990 | EP |
0 651 968 | Aug 1990 | EP |
0 427 358 | Oct 1990 | EP |
0 456 103 | May 1991 | EP |
0531081 | Mar 1993 | EP |
0 581 704 | Jul 1993 | EP |
0655138 | Aug 1993 | EP |
0894473 | Jan 1995 | EP |
0 469 966 | Aug 1995 | EP |
0 908 146 | Oct 1998 | EP |
0 930 046 | Oct 1998 | EP |
79 04241 | Feb 1979 | FR |
2417970 | Feb 1979 | FR |
2 094 590 | Feb 1982 | GB |
2 164 856 | Oct 1984 | GB |
61-94639 | Oct 1984 | JP |
62-327 | Jun 1985 | JP |
63-240851 | Mar 1987 | JP |
3-267054 | Mar 1990 | JP |
2765738 | Apr 1998 | JP |
FR 2 618 211 | Jul 1987 | WO |
WO 8809151 | Dec 1988 | WO |
WO 8905123 | Jun 1989 | WO |
WO 9005494 | Nov 1989 | WO |
WO 9103982 | Apr 1991 | WO |
WO 9104711 | Apr 1991 | WO |
WO 9107726 | May 1991 | WO |
WO 9203090 | Mar 1992 | WO |
WO 9206645 | Apr 1992 | WO |
WO 9404938 | Mar 1994 | WO |
WO 9507055 | Sep 1994 | WO |
WO 9423647 | Oct 1994 | WO |
WO 9424933 | Nov 1994 | WO |
WO 9632059 | Nov 1995 | WO |
WO 9611624 | Apr 1996 | WO |
WO 9749453 | Jun 1997 | WO |
WO 9736192 | Oct 1997 | WO |
WO 9923956 | Nov 1997 | WO |
WO 9808554 | Mar 1998 | WO |
WO 9838908 | Sep 1998 | WO |
WO 9915097 | Sep 1998 | WO |
WO 9921498 | Oct 1998 | WO |
WO 9927839 | Dec 1998 | WO |
WO 9933406 | Dec 1998 | WO |
WO 9938449 | Jan 1999 | WO |
WO 9952094 | Apr 1999 | WO |
WO 9926549 | Jun 1999 | WO |
WO 9929253 | Jun 1999 | WO |
WP 9937208 | Jul 1999 | WO |
WO 9960939 | Dec 1999 | WO |
WO 0130437 | May 2001 | WO |
Number | Date | Country | |
---|---|---|---|
20030073901 A1 | Apr 2003 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 09274972 | Mar 1999 | US |
Child | 10236013 | US |