Method and apparatus for virtual endoscopy

Information

  • Patent Grant
  • 6892090
  • Patent Number
    6,892,090
  • Date Filed
    Monday, August 19, 2002
    21 years ago
  • Date Issued
    Tuesday, May 10, 2005
    19 years ago
Abstract
A surgical instrument navigation system is provided that visually simulates a virtual volumetric scene of a body cavity of a patient from a point of view of a surgical instrument residing in the cavity of the patient. The surgical instrument navigation system includes: a surgical instrument; an imaging device which is operable to capture scan data representative of an internal region of interest within a given patient; a tracking subsystem that employs electro-magnetic sensing to capture in real-time position data indicative of the position of the surgical instrument; a data processor which is operable to render a volumetric, perspective image of the internal region of interest from a point of view of the surgical instrument; and a display which is operable to display the volumetric perspective image of the patient.
Description
FIELD OF THE INVENTION

The present invention relates generally to surgical instrument navigation systems and, more particularly, to a system that visually simulates a virtual volumetric scene of a body cavity from a point of view of a surgical instrument residing in a patient.


BACKGROUND OF THE INVENTION

Precise imaging of portions of the anatomy is an increasingly important technique in the medical and surgical fields. In order to lessen the trauma to a patient caused by invasive surgery, techniques have been developed for performing surgical procedures within the body through small incisions with minimal invasion. These procedures generally require the surgeon to operate on portions of the anatomy that are not directly visible, or can be seen only with difficulty. Furthermore, some parts of the body contain extremely complex or small structures and it is necessary to enhance the visibility of these structures to enable the surgeon to perform more delicate procedures. In addition, planning such procedures required the evaluation of the location and orientation of these structures within the body in order to determine the optimal surgical trajectory.


Endoscopy is one commonly employed technique for visualizing internal regions of interest within a patient. Flexible endoscopes enable surgeons to visually inspect a region prior to or during surgery. However, flexible endoscopes are relatively expensive, limited in flexibility due to construction and obscured by blood and other biological materials.


Therefore, it is desirable to provide a cost effective alternative technique for visualizing an internal regions of interest within a patient.


SUMMARY OF THE INVENTION

In accordance with the present invention, a surgical instrument navigation system is provided that visually simulates a virtual volumetric scene of a body cavity of a patient from a point of view of a surgical instrument residing in the patient. The surgical instrument navigation system generally includes: a surgical instrument, such as a guide wire or catheter; a tracking subsystem that captures real-time position data indicative of the position (location and/or orientation) of the surgical instrument; a data processor which is operable to render a volumetric image of the internal region of interest from a point of view of the surgical instrument; and a display which is operable to display the volumetric image of the patient. The surgical instrument navigation system may also include an imaging device which is operable to capture 2D and/or 3D volumetric scan data representative of an internal region of interest within a given patient.


For a more complete understanding of the invention, reference may be made to the following specification and to the accompanying drawings.





BRIEF DESCRIPTION OF THE DRAWINGS


FIG. 1 is a diagram of an exemplary surgical instrument navigation system in accordance with present invention;



FIG. 2 is a flowchart that depicts a technique for simulating a virtual volumetric scene of a body cavity from a point of view of a surgical instrument positioned within the patient in accordance with the present invention;



FIG. 3 is an exemplary display from the surgical instrument navigation system of the present invention;



FIG. 4 is a flowchart that depicts a technique for synchronizing the display of an indicia or graphical representation of the surgical instrument with cardiac or respiratory cycle of the patient in accordance with the present invention; and



FIG. 5 is a flowchart that depicts a technique for generating four-dimensional image data that is synchronized with the patient in accordance with the present invention.





DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS


FIG. 1 is a diagram of an exemplary surgical instrument navigation system 10. In accordance with one aspect of the present invention, the surgical instrument navigation system 10 is operable to visually simulate a virtual volumetric scene within the body of a patient, such as an internal body cavity, from a point of view of a surgical instrument 12 residing in the cavity of a patient 13. To do so, the surgical instrument navigation system 10 is primarily comprised of a surgical instrument 12, a data processor 16 having a display 18, and a tracking subsystem 20. The surgical instrument navigation system 10 may further include (or accompanied by) an imaging device 14 that is operable to provide image data to the system.


The surgical instrument 12 is preferably a relatively inexpensive, flexible and/or steerable catheter that may be of a disposable type. The surgical instrument 12 is modified to include one or more tracking sensors that are detectable by the tracking subsystem 20. It is readily understood that other types of surgical instruments (e.g., a guide wire, a pointer probe, a stent, a seed, an implant, an endoscope, etc.) are also within the scope of the present invention. It is also envisioned that at least some of these surgical instruments may be wireless or have wireless communications links. It is also envisioned that the surgical instruments may encompass medical devices which are used for exploratory purposes, testing purposes or other types of medical procedures.


Referring to FIG. 2, the imaging device 14 is used to capture volumetric scan data 32 representative of an internal region of interest within the patient 13. The three-dimensional scan data is preferably obtained prior to surgery on the patient 13. In this case, the captured volumetric scan data may be stored in a data store associated with the data processor 16 for subsequent processing. However, one skilled in the art will readily recognize that the principles of the present invention may also extend to scan data acquired during surgery. It is readily understood that volumetric scan data may be acquired using various known medical imaging devices 14, including but not limited to a magnetic resonance imaging (MRI) device, a computed tomography (CT) imaging device, a positron emission tomography (PET) imaging device, a 2D or 3D fluoroscopic imaging device, and 2D, 3D or 4D ultrasound imaging devices. In the case of a two-dimensional ultrasound imaging device or other two-dimensional image acquisition device, a series of two-dimensional data sets may be acquired and then assembled into volumetric data as is well known in the art using a two-dimensional to three-dimensional conversion.


A dynamic reference frame 19 is attached to the patient proximate to the region of interest within the patient 13. To the extent that the region of interest is a vessel or a cavity within the patient, it is readily understood that the dynamic reference frame 19 may be placed within the patient 13. To determine its location, the dynamic reference frame 19 is also modified to include tracking sensors detectable by the tracking subsystem 20. The tracking subsystem 20 is operable to determine position data for the dynamic reference frame 19 as further described below.


The volumetric scan data is then registered as shown at 34. Registration of the dynamic reference frame 19 generally relates information in the volumetric scan data to the region of interest associated with the patient. This process is referred to as registering image space to patient space. Often, the image space must also be registered to another image space. Registration is accomplished through knowledge of the coordinate vectors of at least three non-collinear points in the image space and the patient space.


Registration for image guided surgery can be completed by different known techniques. First, point-to-point registration is accomplished by identifying points in an image space and then touching the same points in patient space. These points are generally anatomical landmarks that are easily identifiable on the patient. Second, surface registration involves the user's generation of a surface in patient space by either selecting multiple points or scanning, and then accepting the best fit to that surface in image space by iteratively calculating with the data processor until a surface match is identified. Third, repeat fixation devices entail the user repeatedly removing and replacing a device (i.e., dynamic reference frame, etc.) in known relation to the patient or image fiducials of the patient. Fourth, automatic registration by first attaching the dynamic reference frame to the patient prior to acquiring image data. It is envisioned that other known registration procedures are also within the scope of the present invention, such as that disclosed in U.S. Ser. No. 09/274,972, filed on Mar. 23, 1999, entitled “NAVIGATIONAL GUIDANCE VIA COMPUTER-ASSISTED FLUOROSCOPIC IMAGING”, which is hereby incorporated by reference.


During surgery, the surgical instrument 12 is directed by the surgeon to the region of interest within the patient 13. The tracking subsystem 20 preferably employs electro-magnetic sensing to capture position data 37 indicative of the location and/or orientation of the surgical instrument 12 within the patient. The tracking subsystem 20 may be defined as a localizing device 22 and one or more electro-magnetic sensors 24 may be integrated into the items of interest, such as the surgical instrument 12. In one embodiment, the localizing device 22 is comprised of three or more field generators (transmitters) mounted at known locations on a plane surface and the electro-magnetic sensor (receivers) 24 is further defined as a single coil of wire. The positioning of the field generators (transmitter), and the sensors (receivers) may also be reversed, such that the generators are associated with the surgical instrument 12 and the receivers are positioned elsewhere. Although not limited thereto, the localizing device 22 may be affixed to an underneath side of the operating table that supports the patient.


In operation, the field generators generate magnetic fields which are detected by the sensor. By measuring the magnetic fields generated by each field generator at the sensor, the location and orientation of the sensor may be computed, thereby determining position data for the surgical instrument 12. Although not limited thereto, exemplary electro-magnetic tracking subsystems are further described in U.S. Pat. Nos. 5,913,820; 5,592,939; and 6,374,134 which are incorporated herein by reference. In addition, it is envisioned that other types of position tracking devices are also within the scope of the present invention. For instance, non line-of-sight tracking subsystem 20 may be based on sonic emissions or radio frequency emissions. In another instance, a rigid surgical instrument, such as a rigid endoscope may be tracked using a line-of-sight optical-based tracking subsystem (i.e., LED's, passive markers, reflective markers, etc).


Position data such as location and/or orientation data from the tracking subsystem 20 is in turn relayed to the data processor 16. The data processor 16 is adapted to receive position/orientation data from the tracking subsystem 20 and operable to render a volumetric perspective image and/or a surface rendered image of the region of interest. The volumetric perspective and/or surface image is rendered 36 from the scan data 32 using rendering techniques well known in the art. The image data may be further manipulated 38 based on the position/orientation data for the surgical instrument 12 received from tracking subsystem 20. Specifically, the volumetric perspective or surface rendered image is rendered from a point of view which relates to position of the surgical instrument 12. For instance, at least one electro-magnetic sensor 24 may be positioned at the tip of the surgical instrument 12, such that the image is, rendered from a leading point on the surgical instrument. In this way, the surgical instrument navigation system 10 of the present invention is able, for example, to visually simulate a virtual volumetric scene of an internal cavity from the point of view of the surgical instrument 12 residing in the cavity without the use of an endoscope. It is readily understood that tracking two or more electro-magnetic sensors 24 which are embedded in the surgical instrument 12 enables orientation of the surgical instrument 12 to be determined by the system 10.


As the surgical instrument 12 is moved by the surgeon within the region of interest, its position and orientation are tracked and reported on a real-time basis by the tracking subsystem 20. The volumetric perspective image may then be updated by manipulating 38 the rendered image data 36 based on the position of the surgical instrument 12. The manipulated volumetric perspective image is displayed 40 on a display device 18 associated with the data processor 16. The display 18 is preferably located such that it can be easily viewed by the surgeon during the medical procedure. In one embodiment, the display 18 may be further defined as a heads-up display or any other appropriate display. The image may also be stored by data processor 16 for later playback, should this be desired.


It is envisioned that the primary perspective image 38 of the region of interest may be supplemented by other secondary images. For instance, known image processing techniques may be employed to generate various multi-planar images of the region of interest. Alternatively, images may be generated from different view points as specified by a user 39, including views from outside of the vessel or cavity or views that enable the user to see through the walls of the vessel using different shading or opacity. In another instance, the location data of the surgical instrument may be saved and played back in a movie format. It is envisioned that these various secondary images may be displayed simultaneously with or in place of the primary perspective image.


In addition, the surgical instrument 12 may be used to generate real-time maps corresponding to an internal path traveled by the surgical instrument or an external boundary of an internal cavity. Real-time maps are generated by continuously recording the position of the instrument's localized tip and its full extent. A real-time map is generated by the outermost extent of the instrument's position and minimum extrapolated curvature as is known in the art. The map may be continuously updated as the instrument is moved within the patient, thereby creating a path or a volume representing the internal boundary of the cavity. It is envisioned that the map may be displayed in a wire frame form, as a shaded surface or other three-dimensional computer display modality independent from or superimposed on the volumetric perspective image 38 of the region of interest. It is further envisioned that the map may include data collected from a sensor embedded into the surgical instrument, such as pressure data, temperature data or electro-physiological data. In this case, the map may be color coded to represent the collected data.



FIG. 3 illustrates another type of secondary image 28 which may be displayed in conjunction with the primary perspective image 38. In this instance, the primary perspective image is an interior view of an air passage within the patient 13. The secondary image 28 is an exterior view of the air passage which includes an indicia or graphical representation 29 that corresponds to the location of the surgical instrument 12 within the air passage. In FIG. 3, the indicia 29 is shown as a crosshairs. It is envisioned that other indicia may be used to signify the location of the surgical instrument in the secondary image. As further described below, the secondary image 28 is constructed by superimposing the indicia 29 of the surgical instrument 12 onto the manipulated image data 38.


Referring to FIG. 4, the display of an indicia of the surgical instrument 12 on the secondary image may be synchronized with an anatomical function, such as the cardiac or respiratory cycle, of the patient. In certain instances, the cardiac or respiratory cycle of the patient may cause the surgical instrument 12 to flutter or jitter within the patient. For instance, a surgical instrument 12 positioned in or near a chamber of the heart will move in relation to the patient's heart beat. In these instance, the indicia of the surgical instrument 12 will likewise flutter or jitter on the displayed image 40. It is envisioned that other anatomical functions which may effect the position of the surgical instrument 12 within the patient are also within the scope of the present invention.


To eliminate the flutter of the indicia on the displayed image 40, position data for the surgical instrument 12 is acquired at a repetitive point within each cycle of either the cardiac cycle or the respiratory cycle of the patient. As described above, the imaging device 14 is used to capture volumetric scan data 42 representative of an internal region of interest within a given patient. A secondary image may then be rendered 44 from the volumetric scan data by the data processor 16.


In order to synchronize the acquisition of position data for the surgical instrument 12, the surgical instrument navigation system 10 may further include a timing signal generator 26. The timing signal generator 26 is operable to generate and transmit a timing signal 46 that correlates to at least one of (or both) the cardiac cycle or the respiratory cycle of the patient 13. For a patient having a consistent rhythmic cycle, the timing signal might be in the form of a periodic clock signal. Alternatively, the timing signal may be derived from an electrocardiogram signal from the patient 13. One skilled in the art will readily recognize other techniques for deriving a timing signal that correlate to at least one of the cardiac or respiratory cycle or other anatomical cycle of the patient.


As described above, the indicia of the surgical instrument 12 tracks the movement of the surgical instrument 12 as it is moved by the surgeon within the patient 13. Rather than display the indicia of the surgical instrument 12 on a real-time basis, the display of the indicia of the surgical instrument 12 is periodically updated 48 based on the timing signal from the timing signal generator 26. In one exemplary embodiment, the timing generator 26 is electrically connected to the tracking subsystem 20. The tracking subsystem 20 is in turn operable to report position data for the surgical instrument 12 in response to a timing signal received from the timing signal generator 26. The position of the indicia of the surgical instrument 12 is then updated 50 on the display of the image data. It is readily understood that other techniques for synchronizing the display of an indicia of the surgical instrument 12 based on the timing signal are within the scope of the present invention, thereby eliminating any flutter or jitter which may appear on the displayed image 52. It is also envisioned that a path (or projected path) of the surgical instrument 12 may also be illustrated on the displayed image data 52.


In another aspect of the present invention, the surgical instrument navigation system 10 may be further adapted to display four-dimensional image data for a region of interest as shown in FIG. 5. In this case, the imaging device 14 is operable to capture volumetric scan data 62 for an internal region of interest over a period of time, such that the region of interest includes motion that is caused by either the cardiac cycle or the respiratory cycle of the patient 13. A volumetric perspective view of the region may be rendered 64 from the volumetric scan data 62 by the data processor 16 as described above. The four-dimensional image data may be further supplemented with other patient data, such as temperature or blood pressure, using coloring coding techniques.


In order to synchronize the display of the volumetric perspective view in real-time with the cardiac or respiratory cycle of the patient, the data processor 16 is adapted to receive a timing signal from the timing signal generator 26. As described above, the timing signal generator 26 is operable to generate and transmit a timing signal that correlates to either the cardiac cycle or the respiratory cycle of the patient 13. In this way, the volumetric perspective image may be synchronized 66 with the cardiac or respiratory cycle of the patient 13. The synchronized image 66 is then displayed 68 on the display 18 of the system. The four-dimensional synchronized image may be either (or both of) the primary image rendered from the point of view of the surgical instrument or the secondary image depicting the indicia of the position of the surgical instrument 12 within the patient 13. It is readily understood that the synchronization process is also applicable to two-dimensional image data acquire over time.


To enhance visualization and refine accuracy of the displayed image data, the surgical navigation system can use prior knowledge such as the segmented vessel structure to compensate for error in the tracking subsystem or for inaccuracies caused by an anatomical shift occurring since acquisition of scan data. For instance, it is known that the surgical instrument 12 being localized is located within a given vessel and, therefore should be displayed within the vessel. Statistical methods can be used to determine the most likely location; within the vessel with respect to the reported location and then compensate so the display accurately represents the instrument 12 within the center of the vessel. The center of the vessel can be found by segmenting the vessels from the three-dimensional datasets and using commonly known imaging techniques to define the centerline of the vessel tree. Statistical methods may also be used to determine if the surgical instrument 12 has potentially punctured the vessel. This can be done by determining the reported location is too far from the centerline or the trajectory of the path traveled is greater than a certain angle (worse case 90 degrees) with respect to the vessel. Reporting this type of trajectory (error) is very important to the clinicians. The tracking along the center of the vessel may also be further refined by correcting for motion of the respiratory or cardiac cycle, as described above.


The surgical instrument navigation system of the present invention may also incorporate atlas maps. It is envisioned that three-dimensional or four-dimensional atlas maps may be registered with patient specific scan data or generic anatomical models. Atlas maps may contain kinematic information (e.g., heart models) that can be synchronized with four-dimensional image data, thereby supplementing the real-time information. In addition, the kinematic information may be combined with localization information from several instruments to provide a complete four-dimensional model of organ motion. The atlas maps may also be used to localize bones or soft tissue which can assist in determining placement and location of implants.


While the invention has been described in its presently preferred form, it will be understood that the invention is capable of modification without departing from the spirit of the invention as set forth in the appended claims.

Claims
  • 1. A surgical instrument navigation system, comprising: a surgical instrument; a tracking subsystem operable to capture in real-time position data indicative of the position of the surgical instrument; a data processor adapted to receive scan data representative of a region of interest of a given patient and the position data from the tracking subsystem, the data processor being operable to render an image of the region of interest from a point of view which relates to position of the surgical instrument, the image being derived from the scan data; and a display in data communication with the data processor, the display being operable to display the image of the patient.
  • 2. The surgical navigation system of claim 1 wherein a volumetric perspective image is rendered from a point of view of the surgical instrument.
  • 3. The surgical navigation system of claim 1 wherein the surgical instrument is further defined as at least one of a catheter, a guide wire, a pointer probe, a stent, a seed, an implant, or an endoscope.
  • 4. The surgical navigation system of claim 1 further comprising an imaging device operable to capture and provide the scan data to the data processor.
  • 5. The surgical navigation system of claim 4 wherein the imaging device is operable to capture volumetric scan data or surface data representative of the region of interest.
  • 6. The surgical navigation system of claim 4 wherein the imaging device is further defined as at least one of a magnetic resonance imaging device, a computed tomography imaging device, a positron emission tomography imaging device, a fluoroscopic imaging device, or an ultrasound imaging device.
  • 7. The surgical navigation system of claim 1 wherein the tracking subsystem is further defined as an electro-magnetic localizing device having one or more electro-magnetic sensors attached to the surgical instrument.
  • 8. The surgical navigation system of claim 7 wherein a volumetric perspective image is rendered from a point of view which correlates to one of the electro-magnetic sensors attached to the surgical instrument.
  • 9. The surgical navigation system of claim 1 wherein the data processor is operable to render a second image of the region of interest based on the scan data, and to superimpose an indicia of the surgical instrument onto the second image of the patient.
  • 10. The surgical navigation system of claim 9 wherein the data processor is further operable to track in real-time the position of the surgical instrument as it is moved within the region of interest and update the corresponding position of the indicia of the surgical instrument on the second image of the patient.
  • 11. The surgical navigation system of claim 9 wherein the data processor is further operable to track in real-time the location and orientation of the surgical instrument as it is moved within the region of interest and the display is further operable to display the location and orientation of the surgical instrument.
  • 12. A surgical instrument navigation system, comprising: a surgical instrument; a timing signal generator operable to generate and transmit a timing signal that correlates to at least one anatomical function of the patient; a tracking subsystem operable to receive the timing signal from the timing signal generator, the tracking subsystem operable to capture position data indicative of the position of the surgical instrument and to report the position data in response to the timing signal received from the timing signal generator; a data processor adapted to receive scan image data representative of an internal region of interest within a given patient and the position data from the tracking subsystem, the data processor being operable to render a volumetric perspective image of the internal region of interest from the scan image data and to superimpose an indicia of the surgical instrument onto the volumetric perspective image based on the position data received from the tracking subsystem; and a display in data communication with the data processor, the display being operable to display the volumetric perspective image of the patient.
  • 13. The surgical instrument navigation system of claim 12 wherein the timing signal correlates at least one of cardiac cycle or respiratory cycle of the patient.
  • 14. The surgical instrument navigation system of claim 13 wherein the timing signal is generated at a repetitive point within each cycle of either the cardiac cycle or the respiratory cycle of the patient, thereby minimizing any jitter of the surgical instrument in the volumetric perspective image which may be caused by the cardiac cycle or the respiratory cycle of the patient.
  • 15. The surgical instrument navigation system of claim 13 wherein the timing signal is at least one derived from or is an electrocardiogram signal from the patient.
  • 16. The surgical instrument navigation system of claim 12 wherein the data processor is further operable to track position of the surgical instrument as it is moved within the region of interest and to update the corresponding position of the indicia of the surgical instrument in the volumetric perspective image of the patient.
  • 17. The surgical navigation system of claim 12 wherein the data processor is further operable to track in real-time the location and orientation of the surgical instrument as it is moved within the region of interest and the display is further operable to display the location and orientation of the surgical instrument.
  • 18. The surgical navigation system of claim 12 wherein the surgical instrument is further defined as at least one of a catheter, a guide wire, a pointer probe, a stent, a seed, an implant, or an endoscope.
  • 19. The surgical navigation system of claim 12 further comprises an imaging device operable to capture and provide the scan image data to the data processor.
  • 20. The surgical navigation system of claim 19 wherein the imaging device is operable to capture volumetric scan data representative of the internal region of interest.
  • 21. The surgical navigation system of claim 19 wherein the imaging device is further defined as at least one of a magnetic resonance imaging device, a computed tomography imaging device, a positron emission tomography imaging device, a fluoroscopic imaging device, or a ultrasound imaging device.
  • 22. The surgical navigation system of claim 12 wherein the tracking subsystem is further defined as an electro-magnetic localizing device having one or more electro-magnetic sensors attached to the surgical instrument.
  • 23. A surgical instrument navigation system, comprising: a surgical instrument; an imaging device operable to capture volumetric scan data over time, the volumetric scan data representative of an internal region of interest within a patient and the internal region of interest having motion that is caused by at least one anatomical function of the patient; a timing signal generator operable to generate and transmit a timing signal that correlates to the at least one anatomical function of the patient; a data processor adapted to receive the volumetric image data from the imaging device and the timing signal from the timing signal generator, the data processor being operable to render a volumetric perspective image from the viewpoint of the surgical instrument of the internal region of interest over time, the volumetric perspective image being derived from the volumetric scan data and synchronized with the timing signal; and a display in data communication with the data processor, the display being operable to display the volumetric perspective image of the patient.
  • 24. The surgical instrument navigation system of claim 23 wherein the timing signal correlates at least one of cardiac cycle or respiratory cycle of the patient.
  • 25. The surgical instrument navigation system of claim 23, further comprising: a tracking subsystem operable to receive the timing signal from the timing signal generator, the tracking subsystem operable to capture position data indicative of a position of the surgical instrument and to report the position data in response to the timing signal received from the timing signal generator; a display in data communication with the data processor, the display being operable to display the volumetric perspective image of the patient based upon the position of the surgical instrument.
  • 26. A surgical instrument navigation system, comprising: a non-imaging surgical instrument; a tracking subsystem operable to capture in real-time position data indicative of the position of the non-imaging surgical instrument; a data processor adapted to receive scan data representative of a region of interest of a given patient and the position data from the tracking subsystem, the data processor being operable to render an image of the region of interest from a point of view which relates to the position of the surgical instrument; and a display in data communication with the data processor, the display being operable to display the image of the patient; wherein the rendered image being derived from the scan data.
US Referenced Citations (446)
Number Name Date Kind
1576781 Phillips Mar 1926 A
1735726 Bornhardt Nov 1929 A
2407845 Nemeyer Sep 1946 A
2650588 Drew Sep 1953 A
2697433 Sehnder Dec 1954 A
3016899 Stenvall Jan 1962 A
3017887 Heyer Jan 1962 A
3061936 Dobbeleer Nov 1962 A
3073310 Mocarski Jan 1963 A
3294083 Alderson Dec 1966 A
3367326 Frazier Feb 1968 A
3439256 Kähne et al. Apr 1969 A
3577160 White May 1971 A
3674014 Tillander Jul 1972 A
3702935 Carey et al. Nov 1972 A
3704707 Halloran Dec 1972 A
3868565 Kuipers Feb 1975 A
3941127 Froning Mar 1976 A
4037592 Kronner Jul 1977 A
4052620 Brunnett Oct 1977 A
4054881 Raab Oct 1977 A
4117337 Staats Sep 1978 A
4173228 Van Steenwyk et al. Nov 1979 A
4202349 Jones May 1980 A
4262306 Renner Apr 1981 A
4287809 Egli et al. Sep 1981 A
4314251 Raab Feb 1982 A
4317078 Weed et al. Feb 1982 A
4328813 Ray May 1982 A
4339953 Iwasaki Jul 1982 A
4358856 Stivender et al. Nov 1982 A
4368536 Pfeiler Jan 1983 A
4396885 Constant Aug 1983 A
4418422 Richter et al. Nov 1983 A
4422041 Lienau Dec 1983 A
4431005 McCormick Feb 1984 A
4485815 Amplatz Dec 1984 A
4543959 Sepponen Oct 1985 A
4548208 Niemi Oct 1985 A
4572198 Codrington Feb 1986 A
4584577 Temple Apr 1986 A
4613866 Blood Sep 1986 A
4618978 Cosman Oct 1986 A
4621628 Brudermann Nov 1986 A
4625718 Olerud et al. Dec 1986 A
4642786 Hansen Feb 1987 A
4645343 Stockdale et al. Feb 1987 A
4649504 Krouglicof et al. Mar 1987 A
4651732 Frederick Mar 1987 A
4653509 Oloff et al. Mar 1987 A
4673352 Hansen Jun 1987 A
4706665 Gouda Nov 1987 A
4719419 Dawley Jan 1988 A
4722056 Roberts et al. Jan 1988 A
4722336 Kim et al. Feb 1988 A
4727565 Ericson Feb 1988 A
4737794 Jones Apr 1988 A
4737921 Goldwasser et al. Apr 1988 A
4750487 Zanetti Jun 1988 A
4771787 Wurster et al. Sep 1988 A
4791934 Brunnett Dec 1988 A
4793355 Crum et al. Dec 1988 A
4797907 Anderton Jan 1989 A
4803976 Frigg et al. Feb 1989 A
4821206 Arora Apr 1989 A
4821731 Martinelli et al. Apr 1989 A
4836778 Baumrind et al. Jun 1989 A
4845771 Wislocki et al. Jul 1989 A
4849692 Blood Jul 1989 A
4862893 Martinelli Sep 1989 A
4889526 Rauscher et al. Dec 1989 A
4905698 Strohl, Jr. et al. Mar 1990 A
4923459 Nambu May 1990 A
4931056 Ghajar et al. Jun 1990 A
4945305 Blood Jul 1990 A
4945914 Allen Aug 1990 A
4951653 Fry et al. Aug 1990 A
4977655 Martinelli Dec 1990 A
4989608 Ratner Feb 1991 A
4991579 Allen Feb 1991 A
5002058 Martinelli Mar 1991 A
5005592 Cartmell Apr 1991 A
5013317 Cole et al. May 1991 A
5016639 Allen May 1991 A
5027818 Bova et al. Jul 1991 A
5030196 Inoue Jul 1991 A
5030222 Calandruccio et al. Jul 1991 A
5031203 Trecha Jul 1991 A
5042486 Pfeiler et al. Aug 1991 A
5050608 Watanabe et al. Sep 1991 A
5054492 Scribner et al. Oct 1991 A
5057095 Fabian Oct 1991 A
5059789 Salcudean Oct 1991 A
5079699 Tuy et al. Jan 1992 A
5086401 Glassman et al. Feb 1992 A
5094241 Allen Mar 1992 A
5097839 Allen Mar 1992 A
5099845 Besz et al. Mar 1992 A
5105829 Fabian et al. Apr 1992 A
5107839 Houdek et al. Apr 1992 A
5107843 Aarnio et al. Apr 1992 A
5107862 Fabian et al. Apr 1992 A
5109194 Cantaloube Apr 1992 A
5119817 Allen Jun 1992 A
5142930 Allen et al. Sep 1992 A
5152288 Hoenig et al. Oct 1992 A
5160337 Cosman Nov 1992 A
5161536 Vilkomerson et al. Nov 1992 A
5178164 Allen Jan 1993 A
5178621 Cook et al. Jan 1993 A
5186174 Schlondorff et al. Feb 1993 A
5187475 Wagener et al. Feb 1993 A
5188126 Fabian et al. Feb 1993 A
5190059 Fabian et al. Mar 1993 A
5197476 Nowacki et al. Mar 1993 A
5197965 Cherry et al. Mar 1993 A
5198768 Keren Mar 1993 A
5198877 Schulz Mar 1993 A
5211164 Allen May 1993 A
5211165 Dumoulin et al. May 1993 A
5211176 Ishiguro et al. May 1993 A
5212720 Landi et al. May 1993 A
5214615 Bauer May 1993 A
5219351 Teubner et al. Jun 1993 A
5222499 Allen et al. Jun 1993 A
5228442 Imran Jul 1993 A
5233990 Barnea Aug 1993 A
5237996 Waldman et al. Aug 1993 A
5249581 Horbal et al. Oct 1993 A
5251127 Raab Oct 1993 A
5251635 Dumoulin et al. Oct 1993 A
5253647 Takahashi et al. Oct 1993 A
5255680 Darrow et al. Oct 1993 A
5257636 White Nov 1993 A
5265610 Darrow et al. Nov 1993 A
5265611 Hoenig et al. Nov 1993 A
5269759 Hernandez et al. Dec 1993 A
5271400 Dumoulin et al. Dec 1993 A
5273025 Sakiyama et al. Dec 1993 A
5274551 Corby, Jr. Dec 1993 A
5279309 Taylor et al. Jan 1994 A
5291199 Overman et al. Mar 1994 A
5295483 Nowacki et al. Mar 1994 A
5297549 Beatty et al. Mar 1994 A
5299254 Dancer et al. Mar 1994 A
5299288 Glassman et al. Mar 1994 A
5305091 Gelbart et al. Apr 1994 A
5305203 Raab Apr 1994 A
5309913 Kormos et al. May 1994 A
5315630 Sturm et al. May 1994 A
5316024 Hirschi et al. May 1994 A
5318025 Dumoulin et al. Jun 1994 A
5320111 Livingston Jun 1994 A
5325728 Zimmerman et al. Jul 1994 A
5325873 Hirschi et al. Jul 1994 A
5329944 Fabian et al. Jul 1994 A
5333168 Fernandes et al. Jul 1994 A
5353795 Souza et al. Oct 1994 A
5353800 Pohndorf et al. Oct 1994 A
5353807 DeMarco Oct 1994 A
5368030 Zinreich et al. Nov 1994 A
5375596 Twiss et al. Dec 1994 A
5377678 Dumoulin et al. Jan 1995 A
5383454 Bucholz Jan 1995 A
5385146 Goldreyer Jan 1995 A
5385148 Lesh et al. Jan 1995 A
5386828 Owens et al. Feb 1995 A
5389101 Heilbrun et al. Feb 1995 A
5391199 Ben-Haim Feb 1995 A
5394457 Leibinger et al. Feb 1995 A
5397329 Allen Mar 1995 A
5399146 Nowacki et al. Mar 1995 A
5400384 Fernandes et al. Mar 1995 A
5402801 Taylor Apr 1995 A
5403321 DiMarco Apr 1995 A
5408409 Glassman et al. Apr 1995 A
5417210 Funda et al. May 1995 A
5419325 Dumoulin et al. May 1995 A
5423334 Jordan Jun 1995 A
5425367 Shapiro et al. Jun 1995 A
5425382 Golden et al. Jun 1995 A
5426683 O'Farrell, Jr. et al. Jun 1995 A
5426687 Goodall et al. Jun 1995 A
5427097 Depp Jun 1995 A
5429132 Guy et al. Jul 1995 A
5433198 Desai Jul 1995 A
RE35025 Anderton Aug 1995 E
5437277 Dumoulin et al. Aug 1995 A
5443066 Dumoulin et al. Aug 1995 A
5443489 Ben-Haim Aug 1995 A
5444756 Pai et al. Aug 1995 A
5445144 Wodicka et al. Aug 1995 A
5445150 Dumoulin et al. Aug 1995 A
5445166 Taylor Aug 1995 A
5446548 Gerig et al. Aug 1995 A
5447154 Cinquin et al. Sep 1995 A
5448610 Yamamoto et al. Sep 1995 A
5453686 Anderson Sep 1995 A
5456718 Szymaitis Oct 1995 A
5458718 Venkitachalam Oct 1995 A
5464446 Dreessen et al. Nov 1995 A
5478341 Cook et al. Dec 1995 A
5478343 Ritter Dec 1995 A
5480422 Ben-Haim Jan 1996 A
5483961 Kelly et al. Jan 1996 A
5485849 Panescu et al. Jan 1996 A
5487391 Panescu Jan 1996 A
5487729 Avellanet et al. Jan 1996 A
5487757 Truckai et al. Jan 1996 A
5490196 Rudich et al. Feb 1996 A
5494034 Schlondorff et al. Feb 1996 A
5503416 Aoki et al. Apr 1996 A
5513637 Twiss et al. May 1996 A
5515160 Schulz et al. May 1996 A
5517990 Kalfas et al. May 1996 A
5531227 Schneider Jul 1996 A
5531520 Grimson et al. Jul 1996 A
5542938 Avellanet et al. Aug 1996 A
5543951 Moehrmann Aug 1996 A
5546940 Panescu et al. Aug 1996 A
5546949 Frazin et al. Aug 1996 A
5546951 Ben-Haim Aug 1996 A
5551429 Fitzpatrick et al. Sep 1996 A
5558091 Acker et al. Sep 1996 A
5568809 Ben-haim Oct 1996 A
5572999 Funda et al. Nov 1996 A
5573533 Strul Nov 1996 A
5575794 Walus et al. Nov 1996 A
5583909 Hanover Dec 1996 A
5588430 Bova et al. Dec 1996 A
5592939 Martinelli Jan 1997 A
5595193 Walus et al. Jan 1997 A
5596228 Anderton et al. Jan 1997 A
5600330 Blood Feb 1997 A
5603318 Heilbrun et al. Feb 1997 A
5617462 Spratt Apr 1997 A
5617857 Chader et al. Apr 1997 A
5619261 Anderton Apr 1997 A
5622169 Golden et al. Apr 1997 A
5622170 Schulz Apr 1997 A
5627873 Hanover et al. May 1997 A
5628315 Vilsmeier et al. May 1997 A
5630431 Taylor May 1997 A
5636644 Hart et al. Jun 1997 A
5638819 Manwaring et al. Jun 1997 A
5640170 Anderson Jun 1997 A
5642395 Anderton et al. Jun 1997 A
5643268 Vilsmeier et al. Jul 1997 A
5645065 Shapiro et al. Jul 1997 A
5647361 Damadian Jul 1997 A
5662111 Cosman Sep 1997 A
5664001 Tachibana et al. Sep 1997 A
5674296 Bryan et al. Oct 1997 A
5676673 Ferre et al. Oct 1997 A
5681260 Ueda et al. Oct 1997 A
5682886 Delp et al. Nov 1997 A
5690108 Chakeres Nov 1997 A
5694945 Ben-Haim Dec 1997 A
5695500 Taylor et al. Dec 1997 A
5695501 Carol et al. Dec 1997 A
5697377 Wittkampf Dec 1997 A
5702406 Vilsmeier et al. Dec 1997 A
5704897 Truppe Jan 1998 A
5711299 Manwaring et al. Jan 1998 A
5713946 Ben-Haim Feb 1998 A
5715822 Watkins Feb 1998 A
5715836 Kliegis et al. Feb 1998 A
5718241 Ben-Haim et al. Feb 1998 A
5727552 Ryan Mar 1998 A
5727553 Saad Mar 1998 A
5729129 Acker Mar 1998 A
5730129 Darrow et al. Mar 1998 A
5730130 Fitzpatrick et al. Mar 1998 A
5732703 Kalfas et al. Mar 1998 A
5735278 Hoult et al. Apr 1998 A
5738096 Ben-Haim Apr 1998 A
5740267 Echerer et al. Apr 1998 A
5741214 Ouchi et al. Apr 1998 A
5742394 Hansen Apr 1998 A
5744953 Hansen Apr 1998 A
5748767 Raab May 1998 A
5749362 Funda et al. May 1998 A
5749835 Glantz May 1998 A
5752513 Acker et al. May 1998 A
5755725 Druais May 1998 A
RE35816 Schulz Jun 1998 E
5758667 Slettenmark Jun 1998 A
5762064 Polyani Jun 1998 A
5767669 Hansen et al. Jun 1998 A
5769789 Wang et al. Jun 1998 A
5769861 Vilsmeier Jun 1998 A
5772594 Barrick Jun 1998 A
5775322 Silverstein et al. Jul 1998 A
5776050 Chen et al. Jul 1998 A
5776064 Kalfas et al. Jul 1998 A
5782765 Jonkman Jul 1998 A
5787886 Kelly et al. Aug 1998 A
5792055 McKinnon Aug 1998 A
5795294 Luber et al. Aug 1998 A
5797849 Vesely et al. Aug 1998 A
5799055 Peshkin et al. Aug 1998 A
5799099 Wang et al. Aug 1998 A
5800352 Ferre et al. Sep 1998 A
5800535 Howard, III Sep 1998 A
5802719 O'Farrell, Jr. et al. Sep 1998 A
5803089 Ferre et al. Sep 1998 A
5807252 Hassfeld et al. Sep 1998 A
5810728 Kuhn Sep 1998 A
5810735 Halperin et al. Sep 1998 A
5823192 Kalend et al. Oct 1998 A
5823958 Truppe Oct 1998 A
5828725 Levinson Oct 1998 A
5829444 Ferre et al. Nov 1998 A
5831260 Hansen Nov 1998 A
5833608 Acker Nov 1998 A
5834759 Glossop Nov 1998 A
5836954 Heilbrun et al. Nov 1998 A
5840024 Taniguchi et al. Nov 1998 A
5840025 Ben-Haim Nov 1998 A
5843076 Webster, Jr. et al. Dec 1998 A
5848967 Cosman Dec 1998 A
5851183 Bucholz Dec 1998 A
5865846 Bryan et al. Feb 1999 A
5868674 Glowinski et al. Feb 1999 A
5868675 Henrion et al. Feb 1999 A
5871445 Bucholz Feb 1999 A
5871455 Ueno Feb 1999 A
5871487 Warner et al. Feb 1999 A
5873822 Ferre et al. Feb 1999 A
5884410 Prinz Mar 1999 A
5891034 Bucholz Apr 1999 A
5891157 Day et al. Apr 1999 A
5904691 Barnett et al. May 1999 A
5907395 Schulz et al. May 1999 A
5913820 Bladen et al. Jun 1999 A
5920395 Schulz Jul 1999 A
5921992 Costales et al. Jul 1999 A
5923727 Navab Jul 1999 A
5928248 Acker Jul 1999 A
5938603 Ponzi Aug 1999 A
5938694 Jaraczewski et al. Aug 1999 A
5947981 Cosman Sep 1999 A
5950629 Taylor et al. Sep 1999 A
5951475 Gueziec et al. Sep 1999 A
5954647 Bova et al. Sep 1999 A
5964796 Imran Oct 1999 A
5967980 Ferre et al. Oct 1999 A
5968047 Reed Oct 1999 A
5971997 Guthrie et al. Oct 1999 A
5976156 Taylor et al. Nov 1999 A
5980535 Barnett et al. Nov 1999 A
5983126 Wittkampf Nov 1999 A
5987349 Schulz Nov 1999 A
5987960 Messner et al. Nov 1999 A
5999837 Messner et al. Dec 1999 A
5999840 Grimson et al. Dec 1999 A
6001130 Bryan et al. Dec 1999 A
6006126 Cosman Dec 1999 A
6016439 Acker Jan 2000 A
6019725 Vesely et al. Feb 2000 A
6024695 Taylor et al. Feb 2000 A
6050724 Schmitz et al. Apr 2000 A
6059718 Taniguchi et al. May 2000 A
6063022 Ben-Haim May 2000 A
6073043 Schneider Jun 2000 A
6104944 Martinelli Aug 2000 A
6112113 Van Der Brug et al. Aug 2000 A
6118845 Simon et al. Sep 2000 A
6122538 Sliwa, Jr. et al. Sep 2000 A
6131396 Duerr et al. Oct 2000 A
6135946 Konen et al. Oct 2000 A
6139183 Graumann Oct 2000 A
6144874 Du Nov 2000 A
6149592 Yanof et al. Nov 2000 A
6156067 Bryan et al. Dec 2000 A
6161032 Acker Dec 2000 A
6167296 Shahidi Dec 2000 A
6172499 Ashe Jan 2001 B1
6175756 Ferre et al. Jan 2001 B1
6216026 Kuhn et al. Apr 2001 B1
6223067 Vilsmeier et al. Apr 2001 B1
6226543 Gilboa et al. May 2001 B1
6233476 Strommer et al. May 2001 B1
6236205 Ludeke et al. May 2001 B1
6246231 Ashe Jun 2001 B1
6252599 Natsuko et al. Jun 2001 B1
6273896 Franck et al. Aug 2001 B1
6298262 Franck et al. Oct 2001 B1
6332089 Acker et al. Dec 2001 B1
6341231 Ferre et al. Jan 2002 B1
6346940 Fukunaga Feb 2002 B1
6351659 Vilsmeier Feb 2002 B1
6369812 Iyriboz et al. Apr 2002 B1
6374134 Bladen et al. Apr 2002 B1
6380958 Guendel et al. Apr 2002 B1
6381485 Hunter et al. Apr 2002 B1
6424856 Vilsmeier et al. Jul 2002 B1
6428547 Vilsmeier et al. Aug 2002 B1
6434415 Foley et al. Aug 2002 B1
6437567 Schenck et al. Aug 2002 B1
6442417 Shahidi et al. Aug 2002 B1
6445943 Ferre et al. Sep 2002 B1
6456735 Sato et al. Sep 2002 B1
6470207 Simon et al. Oct 2002 B1
6471653 Jordfald et al. Oct 2002 B1
6474341 Hunter et al. Nov 2002 B1
6478743 Jordfald et al. Nov 2002 B1
6484049 Seeley et al. Nov 2002 B1
6490475 Seeley et al. Dec 2002 B1
6493573 Martinelli et al. Dec 2002 B1
6494843 Edwardsen et al. Dec 2002 B2
6496188 Deschamps et al. Dec 2002 B1
6498944 Ben-Haim et al. Dec 2002 B1
6499488 Hunter et al. Dec 2002 B1
6511417 Taniguchi et al. Jan 2003 B1
6511418 Shahidi et al. Jan 2003 B2
6517478 Khadem Feb 2003 B2
6522324 Bosma et al. Feb 2003 B1
6522907 Bladen et al. Feb 2003 B1
6527443 Vilsmeier et al. Mar 2003 B1
6529758 Shahidi Mar 2003 B2
6546278 Walsh Apr 2003 B2
6547739 Jordfald et al. Apr 2003 B2
6551325 Neubauer et al. Apr 2003 B2
6556695 Packer et al. Apr 2003 B1
6584174 Schubert et al. Jun 2003 B2
6609022 Vilsmeier et al. Aug 2003 B2
6611700 Vilsmeier et al. Aug 2003 B1
6636575 Ott Oct 2003 B1
6640128 Vilsmeier et al. Oct 2003 B2
6694162 Hartlep Feb 2004 B2
6701179 Martinelli et al. Mar 2004 B1
20010007919 Shahidi Jul 2001 A1
20010016684 Shahidi Aug 2001 A1
20010025183 Shahidi Sep 2001 A1
20010029333 Shahidi Oct 2001 A1
20010031919 Stommer et al. Oct 2001 A1
20010037064 Shahidi Nov 2001 A1
20020010384 Shahidi et al. Jan 2002 A1
20020049375 Strommer et al. Apr 2002 A1
20020055674 Fenster et al. May 2002 A1
20020075994 Shahidi et al. Jun 2002 A1
20020077543 Grzeszczuk et al. Jun 2002 A1
20020077544 Shahidi Jun 2002 A1
20030032878 Shahidi Feb 2003 A1
20030083567 Deschamps May 2003 A1
Foreign Referenced Citations (32)
Number Date Country
964149 Mar 1975 CA
3042343 Jun 1982 DE
3831278 Mar 1989 DE
4233978 Apr 1994 DE
10085137 Nov 2002 DE
0 319 844 Jan 1988 EP
0419729 Sep 1989 EP
0350996 Jan 1990 EP
0 651 968 Aug 1990 EP
0 581 704 Jul 1993 EP
0655138 Aug 1993 EP
0894473 Jan 1995 EP
2417970 Feb 1979 FR
2765738 Jun 1998 JP
WO 8809151 Dec 1988 WO
WO 9103982 Apr 1991 WO
WO 9104711 Apr 1991 WO
WO 9107726 May 1991 WO
WO 9203090 Mar 1992 WO
WO 9206645 Apr 1992 WO
WO 9404938 Mar 1994 WO
WO 9423647 Oct 1994 WO
WO 9424933 Nov 1994 WO
WO 9611624 Apr 1996 WO
WO 9808554 Mar 1998 WO
WO 9838908 Sep 1998 WO
WO9900052 Jan 1999 WO
WO 8905123 Jun 1999 WO
WO9938449 Aug 1999 WO
WO 9960939 Dec 1999 WO
WO 0130437 May 2001 WO
WO0137748 May 2001 WO
Related Publications (1)
Number Date Country
20040034300 A1 Feb 2004 US