System and method for image based sensor calibration

Information

  • Patent Grant
  • 7085400
  • Patent Number
    7,085,400
  • Date Filed
    Wednesday, June 14, 2000
    24 years ago
  • Date Issued
    Tuesday, August 1, 2006
    17 years ago
Abstract
Apparatus and methods are disclosed for the calibration of a tracked imaging probe for use in image-guided surgical systems. The invention uses actual image data collected from an easily constructed calibration jig to provide data for the calibration algorithm. The calibration algorithm analytically develops a geometric relationship between the probe and the image so objects appearing in the collected image can be accurately described with reference to the probe. The invention can be used with either two or three dimensional image data-sets. The invention also has the ability to automatically determine the image scale factor when two dimensional data-sets are used.
Description
BACKGROUND OF THE INVENTION

1. Field of Invention


The present invention is directed generally to image guided medical systems, and more particularly, to systems and methods for utilizing data collected from imaging sensors to calibrate a tracking device.


2. Description of the Related Art


Image guided surgical techniques have been used with success in aiding physicians for performing a wide variety of delicate surgical procedures. These systems are typically used when the visualization of a surgical tool could be obscured by a patient's anatomy, or when the surgical tool is visible but the patient's anatomy could be difficult to visualize.


In order for these systems to be effective, points of the patient's anatomy appearing in the image must be accurately correlated to the instrument being tracked by the surgical navigation system. Accomplishing this correlation requires the accurate calibration of the tracking device. Ultimately, the calibration problem involves determining the position and orientation (POSE) of a set of points displayed by the image plane in the space defined by the tracking markers of the tracked instrument. It can be assumed that the structure associated with the tracking markers and image plane is a rigid body, so once the POSE is determined it remains constant. However, it is not possible to physically measure the POSE of the points in the image plane.


In addition to being robust and accurate, a preferred calibration scheme must be an uncomplicated procedure which can be performed quickly in the field by minimally trained personnel.


SUMMARY OF THE INVENTION

The present invention is directed generally to image guided medical systems, ERSON, and, particularly, to systems which correlate tracked instrument positions to image data obtained from a patient. More specifically, the present invention is directed to a device and method for registering tracking device outputs with image data.


To achieve these objects and other advantages and in accordance with the purpose of the invention, as embodied and broadly described herein, the invention is directed to an apparatus and method for determining the calibration of a tracking device using image data collected by a tracked probe.


In one aspect of the invention, a surgical navigation system performs the tracking of a position of a calibration jig and a position of a probe. An imaging system performs the collection of a plurality of images of at least one calibration pattern contained in the calibration jig. After receiving the images from the imaging system, the navigation system locates the centers of intersection points in image space associated with each calibration pattern for each of the plurality of images, and extracts a calibration point therefrom. The navigation system then determines the three-dimensional position for at least one calibration point in probe space for each of the plurality of images. Furthermore, the navigation system then relates the positions of the calibration points in image space and the positions of the calibration points in probe space with a coordinate transform. Using this coordinate transform, the navigation system computes a reference position of the image in probe space and stores this reference position. Once the coordinate transform and image reference position in probe space are obtained, the calibration process is complete.


In another aspect of the invention, a surgical navigation system performs the tracking of a position of a calibration jig and a position of a probe. A three-dimensional imaging system performs the collection of a volumetric image of at least one calibration pattern contained in the calibration jig. After receiving the volumetric image from the imaging system, the navigation system extracts two-dimensional slices from the volumetric image and locates centers of intersection points in slice space associated with each calibration pattern for each of the plurality of slices, and extracts a calibration point therefrom. The navigation system then determines the three-dimensional position for at least one calibration point in probe space for each of the plurality of slices. Furthermore, the navigation system then relates the positions of the calibration points in slice space and the positions of the calibration points in probe space with a coordinate transform. Using this coordinate transform, the navigation system computes and stores a reference position of the volumetric image in probe space. Once the coordinate transform and volumetric image reference position in probe space are obtained, the calibration process is complete.


The invention allows for accurate, free-hand calibration of a tracked instrument which can be performed by minimally trained personnel. The calibration jig may be a sterile object and used within an operating room prior to the performance of a surgical procedure. Furthermore, problems associated with the calibration jig or the imaging system can easily be detected by inspection of the images as the calibration is being performed.


It is to be understood that both the foregoing general description and the following detailed description are exemplary only and are not intended to be restrictive of the invention.





BRIEF DESCRIPTION OF THE DRAWINGS

The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments of the invention and together with the description, serve to explain the principles of the invention.



FIG. 1 is a simplified side view of an embodiment of a system for the calibration of a probe in accordance with the present invention.



FIG. 2
a is a perspective view of an embodiment of a calibration jig in accordance with the present invention.



FIG. 2
b depicts a top view of the calibration jig shown in FIG. 2a.



FIG. 3
a is a perspective view of an ultrasound probe imaging a jig containing a point target.



FIG. 3
b is a top view of the jig shown in FIG. 3a illustrating the elevation imaging problem due to a non-ideal imaging plane.



FIG. 4 represents an image of the calibration jig in FIGS. 2a,b formed by an ultrasonic imaging system.



FIG. 5 is a top view showing how the imaging plane intersects an embodiment of a calibration pattern.



FIG. 6 illustrates an image of the calibration pattern of FIG. 5 showing the intersection points in the image.



FIG. 7 is a perspective view of an embodiment of a calibration jig showing different planes intersecting a pattern at differing geometries, and the resulting images corresponding to each geometry.



FIG. 8 is a flow chart illustrating methods and systems in accordance with the present invention.



FIG. 9 shows a set of collected images each containing calibration points lying in a different part of the image plane and their relationship to probe space reference.



FIG. 10 is a flow chart illustrating the steps of determining a relationship between a probe using two dimensional images.



FIG. 11 illustrates an exemplary three-dimensional image of an embodiment of a calibration pattern and its corresponding two dimensional slices of the volumetric image.



FIG. 12 is a flow chart illustrating methods consistent with the steps of calibrating a probe using a three dimensional image.



FIG. 13 is a block diagram of an exemplary computer system consistent for use with the present invention.





DESCRIPTION OF THE PREFERRED EMBODIMENTS

Reference will now be made in detail to the present preferred embodiments of the invention, examples of which are illustrated in the accompanying drawings. Wherever possible, the same reference numbers will be used throughout the drawings to refer to the same or like parts.



FIG. 1 illustrates an exemplary calibration system and method 100 which is consistent with the present invention. Stationary calibration jig 110 supported by platform 115 is scanned by technician 105 by manipulating probe 130. Probe 130 comprises a transducer which transmits signals in the direction of the jig 110. Signals reflected from jig 110 can then be received by probe 130 and fed by cable to imaging system 120. Imaging system 120 processes the received signals and forms images which can be displayed on monitor 140. In the preferred embodiment imaging system 120 is a standard ultrasonic imaging system; however, it should be appreciated that other types of imaging systems, such as microwave, X-ray, or optical could also be used with the present invention. Probe 130 typically is tracked, and for example, has a plurality of tracking markers 160 or other trackable features attached at its distal end. Additionally, calibration jig 110 can also have a plurality of tracking markers 125 attached to its surface. In the preferred embodiment, markers 125 can be attached to the periphery of the upper surface of calibration jig 110.


Tracking markers 125 and 160 can include by way of example only reflectors/emitters operating in the optical, infrared, electromagnetic, and/or acoustic domains and/or other suitable devices known in the art. For example, tracking markers such as those supplied by Northern Digital Incorporated may be used in conjunction with the present invention. It should be noted that the jig calibration markers 125 do not have to be identical to the probe calibration markers 160.


The tracking markers can be used to localize probe 130 and calibration jig 110. Localization is the process of determining the position and orientation of an object of interest and tracking movement of the object over some period of observation.


The manner in which sensor 170 tracks the positions of calibration jig 110 and probe 130 is well known in the art and is therefore only described generally. Sensor 170 comprises a detector array 175 which can be used to detect energy from the tracking markers 125 and 160. In the preferred embodiment, the array is a set of CCD cameras which sense infrared energy. However, other sensors may be used which operate at acoustic, electromagnetic, optical, radiological, and/or other frequencies. For example, sensor array 175 is located and suspended by a mount in such a manner as to provide a line of sight between the mount and tracking markers 125 and 160. Signals from sensor 170 are coupled into computer 150 which processes the received data to determine the position of the markers and, consequently the position of the object attached thereto. Based on the relative positions of the markers as sensed in detector array, the positions of objects can be determined and representations of the objects can be displayed on monitor 180. The tracking technology employed in the present invention may be the same as that used in the STEALTH STATION® Treatment Guidance Platform available from Medtronic Sofamor Danek, Inc.


Surgical navigation systems which perform localization functions to assist in medical procedures are well established. Such systems are disclosed, for example, in PCT Application No. PCT/US95/12894 (Publication No. WO 96/11624) to Bucholz, the entire disclosure of which is incorporated by reference.


Referring further to FIG. 1, imaging machine 120 is coupled to computer 150 through suitable link or connection 190. Connection 190 may be, for example, a device specific digital interface or a generic video output signal. Image data from this connection may be used by computer 150 to perform the calculations required to calibrate the tracking of probe 130. Images taken by probe 130 have specific characteristics due to the construction of the calibration jig 110. These characteristics, which are geometric in nature and will be described in detail later, can be used to determine the position of points appearing in the image in a coordinate system referenced to calibration jig 110. The space defined by the coordinate system referenced to jig 110 is termed jig space for purposes of this document. Since the surgical navigation system is tracking the calibration jig, navigation system computer 150 can also determine the positions of the calibration points in a coordinate system referenced to the probe, defined as probe space for purposes of this document. Positions of these same calibration points are also measured in the image coordinate system, or image space, which typically are measured in pixels. By utilizing the positions of the same calibration points described in the both image space and probe space, a relationship between the spaces can be derived. Once this relationship is determined, any pixel in the image can be accurately described in probe space, and thus the calibration will be complete.


Furthermore, although FIG. 1 shows a single computer 150 performing the localization and calibration functions, multiple computers may be implemented as a single computer to perform the functions performed by computer 150. Although a STEALTH STATION® image guided system manufactured by Medtronic Sofamor Danek has been identified, it will be appreciated that the present invention may be utilized with other types of computer systems. In addition, even though FIG. 1 shows only one display 180 coupled to computer 150, multiple displays of various types known in the art may be coupled to computer 150 in order to provide information to the user of the system.



FIG. 2
a shows a perspective view of calibration jig 110. In the preferred embodiment, a plurality of tracking markers 125 can be attached around the perimeter of the jig's upper surface so the jig may be localized by the surgical navigation system. The calibration jig contains a structure which includes a calibration pattern. In general, the calibration pattern may include of a plurality of parallel members joined by a common member. The geometric relationship between the common member and the parallel members is preferably known and well controlled. The calibration pattern in its most basic form can include a set of wires 220 with a diameter comparable to the wavelength of the radiation from the imaging device. The set of wires may be arranged in a “Z” pattern as shown in the FIG. 2a. The mounting points of the two parallel wires within the “Z” pattern is preferably known in calibration jig space at least to the precision desired by the calibration process. While only one “Z” pattern is shown in FIG. 2a and in subsequent figures for purposes of clarification, it should be appreciated that a plurality of “Z” patterns may be contained within calibration jig 110. It should be understood that other suitable patterns can also be used.


For the preferred embodiment, wires 220 may be constructed of nylon and are submerged in a suitable homogenous imaging medium 210 having a known value for the velocity of propagation of the ultrasound wave emitted by the probe. Such a medium, for example, may be water, ultrasound gel, or some other substance which approximates wave propagation through a biological organism. Images of the “Z” pattern are preferably collected with the imager as perpendicular to the plane formed by the “Z” pattern.



FIG. 2
b is a view as shown from line A′–A of the perspective drawing of FIG. 2a. The wires of “Z” pattern 220, submerged in imaging medium 210, are mounted in accurately known locations on the inner wall of jig 110. Tracking markers 125 are shown in this embodiment as surrounding the upper surface of jig 110.


Performing accurate image-based calibration using small point-target like structures, such as a needle tip, can be difficult due to elevation imaging effects. FIG. 3a depicts such a situation where an ultrasound probe 330 is imaging a point target 320. Ideally, no image would be collected from point target 320 until true imaging plane 340 intersects it. The true imaging plane has a negligible thickness and is used as the reference plane for the calibration process. In practice, elevation imaging effects can create an image of point target although the true plane of the ultrasound image may be some distance from point target 320. FIG. 3b, taken from the perspective of line A′–A, shows this situation. True imaging plane 340 is located at some distance away from point target 320; however, due to the elevation effect, point 320 lies within collection plane 350. Collection plane 350 is centered around true plane 340 and has some finite thickness. As shown in FIG. 4, the resulting image 470 collected from jig 300 will display a representation 460 of point target 320, even though ideal plane 340 is at a distance greater than a resolution cell away from point target 320. The elevation effect results in some level of uncertainty when attempting a precise operation such as calibration. To compensate for this uncertainty, a realistic model ultrasound imaging preferably considers the image plane having a non-negligible thickness. Calibration pattern 220 compensates for the elevation imaging problem by providing a line target to produce points which can be used to calibrate the probe.



FIG. 5 shows a top view of “Z” pattern 220 being intersected by both the true imaging plane 500 and the collection plane 510 of the imaging system. True imaging plane 500, represented in FIG. 5 as line B′–B, has a negligible thickness. Objects captured within this plane will have no elevation errors associated with their location in the image. Collection plane 510, however, has a finite thickness and objects captured within this plane can appear as if they were imaged at the true imaging plane 500, thus creating a position error for these objects in the output image. The “Z” jig compensates for these errors by providing a line target for the calibration points. The line targets of the “Z” pattern 220 pass through the entire thickness of collection plane 510. The resulting image will show an integration of the energy reflected by the wire as it passed through the entire elevation dimension of the collection plane. Shaded portions 530a–b and 532, which can be interpreted as projections of the wire onto the true imaging plane 500, represent the image of the wire in the output image.



FIG. 6 depicts an example of an output image 600 collected from the “Z” pattern 220. The image is taken from the viewing perspective B′–B shown in FIG. 5. Each oval 630a–b and 632 in image 600 represents an intersection of the wire, 530a–b and 532, respectively, with the entire width of collection plane 510. For the calibration problem, the points of interest are where the wires actually intersect the true imaging plane 500; these intersections lie at the centers 635a–b and 637 of each of the ovals 630a–b and 632 respectively. The vertical portions of the centers are interpreted as the centers of the wires 220 in the vertical dimension of image 600. The centers may either be selected manually by the user or automatically by the computer. One advantage of utilizing the “Z” pattern is that in producing centers 635a–b and 637 in image 600, the intersection of the wire with true imaging plane 500 is accurately determined and therefore errors due to the elevation effect are avoided. In order to properly perform the calibration, the coordinates of identical points are preferably found in image space and in probe space. The image point which is used to perform the calibration is center 637 of middle oval 632. These positions, taken over many different images, form a set points defined as calibration points.



FIG. 7 illustrates the technique of determining the three-dimensional coordinates of calibration points in the calibration jig coordinate system, or jig space, based upon the two outside centers and the middle center. Coordinate system 720 is the reference used to localize points in jig space. The X-Y plane of the coordinate system is the inside wall 702 of jig 110 and the Z-axis lies along the lower inside edge, as shown. First image 730 is the result of placing true imaging plane 700 close to the origin of coordinate system 720. Due to the geometry of the Z pattern, imaging plane 700 intersects diagonal wire 707 of the Z pattern closer to the left wire at point 706. As a result, the center 734 lies closer to the left side of image 730. Second image 740 is the result of placing true imaging plane 710 further from the origin of coordinate system 720. In this instance, imaging plane 710 intersects the diagonal wire 707 of the “Z” pattern closer to the right wire at point 708. The resulting image 740 shows the middle center 744 lying closer to the right side of image. These two examples shown in FIG. 7 illustrate that the lateral position of the middle center in the output image can be directly correlated to the Z coordinate in jig space of the diagonal wire 707 when it intersects the true imaging plane. The X and Y coordinates of the calibration point in jig space are identical to location of the two parallel wires, which are precisely known and remain constant for any value of Z. The technique of determining the Z coordinate is preferably based on the Brown-Roberts-Wells (BRW) method used in some Computer Tomography applications Essentially it computes the ratio of the distances from one of the outside points to the center point and the remaining outside point in order to compute the location in jig space of the calibration point. This technique is well known in the art and is described in the paper “Coordinate Transformation and Calculation of the Angular Depth Parameters for a Stereotactic System,” Medical Physics, Vol. 14, No. 6, November/December 1987 by Chen B. Saw, et al., which is incorporated by reference.


Another preferable advantage of employing “Z” pattern 220 over other calibration techniques is the elimination of one axis of movement necessary to position images for calibration. Motion of the imager along the long axis of the “Z” results in the middle center moving laterally in the output image. Thus an automated calibration system would preferably only need to move the imager along the long axis and perpendicular to the axis of a “Z” pattern. Since the calibration point is computed based on the image, preferably, there are no predefined positions for placing the imaging plane within the calibration jig.


Referring to FIG. 8, the processes or steps associated with calibration procedure is illustrated at 800. Initially, calibration jig 110 and probe 130 are tracked using the surgical tracking system throughout the entire calibration procedure in step 810. An initial image of calibration jig 110 is collected (step 820). From this initial image, the orientation of the intersection points is determined. This could be a automated process carried out by computer 150. However, it is also possible for the user to determine this orientation and provide this information into the computer manually via a keyboard or through a graphical user interface (step 830). In step 840, the center points of each intersection in the initial image is determined, and their pixel location is recorded as its image space coordinate. Again, this step may be performed manually by the user or could be carried out automatically by computer 150. In step 850, the Z coordinate of the calibration point is computed in jig space by determining the ratio of the distance between centers of the two outside intersection points and the distance between the center of one outside intersection point and the center of the middle intersection point. From analysis of these ratios, the position of the calibration (center) point in jig space can be calculated. In step 860, the calibration point is transformed from jig space to probe space. This transformation is readily calculated by computer 150 since both jig 110 and probe 130 positions are known in detector space. Once coordinates of the calibration point are known in both image space and probe space, an initial estimate of the transform relating the two spaces and the scale factor between them is made. Afterwards, points associated with several hundred more images, for example, are calculated and steps 840870 are repeated on a large set of calibration points to refine the transform and scale factor which relate image space to probe space (step 880). After the transform and scale factors are accurately known, the origin of the image in probe space coordinates is defined which completes the calibration procedure (step 890).


The goal of the calibration process is to be able to relate points described by pixel locations in image space to positions described in three-dimensional probe space. In order to accomplish this, a mathematical transform between the two spaces, or coordinate systems, is preferably determined. FIG. 9 depicts an exemplary three-dimensional coordinate system, 920, which is referenced to probe 130 and used to describe points in probe space. Coordinate system 930 is a two dimensional coordinate system which is used to describe the location of points within the images. During the calibration process, a set of images 910 is collected. Each image within the set represents calibration pattern 220 as sensed by probe 130 as the probe is moved along the calibration jig's longitudinal dimension (Z axis of coordinate system 720 shown in FIG. 7). The number of images, N, within the set can vary, but typically several hundred are used to perform the calibration. Each of the images within set 910 contains at least one calibration point; however, for purposes of clarity, only one calibration point per image is shown in FIG. 9. As described earlier, the centers of calibration points 900a–n are identified in each image, resulting in at least several hundred image points whose locations in jig space can be calculated. Once this is accomplished, calibration points 900a–n can readily be transformed to probe space referenced by coordinate system 920. The final stage in the calibration process is to utilize the points 900a–n described in both image space and probe space to derive a coordinate transform and scale factor between the two spaces. The computation of this transform is described in detail below.


A technique to derive a transform between two coordinate systems given a set of identical points described in each system is provided in “Closed-form Solution of Absolute Orientation using Unit Quaternions,” Journal of the Optical Society of America, Vol. 4, No. 4, April 1987 by Horn, which is incorporated by reference. FIG. 10 presents a brief description of this method 1000 as it applies to the present invention. Initially, identical points described in both image space and probe space are collected in step 1010. Image space points are then rotated so that they align with points given in probe space (step 1020). After rotation, the centroids of all the points in both probe and image spaces are computed. These values are then used to find the translation between the two systems (step 1030). The image space points are then translated in order to align with probe space points (step 1040). Finally, in step 1050, the scale of the image space points is adjusted to minimize the point-to-point error with the probe space points.


Referring to FIG. 11, another embodiment of the invention allows the calibration of image volumes produced by three-dimensional imaging machines, as well as reconstructed volumes from two-dimensional imaging machines. Instead of an image by image collection of intersection points for the calibration, the entire volume is processed 1100. By collecting slices of voxels, or volume element which is the 3-D counterpart to “pixels” associated with the “Z” pattern 1120, the corresponding intersection points 1130 can be computed. Then, by applying the same techniques as previously disclosed herein for the two dimensional calibration, a calibration transform for the volume is computed. The preferred embodiment for a three-dimensional sensor is an ultrasound device; however, other methods of producing three dimensional imagery could also be used within the scope of the present invention.


Referring to FIG. 12, steps consistent with the present invention for calibrating three-dimensional images are shown (1200). Initially, the calibration jig 110 and probe 130 are tracked with a surgical navigation system (step 1210). A three-dimensional volumetric image is collected and processed with an imaging system (step 1220). A first slice is extracted from the volumetric image and the orientation of the points (1130) in the slice is determined (steps 1230 and 1240). Steps 1230 and 1240 may be performed by the operator. However, they may also be performed automatically by the computer. In step 1250, the centers of the intersections of the “Z” pattern are determined for the first slice. Afterwards, the coordinates for the middle intersection point, or the calibration point, is determined in jig space using the BRW method described earlier (step 1260). In step 1270, an initial transform is computed which relates probe space and image space. Unlike the two-dimensional case, the scale factor is not computed since these factors are already known. After an initial estimate for the transform is made, the computer will extract several hundred more slices (step 1280), for example, and repeat steps 12501270 in order to refine the initial estimate of the transform. After the transform is determined, the origin of the volumetric image is described in probe space, completing the calibration procedure (step 1290).


Referring to FIG. 13, components and modules of a computer system 150 used to perform various processes of the present invention are described. Although a STEALTH STATION® image guided system manufactured by Medtronic Sofamor Danek has been identified, it will be appreciated that the present invention may be utilized with other types of computer systems. One aspect of the computer system 150 includes a graphical user interface system operating in conjunction with a display screen of a display monitor 180. The graphical user interface system is preferably implemented in conjunction with operating system 1315 running computer 150 for displaying and managing the display objects of the system. The graphical user interface is implemented as part of the computer system 150 to receive input data and commands from a conventional keyboard 1320 and mouse 1325. For simplicity of the drawings and explanation, many components of a conventional computer system have not been illustrated such as address buffers, memory buffers, and other standard control circuits because these elements are well known in the art and a detailed description thereof is not necessary for understanding the present invention.


A computer program used to implement the various steps of the present invention is generally located in memory unit 1300, and the processes of the present invention are carried out through the use of a central processing unit (CPU) 1305. Those skilled in the art will appreciate that the memory unit 1300 is representative of both read-only memory and random access memory. The memory unit also contains a database 1350 that stores data, for example, image data and tables, including information regarding the probe, and geometric transform parameters, used in conjunction with the present invention. CPU 1305, in combination with the computer software comprising operating system 1315, scanning software module 1330, tracking software module 1335, calibration software module 1340, and display software module 1345, controls the operations and processes of computer system 150. The processes implemented by CPU 1305 may be communicated as electrical signals along bus 1360 to an I/O interface 1370 and a video interface 1375.


Scanning software module 1330 performs the processes associated with creating a coordinate reference system and reference images for use in connection with the present invention and are known to those skilled in the art. Tracking software module 1335 performs the processes for tracking objects in an image guided system as described herein and are known to those skilled in the art. Calibration software module 1340 computes the coordinates of the calibration points in jig space and implements method to determine the transform between image space and probe space.


Display software module 1345 formats the image data for display on monitor 180 and can identify the center positions of the intersection points in displayed images with icons. Typically these icon are represented as cross-hairs. The display software module also works in conjunction with the graphical user interface and allows the user to determine the orientation of the initial image.


Image data 1355 can be fed directly into computer 150 as a video signal through video interface 1375. Alternatively, the data could also be supplied digitally through I/O interface 1370. In addition, items shown as stored in memory can also be stored, at least partially, on hard disk 1380 if memory resources are limited. Furthermore, while not explicitly shown, image data may also be supplied over a network, through a mass storage device such as a hard drive, optical disks, tape drives, or any other type of data transfer and storage devices which are known in the art.


The foregoing description is present for purposes of illustration and explanation. It is not intended to be exhaustive or to limit the invention to the precise form disclosed, and modifications of variations are possible in light of the above teachings or may be acquired from practice of the invention. The principles of the invention and its practical application enable one skilled in the art to utilize the invention in various embodiments and with various modifications as are suited to the particular use contemplated.

Claims
  • 1. A method of calibrating a tracking device to an image, comprising: tracking a position of a calibration jig and a position of a probe;collecting a plurality of images of at least one calibration pattern of the calibration jig with the probe;locating centers of intersection points in image space associated with each calibration pattern for each of the plurality of images and extracting a calibration point therefrom;determining the position for at least one calibration point in probe space for each of the plurality of images;relating the positions of the calibration points in image space and the positions of said calibration points in probe space;computing an image space reference position in probe space; andstoring the image space reference position.
  • 2. The method of claim 1 further including: identifying an orientation of the intersection points in a first image of the plurality of images collected;computing a position component of the calibration point in calibration jig space by comparing the distances from the center of one of the intersection points near each image edge to the centers of two other intersection points for each image; andtransforming the positions of calibration points described in calibration jig space to positions described in probe space.
  • 3. The method of claim 2 wherein the identification of orientation is performed automatically, specified manually, or combinations thereof.
  • 4. The method of claim 2 further including: receiving positions of identical calibration points described in image space and described in probe space;rotating the calibration points described in image space to align with the calibration points described in probe space;computing centroids of the calibration points described in the rotated image space and the calibration points described in the probe space;translating the calibration points described in the rotated image space to the calibration points described in probe space; andadjusting the scale of the calibration points described in the rotated and translated image space to minimize the point to point error with the calibration points described in probe space.
  • 5. The method of claim 1 wherein the plurality of images are collected utilizing a probe comprising an ultrasonic transducer generating and receiving ultrasonic signals and further utilizing a processing system which forms two dimensional images from the ultrasonic signals received by the ultrasonic transducer.
  • 6. The method of claim 1, wherein collecting a plurality of images of at least one calibration pattern includes positioning an ultrasound probe relative to the calibration jig to image the calibration jig with the ultrasound probe; directing ultrasound waves with the ultrasound probe into a jig space of the calibration jig.
  • 7. A method of calibrating a tracking device to a three-dimensional volumetric image, comprising: tracking a position of a calibration jig and a position of a probe;collecting a volumetric image of at least one calibration pattern contained in the calibration jig;extracting two-dimensional slices from the volumetric image;locating centers of intersection points in slice space associated with each calibration pattern for each of the plurality of slices and extracting a calibration point therefrom;determining the position for at least one calibration point in probe space for each of the plurality of slices;relating the positions of the calibration points in slice space and the positions of said calibration points in probe space;computing a reference position of the volumetric image in probe space; andstoring the reference position of the volumetric image.
  • 8. The method of claim 7 further including: extracting a first slice from the three-dimensional image;identifying an orientation of the intersection points in the first slice;computing a position component of the calibration point in calibration jig space by comparing the distances from the center of one of the intersection points near each slice edge to the centers of two other intersection points for each slice; andtransforming the positions of calibration points described in calibration jig space to positions described in probe space.
  • 9. The method of claim 7 further including: receiving positions of identical calibration points described in slice space and described in probe space;rotating the calibration points described in slice space to align with the calibration points described in probe space;computing centroids of the calibration points described in the rotated slice space and the calibration points described in the probe space; andtranslating the calibration points described in the rotated slice space to the calibration points described in probe space.
  • 10. The method of claim 7 wherein the three-dimensional volumetric image is collected utilizing a probe which is an ultrasonic transducer which generates and receives ultrasonic signals and a processing system which forms the three-dimensional image from the ultrasonic signals received by the ultrasonic transducer.
  • 11. A system for registering a tracking device to an image, comprising: a means for tracking a three-dimensional position of a calibration jig and a position of a probe;a means for collecting a plurality of images of at least one calibration pattern on the calibration jig with the probe;a means for providing the plurality of images to the tracking means;a means for locating centers of intersection points in image space associated with each calibration pattern for each of the plurality of images and extracting a calibration point therefrom;a means for determining the position for at least one calibration point in probe space for each of the plurality of images;a means for relating the positions of the calibration points in image space and the positions of said calibration points in probe space;a means for computing an image space reference position in probe space; anda means for storing the image space reference position.
  • 12. The system of claim 11 further including: a means for identifying an orientation of the intersection points in a first image of the plurality of images collected;a means for computing a position component of the calibration point in calibration jig space by comparing the distances from the center of one of the intersection points near each image edge to the centers of two other intersection points for each image; anda means for transforming the positions of calibration points described in calibration jig space to positions described in probe space.
  • 13. The system of claim 12 wherein the identification means is performed automatically, specified manually, or combinations thereof.
  • 14. The system of claim 12 further including: a means for receiving positions of identical calibration points described in image space and described in probe space;a means for rotating the calibration points described in image space to align with the calibration points described in probe space;a means for computing centroids of the calibration points described in the rotated image space and the calibration points described in the probe space;a means for translating the calibration points described in the rotated image space to the calibration points described in probe space; anda means for adjusting the scale of the calibration points described in the rotated and translated image space to minimize the point to point error with the calibration points described in probe space.
  • 15. The system of claim 11 wherein the means for collecting images further comprises: an ultrasonic transducer means for generating and receiving ultrasonic signals; anda processing means for forming two dimensional images from the ultrasonic signals received by the transducer means.
  • 16. A system for calibrating a tracking device to a three-dimensional volumetric image, comprising: a means for tracking a position of a calibration jig and a position of a probe;a means for collecting a volumetric image of at least one calibration pattern contained in the calibration jig, coupled to the tracking means;a means for extracting two-dimensional slices from the volumetric image;a means for locating centers of intersection points in slice space associated with each calibration pattern for each of the plurality of slices and extracting a calibration point therefrom;a means for determining the position for at least one calibration point in probe space for each of the plurality of slices;a means for relating the positions of the calibration points in slice space and the positions of said calibration points in probe space;a means for computing a reference position of the volumetric image in probe space; anda means for storing the reference position of the volumetric image.
  • 17. The system of claim 16 further including: a means for extracting a first slice from the three-dimensional image;a means for identifying an orientation of the intersection points in the first slice;a means for computing a position component of the calibration point in calibration jig space by comparing the distances from the center of one of the intersection points near each slice edge to the centers of two other intersection points for each slice; anda means for transforming the positions of calibration points described in calibration jig space to positions described in probe space.
  • 18. The system of claim 16 further including: a means for receiving positions of identical calibration points described in slice space and described in probe space;a means for rotating the calibration points described in slice space to align with the calibration points described in probe space;a means for computing centroids of the calibration points described in the rotated slice space and the calibration points described in probe space; anda means for translating the calibration points described in the rotated slice space to the calibration points described in probe space.
  • 19. The system of claim 16 wherein the means for collecting images further comprises: an ultrasonic transducer means for generating and receiving ultrasonic signals; anda processing means for forming three dimensional images from the ultrasonic signals received by the transducer means.
US Referenced Citations (429)
Number Name Date Kind
1576781 Phillips Mar 1926 A
1735726 Bornhardt Nov 1929 A
2407845 Nemeyer Sep 1946 A
2650588 Drew Sep 1953 A
2697433 Sehnder Dec 1954 A
3016899 Stenvall Jan 1962 A
3017887 Heyer Jan 1962 A
3061936 Dobbeleer Nov 1962 A
3073310 Mocarski Jan 1963 A
3294083 Alderson Dec 1966 A
3367326 Frazier Feb 1968 A
3439256 Kähne et al. Apr 1969 A
3577160 White May 1971 A
3674014 Tillander Jul 1972 A
3702935 Carey et al. Nov 1972 A
3704707 Halloran Dec 1972 A
3868565 Kuipers Feb 1975 A
3941127 Froning Mar 1976 A
4037592 Kronner Jul 1977 A
4052620 Brunnett Oct 1977 A
4054881 Raab Oct 1977 A
4117337 Staats Sep 1978 A
4173228 Van Steenwyk et al. Nov 1979 A
4202349 Jones May 1980 A
4262306 Renner Apr 1981 A
4287809 Egli et al. Sep 1981 A
4314251 Raab Feb 1982 A
4317078 Weed et al. Feb 1982 A
4328813 Ray May 1982 A
4339953 Iwasaki Jul 1982 A
4358856 Stivender et al. Nov 1982 A
4368536 Pfeiler Jan 1983 A
4396885 Constant Aug 1983 A
4396945 DiMatteo et al. Aug 1983 A
4403321 Kruger Sep 1983 A
4418422 Richter et al. Nov 1983 A
4419012 Stephenson et al. Dec 1983 A
4422041 Lienau Dec 1983 A
4431005 McCormick Feb 1984 A
4485815 Amplatz Dec 1984 A
4543959 Sepponen Oct 1985 A
4548208 Niemi Oct 1985 A
4571834 Fraser et al. Feb 1986 A
4572198 Codrington Feb 1986 A
4583538 Onik et al. Apr 1986 A
4584577 Temple Apr 1986 A
4613866 Blood Sep 1986 A
4618978 Cosman Oct 1986 A
4621628 Bludermann Nov 1986 A
4625718 Olerud et al. Dec 1986 A
4638232 Stridsberg et al. Jan 1987 A
4642786 Hansen Feb 1987 A
4645343 Stockdale et al. Feb 1987 A
4649504 Krouglicof et al. Mar 1987 A
4651732 Frederick Mar 1987 A
4653509 Oloff et al. Mar 1987 A
4673352 Hansen Jun 1987 A
4706665 Gouda Nov 1987 A
4719419 Dawley Jan 1988 A
4722056 Roberts et al. Jan 1988 A
4722336 Kim et al. Feb 1988 A
4727565 Ericson Feb 1988 A
4737794 Jones Apr 1988 A
4737921 Goldwasser et al. Apr 1988 A
4750487 Zanetti Jun 1988 A
4771787 Wurster et al. Sep 1988 A
4791934 Brunnett Dec 1988 A
4793355 Crum et al. Dec 1988 A
4797907 Anderton Jan 1989 A
4803976 Frigg et al. Feb 1989 A
4821206 Arora Apr 1989 A
4821731 Martinelli et al. Apr 1989 A
4836778 Baumrind et al. Jun 1989 A
4845771 Wislocki et al. Jul 1989 A
4849692 Blood Jul 1989 A
4862893 Martinelli Sep 1989 A
4889526 Rauscher et al. Dec 1989 A
4896673 Rose et al. Jan 1990 A
4905698 Strohl, Jr. et al. Mar 1990 A
4923459 Nambu May 1990 A
4931056 Ghajar et al. Jun 1990 A
4945305 Blood Jul 1990 A
4945914 Allen Aug 1990 A
4951653 Fry et al. Aug 1990 A
4977655 Martinelli Dec 1990 A
4989608 Ratner Feb 1991 A
4991579 Allen Feb 1991 A
5002058 Martinelli Mar 1991 A
5005592 Cartmell Apr 1991 A
5013317 Cole et al. May 1991 A
5016639 Allen May 1991 A
5027818 Bova et al. Jul 1991 A
5030196 Inoue Jul 1991 A
5030222 Calandruccio et al. Jul 1991 A
5031203 Trecha Jul 1991 A
5042486 Pfeiler et al. Aug 1991 A
5050608 Watanabe et al. Sep 1991 A
5054492 Scribner et al. Oct 1991 A
5057095 Fabian Oct 1991 A
5059789 Salcudean Oct 1991 A
5079699 Tuy et al. Jan 1992 A
5086401 Glassman et al. Feb 1992 A
5094241 Allen Mar 1992 A
5097839 Allen Mar 1992 A
5099845 Besz et al. Mar 1992 A
5105829 Fabian et al. Apr 1992 A
5107839 Houdek et al. Apr 1992 A
5107843 Aarnio et al. Apr 1992 A
5107862 Fabian et al. Apr 1992 A
5109194 Cantaloube Apr 1992 A
5119817 Allen Jun 1992 A
5142930 Allen et al. Sep 1992 A
5152288 Hoenig et al. Oct 1992 A
5160337 Cosman Nov 1992 A
5161536 Vikomerson et al. Nov 1992 A
5178164 Allen Jan 1993 A
5178621 Cook et al. Jan 1993 A
5186174 Schlondorff et al. Feb 1993 A
5187475 Wagener et al. Feb 1993 A
5188126 Fabian et al. Feb 1993 A
5190059 Fabian et al. Mar 1993 A
5197476 Nowacki et al. Mar 1993 A
5197965 Cherry et al. Mar 1993 A
5198768 Keren Mar 1993 A
5198877 Schulz Mar 1993 A
5211164 Allen May 1993 A
5211165 Dumoulin et al. May 1993 A
5211176 Ishiguro et al. May 1993 A
5212720 Landi et al. May 1993 A
5214615 Bauer May 1993 A
5219351 Teubner et al. Jun 1993 A
5222499 Allen et al. Jun 1993 A
5228442 Imran Jul 1993 A
5230338 Allen et al. Jul 1993 A
5233990 Barnea Aug 1993 A
5237996 Waldman et al. Aug 1993 A
5249581 Horbal et al. Oct 1993 A
5251127 Raab Oct 1993 A
5251635 Dumoulin et al. Oct 1993 A
5253647 Takahashi et al. Oct 1993 A
5255680 Darrow et al. Oct 1993 A
5257636 White Nov 1993 A
5265610 Darrow et al. Nov 1993 A
5265611 Hoenig et al. Nov 1993 A
5269759 Hernandez et al. Dec 1993 A
5271400 Dumoulin et al. Dec 1993 A
5273025 Sakiyama et al. Dec 1993 A
5274551 Corby, Jr. Dec 1993 A
5279309 Taylor et al. Jan 1994 A
5291199 Overman et al. Mar 1994 A
5295483 Nowacki et al. Mar 1994 A
5297549 Beatty et al. Mar 1994 A
5299254 Dancer et al. Mar 1994 A
5299288 Glassman et al. Mar 1994 A
5305091 Gelbart et al. Apr 1994 A
5305203 Raab Apr 1994 A
5309913 Kormos et al. May 1994 A
5315630 Sturm et al. May 1994 A
5316024 Hirschi et al. May 1994 A
5318025 Dumoulin et al. Jun 1994 A
5320111 Livingston Jun 1994 A
5325728 Zimmerman et al. Jul 1994 A
5325873 Hirschi et al. Jul 1994 A
5329944 Fabian et al. Jul 1994 A
5333168 Fernandes et al. Jul 1994 A
5353220 Ito et al. Oct 1994 A
5353795 Souza et al. Oct 1994 A
5353800 Pohndorf et al. Oct 1994 A
5353807 DeMarco Oct 1994 A
5357953 Merrick et al. Oct 1994 A
5359417 Muller et al. Oct 1994 A
5368030 Zinreich et al. Nov 1994 A
5375596 Twiss et al. Dec 1994 A
5377678 Dumoulin et al. Jan 1995 A
5383454 Bucholz Jan 1995 A
5385146 Goldreyer Jan 1995 A
5385148 Lesh et al. Jan 1995 A
5386828 Owens et al. Feb 1995 A
5389101 Heilbrun et al. Feb 1995 A
5391199 Ben-Haim Feb 1995 A
5394457 Leibinger et al. Feb 1995 A
5394875 Lewis et al. Mar 1995 A
5397329 Allen Mar 1995 A
5399146 Nowacki et al. Mar 1995 A
5400384 Fernandes et al. Mar 1995 A
5402801 Taylor Apr 1995 A
5408409 Glassman et al. Apr 1995 A
5411026 Carol May 1995 A
5417210 Funda et al. May 1995 A
5419325 Dumoulin et al. May 1995 A
5423334 Jordan Jun 1995 A
5425367 Shapiro et al. Jun 1995 A
5425382 Golden et al. Jun 1995 A
5426683 O'Farrell, Jr. et al. Jun 1995 A
5426687 Goodall et al. Jun 1995 A
5427097 Depp Jun 1995 A
5429132 Guy et al. Jul 1995 A
5433198 Desai Jul 1995 A
RE35025 Anderton Aug 1995 E
5437277 Dumoulin et al. Aug 1995 A
5443066 Dumoulin et al. Aug 1995 A
5443489 Ben-Haim Aug 1995 A
5444756 Pai et al. Aug 1995 A
5445144 Wodicka et al. Aug 1995 A
5445150 Dumoulin et al. Aug 1995 A
5445166 Taylor Aug 1995 A
5446548 Gerig et al. Aug 1995 A
5447154 Cinquin et al. Sep 1995 A
5448610 Yamamoto et al. Sep 1995 A
5453686 Anderson Sep 1995 A
5456718 Szymaitis Oct 1995 A
5458718 Venkitachalam Oct 1995 A
5464446 Dreessen et al. Nov 1995 A
5478341 Cook et al. Dec 1995 A
5478343 Ritter Dec 1995 A
5480422 Ben-Haim Jan 1996 A
5483961 Kelly et al. Jan 1996 A
5485849 Panescu et al. Jan 1996 A
5487391 Panescu Jan 1996 A
5487729 Avellanet et al. Jan 1996 A
5487757 Truckai et al. Jan 1996 A
5490196 Rudich et al. Feb 1996 A
5494034 Schlöndorff et al. Feb 1996 A
5503416 Aoki et al. Apr 1996 A
5513637 Twiss et al. May 1996 A
5515160 Schulz et al. May 1996 A
5517990 Kalfas et al. May 1996 A
5531227 Schneider Jul 1996 A
5531520 Grimson et al. Jul 1996 A
5538004 Bamber Jul 1996 A
5542938 Avellanet et al. Aug 1996 A
5543951 Moehrmann Aug 1996 A
5546949 Frazin et al. Aug 1996 A
5546951 Ben-Haim Aug 1996 A
5551429 Fitzpatrick et al. Sep 1996 A
5558091 Acker et al. Sep 1996 A
5568809 Ben-haim Oct 1996 A
5572999 Funda et al. Nov 1996 A
5573533 Strul Nov 1996 A
5575794 Walus et al. Nov 1996 A
5583909 Hanover Dec 1996 A
5588430 Bova et al. Dec 1996 A
5590215 Allen Dec 1996 A
5592939 Martinelli Jan 1997 A
5595193 Walus et al. Jan 1997 A
5596228 Anderton et al. Jan 1997 A
5600330 Blood Feb 1997 A
5603318 Heilbrun et al. Feb 1997 A
5617462 Spratt Apr 1997 A
5617857 Chader et al. Apr 1997 A
5619261 Anderton Apr 1997 A
5622169 Golden et al. Apr 1997 A
5622170 Schulz Apr 1997 A
5627873 Hanover et al. May 1997 A
5628315 Vilsmeier et al. May 1997 A
5630431 Taylor May 1997 A
5636644 Hart et al. Jun 1997 A
5638819 Manwaring et al. Jun 1997 A
5640170 Anderson Jun 1997 A
5642395 Anderton et al. Jun 1997 A
5643268 Vilsmeier et al. Jul 1997 A
5645065 Shapiro et al. Jul 1997 A
5647361 Damadian Jul 1997 A
5662111 Cosman Sep 1997 A
5664001 Tachibana et al. Sep 1997 A
5674296 Bryan et al. Oct 1997 A
5676673 Ferre et al. Oct 1997 A
5681260 Ueda et al. Oct 1997 A
5682886 Delp et al. Nov 1997 A
5690108 Chakeres Nov 1997 A
5694945 Ben-Haim Dec 1997 A
5695500 Taylor et al. Dec 1997 A
5695501 Carol et al. Dec 1997 A
5697377 Wittkampf Dec 1997 A
5702406 Vilsmeier et al. Dec 1997 A
5711299 Manwaring et al. Jan 1998 A
5713946 Ben-Haim Feb 1998 A
5715822 Watkins Feb 1998 A
5715836 Kliegis et al. Feb 1998 A
5718241 Ben-Haim et al. Feb 1998 A
5727552 Ryan Mar 1998 A
5727553 Saad Mar 1998 A
5729129 Acker Mar 1998 A
5730129 Darrow et al. Mar 1998 A
5730130 Fitzpatrick et al. Mar 1998 A
5732703 Kalfas et al. Mar 1998 A
5735278 Hoult et al. Apr 1998 A
5738096 Ben-Haim Apr 1998 A
5741214 Ouchi et al. Apr 1998 A
5742394 Hansen Apr 1998 A
5744953 Hansen Apr 1998 A
5748767 Raab May 1998 A
5749362 Funda et al. May 1998 A
5749835 Glantz May 1998 A
5752513 Acker et al. May 1998 A
5755725 Druais May 1998 A
RE35816 Schulz Jun 1998 E
5758667 Slettenmark Jun 1998 A
5762064 Polvani Jun 1998 A
5767669 Hansen et al. Jun 1998 A
5767960 Orman Jun 1998 A
5769789 Wang et al. Jun 1998 A
5769861 Vilsmeier Jun 1998 A
5772594 Barrick Jun 1998 A
5775322 Silverstein et al. Jul 1998 A
5776064 Kalfas et al. Jul 1998 A
5782765 Jonkman Jul 1998 A
5787886 Kelly et al. Aug 1998 A
5792055 McKinnon Aug 1998 A
5795294 Luber et al. Aug 1998 A
5797849 Vesely et al. Aug 1998 A
5799055 Peshkin et al. Aug 1998 A
5799099 Wang et al. Aug 1998 A
5800352 Ferre et al. Sep 1998 A
5800535 Howard, III Sep 1998 A
5802719 O'Farrell, Jr. et al. Sep 1998 A
5803089 Ferre et al. Sep 1998 A
5807252 Hassfeld et al. Sep 1998 A
5810007 Holupka et al. Sep 1998 A
5810728 Kuhn Sep 1998 A
5810735 Halperin et al. Sep 1998 A
5820553 Hughes Oct 1998 A
5823192 Kalend et al. Oct 1998 A
5823958 Truppe Oct 1998 A
5828725 Levinson Oct 1998 A
5828770 Leis et al. Oct 1998 A
5829444 Ferre et al. Nov 1998 A
5831260 Hansen Nov 1998 A
5833608 Acker Nov 1998 A
5834759 Glossop Nov 1998 A
5836954 Heilbrun et al. Nov 1998 A
5840024 Taniguchi et al. Nov 1998 A
5840025 Ben-Haim Nov 1998 A
5843076 Webster, Jr. et al. Dec 1998 A
5848967 Cosman Dec 1998 A
5851183 Bucholz Dec 1998 A
5865846 Bryan et al. Feb 1999 A
5868675 Henrion et al. Feb 1999 A
5871445 Bucholz Feb 1999 A
5871455 Ueno Feb 1999 A
5871487 Warner et al. Feb 1999 A
5873822 Ferre et al. Feb 1999 A
5884410 Prinz Mar 1999 A
5891034 Bucholz Apr 1999 A
5891157 Day et al. Apr 1999 A
5904691 Barnett et al. May 1999 A
5907395 Schulz et al. May 1999 A
5913820 Bladen et al. Jun 1999 A
5920395 Schulz Jul 1999 A
5921992 Costales et al. Jul 1999 A
5923417 Leis Jul 1999 A
5923727 Navab Jul 1999 A
5928248 Acker Jul 1999 A
5938603 Ponzi Aug 1999 A
5938694 Jaraczewski et al. Aug 1999 A
5947981 Cosman Sep 1999 A
5950629 Taylor et al. Sep 1999 A
5951475 Gueziec et al. Sep 1999 A
5954647 Bova et al. Sep 1999 A
5964796 Imran Oct 1999 A
5967980 Ferre et al. Oct 1999 A
5967982 Barnett Oct 1999 A
5968047 Reed Oct 1999 A
5971997 Guthrie et al. Oct 1999 A
5976156 Taylor et al. Nov 1999 A
5980535 Barnett et al. Nov 1999 A
5983126 Wittkampf Nov 1999 A
5987349 Schulz Nov 1999 A
5987960 Messner et al. Nov 1999 A
5999837 Messner et al. Dec 1999 A
5999840 Grimson et al. Dec 1999 A
6001130 Bryan et al. Dec 1999 A
6006126 Cosman Dec 1999 A
6006127 Van Der Brug et al. Dec 1999 A
6013087 Adams et al. Jan 2000 A
6016439 Acker Jan 2000 A
6019725 Vesely et al. Feb 2000 A
6024695 Taylor et al. Feb 2000 A
6035228 Yanof et al. Mar 2000 A
6050724 Schmitz et al. Apr 2000 A
6052611 Yanof et al. Apr 2000 A
6059718 Taniguchi et al. May 2000 A
6061644 Leis May 2000 A
6063022 Ben-Haim May 2000 A
6069932 Peshkin et al. May 2000 A
6073043 Schneider Jun 2000 A
6073044 Fitzpatrick et al. Jun 2000 A
6076008 Bucholz Jun 2000 A
6101455 Davis Aug 2000 A
6104944 Martinelli Aug 2000 A
6118845 Simon et al. Sep 2000 A
6122538 Sliwa, Jr. et al. Sep 2000 A
6131396 Duerr et al. Oct 2000 A
6139183 Graumann Oct 2000 A
6149592 Yanof et al. Nov 2000 A
6156067 Bryan et al. Dec 2000 A
6161032 Acker Dec 2000 A
6167296 Shahidi Dec 2000 A
6172499 Ashe Jan 2001 B1
6175756 Ferre et al. Jan 2001 B1
6223067 Vilsmeier et al. Apr 2001 B1
6233476 Strommer et al. May 2001 B1
6246231 Ashe Jun 2001 B1
6273896 Franck et al. Aug 2001 B1
6298262 Franck et al. Oct 2001 B1
6311540 Paltieli et al. Nov 2001 B1
6332089 Acker et al. Dec 2001 B1
6341231 Ferre et al. Jan 2002 B1
6351659 Vilsmeier Feb 2002 B1
6381485 Hunter et al. Apr 2002 B1
6415644 Rockwood et al. Jul 2002 B1
6424856 Vilsmeier et al. Jul 2002 B1
6428547 Vilsmeier et al. Aug 2002 B1
6434415 Foley et al. Aug 2002 B1
6437567 Schenck et al. Aug 2002 B1
6445943 Ferre et al. Sep 2002 B1
6470207 Simon et al. Oct 2002 B1
6474341 Hunter et al. Nov 2002 B1
6493573 Martinelli et al. Dec 2002 B1
6498944 Ben-Haim et al. Dec 2002 B1
6499488 Hunter et al. Dec 2002 B1
6527443 Vilsmeier et al. Mar 2003 B1
6551325 Neubauer et al. Apr 2003 B1
6584174 Schubert et al. Jun 2003 B1
6609022 Vilsmeier et al. Aug 2003 B1
6611700 Vilsmeier et al. Aug 2003 B1
6640128 Vilsmeier et al. Oct 2003 B1
6694162 Hartlep Feb 2004 B1
6701179 Martinelli et al. Mar 2004 B1
Foreign Referenced Citations (61)
Number Date Country
964149 Mar 1975 CA
3042343 Jun 1982 DE
3717871 Dec 1988 DE
3717871 Dec 1988 DE
3831278 Mar 1989 DE
4225112 Dec 1993 DE
4233978 Apr 1994 DE
19715202 Oct 1998 DE
19751761 Oct 1998 DE
19832296 Feb 1999 DE
19747427 May 1999 DE
10085137 Nov 2002 DE
0 155 857 Sep 1985 EP
0 319 844 Jan 1988 EP
0 326 768 Aug 1989 EP
0419729 Sep 1989 EP
0350996 Jan 1990 EP
0 359 773 Mar 1990 EP
0 651 968 Aug 1990 EP
0 456 103 Nov 1991 EP
0 469 966 Feb 1992 EP
0 581 704 Jul 1993 EP
0894473 Jan 1995 EP
0 655 138 May 1995 EP
0 904 735 Mar 1999 EP
0 908 146 Apr 1999 EP
0 950 380 Oct 1999 EP
2417970 Feb 1979 FR
2765738 Jun 1998 JP
WO 8809151 Dec 1988 WO
WO 8905123 Jun 1989 WO
WO 9103982 Apr 1991 WO
WO 9104711 Apr 1991 WO
WO 9107726 May 1991 WO
WO 9203090 Mar 1992 WO
WO 9206645 Apr 1992 WO
WO 9404938 Mar 1994 WO
WO 9404938 Mar 1994 WO
WO 9423647 Oct 1994 WO
WO 9424933 Nov 1994 WO
WO 9507055 Mar 1995 WO
WO 9608209 Mar 1996 WO
WO 9611624 Apr 1996 WO
WO 9632059 Oct 1996 WO
WO 9736192 Oct 1997 WO
WO 9749453 Dec 1997 WO
WO 9808554 Mar 1998 WO
WO 9838908 Sep 1998 WO
WO 9915097 Apr 1999 WO
WO 9921498 May 1999 WO
WO 9923956 May 1999 WO
WO 9926549 Jun 1999 WO
WO 9927839 Jun 1999 WO
WO 9929253 Jun 1999 WO
WO 9933406 Jul 1999 WO
WO 9938449 Aug 1999 WO
WO 9952094 Oct 1999 WO
WO 9956654 Nov 1999 WO
WO 9960939 Dec 1999 WO
WO 0010456 Mar 2000 WO
WO 0130437 May 2001 WO