Registration of human anatomy integrated for electromagnetic localization

Abstract
A method for use during a procedure on a body. The method generates a display representing relative positions of two structures during the procedure. The method comprises the steps of storing an image data set in memory, the image data set representing the position of the body based on scans taken of the body prior to the procedure; reading the image data set stored in the memory, the image data set having a plurality of data points in known relation to a plurality of reference points for at least one of the two structures; placing one or more magnetic field sensors in known relation to the reference points of the two structures; generating a magnetic field; detecting the magnetic field with the magnetic field sensors; ascertaining the locations of the sensors based upon the magnetic field detected by the sensors and processing the locations of the sensors to generate a displaced image data set representing the relative position of the two structures during the procedure; and generating a display based on the displaced image data set illustrating the relative position of the two structures during the procedure.
Description
BACKGROUND OF THE INVENTION

The present invention relates to localization of a position during neurosurgery. The present invention relates more specifically to electromagnetic localization of a position during stereotactic neurosurgery, such as brain surgery and spinal surgery.


Precise localization of a position is important to stereotactic neurosurgery. In addition, minimizing invasiveness of surgery is important to reduce health risks for a patient. Stereotactic surgery minimizes invasiveness of surgical procedures by allowing a device to be guided through tissue that has been localized by preoperative scanning techniques, such as for example, MR, CT, ultrasound, fluoro and PET. Recent developments in stereotactic surgery have increased localization precision and helped minimize invasiveness of surgery.


Stereotactic neurosurgery is now commonly used in neurosurgery of the brain. Such methods typically involve acquiring image data by placing fiducial markers on the patient's head, scanning the patient's head, attaching a headring to the patient's head, and determining the spacial relation of the image data to the headring by, for example, registration of the fiducial markers. Registration of the fiducial markers relates the information in the scanned image data for the patient's brain to the brain itself, and involves one-to-one mapping between the fiducial markers as identified in the image data and the fiducial markers that remain on the patient's head after scanning and throughout surgery. This is referred to as registering image space to patient space. Often, the image space must also be registered to another image space. Registration is accomplished through knowledge of the coordinate vectors of at least three non-collinear points in the image space and the patient space.


Currently, registration for image guided surgery can be completed by different methods. First, point-to-point registration is accomplished by identifying points in image space and then touching the same points in patient space. Second, surface registration involves the user's generation of a surface (e.g., the patient's forehead) in patient space by either selecting multiple points or scanning, and then accepting or rejecting the best fit to that surface in image space, as chosen by the processor. Third, repeat fixation devices entail the user repeatedly removing and replacing a device in known relation to the fiducial markers. Such registration methods have additional steps during the procedure, and therefore increase the complexity of the system and increase opportunities for introduction of human error.


It is known to adhere the fiducial markers to a patient's skin or alternatively to implant the fiducial markers into a patient's bone for use during stereotactic surgery. For example, U.S. Pat. No. 5,595,193 discloses an apparatus and method for creating a hole that does not penetrate the entire thickness of a segment of bone and is sized to accommodate a fiducial marker. A fiducial marker may then be inserted into the hole and image data may be acquired.


Through the image data, quantitative coordinates of targets within the patient's body can be specified relative to the fiducial markers. Once a guide probe or other instrument has been registered to the fiducial markers on the patient's body, the instrument can be navigated through the patient's body using image data.


It is also known to display large, three-dimensional data sets of image data in an operating room or in the direct field of view of a surgical microscope. Accordingly, a graphical representation of instrument navigation through the patient's body is displayed on a computer screen based on reconstructed images of scanned image data.


Although scanners provide valuable information for stereotactic surgery, improved accuracy in defining the position of the target with respect to an accessible reference location can be desirable. Inaccuracies in defining the target position can create inaccuracies in placing a therapeutic probe. One method for attempting to limit inaccuracies in defining the target position involves fixing the patient's head to the scanner to preserve the reference. Such a requirement is uncomfortable for the patient and creates other inconveniences, particularly if surgical procedures are involved. Consequently, a need exists for a system utilizing a scanner to accurately locate positions of targets, which allows the patient to be removed from the scanner.


Stereotactic neurosurgery utilizing a three-dimensional digitizer allows a patient to be removed from the scanner while still maintaining accuracy for locating the position of targets. The three-dimensional digitizer is used as a localizer to determine the intra-procedural relative positions of the target. Three-dimensional digitizers may employ optical, acoustic, electromagnetic, conductive or other known three-dimensional navigation technology for navigation through the patient space.


Stereotactic surgery techniques are also utilized for spinal surgery in order to increase accuracy of the surgery and minimize invasiveness. Accuracy is particularly difficult in spinal surgery and must be accommodated in registration and localization techniques utilized in the surgery. Prior to spinal surgery, the vertebra are scanned to determine their alignment and positioning. During imaging, scans are taken at intervals through the vertebra to create a three-dimensional pre-procedural data set for the vertebra. After scanning the patient is moved to the operating table, which can cause repositioning of the vertebra. In addition, the respective positions of the vertebra may shift once the patient has been immobilized on the operating table because, unlike the brain, the spine is not held relatively still in the same way as a skull-like enveloping structure. Even normal patient respiration may cause relative movement of the vertebra.


Computer processes discriminate the image data retrieved by scanning the spine so that the body vertebra remain in memory. Once the vertebra are each defined as a single rigid body, the vertebra can be repositioned with software algorithms that define a displaced image data set. Each rigid body element has at least three fiducial markers that are visible on the pre-procedural images and accurately detectable during the procedure. It is preferable to select reference points on the spinous process that are routinely exposed during such surgery. See also, for example, U.S. Pat. No. 5,871,445, WO 96/11624, U.S. Pat. Nos. 5,592,939 and 5,697,377, the disclosures of which are incorporated herein by reference.


SUMMARY OF INVENTION

To enhance the prior art, and in accordance with the purposes of the invention, as embodied and broadly described herein, there is provided a system for displaying relative positions of two structures during a procedure on a body. The system comprises memory for storing an image data set representing the position of the body based on scans of the body, the image data set having a plurality of data points in known relation to a plurality of reference points for the body; a magnetic field generator for generating a magnetic field to be sensed by one or more magnetic field sensors placed in known relation to the reference points of the body for detecting the magnetic field and for generating positional signals in response to the detected magnetic field; a processor for receiving the reference signals and for ascertaining a location of the magnetic field sensors based upon the reference signals, the processor for generating a displaced image data set representing the relative positions of the body elements during the procedure; and a display utilizing the displaced image data set generated by the processor to display the relative position of the body elements during the procedure.


The present invention also provides a method for use during a procedure on a body. The method generates a display representing relative positions of two structures during the procedure. The method comprises the steps of storing an image data set in memory, the image data set representing the position of the body based on scans taken of the body prior to the procedure; reading the image data set stored in the memory, the image data set having a plurality of data points in known relation to a plurality of reference points for at least one of the two structures; placing one or more magnetic field sensors in known relation to the reference points of the two structures; generating a magnetic field; detecting the magnetic field with the magnetic field sensors; ascertaining the locations of the sensors based upon the magnetic field detected by the sensors and processing the locations of the sensors to generate a displaced image data set representing the relative position of the two structures during the procedure; and generating a display based on the displaced image data set illustrating the relative position of the two structures during the procedure.


The present invention further includes a device for use in a system for displaying relative positions of two structures during a procedure on a body. The device comprises a base adapted for attachment to the body, a fiducial marker mounted to the base, and a sensor having a known location and orientation with respect to the fiducial marker.


Additional features and advantages of the invention will be set forth in the description which follows, and in part will be apparent from the description, or may be learned from practice of the invention. The objectives and other advantages of the invention will be realized and attained by the apparatus particularly pointed out in the written description and claims herein as well as the appended drawings.





BRIEF DESCRIPTION OF THE DRAWINGS

The accompanying drawings, which are incorporated in and constitute part of the specification, illustrate a presently preferred embodiment of the invention and together with the general description given above and detailed description of the preferred embodiment given below, serve to explain the principles of the invention.



FIG. 1 is a schematic diagram illustrating an embodiment of the registration system of the present invention;



FIG. 2 illustrates a top view of a first embodiment of a fiducial marker-sensor device;



FIG. 3 illustrates a cross-sectional view of the first embodiment of the fiducial marker-sensor device of the present invention, taken along line 3-3 of FIG. 2;



FIG. 4 illustrates a top view of a second embodiment of a fiducial marker-sensor device;



FIG. 5 illustrates a cross-sectional view of the second embodiment of the fiducial marker-sensor device of the present invention, taken along line 5-5 of FIG. 4;



FIG. 6 illustrates a top view of a third embodiment of a fiducial marker-sensor device;



FIG. 7 illustrates a cross-sectional view of the third embodiment of the fiducial marker-sensor device of the present invention, taken along line 7-7 of the FIG. 6;



FIG. 8 illustrates a side view of a fourth embodiment of a fiducial marker-sensor device of the present invention, indicating a placement of an attachable sensor ring in phantom;



FIG. 9 illustrates a top view of an attachable sensor ring for placement according to the fourth embodiment of the fiducial-sensor device as illustrated in FIG. 4;



FIG. 10 illustrates a side view of a fifth embodiment of a fiducial marker-sensor device of the present invention;



FIG. 11 illustrates a side view of a fiducial marker according to the fifth embodiment of the fiducial marker-sensor device of the present invention;



FIG. 12 illustrates a side view of sensor ring according to the fifth embodiment of the fiducial marker-sensor device of the present invention;



FIG. 13 illustrates a schematic view of a sixth embodiment of a fiducial marker-sensor device of the present invention;



FIG. 14 illustrates a schematic view of a seventh embodiment of a fiducial marker-sensor device of the present invention;



FIG. 15 illustrates a medical instrument for use in the registration system of the present invention; and



FIG. 16 schematically illustrates the registration system for use in spinal procedures.





DETAILED DESCRIPTION OF DRAWINGS

Reference will now be made in detail to the present preferred exemplary embodiments of the invention, examples of which are illustrated in the accompanying drawings. Wherever possible, the same reference numbers will be used throughout the drawings to refer to the same or like parts.


In accordance with the present invention, a method for use during a procedure on a body generates a display representing relative positions of two structures during the procedure. The method comprises the steps of (i) storing an image data set in memory, the image data set representing the position of the body based on scans taken of the body prior to the procedure; (ii) reading the image data set stored in the memory, the image data set having a plurality of data points in known relation to a plurality of reference points for at least one of the two structures; (iii) placing one or more magnetic field sensors in known relation to the reference points of the two structures; (iv) generating a magnetic field; (v) detecting the magnetic field with the magnetic field sensors; (vi) ascertaining the locations of the sensors based upon the magnetic field detected by the sensors and processing the locations of the sensors to generate a displaced image data set representing the relative position of the two structures during the procedure; and (vii) generating a display based on the displaced image data set illustrating the relative position of the two structures during the procedure. The relation of the plurality of data points to the plurality of reference points is determined by the user or by standard image processing of shape detection.


The two structures can be body elements (e.g., vertebrae of the spine) or a body element (e.g., a brain or a vertebrae) and a medical instrument such as a probe.



FIG. 1 schematically illustrates an exemplary embodiment of the registration system 10 of the present invention. For illustrative purposes, the registration system of the present invention will be described for a brain surgery procedure. However, the registration system may alternatively be used for a number of different procedures on the body, including spinal surgery (described hereinafter).


Initially, at least one fiducial marker 20 is placed on patient's head 30. A pre-operative scan is taken of the patient's head 30, preferably using at least one of MR, CT, ultrasound, fluoro and PET. The scan generates an image data set that is placed into the memory of a computer system 40. The image data set represents the position of the patient's head 30 based on the pre-operative scans of the head. The image data set includes a plurality of data points.


During the procedure, at least one magnetic field sensor 50 is placed in known relation to the at least one fiducial marker 20 on the patient's head 30. For example, the magnetic field sensor can be integrated with the fiducial marker, attached to the fiducial marker, or interchanged with the fiducial marker. Another magnetic field sensor 50 can be placed, for example, in a medical instrument 60. The medical instrument 60 does not need a fiducial marker because it is not present in the scan taken to create the image data set.


During the procedure, a magnetic field generator (not shown) generates a magnetic field in the area of the patient. For example, coils (not shown) can be embedded into an operating table 42 on which the patient is placed. The magnetic field sensors 50 on the patient's head 30 and in the medical instrument 60 detect the generated magnetic field and send appropriate signals to the processor 45 so that the processor 45 can determine the positions of the magnetic field sensors 50 during the procedure. Once the processor 45 determines the positions of the magnetic field sensors 50 on the patient's head 30, the position of the magnetic field sensors 50 on the patient's head is registered to the position of the fiducial markers 20 as represented in the scan.


After the position of the magnetic field sensors 50 has been determined and the sensors on the patient's head 30 are registered, a displaced image data set is created and displayed on a monitor 48. The display includes the relative position of the medical device 60 to the patient's head 30.


A variety of fiducial markers 20 and magnetic field sensors 50 (combined to create “fiducial marker-sensor devices”) are illustrated in FIGS. 2 through 14. In FIGS. 2 and 3, an interchangeable fiducial marker-sensor device 150 is illustrated. The device 150 includes a base 152 that is attached to the patient. The base 152 is preferably adhesively attached to the patient along its bottom surface 154, but may also be implanted in the patient, clamped or stapled to the patient, or otherwise suitably attached to the patient. The base 152 has a raised ring portion 156 and a central circular depression 158. A fiducial (not shown) having a shape complementary to the base 152 is placed into the base for scanning, and then a sensor 160 having a shape complementary to the base 152 is placed in the base for electromagnetic tracking of the patient space. One or more coils 162 are placed in the sensor 160, preferably perpendicular to each other. The coils 162 are placed in communication with the processor 45, for example using wires 164 or similarly suitable communication links such as radio waves. Alternatively, optical, acoustic or inertial elements could be interchanged for the sensor if an optical, acoustic or inertial navigation system is employed.


In FIGS. 4 and 5, a preferred embodiment of an integrated fiducial marker-sensor 250 is illustrated. The illustrated fiducial marker 256 is spherical, but provides only location data and no orientation data. The device 250 includes a base 252 that is attached to the patient. The base 252 is preferably adhesively attached to the patient along its bottom surface 254, but may also be implanted in the patient, clamped or stapled to the patient, or otherwise suitably attached to the patient. The fiducial marker 256 is attached to the base 252, for example using an epoxy or plastic layer 258. The base is also a sensor for electromagnetic tracking of the patient space. One or more coils 262 are placed in the base 252, preferably perpendicular to each other. The coils 262 are placed in communication with the processor 45, for example using wires 264 or other suitable communication links such as radio waves. Alternatively, optical, acoustic or inertial elements known in the art could be interchanged for the sensor if an optical, acoustic or inertial navigation system is employed.


The preferred size of the spherical fiducial marker is dependent upon scan slice thickness. For example, with 1 mm slices, a 3 mm sphere is preferred and for 3 mm slices an 8 mm sphere is preferred. As can be see in FIGS. 3 and 4, the spherical fiducial marker 256 is spaced from the base. It is preferable (but not necessary) that the space between the fiducial marker and the patient is greater than the slice thickness to provide a “barrier.” By barrier, the present invention contemplates that the fiducial is preferably spaced from the patient's skin by a large enough distance that the fiducial and the skin do not blend together in the scan image and appear as one object.


In FIGS. 6 and 7, another preferred embodiment of an integrated fiducial marker-sensor 350 is illustrated. The illustrated fiducial marker 356 has a spherical shape. The device 350 includes a base 352 that is attached to the patient either adhesively along its bottom surface 354, implanted in the patient, clamped or stapled to the patient, or otherwise suitably attached to the patient. The fiducial marker 356 is attached to the base 352, for example using an epoxy or plastic casing 358. The base is also a sensor for electromagnetic tracking of the patient space. One or more coils 362 are placed in the base 352, preferably perpendicular to each other. The coils 362 are placed in communication with the processor 45, for example using wires 364. Alternatively, optical, acoustic or inertial elements could be interchanged for the sensor if an optical, acoustic or inertial navigation system is employed.


As stated above, the preferred size of the spherical fiducial marker is dependent upon scan slice thickness, and the spherical fiducial marker 356 is preferably (but not necessarily) spaced from the base a distance greater than the slice thickness to provide a barrier.



FIGS. 8 and 9 illustrate a fiducial marker-sensor device 450 similar to the fiducial marker-sensor device illustrated in FIGS. 6 and 7, except that the sensor is in an attachable ring 460 instead of being in the base 452. This embodiment allows attachment of the sensor in known relation to the fiducial after scanning has taken place. As with the above-described embodiments, the sensor includes at least one sensor 462, and preferably includes two perpendicularly oriented sensors 462.



FIGS. 10 through 12 illustrate an interchangeable fiducial marker-sensor device 550 including a base 552 having a protrusion 554 that is threaded. A fiducial marker 570 has a complementary threaded recess 572 for engagement with the protrusion 554 on the base 552. FIG. 11 illustrates the fiducial marker 570. FIG. 12 illustrates a sensor ring 560 with an aperture 562 that is threaded so that it can be interchanged with the fiducial marker 570 on the base 552. Alternatively, this embodiment could also employ a recess in the base and a complementary protrusion on the interchangeable fiducial marker and sensor ring.


The present invention contemplates use of a fiducial marker having a unique geometrical shape in any of the embodiments of the fiducial marker-sensor device described hereinabove. In addition, the present invention contemplates placement of multiple fiducial markers on the patient and attachment of sensors to a subset of the fiducial markers that the user finds are most clearly and helpfully represented in the scan. Placement of additional sensors helps ensure that a proper number of sensors can be placed on the patient even if one or more fiducial markers are not clearly identifiable in the scan.


One exemplary embodiment of the method of the present invention utilizes at least one fiducial marker-sensor device. The user places at least one fiducial marker with a unique geometric shape on the patient's head 30. One example of the unique geometrical shape contemplated by the present invention includes at least three distinct non-collinear points, and may include more points to increase the accuracy of the system in correlating patient space to image space. Examples of presently preferred unique geometric shapes including more than three non-collinear points are illustrated in FIGS. 13 and 14. Unique geometrical shapes allows determination of both the location and the orientation of the fiducial marker from the image slices and with a six degree of freedom (DOF) sensor. The image slices represent the location and orientation of the at least one fiducial marker in image space and the six DOF sensor determines the corresponding location and orientation of the at least one fiducial marker in patient space to accomplish auto-registration. The six DOF sensor is preferably electromagnetic, but may also be acoustic, optical or inertial. Other uniquely identifiable shapes can be used, for example a T-shape or a tack.


Alternatively, the user may place at least two fiducial markers with predetermined geometrical shapes (see FIGS. 13 and 14) on the patient's head 30. The location and orientation of the fiducial markers can be determined from the image slices and with a five DOF sensor. A six DOF sensor is not needed, but can be used, when at least two fiducial markers with unique geometries are used. The image slices represent the location and orientation of the fiducial markers in image space and the five DOF sensor determines the corresponding location and orientation of the fiducial markers in patient space to accomplish auto-registration. The five DOF sensor is preferably electromagnetic, but may also be acoustic, optical or inertial.


As another alternative, the user may place at least three fiducial markers on the patient's head 30. The location of the fiducial markers can be determined from the image slices and with a combination of sensors to define six DOF (e.g., two five DOF sensors). The image slices represent at least the location of the fiducial markers in image space and the sensor determines at least the corresponding location of the fiducial markers in patient space to accomplish auto-registration. The sensors are preferably electromagnetic.


In yet another alternative, the user may place at least three fiducial markers on the patient's head 30. In this embodiment including at least three fiducial markers, the fiducial markers need not have a unique geometrical shape. Exemplary embodiments of fiducial markers that do not have a unique geometrical shape are illustrated in FIGS. 4 through 9. The exemplary fiducial marker-sensor devices illustrated in FIGS. 4 through 9 include a spherical fiducial marker. The location of the fiducial markers can be determined from the image slices and with a three DOF sensor. A three DOF sensor is commonly used in both acoustic, optical or inertial navigation systems. The image slices represent the location of the fiducial markers in image space and the three dimensional sensor determines the corresponding location of the fiducial markers in patient space to accomplish auto-registration.


As stated above, once fiducial markers 20 have been placed on the patient's head, image slices or a three-dimensional scan (e.g., MR, CT, ultrasound, fluoro and PET) are taken of the patient's head to create a three-dimensional data set having data points corresponding to reference points on the fiducial marker(s) 20. The relation of the plurality of data points to the plurality of reference points is determined by the user or by standard image processing of shape detection. The scan is preferably taken prior to or during the procedure. An image data set is created by the scan and placed in computer memory, and the processor 45 identifies the fiducial marker(s) in image space (in the image data set) using image algorithms. Each fiducial marker is represented by at least one data point in the image data set.


Preferably, the image data set is created prior to placing the patient on the operating table. Once the patient is ready for surgery, the processor 45 can identify the fiducial marker(s) 20 in patient space using signals received from the sensors 50 on the patient's head 30. Each fiducial marker includes least one reference point 70 in patient space (see exemplary fiducial markers illustrated in FIGS. 13 and 14). The reference points need not be attached to a defined triangle as illustrated in FIG. 13, but instead may be as simple as 3 suspended BBs. The reference points in patient space correlate to the data points in the image data set. The signals sent by the sensors to the processor 45 to identify the fiducial marker(s) in patient space are called “localization information” and allow the processor to “auto-register” the patient by correlating the reference points to the data points. The relation of the plurality of data points to the plurality of reference points is determined by the user or by standard image processing of shape detection. This is done by determining a translation matrix between image space and patient space.


Auto-registering the patient provides a simplified and more user-friendly system because the user need not select the data points in the data set and thereafter touch fiducial markers, or create a surface in patient space by selecting multiple points or scanning and then accept or reject the best fit in image space as determined by the processor, or repeatedly remove and replace a localizing device. In addition, accuracy can be enhanced because opportunities for human error during user registration is eliminated.


During the procedure, at least one sensor 50 is placed in known relation to the fiducial marker(s) 20 on patient's head to create a dynamic reference frame for the procedure. Preferably, the at least one sensor is integrated with the fiducial marker(s), removably attached to the fiducial marker(s), permanently affixed to the fiducial marker(s) after the patient is scanned, or interchanged with the fiducial marker(s) during the procedure. In a preferred embodiment of the invention in which a single uniquely shaped fiducial marker with ascertainable location and orientation is utilized (see FIGS. 13 and 14), the location and orientation of the sensor with respect to the fiducial marker is determined prior to placement of the fiducial marker-sensor onto the patient and remains constant throughout the procedure. For example, factory calibration may be used.


During the procedure, the computer system dynamically tracks movement of the sensors 50 on the patient's head 30 and on the medical instrument 60. Thus, the system tracks movement of the medical instrument 60 relative to the patient's head 30. In addition, the system can “learn the geometry” of sensors placed on the patient's head to perform geometry checks that help maintain system accuracy. To learn the geometry of the sensors 50 on the patient's head, the processor 45 determines the relative locations of all of the sensors 50 on the patient's head. The relative locations of the sensors on the patient's head should not change. If the processor determines that the relative location of sensors on the patient's head has changed, the system indicates to the user that an error may have occurred. By using the magnetic field sensors as a dynamic reference frame, the system need not employ additional navigational devices in the surgical field.


As the system tracks relative movement of two structures such as the patient's head and the medical instrument, a graphical representation of instrument navigation through the patient's brain is displayed on a monitor 48 of the computer system 40 based on reconstructed images of scanned image data.


An exemplary embodiment of a medical instrument for use in the present invention is illustrated in FIG. 15. The medical instrument 60 includes a handle 62 and a probe 64 having a tip portion 66. The tip portion 66 of the medical instrument 60 includes a sensor having at least one coil 68 that makes up the sensor 50. In a preferred embodiment of the invention, the two coils 68 are placed in the tip portion 66 in order to allow the computer system of the present invention to track movement of the instrument in six degrees of freedom. The coils 68 are preferably located perpendicular to each other within the tip portion 66.


When using the registration system of the present invention during spinal surgery, the systems ability to track relative movement of multiple structures is particularly important for at least the following reason. Prior to spinal surgery, the vertebra are scanned to determine their alignment and positioning. During imaging, scans are taken at intervals through the vertebra to create a three-dimensional pre-procedural data set for the vertebra. However, after scanning the patient must be moved to the operating table, causing repositioning of the vertebra. In addition, the respective positions of the vertebra may shift once the patient has been immobilized on the operating table because, unlike the brain, the spine is not held relatively still by a skull-like enveloping structure. Even normal patient respiration may cause relative movement of the vertebra.



FIG. 16 schematically illustrates elements of spinal surgery needed to explain the procedures of the present invention. At least one fiducial marker 20 is placed on each vertebra 610 of concern during the procedure. A vertebra “of concern” is a vertebra whose position the user is concerned with during the spinal procedure. Once at least one fiducial marker 20 has been placed on each vertebra of concern, image slices or a three-dimensional scan (e.g., MR, CT, ultrasound, fluoro and PET) are taken of the patient's spine to create a three-dimensional data set having data points corresponding to reference points on each fiducial marker 20. The relation of the plurality of data points to the plurality of reference points is determined by the user or by standard image processing of shape detection. The scan is preferably taken prior to or during the procedure. An image data set is created by the scan and placed in computer memory, and the processor 45 (see FIG. 1) identifies each fiducial marker 20 in image space (in the image data set) using image algorithms. Each fiducial marker 20 is represented by at least one data point in the image data set.


Preferably, the image data set is created prior to placing the patient on the operating table. Once the patient is ready for surgery, the processor 45 can identify the fiducial marker 20 in patient space using signals received from at least one sensor 50, placed in known relation to the fiducial marker(s) 20 placed on the patient's vertebra 610. As described above, the system then auto-registers the patient by correlating the reference points to the data points. According to the present invention, the fiducial marker-sensor devices illustrated with respect to brain surgery are equally acceptable for spinal surgery.


During the procedure, the computer system dynamically tracks movement of each sensor 50 on the patient's vertebra and on the medical instrument 60. Thus, the system tracks alignment and positioning of the vertebra 610 (e.g., relative movement of the vertebra) as well as movement of the medical instrument 60 relative to the vertebrae. In addition, the system can “learn the geometry” of sensors placed on a single to perform geometry checks that help maintain system accuracy as described above.


As the system tracks relative movement of vertebra 610 and the medical instrument 60, a graphical representation of instrument navigation through the patient's spinous process is displayed on a monitor 48 of the computer system 40 based on reconstructed images of scanned image data.


An exemplary embodiment of a medical instrument for use in the present invention is illustrated in FIG. 15. The medical instrument 60 includes a handle 62 and a probe 64 having a tip portion 66. The tip portion 66 of the medical instrument 60 includes a sensor having at least one coil 68. In a preferred embodiment of the invention, the two coils 68 are placed in the tip portion 66 in order to allow the computer system of the present invention to track movement of the instrument in six degrees of freedom. The coils 68 are preferably located perpendicular to each other within the tip portion 66.


It will be apparent to those skilled in the art that various modifications and variations can be made in the registration system of the present invention and in construction of this registration system without departing from the scope or spirit of the invention. As an example a variety of other embodiments of the fiducial marker-sensor device could be employed, including fiducial markers of an endless variety of shapes and sizes. The magnetic field generator and sensor roles could be reversed, such that the operating table 42 could include a sensor, and field generators could be placed on the patient and in the medical device. In addition, an optical, acoustic or inertial system could be used to track the location of the sensors and fiducial markers instead of electromagnetics.


Other embodiments of the invention will be apparent to those skilled in the art from consideration of the specification and practice of the invention disclosed herein. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the invention being indicated by the following claims.

Claims
  • 1. A system for displaying relative positions of two structures during a procedure on a body, the system comprising: a memory for storing an image data set representing the position of the body based on scans of the body, the image data set having a plurality of data points in known relation to a plurality of reference points for the body;a magnetic field generator for generating a magnetic field;a plurality of magnetic field sensors placed in known relation to at least one of the plurality of reference points for the body, the plurality of magnetic field sensors operable to sense the magnetic field and to generate a position signal in response to the sensed magnetic field;a processor configured to execute instructions to, receive the position signal for ascertaining a location of at least a first sub-plurality of the plurality of magnetic field sensors based upon the position signal,register the ascertained location of at least the first sub-plurality of the plurality of magnetic field sensors and the image data set based on the known relation of at least the first sub-plurality of the plurality of magnetic field sensors relative to the at least one of the plurality of reference points and the received position signal from at least the first sub-plurality of the plurality of magnetic field sensors,generate a displaced image data set representing relative positions of the plurality of reference points for the body during the procedure based on the registered image data set; andlearn a geometry of relative locations of at least a second sub-plurality of the plurality of magnetic field sensors placed on the body at a first time to perform an accuracy geometry check at a second time to ensure accuracy after the registration, wherein the second sub-plurality of the plurality of magnetic field sensors placed on the body are configured to remain substantially fixed in the learned geometry between the first time and the second time; anda display to display the displaced image data set generated by the processor to display the relative position of the plurality of reference points for the body during the procedure.
  • 2. The system of claim 1, wherein the plurality of reference points are identifiable as points on body elements of the body to which the image data set relates.
  • 3. The system of claim 1, further comprising: a fiducial marker, wherein the fiducial marker defines the plurality of reference points for the body in the image data set;wherein the fiducial marker is placed on the body prior to the image data set being acquired of the body;wherein the processor is further configured to identify the fiducial marker in the image data set using image algorithms and wherein the fiducial marker is represented by at least the plurality of reference points in the image data set.
  • 4. The system of claim 3, wherein the image data set is a three-dimensional data set that has data points corresponding to the plurality of reference points defined by the fiducial marker.
  • 5. The system of claim 4, wherein the first sub-plurality of the plurality of magnetic field sensors is placed on a first vertebra on the body and at least the second sub-plurality of the plurality of magnetic field sensors is placed on a second vertebra on the body;wherein the processor is configured to dynamically track movement of each of the first sub-plurality of magnetic field sensors and the second sub-plurality of magnetic field sensors to track alignment and positioning of both the first vertebra and the second vertebra.
  • 6. The system of claim 5, wherein the processor is configured to track relative movement of both the first vertebra and the second vertebra.
  • 7. The system of claim 1, further comprising: an instrument operable to be moved relative to the body; andan instrument magnetic field sensor configured to sense the magnetic field and to generate an instrument position signal in response to the sensed magnetic field;wherein the processor is configured to determine the relative location of the instrument relative to the body based on the instrument position signal and the position signal and generate for display with the display a representation of the instrument relative to the body.
  • 8. A system for displaying relative positions of two structures during a procedure on a body, the system comprising: a first fiducial marker that defines at least a first reference point and a second fiducial marker that defines at least a second reference point, both operable to be connected to the body during an acquisition of an image data set representing the body;a memory for storing the image data set representing the body, the image data set having a plurality of body data points in known relation to the first reference point and the second reference point;a magnetic field generator for generating a magnetic field;a first magnetic field sensor placed in known relation to the first reference point on a first body portion of the body and a second magnetic field sensor placed in known relation to the second reference point on a second body portion of the body, the first magnetic field sensor operable to sense the magnetic field and to generate a first position signal in response to the sensed magnetic field and the second magnetic field sensor operable to sense the magnetic field and to generate a second position signal in response to the sensed magnetic field;a processor Configured to, receive the first position signal and the second position signal,ascertain a location of both the first Magnetic field sensor and the second magnetic field sensor based upon the received first position signal and the second position signal,register the first magnetic field sensor and the second magnetic field sensor and the image data set based on both the known relation of the first magnetic field sensor relative to the first reference point and the ascertained location of the first magnetic field sensor and the known relation of the second magnetic field sensor relative to the second reference point and the ascertained location of the second magnetic field sensor, andgenerate a displaced image data set representing relative positions of the first reference point and the second reference point for the body during the procedure based on the registered image data set;determine a fixed geometry between the first magnetic field sensor and the second magnetic field sensor, wherein the first magnetic field sensor is positioned at the fixed geometry relative to the second magnetic field sensor; andincrease accuracy of the registered image data set by performing an accuracy check based on the determined fixed geometry and tracking the determined fixed geometry during the procedure; anda display to display the displaced image data set generated by the processor to display the relative position of the first reference point and the second reference point for the body during the procedure.
  • 9. The system of claim 8, wherein the first fiducial marker and the first magnetic field sensor are integrated info a single unit.
  • 10. The system of claim 9, wherein the first reference point defined by the first fiducial marker is positioned a distance away from a connection portion of the first fiducial marker relative to the body.
  • 11. The system of claim 8, wherein the first fiducial marker and the first magnetic field sensor are configured to be connected; wherein the image data set is configured to be acquired with the first fiducial marker connected to the first body portion and the first magnetic field sensor is configured to be connected after the acquisition of the image data set.
  • 12. The system of claim 11, wherein the processor is configured to register the image data set automatically based on the known relation of the first magnetic sensor relative to the first fiducial marker.
  • 13. The system of claim 11, wherein the first reference point defined by the first fiducial marker is positioned a distance away from a connection portion of the first fiducial marker relative to the body.
  • 14. A method for displaying relative positions of two structures during a procedure on a body, the method comprising: accessing an image data set representing the position of the body based on scans of the body, the image data set having a plurality of data points in known relation to a plurality of reference points for the body;generating a magnetic field with a magnetic field generator;generating a first position signal with a first magnetic field sensor placed in known relation to at least one of the plurality of reference points for the body on a first body portion, wherein the first position signal from the first magnetic field sensor is in response to the sensed magnetic field;generating a second position signal with a second magnetic field sensor placed in known relation to at least one of the plurality of reference points for the body on a second body portion, wherein the second position signal from the second magnetic field sensor is in response to the sensed magnetic field;receiving the first position signal and the second position signal with a processor to ascertain a location of both the first magnetic field sensor and the second magnetic field sensor based upon the received first position signal and the second position signal;registering, with the processor configured to register, the first magnetic field sensor and the second magnetic field sensor and the image data set based on the known relation of the first magnetic field sensor relative to the at least one of the plurality of reference points for the body on a first body portion and the second magnetic field sensor relative to the at least one of the plurality of reference points on the second body portion and the ascertained location of both the first magnetic field sensor and the second magnetic field sensor based upon the received first position signal and the received second position signal;generating a displaced image data set with the processor representing relative positions of the plurality of reference points for the body during the procedure based on the registered image data set;displaying the displaced image data set generated by the processor to display the relative position of the plurality of reference points for the body during the procedure;determining a fixed geometry between the first magnetic field sensor and the second magnetic field sensor, wherein the first magnetic field sensor is positioned at the fixed geometry relative to the second magnetic field sensor; andperforming an accuracy check with the determined fixed geometry and tracking the geometry during the procedure to increase accuracy of the registered image data set.
  • 15. The method of claim 14, wherein the first magnetic field sensor is connected to a first vertebra and the second magnetic field sensor is attached to a second vertebra separate from the first vertebra.
  • 16. The method of claim 15, further comprising: sensing the magnetic field with an instrument magnetic field sensor affixed to an instrument;generating an instrument position signal in response to the sensed magnetic field with the instrument magnetic field sensor;determining the relative location of the instrument relative to the body with the processor based on the instrument position signal and the ascertained location of both the first magnetic field sensor and the second magnetic field sensor based upon the received first position signal and the second position signal and generate for display with the display a representation of the instrument relative to the body.
  • 17. The method of claim 16, wherein the instrument magnetic field sensor includes a first coil and a second coil positioned near a tip portion of the instrument.
  • 18. The method of claim 17, wherein the first coil is positioned substantially orthogonal to the second coil to ascertain six degree of freedom of movement information regarding the instrument.
  • 19. The method of claim 18, further comprising: identifying a fiducial marker in image data set using image algorithms and wherein the fiducial marker is represented by at least the plurality of reference points in the image data set.
  • 20. The method of claim 19, wherein identifying the fiducial marker includes at least a first fiducial marker and a second fiducial marker; wherein the first magnetic field sensor is positioned on the first vertebra relative to the first fiducial Marker and the second magnetic field sensor is positioned on the second vertebra relative to the second fiducial marker;wherein registering is automatically registering the image data with the processor based upon the known relative location of both the first magnetic field sensor relative to the first fiducial marker and the second magnetic field sensor relative to the second fiducial marker.
CROSS-REFERENCE TO RELATED APPLICATIONS

This application is a continuation of U.S. patent application Ser. No. 10/103,685, filed Mar. 21, 2002, now U.S. Pat. No. 7,657,300, issued on Feb. 2, 2010, which is a continuation of U.S. Pat. No. 6,381,485, issued on Apr. 30, 2002, the applications which are hereby incorporated by reference herein. The following United States patent applications, which were concurrently filed with this one on Oct. 28, 1999, are fully incorporated herein by reference: Method and System for Navigating a Catheter Probe in the Presence of Field-influencing Objects, by Michael Martinelli, Paul Kessman and Brad Jascob; Patient-shielding and Coil System, by Michael Martinelli, Paul Kessman and Brad Jascob; Navigation Information Overlay onto Ultrasound Imagery, by Paul Kessman, Troy Holsing and Jason Trobaugh; Coil Structures and Methods for Generating Magnetic Fields, by Brad Jascob, Paul Kessman and Michael Martinelli; Registration of Human Anatomy Integrated for Electromagnetic Localization, by Mark W. Hunter and Paul Kessman; System for Translation of Electromagnetic and Optical Localization Systems, by Mark W. Hunter and Paul Kessman; Surgical Communication and Power System, by Mark W. Hunter, Paul Kessman and Brad Jascob; and Surgical Sensor, by Mark W. Hunter, Sheri McCoid and Paul Kessman.

US Referenced Citations (556)
Number Name Date Kind
1576781 Phillips Mar 1926 A
1735726 Bornhardt Nov 1929 A
2407845 Nemeyer Sep 1946 A
2650588 Drew Sep 1953 A
2697433 Sehnder Dec 1954 A
3016899 Stenvall Jan 1962 A
3017887 Heyer Jan 1962 A
3061936 Dobbeleer Nov 1962 A
3073310 Mocarski Jan 1963 A
3109588 Polhemus et al. Nov 1963 A
3294083 Alderson Dec 1966 A
3367326 Frazier Feb 1968 A
3439256 Kahne Apr 1969 A
3577160 White May 1971 A
3614950 Rabey Oct 1971 A
3644825 Davis, Jr. et al. Feb 1972 A
3674014 Tillander Jul 1972 A
3702935 Carey et al. Nov 1972 A
3704707 Halloran Dec 1972 A
3821469 Whetstone et al. Jun 1974 A
3868565 Kuipers Feb 1975 A
3941127 Froning Mar 1976 A
3983474 Kuipers Sep 1976 A
4017858 Kuipers Apr 1977 A
4037592 Kronner Jul 1977 A
4052620 Brunnett Oct 1977 A
4054881 Raab Oct 1977 A
4117337 Staats Sep 1978 A
4173228 Van Steenwyk et al. Nov 1979 A
4182312 Mushabac Jan 1980 A
4202349 Jones May 1980 A
4228799 Anichkov et al. Oct 1980 A
4256112 Kopf et al. Mar 1981 A
4262306 Renner Apr 1981 A
4287809 Egli et al. Sep 1981 A
4298874 Kuipers Nov 1981 A
4314251 Raab Feb 1982 A
4317078 Weed et al. Feb 1982 A
4319136 Jinkins Mar 1982 A
4328548 Crow et al. May 1982 A
4328813 Ray May 1982 A
4339953 Iwasaki Jul 1982 A
4341220 Perry Jul 1982 A
4346384 Raab Aug 1982 A
4358856 Stivender et al. Nov 1982 A
4368536 Pfeiler Jan 1983 A
4396885 Constant Aug 1983 A
4396945 DiMatteo et al. Aug 1983 A
4403321 Kruger Sep 1983 A
4418422 Richter et al. Nov 1983 A
4419012 Stephenson et al. Dec 1983 A
4422041 Lienau Dec 1983 A
4431005 McCormick Feb 1984 A
4485815 Amplatz et al. Dec 1984 A
4506676 Duska Mar 1985 A
4543959 Sepponen Oct 1985 A
4548208 Niemi Oct 1985 A
4571834 Fraser et al. Feb 1986 A
4572198 Codrington Feb 1986 A
4583538 Onik et al. Apr 1986 A
4584577 Temple Apr 1986 A
4608977 Brown Sep 1986 A
4613866 Blood Sep 1986 A
4617925 Laitinen Oct 1986 A
4618978 Cosman Oct 1986 A
4621628 Brudermann Nov 1986 A
4625718 Olerud et al. Dec 1986 A
4638798 Shelden et al. Jan 1987 A
4642786 Hansen Feb 1987 A
4645343 Stockdale et al. Feb 1987 A
4649504 Krouglicof et al. Mar 1987 A
4651732 Frederick Mar 1987 A
4653509 Oloff et al. Mar 1987 A
4659971 Suzuki et al. Apr 1987 A
4660970 Ferrano Apr 1987 A
4673352 Hansen Jun 1987 A
4688037 Krieg Aug 1987 A
4701049 Beckman et al. Oct 1987 A
4705395 Hageniers Nov 1987 A
4705401 Addleman et al. Nov 1987 A
4706665 Gouda Nov 1987 A
4709156 Murphy et al. Nov 1987 A
4710708 Rorden et al. Dec 1987 A
4719419 Dawley Jan 1988 A
4722056 Roberts et al. Jan 1988 A
4722336 Kim et al. Feb 1988 A
4723544 Moore et al. Feb 1988 A
4727565 Ericson Feb 1988 A
RE32619 Damadian Mar 1988 E
4733969 Case et al. Mar 1988 A
4737032 Addleman et al. Apr 1988 A
4737794 Jones Apr 1988 A
4737921 Goldwasser et al. Apr 1988 A
4742356 Kuipers May 1988 A
4742815 Ninan et al. May 1988 A
4743770 Lee May 1988 A
4743771 Sacks et al. May 1988 A
4745290 Frankel et al. May 1988 A
4750487 Zanetti Jun 1988 A
4753528 Hines et al. Jun 1988 A
4761072 Pryor Aug 1988 A
4764016 Johansson Aug 1988 A
4771787 Wurster et al. Sep 1988 A
4779212 Levy Oct 1988 A
4782239 Hirose et al. Nov 1988 A
4788481 Niwa Nov 1988 A
4791934 Brunnett Dec 1988 A
4793355 Crum et al. Dec 1988 A
4794262 Sato et al. Dec 1988 A
4797907 Anderton Jan 1989 A
4803976 Frigg et al. Feb 1989 A
4804261 Kirschen Feb 1989 A
4805615 Carol Feb 1989 A
4809694 Ferrara Mar 1989 A
4821200 Oberg Apr 1989 A
4821206 Arora Apr 1989 A
4821731 Martinelli et al. Apr 1989 A
4822163 Schmidt Apr 1989 A
4825091 Breyer et al. Apr 1989 A
4829373 Leberl et al. May 1989 A
4836778 Baumrind et al. Jun 1989 A
4838265 Cosman et al. Jun 1989 A
4841967 Chang et al. Jun 1989 A
4845771 Wislocki et al. Jul 1989 A
4849692 Blood Jul 1989 A
4860331 Williams et al. Aug 1989 A
4862893 Martinelli Sep 1989 A
4869247 Howard, III et al. Sep 1989 A
4875165 Fencil et al. Oct 1989 A
4875478 Chen Oct 1989 A
4884566 Mountz et al. Dec 1989 A
4889526 Rauscher et al. Dec 1989 A
4896673 Rose et al. Jan 1990 A
4905698 Strohl, Jr. et al. Mar 1990 A
4923459 Nambu May 1990 A
4931056 Ghajar et al. Jun 1990 A
4945305 Blood Jul 1990 A
4945914 Allen Aug 1990 A
4951653 Fry et al. Aug 1990 A
4955891 Carol Sep 1990 A
4961422 Marchosky et al. Oct 1990 A
4977655 Martinelli Dec 1990 A
4989608 Ratner Feb 1991 A
4991579 Allen Feb 1991 A
5002058 Martinelli Mar 1991 A
5005592 Cartmell Apr 1991 A
5013317 Cole et al. May 1991 A
5016639 Allen May 1991 A
5017139 Mushabac May 1991 A
5027818 Bova et al. Jul 1991 A
5030196 Inoue Jul 1991 A
5030222 Calandruccio et al. Jul 1991 A
5031203 Trecha Jul 1991 A
5042486 Pfeiler et al. Aug 1991 A
5047036 Koutrouvelis Sep 1991 A
5050608 Watanabe et al. Sep 1991 A
5054492 Scribner et al. Oct 1991 A
5057095 Fabian Oct 1991 A
5059789 Salcudean Oct 1991 A
5078140 Kwoh Jan 1992 A
5079699 Tuy et al. Jan 1992 A
5086401 Glassman et al. Feb 1992 A
5094241 Allen Mar 1992 A
5097839 Allen Mar 1992 A
5098426 Sklar et al. Mar 1992 A
5099845 Besz et al. Mar 1992 A
5099846 Hardy Mar 1992 A
5105829 Fabian et al. Apr 1992 A
5107839 Houdek et al. Apr 1992 A
5107843 Aarnio et al. Apr 1992 A
5107862 Fabian et al. Apr 1992 A
5109194 Cantaloube Apr 1992 A
5119817 Allen Jun 1992 A
5142930 Allen et al. Sep 1992 A
5143076 Hardy et al. Sep 1992 A
5152288 Hoenig et al. Oct 1992 A
5160337 Cosman Nov 1992 A
5161536 Vilkomerson et al. Nov 1992 A
5178164 Allen Jan 1993 A
5178621 Cook et al. Jan 1993 A
5186174 Schlondorff et al. Feb 1993 A
5187475 Wagener et al. Feb 1993 A
5188126 Fabian et al. Feb 1993 A
5190059 Fabian et al. Mar 1993 A
5193106 DeSena Mar 1993 A
5197476 Nowacki et al. Mar 1993 A
5197965 Cherry et al. Mar 1993 A
5198768 Keren Mar 1993 A
5198877 Schulz Mar 1993 A
5207688 Carol May 1993 A
5211164 Allen May 1993 A
5211165 Dumoulin et al. May 1993 A
5211176 Ishiguro et al. May 1993 A
5212720 Landi et al. May 1993 A
5214615 Bauer May 1993 A
5219351 Teubner et al. Jun 1993 A
5222499 Allen et al. Jun 1993 A
5224049 Mushabac Jun 1993 A
5228442 Imran Jul 1993 A
5230338 Allen et al. Jul 1993 A
5230623 Guthrie et al. Jul 1993 A
5233990 Barnea Aug 1993 A
5237996 Waldman et al. Aug 1993 A
5249581 Horbal et al. Oct 1993 A
5251127 Raab Oct 1993 A
5251635 Dumoulin et al. Oct 1993 A
5253647 Takahashi et al. Oct 1993 A
5255680 Darrow et al. Oct 1993 A
5257636 White Nov 1993 A
5257998 Ota et al. Nov 1993 A
5261404 Mick et al. Nov 1993 A
5265610 Darrow et al. Nov 1993 A
5265611 Hoenig et al. Nov 1993 A
5269759 Hernandez et al. Dec 1993 A
5271400 Dumoulin et al. Dec 1993 A
5273025 Sakiyama et al. Dec 1993 A
5274551 Corby, Jr. Dec 1993 A
5279309 Taylor et al. Jan 1994 A
5285787 Machida Feb 1994 A
5291199 Overman et al. Mar 1994 A
5291889 Kenet et al. Mar 1994 A
5295483 Nowacki et al. Mar 1994 A
5297549 Beatty et al. Mar 1994 A
5299253 Wessels Mar 1994 A
5299254 Dancer et al. Mar 1994 A
5299288 Glassman et al. Mar 1994 A
5300080 Clayman et al. Apr 1994 A
5305091 Gelbart et al. Apr 1994 A
5305203 Raab Apr 1994 A
5306271 Zinreich et al. Apr 1994 A
5307072 Jones, Jr. Apr 1994 A
5309913 Kormos et al. May 1994 A
5315630 Sturm et al. May 1994 A
5316024 Hirschi et al. May 1994 A
5318025 Dumoulin et al. Jun 1994 A
5320111 Livingston Jun 1994 A
5325728 Zimmerman et al. Jul 1994 A
5325873 Hirschi et al. Jul 1994 A
5329944 Fabian et al. Jul 1994 A
5330485 Clayman et al. Jul 1994 A
5333168 Fernandes et al. Jul 1994 A
5353795 Souza et al. Oct 1994 A
5353800 Pohndorf et al. Oct 1994 A
5353807 DeMarco Oct 1994 A
5359417 Muller et al. Oct 1994 A
5368030 Zinreich et al. Nov 1994 A
5371778 Yanof et al. Dec 1994 A
5375596 Twiss et al. Dec 1994 A
5377678 Dumoulin et al. Jan 1995 A
5383454 Bucholz Jan 1995 A
5385146 Goldreyer Jan 1995 A
5385148 Lesh et al. Jan 1995 A
5386828 Owens et al. Feb 1995 A
5389101 Heilbrun et al. Feb 1995 A
5391199 Ben-Haim Feb 1995 A
5394457 Leibinger et al. Feb 1995 A
5394875 Lewis et al. Mar 1995 A
5397329 Allen Mar 1995 A
5398684 Hardy Mar 1995 A
5399146 Nowacki et al. Mar 1995 A
5400384 Fernandes et al. Mar 1995 A
5402801 Taylor Apr 1995 A
5408409 Glassman et al. Apr 1995 A
5413573 Koivukangas May 1995 A
5417210 Funda et al. May 1995 A
5419325 Dumoulin et al. May 1995 A
5423334 Jordan Jun 1995 A
5425367 Shapiro et al. Jun 1995 A
5425382 Golden et al. Jun 1995 A
5426683 O'Farrell, Jr. et al. Jun 1995 A
5426687 Goodall et al. Jun 1995 A
5427097 Depp Jun 1995 A
5429132 Guy et al. Jul 1995 A
5433198 Desai Jul 1995 A
RE35025 Anderton Aug 1995 E
5437277 Dumoulin et al. Aug 1995 A
5443066 Dumoulin et al. Aug 1995 A
5443489 Ben-Haim Aug 1995 A
5444756 Pai et al. Aug 1995 A
5445144 Wodicka et al. Aug 1995 A
5445150 Dumoulin et al. Aug 1995 A
5445166 Taylor Aug 1995 A
5446548 Gerig et al. Aug 1995 A
5447154 Cinquin et al. Sep 1995 A
5448610 Yamamoto et al. Sep 1995 A
5453686 Anderson Sep 1995 A
5456718 Szymaitis Oct 1995 A
5457641 Zimmer et al. Oct 1995 A
5458718 Venkitachalam Oct 1995 A
5464446 Dreessen et al. Nov 1995 A
5469847 Zinreich et al. Nov 1995 A
5478341 Cook et al. Dec 1995 A
5478343 Ritter Dec 1995 A
5480422 Ben-Haim Jan 1996 A
5480439 Bisek et al. Jan 1996 A
5483961 Kelly et al. Jan 1996 A
5485849 Panescu et al. Jan 1996 A
5487391 Panescu Jan 1996 A
5487729 Avellanet et al. Jan 1996 A
5487757 Truckai et al. Jan 1996 A
5490196 Rudich et al. Feb 1996 A
5494034 Schlondorff et al. Feb 1996 A
5503416 Aoki et al. Apr 1996 A
5513637 Twiss et al. May 1996 A
5514146 Lam et al. May 1996 A
5515160 Schulz et al. May 1996 A
5517990 Kalfas et al. May 1996 A
5531227 Schneider Jul 1996 A
5531520 Grimson et al. Jul 1996 A
5542938 Avellanet et al. Aug 1996 A
5543951 Moehrmann Aug 1996 A
5546940 Panescu et al. Aug 1996 A
5546949 Frazin et al. Aug 1996 A
5546951 Ben-Haim Aug 1996 A
5551429 Fitzpatrick et al. Sep 1996 A
5558091 Acker et al. Sep 1996 A
5566681 Manwaring et al. Oct 1996 A
5568384 Robb et al. Oct 1996 A
5568809 Ben-haim Oct 1996 A
5572999 Funda et al. Nov 1996 A
5573533 Strul Nov 1996 A
5575794 Walus et al. Nov 1996 A
5575798 Koutrouvelis Nov 1996 A
5583909 Hanover Dec 1996 A
5588430 Bova et al. Dec 1996 A
5590215 Allen Dec 1996 A
5592939 Martinelli Jan 1997 A
5595193 Walus et al. Jan 1997 A
5596228 Anderton et al. Jan 1997 A
5600330 Blood Feb 1997 A
5603318 Heilbrun et al. Feb 1997 A
5611025 Lorensen et al. Mar 1997 A
5617462 Spratt Apr 1997 A
5617857 Chader et al. Apr 1997 A
5619261 Anderton Apr 1997 A
5622169 Golden et al. Apr 1997 A
5622170 Schulz Apr 1997 A
5627873 Hanover et al. May 1997 A
5628315 Vilsmeier et al. May 1997 A
5630431 Taylor May 1997 A
5636255 Ellis et al. Jun 1997 A
5636644 Hart et al. Jun 1997 A
5638819 Manwaring et al. Jun 1997 A
5640170 Anderson Jun 1997 A
5642395 Anderton et al. Jun 1997 A
5643268 Vilsmeier et al. Jul 1997 A
5645065 Shapiro et al. Jul 1997 A
5646524 Gilboa Jul 1997 A
5647361 Damadian Jul 1997 A
5662111 Cosman Sep 1997 A
5664001 Tachibana et al. Sep 1997 A
5674296 Bryan et al. Oct 1997 A
5676673 Ferre et al. Oct 1997 A
5681260 Ueda et al. Oct 1997 A
5682886 Delp et al. Nov 1997 A
5682890 Kormos et al. Nov 1997 A
5690108 Chakeres Nov 1997 A
5694945 Ben-Haim Dec 1997 A
5695500 Taylor et al. Dec 1997 A
5695501 Carol et al. Dec 1997 A
5697377 Wittkampf Dec 1997 A
5702406 Vilsmeier et al. Dec 1997 A
5706811 Takeda et al. Jan 1998 A
5711299 Manwaring et al. Jan 1998 A
5713946 Ben-Haim Feb 1998 A
5715822 Watkins et al. Feb 1998 A
5715836 Kliegis et al. Feb 1998 A
5718241 Ben-Haim et al. Feb 1998 A
5727552 Ryan Mar 1998 A
5727553 Saad Mar 1998 A
5729129 Acker Mar 1998 A
5730129 Darrow et al. Mar 1998 A
5730130 Fitzpatrick et al. Mar 1998 A
5732703 Kalfas et al. Mar 1998 A
5735278 Hoult et al. Apr 1998 A
5738096 Ben-Haim Apr 1998 A
5740802 Nafis et al. Apr 1998 A
5741214 Ouchi et al. Apr 1998 A
5742394 Hansen Apr 1998 A
5744953 Hansen Apr 1998 A
5748767 Raab May 1998 A
5749362 Funda et al. May 1998 A
5749835 Glantz May 1998 A
5752513 Acker et al. May 1998 A
5755725 Druais May 1998 A
RE35816 Schulz Jun 1998 E
5758667 Slettenmark Jun 1998 A
5762064 Polvani Jun 1998 A
5767669 Hansen et al. Jun 1998 A
5767960 Orman Jun 1998 A
5769789 Wang et al. Jun 1998 A
5769843 Abela et al. Jun 1998 A
5769861 Vilsmeier Jun 1998 A
5772594 Barrick Jun 1998 A
5775322 Silverstein et al. Jul 1998 A
5776064 Kalfas et al. Jul 1998 A
5782765 Jonkman Jul 1998 A
5787886 Kelly et al. Aug 1998 A
5792055 McKinnon Aug 1998 A
5795294 Luber et al. Aug 1998 A
5797849 Vesely et al. Aug 1998 A
5799055 Peshkin et al. Aug 1998 A
5799099 Wang et al. Aug 1998 A
5800352 Ferre et al. Sep 1998 A
5800535 Howard, III Sep 1998 A
5802719 O'Farrell, Jr. et al. Sep 1998 A
5803089 Ferre et al. Sep 1998 A
5807252 Hassfeld et al. Sep 1998 A
5810008 Dekel et al. Sep 1998 A
5810728 Kuhn Sep 1998 A
5810735 Halperin et al. Sep 1998 A
5820553 Hughes Oct 1998 A
5823192 Kalend et al. Oct 1998 A
5823958 Truppe Oct 1998 A
5828725 Levinson Oct 1998 A
5828770 Leis et al. Oct 1998 A
5829444 Ferre et al. Nov 1998 A
5831260 Hansen Nov 1998 A
5833608 Acker Nov 1998 A
5834759 Glossop Nov 1998 A
5836954 Heilbrun et al. Nov 1998 A
5840024 Taniguchi et al. Nov 1998 A
5840025 Ben-Haim Nov 1998 A
5843076 Webster, Jr. et al. Dec 1998 A
5848967 Cosman Dec 1998 A
5851183 Bucholz Dec 1998 A
5865846 Bryan et al. Feb 1999 A
5868674 Glowinski et al. Feb 1999 A
5868675 Henrion et al. Feb 1999 A
5871445 Bucholz Feb 1999 A
5871455 Ueno Feb 1999 A
5871487 Warner et al. Feb 1999 A
5873822 Ferre et al. Feb 1999 A
5882304 Ehnholm et al. Mar 1999 A
5884410 Prinz Mar 1999 A
5889834 Vilsmeier et al. Mar 1999 A
5891034 Bucholz Apr 1999 A
5891157 Day et al. Apr 1999 A
5904691 Barnett et al. May 1999 A
5907395 Schulz et al. May 1999 A
5913820 Bladen et al. Jun 1999 A
5920395 Schulz Jul 1999 A
5921992 Costales et al. Jul 1999 A
5923727 Navab Jul 1999 A
5928248 Acker Jul 1999 A
5938603 Ponzi Aug 1999 A
5938694 Jaraczewski et al. Aug 1999 A
5947980 Jensen et al. Sep 1999 A
5947981 Cosman Sep 1999 A
5950629 Taylor et al. Sep 1999 A
5951475 Gueziec et al. Sep 1999 A
5951571 Audette Sep 1999 A
5954647 Bova et al. Sep 1999 A
5957844 Dekel et al. Sep 1999 A
5964796 Imran Oct 1999 A
5967980 Ferre et al. Oct 1999 A
5967982 Barnett Oct 1999 A
5968047 Reed Oct 1999 A
5971997 Guthrie et al. Oct 1999 A
5976156 Taylor et al. Nov 1999 A
5980535 Barnett et al. Nov 1999 A
5983126 Wittkampf Nov 1999 A
5987349 Schulz Nov 1999 A
5987960 Messner et al. Nov 1999 A
5999837 Messner et al. Dec 1999 A
5999840 Grimson et al. Dec 1999 A
6001130 Bryan et al. Dec 1999 A
6006126 Cosman Dec 1999 A
6006127 Van Der Brug et al. Dec 1999 A
6013087 Adams et al. Jan 2000 A
6014580 Blume et al. Jan 2000 A
6016439 Acker Jan 2000 A
6019725 Vesely et al. Feb 2000 A
6024695 Taylor et al. Feb 2000 A
6050724 Schmitz et al. Apr 2000 A
6059718 Taniguchi et al. May 2000 A
6063022 Ben-Haim May 2000 A
6071288 Carol et al. Jun 2000 A
6073043 Schneider Jun 2000 A
6076008 Bucholz Jun 2000 A
6096050 Audette Aug 2000 A
6104944 Martinelli Aug 2000 A
6118845 Simon et al. Sep 2000 A
6122538 Sliwa, Jr. et al. Sep 2000 A
6122541 Cosman et al. Sep 2000 A
6131396 Duerr et al. Oct 2000 A
6139183 Graumann Oct 2000 A
6147480 Osadchy et al. Nov 2000 A
6149592 Yanof et al. Nov 2000 A
6156067 Bryan et al. Dec 2000 A
6161032 Acker Dec 2000 A
6165181 Heilbrun et al. Dec 2000 A
6167296 Shahidi Dec 2000 A
6172499 Ashe Jan 2001 B1
6175756 Ferre et al. Jan 2001 B1
6178345 Vilsmeier et al. Jan 2001 B1
6194639 Botella et al. Feb 2001 B1
6201387 Govari Mar 2001 B1
6203497 Dekel et al. Mar 2001 B1
6211666 Acker Apr 2001 B1
6223067 Vilsmeier et al. Apr 2001 B1
6233476 Strommer et al. May 2001 B1
6235038 Hunter et al. May 2001 B1
6246231 Ashe Jun 2001 B1
6259942 Westermann et al. Jul 2001 B1
6259943 Cosman et al. Jul 2001 B1
6273896 Franck et al. Aug 2001 B1
6282437 Franck et al. Aug 2001 B1
6285902 Kienzle, III et al. Sep 2001 B1
6298262 Franck et al. Oct 2001 B1
6314310 Ben-Haim et al. Nov 2001 B1
6332089 Acker et al. Dec 2001 B1
6333971 McCrory et al. Dec 2001 B2
6341231 Ferre et al. Jan 2002 B1
6351659 Vilsmeier Feb 2002 B1
6379302 Kessman et al. Apr 2002 B1
6381485 Hunter et al. Apr 2002 B1
6402762 Hunter et al. Jun 2002 B2
6405072 Cosman Jun 2002 B1
6424856 Vilsmeier et al. Jul 2002 B1
6427314 Acker Aug 2002 B1
6428547 Vilsmeier et al. Aug 2002 B1
6434415 Foley et al. Aug 2002 B1
6437567 Schenck et al. Aug 2002 B1
6445943 Ferre et al. Sep 2002 B1
6470207 Simon et al. Oct 2002 B1
6474341 Hunter et al. Nov 2002 B1
6478802 Kienzle, III et al. Nov 2002 B2
6484049 Seeley et al. Nov 2002 B1
6490475 Seeley et al. Dec 2002 B1
6493573 Martinelli et al. Dec 2002 B1
6498944 Ben-Haim et al. Dec 2002 B1
6499488 Hunter et al. Dec 2002 B1
6516046 Frohlich et al. Feb 2003 B1
6527443 Vilsmeier et al. Mar 2003 B1
6551325 Neubauer et al. Apr 2003 B2
6584174 Schubert et al. Jun 2003 B2
6609022 Vilsmeier et al. Aug 2003 B2
6611700 Vilsmeier et al. Aug 2003 B1
6640128 Vilsmeier et al. Oct 2003 B2
6669635 Kessman et al. Dec 2003 B2
6675040 Cosman Jan 2004 B1
6694162 Hartlep Feb 2004 B2
6701179 Martinelli et al. Mar 2004 B1
6747539 Martinelli Jun 2004 B1
6968224 Kessman et al. Nov 2005 B2
7007699 Martinelli et al. Mar 2006 B2
7152608 Hunter et al. Dec 2006 B2
7657300 Hunter et al. Feb 2010 B2
7797032 Martinelli et al. Sep 2010 B2
20010007918 Vilsmeier et al. Jul 2001 A1
20020095081 Vilsmeier et al. Jul 2002 A1
20040024309 Ferre et al. Feb 2004 A1
20060036189 Martinelli et al. Feb 2006 A1
20100210939 Hartmann et al. Aug 2010 A1
20100331671 Martinelli et al. Dec 2010 A1
Foreign Referenced Citations (71)
Number Date Country
964149 Mar 1975 CA
3042343 Jun 1982 DE
3508730 Sep 1986 DE
3717871 Dec 1988 DE
3831278 Mar 1989 DE
3838011 Jul 1989 DE
4213426 Oct 1992 DE
4225112 Dec 1993 DE
4233978 Apr 1994 DE
19715202 Oct 1998 DE
19751761 Oct 1998 DE
19832296 Feb 1999 DE
19747427 May 1999 DE
10085137 Nov 2002 DE
0062941 Oct 1982 EP
0119660 Sep 1984 EP
0155857 Sep 1985 EP
0319844 Jun 1989 EP
0326768 Aug 1989 EP
350996 Jan 1990 EP
0359773 Mar 1990 EP
0419729 Apr 1991 EP
0427358 May 1991 EP
0456103 Nov 1991 EP
0469966 Feb 1992 EP
0501993 Sep 1992 EP
0581704 Feb 1994 EP
0651968 May 1995 EP
0655138 May 1995 EP
0894473 Feb 1999 EP
0908146 Apr 1999 EP
0930046 Jul 1999 EP
2417970 Sep 1979 FR
2618211 Jan 1989 FR
2094590 Sep 1982 GB
2164856 Apr 1986 GB
62327 Jan 1983 JP
2765738 Jun 1988 JP
63240851 Oct 1988 JP
3267054 Nov 1991 JP
6194639 Jul 1994 JP
WO-8809151 Dec 1988 WO
WO-8905123 Jun 1989 WO
WO-9005494 May 1990 WO
WO-9103982 Apr 1991 WO
WO-9104711 Apr 1991 WO
WO-9107726 May 1991 WO
WO-9203090 Mar 1992 WO
WO-9206645 Apr 1992 WO
WO-9404938 Mar 1994 WO
WO-9423647 Oct 1994 WO
WO-9424933 Nov 1994 WO
WO-9507055 Mar 1995 WO
WO-9611624 Apr 1996 WO
WO-9632059 Oct 1996 WO
WO-9736192 Oct 1997 WO
WO-9749453 Dec 1997 WO
WO-9808554 Mar 1998 WO
WO-9838908 Sep 1998 WO
WO-9915097 Apr 1999 WO
WO-9921498 May 1999 WO
WO-9923956 May 1999 WO
WO-9926549 Jun 1999 WO
WO-9927839 Jun 1999 WO
WO-9929253 Jun 1999 WO
WO-9933406 Jul 1999 WO
WO-9937208 Jul 1999 WO
WO-9938449 Aug 1999 WO
WO-9952094 Oct 1999 WO
WO-9960939 Dec 1999 WO
WO-0130437 May 2001 WO
Non-Patent Literature Citations (132)
Entry
“Prestige Cervical Disc System Surgical Technique”, 12 pgs.
Adams et al., “Orientation Aid for Head and Neck Surgeons,” Innov. Tech. Biol. Med., vol. 13, No. 4, 1992, pp. 409-424.
Adams et al., Computer-Assisted Surgery, IEEE Computer Graphics & Applications, pp. 43-51, (May 1990).
Barrick et al., “Prophylactic Intramedullary Fixation of the Tibia for Stress Fracture in a Professional Athlete,” Journal of Orthopaedic Trauma, vol. 6, No. 2, pp. 241-244 (1992).
Barrick et al., “Technical Difficulties with the Brooker-Wills Nail in Acute Fractures of the Femur,” Journal of Orthopaedic Trauma, vol. 6, No. 2, pp. 144-150 (1990).
Barrick, “Distal Locking Screw Insertion Using a Cannulated Drill Bit: Technical Note,” Journal of Orthopaedic Trauma, vol. 7, No. 3, 1993, pp. 248-251.
Batnitzky et al., “Three-Dimensinal Computer Reconstructions of Brain Lesions from Surface Contours Provided by Computed Tomography: A Prospectus,” Neurosurgery, vol. 11, No. 1, Part 1, 1982, pp. 73-84.
Benzel, E. et al., Magnetic Source Imaging: A Review of the Magnes System of Biomagnetic Technologies Incorporated, Neurosurgery, vol. 33, No. 2 (Aug. 1993).
Bergstrom et al. Stereotaxic Computed Tomography, Am. J. Roentgenol, vol. 127 pp. 167-170 (1976).
Bouazza-Marouf et al.; “Robotic-Assisted Internal Fixation of Femoral Fractures”, IMECHE., pp. 51-58 (1995).
Brack et al., “Accurate X-ray Based Navigation in Computer-Assisted Orthopedic Surgery,” CAR '98, pp. 716-722.
BrainLab marketing brochures for Vector Vision (undated) (26 pages).
Brown, R., M.D., A Stereotactic Head Frame for Use with CT Body Scanners, Investigative Radiology .Copyrgt. J.B. Lippincott Company, pp. 300-304 (Jul.-Aug. 1979).
Bryan, “Bryan Cervical Disc System Single Level Surgical Technique”, Spinal Dynamics, 2002, pp. 1-33.
Bucholz, R.D., et al. Image-guided surgical techniques for infections and trauma of the central nervous system, Neurosurg. Clinics of N.A., vol. 7, No. 2, pp. 187-200 (1996).
Bucholz, R.D., et al., A Comparison of Sonic Digitizers Versus Light Emitting Diode-Based Localization, Interactive Image-Guided Neurosurgery, Chapter 16, pp. 179-200 (1993).
Bucholz, R.D., et al., Intraoperative localization using a three dimensional optical digitizer, SPIE—The Intl. Soc. for Opt. Eng., vol. 1894, pp. 312-322 (Jan. 17-19, 1993).
Bucholz, R.D., et al., Intraoperative Ultrasonic Brain Shift Monitor and Analysis, Stealth Station Marketing Brochure (2 pages) (undated).
Bucholz, R.D., et al., The Correction of Stereotactic Inaccuracy Caused by Brain Shift Using an Intraoperative Ultrasound Device, First Joint Conference, Computer Vision, Virtual Reality and Robotics in Medicine and Medical Robotics andComputer-Assisted Surgery, Grenoble, France, pp. 459-466 (Mar. 19-22, 1997).
Bucholz, R.D., et al., Variables affecting the accuracy of stereotactic localization using computerized tomography, J. Neurosurg., vol. 79, pp. 667-673 (1993).
Champleboux et al., “Accurate Calibration of Cameras and Range Imaging Sensors: the NPBS Method,” IEEE International Conference on Robotics and Automation, Nice, France, May 1992.
Champleboux, “Utilisation de Fonctions Splines pour la Mise au Point D'un Capteur Tridimensionnel sans Contact,” Quelques Applications Medicales, Jul. 1991.
Cinquin et al., “Computer Assisted Medical Interventions,” International Advanced Robotics Programme, Sep. 1989, pp. 63-65.
Cinquin, P. et al., Computer Assisted Medical Interventions, IEEE, pp. 254-263 (May/Jun. 1995).
Clarysse, P. et al., A Computer-Assisged System for 3-D Frameless Localization in Stereotaxic MRI, IEEE Trans. on Med. Imaging, vol. 10, No. 4, pp. 523-529 (Dec. 1991).
Cutting M.D. et al., Optical Tracking of Bone Fragments During Craniofacial Surgery, Second Annual International Symposium on Medical Robotics and Computer Assisted Surgery, pp. 221-225, (Nov. 1995).
Feldmar et al., “3D-2D Projective Registration of Free-Form Curves and Surfaces,” Rapport de recherche (Inria Sophia Antipolis), 1994, pp. 1-44.
Foley et al., “Fundamentals of Interactive Computer Graphics,” The Systems Programming Series, Chapter 7, Jul. 1984, pp. 245-266.
Foley, K.T., et al., Image-guided Intraoperative Spinal Localization, Intraoperative Neuroprotection, Chapter 19, pp. 325-340 (1996).
Foley, K.T., et al., The StealthStation: Three-Dimensional Image-Interactive Guidance for the Spine Surgeon, Spinal Frontiers, pp. 7-9 (Apr. 1996).
Friets, E.M., et al. A Frameless Stereotaxic Operating Microscope for Neurosurgery, IEEE Trans. On Biomed. Eng., vol. 36, No. 6, pp. 608-617 (Jul. 1989).
Gallen, C.C., et al., Intracranial Neurosurgery Guided by Functional Imaging, Surg. Neurol., vol. 42, pp. 523-530 (1994).
Galloway, R.L., et al., Interactive Image-Guided Neurosurgery, IEEE Trans. on Biomed. Eng., vol. 89, No. 12, pp. 1226-1231 (1992).
Galloway, R.L., Jr. et al, Optical localization for interactive, image-guided neurosurgery, SPIE, vol. 2164, (May 1, 1994) pp. 137-145.
Germano, “Instrumentation, Technique and Technology”, Neurosurgery, vol. 37, No. 2, Aug. 1995, pp. 348-350.
Gildenberg et al., “Calculation of Stereotactic Coordinates from the Computed Tomographic Scan,” Neurosurgery, vol. 10, No. 5, May 1982, pp. 580-586.
Gomez, C.R., et al., Transcranial Doppler Ultrasound Following Closed Head Injury: Vasospasm or Vasoparalysis?, Surg. Neurol., vol. 35, pp. 30-35 (1991).
Gonzalez, “Digital Image Fundamentals,” Digital Image processing, Second Edition, 1987, pp. 52-54.
Gottesfeld Brown et al., “Registration of Planar Film Radiographs with Computer Tomography,” Proceedings of MMBIA, Jun. '96, pp. 42-51.
Grimson, W.E.L., An Automatic Registration Method for Frameless Stereotaxy, Image Guided Surgery, and enhanced Reality Visualization, IEEE, pp. 430-436 (1994).
Grimson, W.E.L., et al., Virtual-reality technology is giving surgeons the equivalent of x-ray vision helping them to remove tumors more effectively, to minimize surgical wounds and to avoid damaging critical tissues, Sci. Amer., vol. 280, No. 6,pp. 62-69 (Jun. 1999).
Gueziec et al., “Registration of Computed Tomography Data to a Surgical Robot Using Fluoroscopy: A Feasibility Study,” Computer Science/Mathematics, Sep. 27, 1996, 6 pages.
Guthrie, B.L., Graphic-Interactive Cranial Surgery: the Operating Arm System, Handbook of Stereotaxy Using the CRW Apparatus, Chapter 13, (1994) pp. 193-211.
Hamadeh et al, “Kinematic Study of Lumbar Spine Using Functional Radiographies and 3D/2D Registration,” TIMC UMR 5525—IMAG (1997).
Hamadeh et al., “Automated 3-Dimensional Computed Tomographic and Fluorscopic Image Registration,” Computer Aided Surgery (1998), 3:11-19.
Hamadeh, A., et al., Toward automatic registration between CT and X-ray images: cooperation between 3D/2D registration and 2D edge detection, TIMC-IMAG, Faculte de Medecine de Grenoble, France, pp. 39-46 (1995) (Second Annual Intl. Symposium onMedical Robotics and Computer-Assisged Surgery, MRCAS '95, Nov. 4-7, 1995).
Hardy, T., M.D., et al., CASS: A Program for Computer Assisted Stereotaxic Surgery, The Fifth Annual Symposium on Comptuer Applications in Medical Care, Proceedings, Nov. 1-4, 1981, IEEE, pp. 1116-1126, (1981).
Hatch, “Reference-Display System for the Integration of CT Scanning and the Operating Microscope,” Thesis, Thayer School of Engineering, Oct. 1984, pp. 1-189.
Hatch, J.F., Reference-Display System for the Integration of CT Scanning and the Operating Microscope, IEEE, vol. 8, pp. 252-254, Proceedings of the Eleventh Annual Northeast Bioengineering Conference (Worcester, Massachusetts) (Mar. 14-15, 1985).
Heilbrun, M.D., Progressive Technology Applications, Neurosurgery for the Third Millenium, Chapter 15, J. Whitaker & Sons, Ltd., Amer. Assoc. of Neurol. Surgeons, pp. 191-198 (1992).
Heilbrun, M.P., Computed Tomography—Guided Stereotactic Systems, Clinical Neurosurgery, Chapter 31, pp. 564-581 (1983).
Heilbrun, M.P., et al., Preliminary Experience with Brown-Roberts-Wells (BRW) computerized tomography stereotaxic guidance system, J. Neurosurg., vol. 59, pp. 217-222 (1983).
Heilbrun, M.P., et al., Stereotactic Localization and Guidance Using a Machine Vision Technique, Sterotact & Funct. Neurosurg., Proceed. of the Mtg. of the Amer. Soc. for Sterot. and Funct. Neurosurg. (Pittsburgh, PA) vol. 58, pp. 94-98 (1992).
Henderson, J.M., et al., an Accurate and Ergonomic Method of Registration for Image-guided Neurosurgery, Computerized Medical Imaging and Graphics, vol. 18, No. 4, pp. 273-277 (1994).
Hoerenz, “The Operating Microscope I. Optical Principles, Illumination Systems, and Support Systems,” Journal of Microsurgery, vol. 1, 1980, pp. 364-369.
Hofstetter, R., et al., Fluroscopy Based Surgical Navigation-Concept and Clinical Applications, Computer Assisged Radiology and Surgery, Car '97, Proceed. of the 11th Intl. Symp. and Exh., Berlin, pp. 956-960 (Jun. 25-28, 1997).
Horner, N.B. et al., A Comparison of CT-Stereotaxic Brain Biopsy Techniques, Investig. Radiol., vol. 19, pp. 367-373 (Sep.-Oct. 1984).
Hounsfield, “Computerized transverse axial scanning (tomography): Part 1. Description of system,” British Journal of Radiology, vol. 46, No. 552, Dec. 1973, pp. 1016-1022.
Jacques et al., “A Computerized Microstereotactic Method to Approach, 3-Dimensionally Reconstruct, Remove and Adjuvantly Treat Small Cns Lesions,” Applied Neurophysiology, vol. 43, 1980, pp. 176-182.
Jacques et al., “Computerized three-dimensional stereotaxic removal of small central nervous system lesion in patients,” J. Neurosurg., vol. 53, Dec. 1980, pp. 816-820.
Joskowicz et al., “Computer-Aided Image-Guided Bone Fracture Surgery: Concept and Implementation,” Car '98, pp. 710-715.
Kall, B., The Impact of Computer and Imgaging Technology on Stereotactic Surgery, Proceedings of the Meeting of the American Society for Stereotactic and Functional Neurosurgery, pp. 10-22 (1987).
Kato, A., et al., A frameless, armless navigational system for computer-assisted neurosurgery, J. Neurosurg., vol. 74, pp. 845-849 (May 1991).
Kelly et al., “Computer-assisted stereotaxic laser resection of intra-axial brain neoplasms,” Journal of Neurosurgery, vol. 64, Mar. 1986, pp. 427-439.
Kelly et al., “Precision Resection of Intra-Axial CNA Lesions by CT-Based Stereotactic Craniotomy and Computer Monitored CO2 Laser,” Acta Neurochirurgica, vol. 68, 1983, pp. 1-9.
Kelly, P.J., Computer Assisted Stereotactic Biopsy and Volumetric Resection of Pediatric Brain Tumors, Brain Tumors in Children, Neurologic Clinics, vol. 9, No. 2, pp. 317-336 (May 1991).
Kelly, P.J., Computer-Directed Stereotactic Resection of Brain Tumors, Neurologica Operative Atlas, vol. 1, No. 4, pp. 299-313 (1991).
Kelly, P.J., et al., Results of Computed Tomography-based Computer-assisted Stereotactic Resection of Metastatic Intracranial Tumors, Neurosurgery, vol. 22, No. 1, Part 1, 1988, pp. 717 (Jan. 1988).
Kelly, P.J., Stereotactic Imaging, Surgical Planning and Computer-Assisted Resection of Intracranial Lesions: Methods and Results, Advances and Technical Standards in Neurosurgery, vol. 17, pp. 78-118, (1990).
Kim, W.S. et al., A Helmet Mounted Display for Telerobotics, IEEE, pp. 543-547 (1988).
Klimek, L., et al., Long-Term Experience with Different Types of Localization Systems in Skull-Base Surgery, Ear, Nose & Throat Surgery, Chapter 51, (1996) pp. 635-638.
Kosugi, Y., et al., An Articulated Neurosurgical Navigation System Using MRI and CT Images, IEEE Trans. on Biomed, Eng. vol. 35, No. 2, pp. 147-152 (Feb. 1988).
Kosugi, Y., et al., An Articulated Neurosurgical Navigation System Using MRI and CT Images, IEEE Trans. on Biomed., Eng., vol. 35, No. 2, pp. 147-152 (Feb. 1988).
Krybus, W., et al., Navigation Support for Surgery by Means of Optical Position Detection, Computer Assisted Radiology Proceed. of the Intl. Symp. CAR '91 Computed Assisted Radiology, pp. 362-366 (Jul. 3-6, 1991).
Kwoh, Y.S., Ph.D., et al., A New Computerized Tomographic-Aided Robotic Stereotaxis System, Robotics Age, vol. 7, No. 6, pp. 17-22 (Jun. 1985).
Laitinen et al., “An Adapter for Computed Tomography-Guided, Stereotaxis,” Surg. Neurol., 1985, pp. 559-566.
Laitinen, “Noninvasive multipurpose stereoadapter,” Neurological Research, Jun. 1987, pp. 137-141.
Lavallee et al, “Matching 3-D Smooth Surfaces with their 2-D Projections using 3-D Distance Maps,” SPIE, vol. 1570, Geometric Methods in Computer Vision, 1991, pp. 322-336.
Lavallee et al., “Computer Assisted Driving of a Needle into the Brain,” Proceedings of the International Symposium CAR '89, Computer Assisted Radiology, 1989, pp. 416-420.
Lavallee et al., “Matching of Medical Images for Computed and Robot Assisted Surgery,” IEEE EMBS, Orlando, 1991.
Lavallee, “VI Adaption de la Methodologie a Quelques Applications Cliniques,” Chapitre VI, pp. 133-148.
Lavallee, S., et al., A new system for computer assisted neurosurgery, IEEE EMBS, 11th Annual Intl. Conf., pp. 926-927 (1989).
Lavallee, S., et al., Computer Assisted Interventionist Imaging: the Instance of Stereotactic Brain Surgery, MEDINFO 89, pp. 613-617 (1989).
Lavallee, S., et al., Computer Assisted Knee Anterior Cruciate Ligament Reconstruction First Clinical Tests, Proceedings of the First International Symposium on Medical Robotics and Computer Assisted Surgery, pp. 11-16 (Sep. 1994).
Lavallee, S., et al., Computer Assisted Medical Interventions, NATO Asi Series, vol. F 60, 3d Imaging in Medic., pp. 301-312 (1990).
Lavallee, S., et al., Computer Assisted Spine Surgery: a technique for accurate transpedicular screw fixation using CT data and a 3-D optical localizer, TIMC, Faculte de Medecine de Grenoble, J. of Image Guided Surg., vol. 1, No. 1, pp. 65-73 (1995).
Lavallee, S., et al., Image guided operating robot: a clinical application in stereotactic neurosurgery, IEEE Rob. And Autom. Society, Proc. of the 1992 Intl. Conf. on Rob. And Autom., May 1992, pp. 618-624, First Intl. Symp. on Med. Rob. and Comp.Assisted Surg. (Pittsburg, PA) (Sep. 22-24, 1994).
Leavitt, D.D., et al., Dynamic Field Shaping to Optimize Stereotactic Radiosurgery, I.J. Rad. Onc. Biol. Physc., vol. 21, pp. 1247-1255 (1991).
Leksell et al., “Stereotaxis and Tomography—A Technical Note,” ACTA Neurochirurgica, vol. 52, 1980, pp. 1-7.
Lemieux, L., et al., A patient-to-computed-tomography image registration method based on digitally reconstructed radiographs, Med. Phys., vol. 21, No. 11, pp. 1749-1760 (1994).
Levin et al., “The Brain: Integrated Three-dimensional Display of MR and PET Images,” Radiology, vol. 172, No. 3, Sep. 1989, pp. 783-789 1.
Maurer, Jr., et al., Registration of Head CT Images to Physical Space Using a Weighted Combination of Points and Surfaces, IEEE Trans. on Med. Imaging, vol. 17, No. 5, pp. 753-761 (Oct. 1998).
Mazier et al., Chirurgie de la Colonne Vertebrale Assistee par Ordinateur: Appication au Vissage Pediculaire, Innov. Tech. Biol. Med., vol. 11, No. 5, 1990, pp. 559-566.
Mazier, B., et al., Computer assisted interventionist imaging: application to the vertebral column surgery, Annual Intl. Conf. of the IEEE in Medic. and Biol. Soc., vol. 12, No. 1, pp. 430-431 (1990).
McGirr, S., M.D., et al., Stereotactic Resection of Juvenile Pilocytic Astrocytomas of the Thalamus and Basal Ganglia, Neurosurgery, vol. 20, No. 3, pp. 447-452, (1987).
Merloz, P., et al., Computer assisted Spine Surgery, Clinical Orthop. and Related Research, No. 337, pp. 86-96 (1997).
Ng, W.S. et al., Robotic Surgery—A First-Hand Experience in Transurethral Resection of the Prostate Surgery, IEEE Eng. in Med. and Biology, pp. 120-125 (Mar. 1993).
Pelizzari , C.A., et al., Accurate Three-Dimensional Registration of CT, PET, and/or MR Images of the Brain, Journal of Computer Assisted Tomography, vol. 13, No. 1, pp. 20-26 (Jan./Feb. 1989).
Pelizzari et al., “Interactive 3D Patient-Image Registration,” Information Processing in Medical Imaging, 12th International Conference, IPMI '91, Jul. 7-12, 136-141 (A.C.F. Colchester et al. eds. 1991).
Pelizzari et al., No. 528—“Three Dimensional Correlation of PET, CT and MRI Images,” The Journal of Nuclear Medicine, vol. 28, No. 4, Apr. 1987, p. 682.
Penn, R.D., et al., Stereotactic Surgery with Image Processing of Computerized Tomographic Scans, Neurosurgery, vol. 3, No. 2, pp. 157-163 (Sep.-Oct. 1978).
Phillips et al., “Image Guided Orthopaedic Surgery Design and Analysis,” Trans Inst. MC, vol. 17, No. 5, 1995, pp. 251-264.
Pixsys 3-D Digitizing Accessories, by Pixsys (marketing brochure)(undated) (2 pages).
Potamianos et al., “Intra-Operative Imaging Guidance for Keyhole Surgery Methodology and Calibration,” First International Symposium on Medical Robotics and Computer Assisted Surgery, Sep. 22-24, 1994, pp. 98-104.
Reinhardt et al., “CT-Guided ‘Real Time’ Stereotaxy,” ACTA Neurochirurgica, 1989.
Reinhardt, H., et al., A Computer-Assisted Device for Intraoperative CT-Correlated Localization of Brain Tumors, pp. 51-58 (1988).
Reinhardt, H.F. et al., Sonic Stereometry in Microsurgical Procedures for Deep-Seated Brain Tumors and Vascular Malformations, Neurosurgery, vol. 32, No. 1, pp. 51-57 (Jan. 1993).
Reinhardt, H.F., et al., Mikrochirugische Entfernung tiefliegender Gefa.beta.mi.beta.bildungen mit Hilfe der Sonar-Stereometrie (Microsurgical Removal of Deep-Seated Vascular Malformations Using Sonar Stereometry). Ultraschall in Med. 12, pp. 80-83(1991).
Reinhardt, Hans. F., Neuronavigation: A Ten-Year Review, Neurosurgery, (1996) pp. 329-341.
Roberts, D.W., et al., A frameless stereotaxic integration of computerized tomographic imaging and the operating microscope. J. Neurosurg., vol. 65, pp. 545-549 (Oct. 1986).
Rosenbaum et al., “Computerized Tomography Guided Stereotaxis: A New Approach,” Applied Neurophysiology, vol. 43, No. 3-5, 1980, pp. 172-173.
Sautot, “Vissage Pediculaire Assiste Par Ordinateur,” Sep. 20, 1994.
Schueler et al., “Correction of Image Intensifier Distortion for Three-Dimensional X-Ray Angiography,” SPIE Medical Imaging 1995, vol. 2432, pp. 272-279.
Selvik et al., “A Roentgen Stereophotogrammetric System,” Acta Radiologica Diagnosis, 1983, pp. 343-352.
Shelden et al., “Development of a computerized microsteroetaxic method for localization and removal of minute CNS lesions under direct 3-D vision,” J. Neurosurg., vol. 52, 1980, pp. 21-27.
Simon, D.A., Accuracy Validation in Image-Guided Orthopaedic Surgery, Second Annual Intl. Symp. on Med. Rob. an Comp-Assisted surgery, MRCAS (1995), pp. 185-192.
Smith et al., “Computer Methods for Improved Diagnostic Image Display Applied to Stereotactic Neurosurgery,” Automedical, vol. 14, 1992, pp. 371-382 (4 unnumbered pages).
Smith, K.R., et al. Multimodality Image Analysis and Display Methods for Improved Tumor Localization in Stereotactic Neurosurgery, Annul Intl. Conf. of the IEEE Eng. in Med. and Biol. Soc., vol. 13, No. 1, p. 210 (1991).
Smith, K.R., et al., The Neurostation-a highly, accurate, minimally invasive solution to frameless stereotatic neurosurgery, Comput. Med. Imag. and Graph., vol. 18, No. 4, pp. 247256 (1994).
Stereotactic One, Affordable PC Based Graphics for Stereotactic Surgery, Stereotactic Image Systems, Inc. (SLC, Utah) (marketing brochure, undated).
Tan, K., Ph.D., et al., A frameless stereotactic approach to neurosurgical planning based on retrospective patient-image registration, J Neurosurgy, vol. 79, pp. 296-303 (Aug. 1993).
The Laitinen Stereoadapter 500, Instructions for use. By Surgical Navigation Technologies, FDA-NS-001A Rev. 0 (undated).
The Laitinen Stereotactic System, E2-E6.
Thompson, et al., A System for Anatomical and Functional Mapping of the Human Thalamus, Computers and Biomedical Research, vol. 10, pp. 9-24 (1977).
Trobraugh, J.W., et al., Frameless Stereotactic Ultrasonography: Method and Applications, Computerized Medical Imaging and Graphics, vol. 18, No. 4, pp. 235-246 (1994).
Viant et al., “A Computer Assisted Orthopaedic System for Distal Locking of Intramedullary Nails,” Proc. of MediMEC '95, Bristol, 1995, pp. 86-91.
Von Hanwhr et al., Foreword, Computerized Medical Imaging and Graphics, vol. 18, No. 4, pp. 225-228, (Jul.-Aug. 1994).
Wang, M.Y., et al., An Automatic Technique for Finding and Localizing Externally Attached Markers in CT and MR Volumn Images of the Head, IEEE Trans. on Biomed. Eng., vol. 43, No. 6, pp. 627-637 (Jun. 1996).
Watanabe et al., “Three-Dimensional Digitizer (Neuronavigator): New Equipment for Computed Tomography-Guided Stereotaxic Surgery,” Surgical Neurology, vol. 27, No. 6, Jun. 1987, pp. 543-547.
Watanabe, “Neuronavigator,” Igaku-no-Ayumi, vol. 137, No. 6, May 10, 1986, pp. 1-4.
Watanabe, E., M.D., et al., Open Surgery Assisted by the Neuronavigator, a Stereotactic, Articulated, Sensitive Arm, Neurosurgery, vol. 28, No. 6, pp. 792-800 (1991).
Weese et al., “An Approach to 2D/3D Registration of a Vertebra in 2D X-ray Fluoroscopies with 3D CT Images,” (1997) pp. 119-128.
Related Publications (1)
Number Date Country
20100137707 A1 Jun 2010 US
Continuations (2)
Number Date Country
Parent 10103685 Mar 2002 US
Child 12697841 US
Parent 09429569 Oct 1999 US
Child 10103685 US