Method and apparatus for positioning a reference frame

Information

  • Patent Grant
  • 8467851
  • Patent Number
    8,467,851
  • Date Filed
    Monday, November 15, 2010
    14 years ago
  • Date Issued
    Tuesday, June 18, 2013
    11 years ago
Abstract
A method and apparatus to perform a procedure that can include a processor assisted surgical procedure. During the procedure patient space and image space can be registered to allow for tracking of various tracking sensors. A dynamic reference frame can be used to maintain localization of the patient space with the image space. The dynamic reference frame can be fixedly interconnected with a bone portion of the anatomy.
Description
FIELD

The present disclosure relates generally to navigated surgery, and more specifically, to a method and apparatus for performing a surgical procedure to repair, localize, and/or replace a selected portion of an anatomy.


BACKGROUND

Image guided medical and surgical procedures utilize patient images obtained prior to or during a medical procedure to guide a physician performing the procedure. Such procedures can be referred to as computer assisted procedures. Recent advances in imaging technology, especially in imaging technologies that produce highly-detailed, two, three, and four dimensional images, such as computed tomography (CT), magnetic resonance imaging (MRI), fluoroscopic imaging (such as with a C-arm device), positron emission tomography (PET), and ultrasound imaging (US) has increased the interest in image guided medical procedures.


Typical image guided navigation systems generally require a dynamic reference frame to track the position of the patient should patient movement occur during the assisted procedure. The dynamic reference frame is generally affixed to the patient in a generally permanent or immovable fashion. The dynamic reference frame may also be used as a fiducial marker and may, therefore, be attached to the patient during the acquisition of pre-operative images. This enables the image space to be aligned with patient space during the navigated procedure.


Various instruments that are desired to be tracked may be used during an operative procedure. Image data is generally acquired, either intra-operatively or pre-operatively, and the instrument is generally illustrated, and superimposed on the captured image data to identify the position of the instrument relative to the patient space. Therefore, the instrument may include tracking sensors, such as electromagnetic coils or optical detection points, such as LEDs or reflectors that may be detected by a suitable tracking system. Also, a dynamic reference frame (DRF) can be used by the tracking system to maintain a registration or localization of the patient space to the image space. The DRF can be also any appropriate tracking sensor that is fixed to a portion of the patient that allows the system to determine whether the patient has moved relative to the image space.


Other types of navigation systems operate as an image-less system, where an image of the body is not captured by an imaging device prior to the medical procedure, such as the device disclosed in U.S. patent application Ser. No. 10/687,539, entitled Method And Apparatus For Surgical Navigation Of A Multiple Piece Construct For Implantation, filed Oct. 16, 2003, incorporated herein by reference. With this type of procedure, the system may use a probe to contact certain landmarks in the body, such as landmarks on bone, where the system generates either a two-dimensional or three-dimensional model of the area of interest based upon these contacts. This way, when the surgical instrument or other object is tracked relative to this area, they can be superimposed on this model.


Generally, regardless of the whether the system is using images or imageless, a dynamic reference frame is used to maintain registration of the patient space with the navigated or image space. The position of the patient can be determined in real time relative to the images, implant, instruments, etc. with the use of a dynamic reference frame.


Most types of orthopedic medical procedures are performed using conventional surgical techniques, such as spine, hip, knee, shoulder, a synovial joint, and a facet joint. These techniques generally involve opening the patient in a manner to provide adequate viewing by the surgeon during the medical procedure. Use of the navigated procedure may enable more precise and accurate placement of an implant within the patient and may also enable surgery with diminished visualization.


Although a dynamic reference frame can be attached to an external or skin portion of a patient, it may be desirable to attach the dynamic reference frame to a bone portion. Nevertheless, it is desirable to allow the dynamic reference frame to be easily yet fixedly attached to the patient. It may also be desirable to fix the dynamic reference frame to the patient with a single member in an easy or simple procedure.


SUMMARY

According to various embodiments, a surgical navigation system to allow a processor assisted surgical procedure on an anatomy including a bone is disclosed. The system can include a tracking system operable to track a tracking sensor. A dynamic reference frame can include a tracking sensor to be tracked by the tracking system. Also, a dynamic reference frame positioning member can engage the bone in the anatomy to selectively fix the dynamic reference frame relative to the anatomy. The dynamic reference frame positioning member can be driven into the bone to hold the dynamic reference frame in a selected position.


According to various embodiments a dynamic reference frame positioning member can position a dynamic reference frame in a selected position relative to a bone. The positioning member can include a member extending between a first end and a second end. The member can define a dynamic reference frame engaging portion defined nearer the first end than the second end. Also a bone engaging portion can extend from near the second end. The bone engaging portion can include a first member and a second member extending at an angle relative to one another. Also, the bone engaging portion can be driven into the bone.


According to various embodiments a method of using a dynamic reference frame positioning member to position a dynamic reference frame relative to a selected portion of an anatomy including a bone is disclosed. The method can includes positioning a cannula through a soft tissue portion of the anatomy relative to the bone portion. A dynamic reference frame positioning member can be positioned through the positioned cannula and positioned into engagement with the bone. The dynamic reference frame positioning member can be fixed relative to the bone in at least one of a rotational motion, axial motion, translation motion, or combination thereof. Also, the cannula can be removed.





BRIEF DESCRIPTION OF THE DRAWINGS

The present teachings will become more fully understood from the detailed description and the accompanying drawings, wherein:



FIG. 1 is a diagram of a navigation system according to various teachings;



FIGS. 2A and 2B are diagrams representing undistorted and distorted views from a fluoroscopic imaging device;



FIG. 3 is a positioning member according to various embodiments;



FIG. 4 is a positioning member according to various embodiments;



FIG. 5 is a positioning member according to various embodiments;



FIG. 6 is a kit including various instruments to perform a procedure according to various embodiments;



FIG. 7 is an environmental view of a dilator and cannula;



FIG. 8 is an environmental view of positioning a positioning member according to various embodiments;



FIG. 9 is an environmental view of the removal of a positioning member according to various embodiments;



FIG. 10 is a positioning member and tap cap according to various embodiments; and



FIG. 11 is a modular tracking sensor connected to a positioning member according to various embodiments.





DETAILED DESCRIPTION OF VARIOUS EMBODIMENTS

The following description of various embodiments is merely exemplary in nature and is in no way intended to limit the teachings, its application, or uses. A method and apparatus to perform a procedure that can include a processor assisted surgical procedure. During the procedure, patient space and image space can be registered to allow for tracking of various tracking sensors. A dynamic reference frame can be selectively interconnected with a portion of the anatomy to maintain localization of the patient space with the image space. Although the following description describes the use of a dynamic reference frame positioning member in relation to a pelvis, it will be understood that the dynamic reference frame may be positioned in any portion of the anatomy. Further, the dynamic reference frame can be used for an orthopedic procedure, a spinal procedure, a cardiac procedure or any other surgical or medical procedure.



FIG. 1 is a diagram illustrating an overview of an image-guided navigation system 10 that can be used for various procedures. The navigation system 10 can be used to track the location of an implant, such as a spinal implant or orthopedic implant, relative to a patient 14. Also the navigation system 10 can track the position and orientation of various instruments. It should further be noted that the navigation system 10 may be used to navigate any type of instrument, implant, or delivery system, including: guide wires, arthroscopic systems, orthopedic implants, spinal implants, etc. Moreover, these instruments may be used to navigate or map any region of the body. The navigation system 10 and the various instruments may be used in any appropriate procedure, such as one that is generally minimally invasive or an open procedure.


The navigation system 10 may include an optional imaging device 12 that is used to acquire pre-, intra-, or post-operative or real-time image data of a patient 14. Alternatively various imageless systems can be used or images from atlas models can be used to produce patient images, such as those disclosed in U.S. patent application Ser. No. 10/687,539, filed Oct. 16, 2003, entitled “METHOD AND APPARATUS FOR SURGICAL NAVIGATION OF A MULTIPLE PIECE CONSTRUCT FOR IMPLANTATION”, incorporated herein by reference. The optional imaging device 12 is, for example, a fluoroscopic x-ray imaging device that may be configured as a C-arm 16 having an x-ray source 18, an x-ray receiving section 20, an optional calibration and tracking target 22 and optional radiation sensors 24.


Image data may also be acquired using other imaging devices, such as those discussed above and herein. The calibration and tracking target 22 includes calibration markers 26 (see FIGS. 2A-2B), further discussed herein. An optional imaging device controller 28, that may control the C-arm 16, can capture the x-ray images received at the receiving section 20 and store the images for later use. The controller 28 may also be separate from the C-arm 16 and/or control the rotation of the C-arm 16. For example, the C-arm 16 can move in the direction of arrow 30 or rotate about a longitudinal axis 14a of the patient 14, allowing anterior or lateral views of the patient 14 to be imaged. Each of these movements involves rotation about a mechanical axis 32 of the C-arm 16.


In the example of FIG. 1, the longitudinal axis 14a of the patient 14 is substantially in line with the mechanical axis 32 of the C-arm 16. This enables the C-arm 16 to be rotated relative to the patient 14, allowing images of the patient 14 to be taken from multiple directions or about multiple planes. An example of a fluoroscopic C-arm x-ray device that may be used as the optional imaging device 12 is the “Series 9600 Mobile Digital Imaging System,” from OEC Medical Systems, Inc., of Salt Lake City, Utah. Other exemplary fluoroscopes include bi-plane fluoroscopic systems, ceiling fluoroscopic systems, cath-lab fluoroscopic systems, fixed C-arm fluoroscopic systems, isocentric C-arm fluoroscopic systems, 3D fluoroscopic systems, etc.


In operation, the imaging device 12 generates x-rays from the x-ray source 18 that propagate through the patient 14 and calibration and/or tracking target 22, into the x-ray receiving section 20. It will be understood that the tracking target need not include a calibration portion. The receiving section 20 generates image data representing the intensities of the received x-rays. Typically, the receiving section 20 includes an image intensifier that first converts the x-rays to visible light and a charge coupled device (CCD) video camera that converts the visible light into digital image data. Receiving section 20 may also be a digital device that converts x-rays directly to digital image data for forming images, thus potentially avoiding distortion introduced by first converting to visible light. With this type of digital C-arm, which is generally a flat panel device, the optional calibration and/or tracking target 22 and the calibration process discussed below may be eliminated. Also, the calibration process may be eliminated or not used at all for various procedures. Alternatively, the imaging device 12 may only take a single image with the calibration and tracking target 22 in place. Thereafter, the calibration and tracking target 22 may be removed from the line-of-sight of the imaging device 12.


Two dimensional fluoroscopic images that may be taken by the imaging device 12 are captured and stored in the C-arm controller 28. Multiple two-dimensional images taken by the imaging device 12 may also be captured and assembled to provide a larger view or image of a whole region of a patient, as opposed to being directed to only a portion of a region of the patient. For example, multiple image data of a patient's leg may be appended together to provide a full view or complete set of image data of the leg that can be later used to follow contrast agent, such as Bolus tracking.


The image data is then forwarded from the C-arm controller 28 to a navigation computer and/or processor controller or work station 34 having a display 36 and a user interface 38. It will also be understood that the image data is not necessarily first retained in the controller 28, but may also be directly transmitted to the navigation computer 34. The work station 34 provides facilities for displaying the image data as an image on the display 36, saving, digitally manipulating, or printing a hard copy image of the of the received image data. The user interface 38, which may be a keyboard, mouse, touch pen, touch screen or other suitable device, allows a physician or user to provide inputs to control the imaging device 12, via the C-arm controller 28, or adjust the display settings of the display 36. The work station 34 may also direct the C-arm controller 28 to adjust the rotational axis 32 of the C-arm 16 to obtain various two-dimensional images along different planes in order to generate representative two-dimensional and three-dimensional images.


When the x-ray source 18 generates the x-rays that propagate to the x-ray receiving section 20, the radiation sensors 24 sense the presence of radiation, which is forwarded to the C-arm controller 28, to identify whether or not the imaging device 12 is actively imaging. This information is also transmitted to a coil array controller 48, further discussed herein. Alternatively, a person or physician may manually indicate when the imaging device 12 is actively imaging or this function can be built into the x-ray source 18, x-ray receiving section 20, or the control computer 28.


The optional imaging device 12, such as the fluoroscopic C-arm 16, that do not include a digital receiving section 20 generally require the optional calibration and/or tracking target 22. This is because the raw images generated by the receiving section 20 tend to suffer from undesirable distortion caused by a number of factors, including inherent image distortion in the image intensifier and external electromagnetic fields. An empty undistorted or ideal image and an empty distorted image are shown in FIGS. 2A and 2B, respectively. The checkerboard shape, shown in FIG. 2A, represents the ideal image 40 of the checkerboard arranged calibration markers 26. The image taken by the receiving section 20, however, can suffer from distortion, as illustrated by the distorted calibration marker image 42, shown in FIG. 2B.


Intrinsic calibration, which is the process of correcting image distortion in a received image and establishing the projective transformation for that image, involves placing the calibration markers 26 in the path of the x-ray, where the calibration markers 26 are opaque or semi-opaque to the x-rays. The calibration markers 26 are rigidly arranged in pre-determined patterns in one or more planes in the path of the x-rays and are visible in the recorded images. Because the true relative position of the calibration markers 26 in the recorded images are known, the C-arm controller 28 or the work station or computer 34 is able to calculate an amount of distortion at each pixel in the image (where a pixel is a single point in the image). Accordingly, the computer or work station 34 can digitally compensate for the distortion in the image and generate a distortion-free or at least a distortion improved image 40 (see FIG. 2A). A more detailed explanation of exemplary methods for performing intrinsic calibration are described in the references: B. Schuele, et al., “Correction of Image Intensifier Distortion for Three-Dimensional Reconstruction,” presented at SPIE Medical Imaging, San Diego, Calif., 1995; G. Champleboux, et al., “Accurate Calibration of Cameras and Range Imaging Sensors: the NPBS Method,” Proceedings of the IEEE International Conference on Robotics and Automation, Nice, France, May, 1992; and U.S. Pat. No. 6,118,845, entitled “System And Methods For The Reduction And Elimination Of Image Artifacts In The Calibration Of X-Ray Imagers,” issued Sep. 12, 2000, the contents of which are each hereby incorporated by reference.


While the optional imaging device 12 is shown in FIG. 1, any other alternative 2D, 3D or 4D imaging modality may also be used. For example, any 2D, 3D or 4D imaging device, such as isocentric fluoroscopy, bi-plane fluoroscopy, ultrasound, computed tomography (CT), multi-slice computed tomography (MSCT), magnetic resonance imaging (MRI), high frequency ultrasound (HIFU), positron emission tomography (PET), optical coherence tomography (OCT), intra-vascular ultrasound (IVUS), ultrasound, intra-operative CT or MRI may also be used to acquire 2D, 3D or 4D pre- or post-operative and/or real-time images or image data of the patient 14. The images may also be obtained and displayed in two, three or four dimensions. In more advanced forms, four-dimensional surface rendering regions of the body may also be achieved by incorporating patient data or other data from an atlas or anatomical model map or from pre-operative image data captured by MRI, CT, or echocardiography modalities. A more detailed discussion on optical coherence tomography (OCT), is set forth in U.S. Pat. No. 5,740,808, issued Apr. 21, 1998, entitled “Systems And Methods For Guilding Diagnostic Or Therapeutic Devices In Interior Tissue Regions” which is hereby incorporated by reference.


Image datasets from hybrid modalities, such as positron emission tomography (PET) combined with CT, or single photon emission computer tomography (SPECT) combined with CT, could also provide functional image data superimposed onto anatomical data to be used to confidently reach target sights within the patient 14. It should further be noted that the optional imaging device 12, as shown in FIG. 1, provides a virtual bi-plane image using a single-head C-arm fluoroscope as the optional imaging device 12 by simply rotating the C-arm 16 about at least two planes, which could be orthogonal planes to generate two-dimensional images that can be converted to three-dimensional volumetric images. By acquiring images in more than one plane, an icon representing the location of an impacter, stylet, reamer driver, taps, drill, or other instrument, introduced and advanced in the patient 14, may be superimposed in more than one view on display 36 allowing simulated bi-plane or even multi-plane views, including two and three-dimensional views.


These types of imaging modalities may provide certain distinct benefits for their use. For example, magnetic resonance imaging (MRI) is generally performed pre-operatively using a non-ionizing field. This type of imaging provides very good tissue visualization in three-dimensional form and also provides anatomy and functional information from the imaging. MRI imaging data is generally registered and compensated for motion correction using dynamic reference frames (DRF) discussed further herein.


Positron emission tomography (PET) imaging is generally a pre-operative imaging procedure that exposes the patient to some level of radiation to provide a 3D image. PET imaging provides functional information and also generally requires registration and motion correction using dynamic reference frames.


Computed tomography (CT) imaging is also generally a pre-operative technique that exposes the patient to a limited level of radiation. CT imaging, however, is a very fast imaging procedure. A multi-slice CT system provides 3D images having good resolution and anatomy information. Again, CT imaging is generally registered and needs to account for motion correction, via dynamic reference frames.


Fluoroscopy imaging is generally an intra-operative imaging procedure that exposes the patient to certain amounts of radiation to provide either two-dimensional or rotational three-dimensional images. Fluoroscopic images generally provide good resolution and anatomy information. Fluoroscopic images can be either manually or automatically registered and also need to account for motion correction using dynamic reference frames.


Ultrasound imaging is also generally intra-operative procedure using a non-ionizing field to provide 2D, 3D, or 4D imaging, including anatomy and blood flow information. Ultrasound imaging provides automatic registration and does not need to account for any motion correction.


With continuing reference to FIG. 1, the navigation system 10 can further include an electromagnetic navigation or tracking system 44 that includes a localizer, such as a transmitter coil array 46, the coil array controller 48, a navigation probe interface 50, an instrument 52 and a dynamic reference frame 54. The dynamic reference frame 54 can be interconnected with a removable tracking sensor 54a or can include a more integral tracking sensor 54aa and a dynamic reference frame positioning member 80, according to various embodiments. It will be understood that reference to either the tracking sensor 54a or the integral tracking sensor 54aa can be a reference to either, unless specifically taught otherwise. Generally, the tracking sensor 54aa is tracked by the navigation system and the dynamic reference frame positioning member 80 fixes, as discussed further herein, the tracking sensor 54aa relative to the patient 14.


The instrument 52 may be any appropriate instrument, such as an instrument for preparing a portion of the patient or positioning an implant. The transmitter coil array 46 may also be supplemented or replaced with a mobile localizer 46a. The mobile localizer 46a may be one such as that described in U.S. patent application Ser. No. 10/941,782, filed Sep. 15, 2004, and entitled “METHOD AND APPARATUS FOR SURGICAL NAVIGATION”, herein incorporated by reference. It will be understood that the tracking system may be any appropriate tracking system, such as an optical localizer illustrated in phantom at 47 such as the StealthStation® TRIA™ sold by Medtronic Navigation of Louisville, Colo. Other localization systems include an acoustic, radiation etc.


Further included in the navigation system 10 may be an isolator circuit or box 55. The isolator circuit or box 55 may be included in a transmission line to interrupt a line carrying a signal or a voltage to the navigation probe interface 50. Alternatively, the isolator circuit included in the isolator box 55 may be included in the navigation probe interface 50, the instrument 52, the dynamic reference frame 54, the transmission lines coupling the devices, or any other appropriate location. The isolator box 55 is operable to isolate any of the instruments or patient coincidence instruments or portions that are in contact with the patient should an undesirable electrical surge or voltage take place.


It should further be noted that the entire tracking system 44 or parts of the tracking system 44 may be incorporated into the imaging device 12, including the work station 34 and radiation sensors 24. Incorporating the tracking system 44 may provide an integrated imaging and tracking system. Any combination of these components may also be incorporated into the imaging system 12, which again can include a fluoroscopic C-arm imaging device or any other appropriate imaging device.


The transmitter coil array 46 is shown attached to the receiving section 20 of the C-arm 16. It should be noted, however, that the transmitter coil array 46 may also be positioned at any other location as well. For example, the transmitter coil array 46 may be positioned at the x-ray source 18, within or atop the OR table 56 positioned below the patient 14, on siderails associated with the table 56, or positioned on the patient 14 in proximity to the region being navigated, such as on the patient's chest. The transmitter coil array 46 may also be positioned in the items being navigated, further discussed herein. The transmitter coil array 46 includes a plurality of coils that are each operable to generate distinct electromagnetic fields into the navigation region of the patient 14, which is sometimes referred to as patient space. Representative electromagnetic systems are set forth in U.S. Pat. No. 5,913,820, entitled “Position Location System,” issued Jun. 22, 1999 and U.S. Pat. No. 5,592,939, entitled “Method and System for Navigating a Catheter Probe,” issued Jan. 14, 1997, each of which are hereby incorporated by reference.


The transmitter coil array 46 is controlled or driven by the coil array controller 48. The coil array controller 48 drives each coil in the transmitter coil array 46 in a time division multiplex or a frequency division multiplex manner. In this regard, each coil may be driven separately at a distinct time or all of the coils may be driven simultaneously with each being driven by a different frequency. Upon driving the coils in the transmitter coil array 46 with the coil array controller 48, electromagnetic fields are generated within the patient 14 in the area where the medical procedure is being performed, which is again sometimes referred to as patient space. The electromagnetic fields generated in the patient space induce currents in a sensor 58 positioned on or in the instrument 52. These induced signals from the instrument 52 are delivered to the navigation probe interface 50 through the isolation circuit 55 and subsequently forwarded to the coil array controller 48. The navigation probe interface 50 may provide all the necessary electrical isolation for the navigation system 10. Alternatively, the electrical isolation may also be provided in the isolator box 55. Nevertheless, the isolator assembly 55 may be included in the navigation probe interface 50 or may be integrated into the instrument 52, and any other appropriate location. The navigation probe interface 50 can also include amplifiers, filters and buffers to directly interface with the sensors 58 in the instrument 52. Alternatively, the instrument 52 may employ a wireless communications channel, such as that disclosed in U.S. Pat. No. 6,474,341, entitled “Surgical Communication Power System,” issued Nov. 5, 2002, herein incorporated by reference, as opposed to being coupled directly to the navigation probe interface 50.


Various portions of the navigation system 10, such as the instrument 52, the dynamic reference frame (DRF) 54, the probe 66, and others as will be described in detail below, are equipped with at least one, and generally multiple, tracking sensors 58, that may also be referred to as localization sensors. The instrument 52 can be a handle or inserter that interconnects with an attachment and may assist in placing an implant or in driving a portion. The instrument 52 can include a graspable or manipulable portion at a proximal end and the tracking sensor 58 may be fixed near the manipulable portion of the instrument 52. The tracking sensor 58 may be any appropriate tracking sensor 58 such as an optical sensor, acoustic sensor, or an electromagnetic sensor. If the sensor 58 includes an electromagnetic sensor the electromagnetic field generated by the transmitter coil array 46 may induce a current in the electromagnetic sensor 58. An alternative sensor may include an optical sensor, such as the optical sensor 58a, and may be used in addition to or in place of the electromagnetic sensor 58. The optical sensor may work with the optional optical array 47.


In an alternate embodiment, the electromagnetic sources or generators may be located within the instrument 52, DRF 54 (such as the integral tacking sensor 54aa), probe 66 and one or more receiver coils may be provided externally to the patient 14 forming a receiver coil array similar to the transmitter coil array 46. In this regard, the tracking sensors 58 could generate electromagnetic fields that would be received by the receiving coils in the receiving coil array similar to the transmitter coil array 46. Other types of tracking systems include optical, acoustic, electrical field, RF and accelerometers. Accelerometers enable both dynamic sensing due to motion and static sensing due to gravity. An additional representative alternative localization and tracking system is set forth in U.S. Pat. No. 5,983,126, entitled “Catheter Location System and Method,” issued Nov. 9, 1999, which is hereby incorporated by reference. Alternatively, the localization system may be a hybrid system that includes components from various systems.


The dynamic reference frame 54 of the tracking system 44 is also coupled to the navigation probe interface 50 to forward the information to the coil array controller 48. The dynamic reference frame 54, according to various embodiments, may include a small magnetic field detector. The dynamic reference frame 54 may be fixed to the patient 14 adjacent to the region being navigated so that any movement of the patient 14 is detected as relative motion between the transmitter coil array 46 and the dynamic reference frame 54. The dynamic reference frame 54 can be interconnected with the patient in any appropriate manner, including those discussed herein. This relative motion is forwarded to the coil array controller 48, which updates registration correlation and maintains accurate navigation, further discussed herein. The dynamic reference frame 54 may be any appropriate tracking sensor used as the dynamic reference frame 54 in the navigation system 10. Therefore the dynamic reference frame 54 may also be optical, acoustic, etc. If the dynamic reference frame 54 is electromagnetic it can be configured as a pair of orthogonally oriented coils, each having the same center or may be configured in any other non-coaxial or co-axial coil configurations.


The dynamic reference frame 54 may be affixed externally to the patient 14, adjacent to the region of navigation, such as on the patient's chest or pelvis, as shown in FIG. 1. The dynamic reference frame 54 can be affixed to the patient's skin, by way of a selected adhesive patch and/or a tensioning system. The dynamic reference frame 54 may also be removably attachable to fiducial markers 60 also positioned on the patient's body and further discussed herein. The dynamic reference frame 54 can also be connected to a bone portion of the anatomy. The bone portion can be adjacent, the area of the procedure, the bone of the procedure, or any appropriate bone portion.


The dynamic reference frame 54 may also be attached to various boney portions such as a femur, pelvis, cranium, or other boney portions. The movement of various portions, such as the instrument 52, relative to these boney portions can then be determined, even if the boney portion is also moved. This may assist in positioning an implant or in performing a planned procedure.


Briefly, the navigation system 10 operates as follows. The navigation system 10 creates a translation map between all points in the radiological image generated from the imaging device 12 and the corresponding points in the patient's anatomy in patient space. After this map is established, whenever a tracked instrument, such as the instrument 52 or a pointing device or probe 66 is used, the work station 34 in combination with the coil array controller 48 and the C-arm controller 28 uses the translation map to identify the corresponding point on the pre-acquired image or atlas model, which is displayed on display 36. This identification is known as navigation or localization. An icon representing the localized point or instruments is shown on the display 36 within several two-dimensional image planes, as well as on three and four dimensional images and models.


To enable navigation, the navigation system 10 must be able to detect both the position of the patient's anatomy and the position of the instrument 52 or attachment member attached to the instrument 52. Knowing the location of these two items allows the navigation system 10 to compute and display the position of the instrument 52 in relation to the patient 14. The tracking system 44 is employed to track the instrument 52 and the anatomy simultaneously.


The tracking system 44, if it is using an electromagnetic tracking assembly, essentially works by positioning the transmitter coil array 46 adjacent to the patient space to generate a low-energy magnetic field generally referred to as a navigation field. Because every point in the navigation field or patient space is associated with a unique field strength, the electromagnetic tracking system 44 can determine the position of the instrument 52 by measuring the field strength at the tracking sensor 58 location. The dynamic reference frame 54 is fixed to the patient 14 to identify the location of the patient in the navigation field. The electromagnetic tracking system 44 continuously recomputes the relative position of the dynamic reference frame 54 and the instrument 52 during localization and relates this spatial information to patient registration data to enable image guidance of the instrument 52 within and/or relative to the patient 14.


Patient registration is the process of determining how to correlate the position of the instrument 52 relative to the patient 14 to the position on the diagnostic or pre-acquired images. To register the patient 14, a physician or user 67 may use point registration by selecting and storing particular points from the pre-acquired images and then touching the corresponding points on the patient's anatomy with the pointer probe 66. The navigation system 10 analyzes the relationship between the two sets of points that are selected and computes a match, which correlates every point in the image data with its corresponding point on the patient's anatomy or the patient space. The points that are selected to perform registration are the fiducial markers or landmarks 60, such as anatomical landmarks. Again, the landmarks or fiducial points 60 are identifiable on the images and identifiable and accessible on the patient 14. The landmarks 60 can be artificial landmarks 60 that are positioned on the patient 14 or anatomical landmarks that can be easily identified in the image data. The artificial landmarks, such as the fiducial markers 60, can also form part of the dynamic reference frame 54, such as those disclosed in U.S. Pat. No. 6,381,485, entitled “Registration of Human Anatomy Integrated for Electromagnetic Localization,” issued Apr. 30, 2002, herein incorporated by reference.


The system 10 may also perform registration using anatomic surface information or path information as is known in the art. The system 10 may also perform 2D to 3D registration by utilizing the acquired 2D images to register 3D volume images by use of contour algorithms, point algorithms or density comparison algorithms, as is known in the art. An exemplary 2D to 3D registration procedure, is set forth in U.S. Ser. No. 60/465,615, entitled “Method and Apparatus for Performing 2D to 3D Registration” filed on Apr. 25, 2003, hereby incorporated by reference.


In order to maintain registration accuracy, the navigation system 10 continuously tracks the position of the patient 14 during registration and navigation. This is because the patient 14, dynamic reference frame 54, and transmitter coil array 46 may all move during the procedure, even when this movement is not desired. Therefore, if the navigation system 10 did not track the position of the patient 14 or area of the anatomy, any patient movement after image acquisition would result in inaccurate navigation within that image. The dynamic reference frame 54 allows the electromagnetic tracking device 44 to register and track the anatomy. Because the dynamic reference frame 54 is rigidly fixed to the patient 14, any movement of the anatomy or the transmitter coil array 46 is detected as the relative motion between the transmitter coil array 46 and the dynamic reference frame 54. This relative motion is communicated to the coil array controller 48, via the navigation probe interface 50, which updates the registration correlation to thereby maintain accurate navigation.


The navigation system 10 can be used according to any appropriate method or system. For example, pre-acquired images, atlas or 3D models may be registered relative to the patient and patient space. Generally, the navigation system allows the images on the display 36 to be registered and accurately display the real time location of the various instruments, such as the instrument 52, and other appropriate items, such as the pointer 66. In addition, the pointer 66 may be used to register the patient space to the pre-acquired images or the atlas or 3D models. In addition, the dynamic reference frame 54 may be used to ensure that any planned or unplanned movement of the patient or the receiver array 46 is determined and used to correct the image on the display 36.


With additional reference to FIG. 3, the dynamic reference frame 54 can be affixed to any appropriate portion of the patient 14, and can be used to register the patient to the image data, as discussed above. For example, when a spinal procedure is being performed, the dynamic reference frame 54 can be interconnected with a portion of a spine 15 of the patient. The spine 15 can include various vertebral bodies 15a and portions of the vertebral bodies. In addition, or alternatively, the dynamic reference frame 54 can be affixed to any appropriate portion of the patient 14. The dynamic reference frame 54 can be interconnected with a portion of a pelvis 17 of the patient 14. The dynamic reference frame 54 can be interconnected with the pelvis 17 in any appropriate manner, such as those discussed herein according to various embodiments.


Affixing the dynamic reference frame 54 to the pelvis can be appropriate if the procedure being performed is performed in a portion of the anatomy that is held substantially still or stable relative to the pelvis 17. For example, various portions of the lumbar spine 15 are held substantially constant relative to the pelvis 17. In other words, if the pelvis 17 moves a selected amount, the selected lumbar vertebrae 15a are held at a substantially constant distance relative to the pelvis 17. Therefore, it would be understood that the dynamic reference frame 54 can be interconnected with any selected portion of the anatomy.


To obtain a maximum reference it can be selected to fix the dynamic reference frame 54 in each of at least 6 degrees of freedom. Thus, the dynamic reference frame 54 can be fixed relative to axial motion X, translational motion Y, rotational motion Z, yaw, pitch, and roll relative to the portion of the patient 14 to which it is attached. Any appropriate coordinate system can be used to describe the various degrees of freedom. Fixing the dynamic reference frame relative to the patient 14 in this manner can assist in maintaining maximum accuracy of the navigation system 10.


With additional reference to FIG. 3, a dynamic reference frame fixation device 80 according to various embodiments is illustrated. The dynamic reference frame fixation device 80 generally includes a body 82 that extends between a first or proximal end 84 and a distal or second end 86. Extending from the distal end 86 is a bone engaging or fixation section 88. The bone engaging section 88 can be provided to engage a selected portion of the bone or another portion of the anatomy, such as the pelvis 17. It will also be understood that the bone engaging portion 88 can be provided to engage a portion of the anatomy other than bone.


The bone engaging section 88 can be formed in any appropriate manner, but can include at least a first arm or portion 90 and a second arm or portion 92 and may also include a third arm portion 94 and a fourth arm portion 96. Generally, at least two of the arm portions 90 and 92 can be provided. Though any appropriate number, such as the third and fourth arm portions 94, 96, or more can also be provided. The arm portions 90-96 engage the bone, such as the pelvis 17 can resist rotation of the dynamic reference frame fixation member 80.


Further, a distal end 98 of the bone engaging section 88 can be formed in any appropriate manner. Generally, the distal end 98 can be sharpened, such that the dynamic reference frame fixation member 80 can be driven into the selected bone portion, such as with a hammer. The bone engaging section can be sharpened in any appropriate manner so that a generally straight axial motion can drive the dynamic reference frame fixation member 80 into the bone. In other words, the distal end 98 can allow the dynamic reference frame fixation member 80 to be driven into the bone with a hammer or similar device, such that a rotation of the dynamic reference frame fixation member 80 is not required.


The body 82 can include any appropriate dimension, which may be a similar dimension to the bone engaging section 88. For example, the bone engaging section 88 can include a large or largest dimension A that defines the width between the ends of the first arm 90 and the third arm 94. Nevertheless, if two arms, such as arm 90, 92 are provided at substantially right angles to one another, the largest dimension may be smaller than the dimension A. Nevertheless, the dimension A may be about 1.5 mm to about 10 mm.


The dimension A can also be the largest dimension in the body 82. This can allow the dynamic reference frame fixation member 80 to be passed through a small incision or puncture wound in a soft tissue of the patient 14. This can allow the dynamic reference frame fixation member 80 to be implanted or positioned substantially percutaneously or through a very small incision. Also, as discussed herein, the dynamic reference frame fixation member 80 can be positioned in the anatomy through a puncture wound formed with a dilator and cannula.


Further, near the proximal end 84 of the body 82, a dynamic reference frame holding portion 100 can be provided. The dynamic reference frame holding portion 100 can include a bore or opening 102 that can selectively engage the tracking sensor 54a. Further, the dynamic reference frame holding section 100 can include a second bore 104 to further fix the tracking sensing 54a. Further, or in addition to the tracking sensor 54a, the integral or included tracking sensor that can act as a tracking sensor 54aa can be included in the body 82. The included tracking sensor 54aa can be an electromagnetic tracking sensor. The included or one piece tracking sensor 54aa can act as the tracking sensor for the dynamic reference frame so that an additional one need not be interconnected with the body 82. Nevertheless, it will be understood that any appropriate tracking sensor can be used as the dynamic reference frame, such as an electromagnetic tracking sensor, an acoustic tracking sensor, a nuclear tracking sensor, an optical or IR tracking sensor, or combinations thereof. The dynamic reference frame positioning member 80 can be provided to interconnect the dynamic reference frame with the bony portion, regardless whether the dynamic reference frame is selectively interconnected with the body 82 or formed with or in the body 82.


If the tracking sensor 54a is provided it can be interconnected with the dynamic reference frame holding portion 100. For example the tracking sensor can be formed as a shape that compliments the dynamic reference frame holding portion 100 such that positioning the tracking sensor 54a into the dynamic reference frame holding portion 100 fixes it relative to the body 82. Further, screws or pins can be provided to further interconnect the tracking sensor 54a with the dynamic reference frame holding portion 100. Alternatively, the locking screw 170 (FIG. 6) can engage any appropriate of the bores 102, 104 to fix the tracking sensor 54a relative to the body 82.


With reference to FIG. 4, a dynamic reference frame positioning member 110 according to various embodiments is illustrated. The dynamic reference frame positioning member 110 includes portions that are similar to those illustrated in the dynamic reference frame positioning member 80 illustrated in FIG. 3 and like reference numerals are used to reference like portions.


Extending from the distal end 86 of the dynamic reference frame positioning member 110 is a bone engaging section or portion 112. The bone engaging section 112 can include a extending member 114 that extends from the distal end 86. The extending member 114 can include a substantially smooth portion 116 and a second portion 118 from which bone engaging fins 120, 122, 124, and 126. It will be understood, similar to the bone engaging portion 88, that any appropriate number of fins may be provided and four is merely provided as an example. The smooth portion 116 can terminate in a blunted or sharpened end.


The bone engaging portion 112 can be driven into the bone similar to the bone engaging portion 88. Therefore, the smooth end 116 may include a sharpened or bone driving portion so that the dynamic reference frame positioning member 110 can be driven into a selected portion of the anatomy, such as the pelvis 17. Similar to the bone engaging portion 88, the bone engaging portion 112 can allow the dynamic reference frame positioning member 110 to be hammered or impacted to be driven axially into the bone. Therefore, the dynamic reference frame positioning member 110 need not be screwed or rotated to drive the dynamic reference frame positioning member 110 into the bone. The various fins 120, 122, 124, 126 can be sharpened on the distal portion thereof to assist in driving the dynamic reference frame positioning member 110 into the selected portion of the anatomy.


Further, the various fins 120, 122, 124, 126 can engage the bone to substantially resist rotation of the dynamic reference frame positioning member 110 after insertion thereof. Therefore, the tracking sensor 54a, 54aa interconnected with the dynamic reference frame positioning member 110 can be held relative to the bone in a selected manner, such as to resist rotation, translation, axial movement, and movement in pitch, yaw, and roll. Also the dynamic reference frame positioning member 110 can have an included or one piece tracking sensor 54aa, similar to that of the dynamic reference frame positioning member 80. Thus, the separate or modular tracking sensor 54a may not be used. Further, either or both of the dynamic reference frames can be any appropriate tracking sensor, such as those discussed above and herein.


With reference to FIG. 5, a dynamic reference frame positioning member 130 is illustrated. The dynamic reference frame positioning member 130 includes a body 132 that extends between a first or distal end 134 and a second or proximal end 136. Near the second end 136 is a dynamic reference frame positioning portion 135. The dynamic reference frame positioning portion 135 can be formed in any appropriate manner, such as the dynamic reference frame positioning portion 100 illustrated in the dynamic reference frame positioning members 80, 110. Nevertheless, the dynamic reference frame positioning portion 135 can include a selected geometry, such as a hexagon, square, cylindrical or the like, that can be interconnected with the tracking sensor 54a, such as that discussed herein. For example the locking screw 170 (FIG. 6) can contact one of the flats of the dynamic reference frame positioning portion 135 to holding the tracking sensor 54a relative thereto. Although it will be understood that an included or one-piece tracking sensor 54aa may be provided in the dynamic reference frame positioning member 130.


The dynamic reference frame positioning member 130 also includes a bone engaging portion 138 that extends from the first end 134. Similar to the bone engaging portion 88, the bone engaging portion of 138 can include any appropriate number of fins 138 such as a first fin 138a, a second fin 138b, a third fin 138c and a fourth fin 138d. It will be understood that any appropriate number of the fins 138 can be provided. The fins 138 can include sharpened edges and sharpened ends to assist in their movement into a selected bone portion. As discussed herein, the dynamic reference frame positioning member 130 may be driven substantially axially, such as with an impacting motion, into the selected bone portion. Further, as discussed herein, the selected number of fins or geometry of the fins 138 can provide for a reduction or elimination or rotation of the dynamic reference frame positioning member 130.


It will be understood that the dynamic reference frame positioning members 80, 110, 130, can be used according to any appropriate embodiments and any selected procedure. Further, the selection of the dynamic reference frame positioning member 80, 110, 130 can be for the procedure, selection by a user, inclusion of the selected kit, or the like. Therefore, it will be understood that the dynamic reference frame positioning member 80, 110, 130 can be used according to any appropriate reason.


The dynamic reference frame positioning members 80, 110, 130 can interconnect a selected tracking sensor, such as an optical reflective tracking sensor 54a or an electromagnetic tracking sensor 54aa that can be interconnected or formed in the body 82, relative to the anatomy. Therefore, the dynamic reference frame positioning members 80, 110, 130 can be driven through substantially small or puncture wounds of the soft tissue to engage a selected portion of the anatomy, especially bony portions therein. This can allow the tracking sensor 54a, 54aa to be held relative to a selected portion of the anatomy by providing the dynamic reference frame positioning member 80, 110, 130 through a small incision with a hammer force or other similar force producing device.


It can also be understood, according to various embodiments, that registration techniques can be used to determine the position of the dynamic reference frame 54 relative to any portion of the selected dynamic reference frame positioning member 80, 110, 130. For example the probe 66 can be tracked and touched to the first end of the respective dynamic frame positioning member 80, 110, 130 so that the navigation system 10 can determine the position of the anatomy. Alternatively, or in addition there to, such information can be preprogrammed or stored in the navigation system 10.


Also, the dynamic reference frame 54 can include a fiducial portion. The fiducial portion can be formed into the tracking sensor 54a, the dynamic reference frame positioning member 80, 110, 130, or any appropriate portion. For example a dimple can be formed in the dynamic reference frame that the probe 66 can touch. This can allow for registration of the patient space to the image space. Further, it will be understood that the fiducial portion of the dynamic reference 54 can be formed with or separate from any other portion of the dynamic reference frame.


With reference to FIG. 6, the dynamic reference frame positioning member 80 can be provided in a kit 150 that can include a plurality of instruments or portions. It will be understood that the kit 150 can be understood as a system for positioning the dynamic reference frame 54 and/or a part of the navigation system 10. Further, the kit 150 can include all, part, or more parts than those illustrated and discussed. It will be understood that the dynamic reference frame positioning member 110 and/or 130 can also be provided in addition to or in the alternative of the dynamic reference frame positioning member 80, and only one is shown for clarity of the following discussion. The various portions included in the kit 150 can include any appropriate portions, and only exemplary include those described herein. Therefore, it will be understood that the portions of the kit 150 described herein are merely exemplary and not intended to limit the scope of the present teachings.


Regardless the kit 150 can include the dynamic reference frame positioning member 80 (which can also be referenced as a percutaneous reference pin). Further, the kit 150 can include the tracking sensor 54a, which can be interconnected with the dynamic reference frame positioning member 80. It will be understood, however, that the dynamic reference frame positioning member 80 may have included therein the tracking sensor 54aa.


The kit 150 may also include a tap cap 152, a cannula 154, a dilator 156, an impactor 157, and a slap hammer 158. The various portions of the kit 150 can be used according to any appropriate embodiment. Further, the portions of the kit 150 can be selected to include selected features. For example the cannula 154 can be flexible, rigid, or a combination thereof. Also, the kit 150 may be used according to a method as exemplary described herein. Therefore, it will be understood that the portions of the kit 150 may be used with any appropriate system or method and the method described herein is merely exemplary.


It will be understood that each of the portions of the kit 150 may be substantially single use and can be disposed of after a selected procedure. Nevertheless, it will be understood that the various portions of the kit 150 may also be multi-use and used for a plurality of procedures. Regardless, various portions of the kit 150, such as the dynamic reference frame positioning member 80, can be formed of any appropriate materials, such as various metals or metal alloys, polymers, or any other appropriate materials. The various portions of the kit 150, such as the dynamic reference frame positioning member 80 can be sterilized according to various procedures to reduce or eliminate the possibility of contamination or infection during use. Further the kit 150 can be provided in a container 159 that can be sterilized with each of the portions included therewith. Also the kit 150 can be provided in a sterile manner such that no additional procedures need to occur to provide a sterile kit.


According to a selected procedure or illustrated in FIGS. 7-9, the dynamic reference frame positioning member 80 can be inserted into a selected portion of the anatomy. A small or stab incision can be formed in an appropriate portion of the anatomy, such as over the posterior superior iliac spine (PSIS) or crest 17a′ or in any area relative to the pelvis 17 or the iliac crest 17a. The incision can be formed in any appropriate manner, such as with a scalpel or other appropriate instrument. The incision can also be formed with the dilator 156 and/or cannula 154 being pushed or moved through a skin and/or soft tissue layer 166.


With reference to FIG. 7, the cannula 154 can be placed percutaneously passed through a layer of soft tissue, including skin 166 into the iliac 17a. As discussed above, the iliac 17a generally includes a PSIS 17a. The PSIS 17a′ can be accessed through a posterior portion of the patient 14 through the skin 166. The cannula 154 can be positioned percutaneously by positioning the dilator 156 through the cannula 154 and simultaneously inserting both members through the soft tissue, including the skin 166.


Both the dilator 156 and the cannula 154 can include cutting or puncturing edges, which allow it to be passed through the soft tissue, including the skin 166. Positioning the dilator 156 through the cannula 154 can assist in assuring that no soft tissue or other material passes into the cannula 154 prior to a selected procedure. Further, the use of the dilator 156 with the cannula 154 can substantially eliminate the necessity of forming any other incisions through the soft tissue including the skin 166 to position the cannula 154 relative to the PSIS 17a′. The use of the cannula 154 and the dilator 156 allows for an ease of the operation further discussed herein.


Once the cannula 154 is positioned next to or relative to the PSIS 17a′, the dilator 156 can be removed from the cannula 154. Once the dilator is removed from the cannula 154, the bore defined by the cannula 154 can be used to position a selected member relative to the PSIS 17a′. The portion to be positioned can include the dynamic reference frame positioning member 80. It will be understood that the present exemplary method discusses specifically the dynamic reference frame positioning member 80, but any appropriate member may be used such as the dynamic reference frame positioning members 110, 130.


The dynamic reference frame positioning member 80 can be passed through the cannula 154 until it engages or touches the PSIS 17a′. The dynamic reference frame positioning member 80 can then be driven into the PSIS 17a′ in any appropriate manner. It will be understood that any other appropriate preparatory steps may also occur. For example, a pilot or preformed hole may be made in the pelvis 17 prior to positioning the dynamic reference frame positioning member 80. This can allow the dynamic reference frame positioning member 80 to be driven through the pilot hole formed in the pelvis 17. it will be understood, however, that a pilot hole or other preformed opening is not necessary and is described merely as an example.


The tap cap 152 can be selectively interconnected with the dynamic reference frame positioning member 80 and be used, with the impactor 157 to drive or impact the dynamic reference frame positioning member 80 into the PSIS 17a′. Any appropriate instrument can be used to assist in this procedure, such as the hammer or mallet 157. The hammer 157 can be used to impact the proximal or exposed end of the tap cap 152 to drive the dynamic reference frame positioning member 80 into the bone.


The dynamic reference frame positioning member 80 can be driven in any appropriate distance, such as until the tap cap 152 engages a portion of the cannula 154. Also, the body 82 of the dynamic reference frame positioning member 80 can include selected indicia or markings to assist in determining an amount of movement of the dynamic reference frame positioning member 80 relative to the cannula or the patient 14. Therefore, it will be understood that the bone engaging portion section 88 of the dynamic reference frame positioning member 80 can be determined to be positioned if the body 82 of the dynamic reference frame positioning member 80 is substantially similar in length to the cannula 154. It will be understood, however, that any appropriate system may be used to determine appropriate positioning of the dynamic reference frame positioning member 80.


Once the dynamic reference frame positioning member 80 has been driven into the PSIS 17a′, the tracking sensor 54a can be interconnected with the dynamic reference frame positioning member 80. It will be understood, however, that the cannula 154 and/or the tap cap 152 can also be removed before interconnecting the tracking sensor 54a with the dynamic reference frame positioning member 80. Therefore, for a majority of the procedure, only the dynamic reference frame positioning member 80 is provided percutaneously to engage the PSIS 17a′.


Also, prior to impacting the dynamic reference frame positioning member 80 a fiducial may be used to determine an appropriate location. Further the dynamic reference frame positioning member 80 may act as a fiducial that is positioned when image data is collected regarding the patient. Thus the tracking sensor 54a need only be connected to the dynamic reference frame positioning member 80 during an operative procedure. Thus the dynamic reference frame positioning member 80 can be a fiducial for use in registering the image data or image space to patient space.


Further, if the integral tracking sensor 54aa is provided, driving the dynamic reference frame positioning member 80 into the PSIS 17a′ may substantially complete positioning the dynamic reference frame. Nevertheless, if the tracking sensor 54a is provided, it can be selectively interconnected with the dynamic reference frame positioning member 80. When the tracking sensor 54a is used it can include a locking screw 170 that can engage the dynamic reference frame positioning portion 100. As discussed above, the dynamic reference frame positioning portion 100 can include a bore 102, which the locking screw 170 may engage. The tracking sensor 54a can include other positioning portions, such as an angle screw, a translation screw, or the like, which can allow for adjustment or positioning the tracking sensor 54a in any of the 6 degrees of freedom or any selected number thereof. Nevertheless, the tracking sensor 54a can be interconnected with the dynamic reference frame positioning member 80 in any appropriate manner.


Once the tracking sensor 54a is interconnected with the dynamic reference frame positioning member 80, (as illustrated in FIG. 1) if necessary, the tracking sensor 54a can be localized or registered with the navigation system 10. It will be understood that the tracking sensor 54a can be any appropriate dynamic reference frame, such as an optical dynamic reference frame, an electromagnetic dynamic reference frame, or any appropriate dynamic reference frame. Regardless, the dynamic reference frame positioning member 80 allows the tracking sensor 54a, 54aa to be held at a selected location relative to a portion of the anatomy for a period during the procedure. The bone engaging portion 88, according to various embodiments, can substantially reduce or eliminate rotation of the dynamic reference frame positioning member 80, and therefore, the tracking sensor 54a. Further, the bone engaging portion 88 can also substantially reduce or eliminate translation or axial movement of the dynamic reference frame positioning member 80 and, consequently, motion of the tracking sensor 54a. Therefore, the dynamic reference frame positioning member 80 can allow for percutaneous holding of the tracking sensor 54a relative to the patient 14 for a selected procedure. The dynamic reference frame positioning member 80 can hold the tracking sensor 54a, 54aa in any selected amount, such as in six degrees of freedom including rotation, translation, axial motion, roll, pitch, and yaw.


After a selected procedure is performed, such as a disc replacement, nucleus replacements, vertebral implants, or other appropriate procedures, the dynamic reference frame positioning member 80 and the tracking sensor 54a, 54aa can be removed.


Although the tracking sensor 54a and the dynamic reference frame positioning member 80 can be removed in any appropriate manner, the following is an exemplary method. Additionally, if provided, the tracking sensor 54a can be disconnected from the dynamic reference frame positioning member 80. The locking screw 170 can be loosened or disconnected to allow for removal of the tracking sensor 54a. If the tracking sensor 54aa is provided, the tracking sensor 54a need not be present and may not need to be removed.


After the tracking sensor 54a, if provided, is removed the slap hammer 158 can engage a portion of the dynamic reference frame positioning member 80, as illustrated in FIG. 9, such as the dynamic reference frame engaging portion 100 of the dynamic reference frame positioning member 80. Once the slap hammer 158 has appropriately engaged the dynamic reference frame positioning member 80, the slap hammer 158 can be operated in an appropriate manner to remove the dynamic reference frame positioning member 80.


The slap hammer 158 can include a handle 172 that can be operated by a user, such as a physician. An engaging end 174 is provided to engage the dynamic reference frame positioning member 80 in a selected manner. The handle 172 can be moved in the direction of arrow B to provide an axial movement of the slap hammer to withdraw the dynamic reference frame positioning member 80 from the PSIS 17a′. Once the dynamic reference frame positioning member 80 has been removed from the PSIS 17a′, the dynamic reference frame positioning member 80 can be disposed of in an appropriate manner or cleaned and sterilized for further procedures.


Therefore, as discussed above, the dynamic reference frame 54 can be positioned relative to a selected portion of the patient 14 substantially percutaneously, such as through a puncture or through a small incision. The small incision can be closed in any appropriate manner, with or without sutures.


Regardless, the disruption of natural tissue with the use of the dynamic reference frame positioning member 80, 110, 130 according to various embodiments is substantially minimal. Therefore recovery time due to the positioning of the dynamic reference frame 54 can be substantially reduced or eliminated. Also, the ability to drive a dynamic reference frame positioning member 80 substantially axially into the bone, such as the iliac 17 can provide for ease of use by a user, such as a physician, and also further reduce trauma to the soft tissue surrounding the area of positioning of the dynamic reference frame positioning member 80. This can further assist in reducing trauma to the patient 14 and assist in speeding recovery.


It will be understood, that the dynamic reference frame positioning member 80 can be used to position any appropriate modular tracking sensor 54a or can include the integral tracking sensor 54aa. As discussed above, the modular tracking sensor 54a can be an optical, electromagnetic, acoustic, or any other appropriate dynamic reference frame. Further, the modular tracking sensor 54a can be formed in any appropriate geometry for selected instrumentation. The modular tracking sensor 54a, using the dynamic reference frame positioning member 80, can be used to perform any appropriate procedure and can be used to track to any appropriate portion of the anatomy.


For example, the dynamic reference frame positioning member 80 can be driven into the iliac crest, such as that described above, driven into a portion of the leg, such as a portion of the femur, driven into a portion of the arm, such as the humerus, or the like. The dynamic reference frame positioning member 80, 110 can be sized to allow it to be interconnected with any appropriate portion of the anatomy and driving it into the iliac crest is merely exemplary. Regardless, the dynamic reference frame 54 can be positioned relative to a selected portion of the anatomy to allow for referencing or dynamically tracking a portion of the anatomy during a procedure.


Turning to FIGS. 10 and 11, a dynamic reference frame holding member 200 according to various embodiments is illustrated. The dynamic reference frame holding member 200 can include a plurality of portions that are similar to the previously disclosed dynamic reference frame holding members 80, 110, 130. The similar portions will not be described in detail here as they will be understood by one skilled in the art. Briefly, however, the dynamic reference frame holding member 200 can include a shaft 202 extending between two ends. Near a first end, a bone engaging portion 204 can be formed. The bone engaging portion 204 can include any appropriate engaging portion such as a plurality of fins, points and the like. Nevertheless, the bone engaging portion 204 can hold the dynamic reference frame holding member 200 relatively fixed to the anatomy 14 in translation, axial movements, rotation, yaw, pitch, and roll.


Near the second end of the shaft 202 is a resiliently deformable member 208, such as a spring, a rubber component, or other similar resilient members. Further, an engaging pin 206 is formed to extend from the shaft 202. It will be understood that the engaging pin 206 can extend from a plurality of positions or include a plurality of extending portions. Nevertheless, a single extending portion is illustrated for clarity of the current discussion.


The dynamic reference frame holding member 200 can be inserted in a manner substantially similar to that discussed above. Nevertheless, a tap cap 152′ can include a slot or passage 210 that is able to extend over a proximal or second end of the dynamic reference frame holding member 200 so as not to engage the engaging pin 206 in a substantial manner. Therefore the tap cap 152′ can engage mostly the resilient portion 208 rather than directing forces on the engaging pin 206. Therefore the dynamic reference frame holding member 200 can be driven into a selected portion of the anatomy, such as the PSIS 17a′ as discussed above. The dynamic reference frame holding member 200 can include an integral or single piece tracking sensor 54aa. Nevertheless, the engaging pin 206 can be used to engage a modular tracking sensor 54a′ illustrated in FIG. 11. The modular tracking sensor 54a′ can include an engaging shaft 212 that can include a portion that is operable to move over or pass over the second end of the dynamic reference frame holding member 200. An opening 214 can be defined in the shaft portion 212 of the modular tracking sensor 54a′ to allow the engaging pin 206 to move into a selected portion of the opening 214.


When positioning the tracking sensor 54a′ relative to the dynamic reference frame holding member 200, the tracking sensor 54a′ can have a force applied to it to deform the resilient member 208. The modular tracking sensor 54a′ can then be rotated to move the engaging pin 206 to a selected portion of the opening 214. Once the engaging pin 206 is positioned in a selected area of the opening 214, the applied force to the modular tracking sensor 54a′ can be removed. The resilient member 208 can then push against the modular tracking sensor 54a′ to move the modular tracking sensor 54a′ in a manner that allows the engaging pin 206 to engage in an engaging portion 216 of the opening 214.


Therefore the modular tracking sensor 54a′ can be easily and quickly interconnected with the dynamic reference frame holding member 200. Further, the modular dynamic reference frame 54a′ can be easily and repeatedly interconnected with the dynamic reference frame holding member 200 during a selected procedure prior thereto, or afterwards. The engaging pin 206, in cooperation with the resilient member 208 and the engaging section 216, can allow for ease of attachment in a quick manner. It also allows for ease of substantial repeatability of the engagement. Therefore, the modular tracking sensor 54a′ can be easily interconnected with the holding member 200. Nevertheless, it will be understood that a modular tracking sensor can be interconnected with any appropriate dynamic reference frame holding member 80, 110, 130 for various purposes.


Further areas of applicability of the present teachings will become apparent from the detailed description provided above. It should be understood that the detailed description and specific examples, while indicating various embodiments, are intended for purposes of illustration only and are not intended to limit the scope of the teachings.

Claims
  • 1. A surgical navigation system for use in a surgical procedure on an anatomy including a bone, comprising: a tracking system;a dynamic reference frame tracking sensor operable to be tracked by the tracking system as a dynamic reference frame;a dynamic reference frame positioning member having a substantially non-cylindrical bone engaging portion including at least one arm extending from a central region that is configured to be axially driven into the bone to at least resist rotation of the dynamic reference frame positioning member about a long axis of the dynamic reference frame positioning member;wherein the dynamic reference frame tracking sensor is attached to the dynamic reference frame positioning member;wherein the dynamic reference frame positioning member is operable to engage a bone in the anatomy in a selectively fixed manner at least by being impacted into the bone in the axial direction to hold the dynamic reference frame tracking sensor in a selected position relative to the bone.
  • 2. The system of claim 1, wherein the at least one arm includes at least four arms extending radially from the central region.
  • 3. The system of claim 2, wherein the substantially non-cylindrical bone engaging portion has a cruciform cross-section when taken substantially perpendicular to the long axis of the dynamic reference frame positioning member defined by the at least four arms extending radially from the central region.
  • 4. The system of claim 1, wherein the dynamic reference frame positioning member further comprises: a sensor holding portion that is at a fixed position relative to the substantially non-cylindrical bone engaging portion.
  • 5. The system claim 1, wherein the dynamic reference frame positioning member further comprises: a pin extending transverse to the long axis of the dynamic reference frame positioning member; anda resilient member to bias the dynamic reference frame tracking sensor to engage the pin.
  • 6. The system of claim 1, wherein the dynamic reference frame positioning member includes incorporated therein the dynamic reference frame tracking sensor such that the dynamic reference frame tracking sensor is at a fixed position relative to the substantially non-cylindrical bone engaging portion.
  • 7. The system of claim 1, further comprising: a cannula adapted to pass through soft tissue relative to the bone adapted to assist in passage of the dynamic reference frame positioning member towards and into the bone;a dilator configured to initially pass through the cannula prior to passing the dynamic reference frame positioning member passing towards and into the bone through the cannula; andan impaction instrument adapted to impact the dynamic reference frame positioning member into the bone through the cannula;wherein the cannula and the dilator are adapted to prepare the soft tissue for passage of the dynamic reference frame positioning member towards and into the bone;wherein the dynamic reference frame positioning member includes indicia to assist in determining an amount of movement of the dynamic reference frame positioning member relative to at least one of the cannula or the bone.
  • 8. A surgical navigation system for use in a surgical procedure on an anatomy including a bone, comprising: a tracking system;a dynamic reference frame tracking sensor operable to be tracked by the tracking system as a dynamic reference frame; anda dynamic reference frame positioning member having: a bone engaging portion, wherein the bone engaging portion has at least a first section that has at least a first arm and a second arm that extend transverse to a long axis of the dynamic reference frame positioning member and that is configured to be axially driven into the bone to at least resist rotation of the dynamic reference frame positioning member about the long axis, anda sensor holding section, wherein the dynamic reference frame tracking sensor is configured to removably engage the sensor holding section in a fixed position relative to the bone engaging portion;wherein the dynamic reference frame positioning member is operable to engage a bone in the anatomy in a selectively fixed manner at least by being impacted into the bone in the axial direction to hold said dynamic reference frame tracking sensor in a selected position relative to the bone.
  • 9. The system of claim 8, wherein the dynamic reference frame tracking sensor includes a locking screw to engage the sensor holding section.
  • 10. The system of claim 8, wherein the sensor holding section includes a pin extending transverse to the long axis of the dynamic reference frame positioning member and a resilient member to bias the dynamic reference frame tracking sensor to engage the pin; wherein the dynamic reference frame tracking sensor includes an opening and an engaging portion;wherein the pin is operable to move through the opening and engage the engaging portion of the dynamic reference frame tracking sensor when the dynamic reference frame tracking sensor is biased by the resilient member.
  • 11. The system of claim 8, wherein the first arm and the second arm are substantially thin relative to the remaining portion of the dynamic reference frame positioning member.
  • 12. The system of claim 8, wherein the bone engaging portion further includes a third arm and a fourth arm; wherein all of the first arm, the second arm, the third arm, and the fourth arm together form a cruciform configuration extending from an end region of the dynamic reference frame positioning member and forming a terminal end of the dynamic reference frame positioning member.
  • 13. The system of claim 8, wherein the bone engaging portion further includes a third arm and a fourth arm, and an extending portion; wherein all of the first arm, the second arm, the third arm, and the fourth arm extend radially from the extending portion.
  • 14. The system of claim 13, wherein the extending portion includes a smooth portion that extends beyond all of the first arm, the second arm, the third arm, and the fourth arm along the long axis and forms a terminal end of the dynamic reference frame positioning member.
  • 15. The system of claim 8, further comprising: a removal tool to engage the dynamic reference frame positioning member to axially remove the bone engaging portion from the bone after it is impacted into the bone.
  • 16. A surgical navigation system for use in a surgical procedure on an anatomy including a bone, comprising: a tracking system;a dynamic reference frame tracking sensor operable to be tracked by the tracking system as a dynamic reference frame; anda dynamic reference frame positioning member having: a bone engaging portion, wherein the bone engaging portion has at least a first section that has at least a first arm and a second arm that extend transverse to a long axis of the dynamic reference frame positioning member and that is configured to be axially driven into the bone to at least resist rotation of the dynamic reference frame positioning member about the long axis, anda sensor holding section having a pin extending transverse to the long axis of the dynamic reference frame positioning member, wherein the dynamic reference frame tracking sensor is configured to removably engage the sensor holding section in a fixed position relative to the bone engaging portion; anda cap having a pin passage to allow the pin to pass relative to the cap during an impaction of the dynamic reference frame positioning member;wherein the cap is configured to be directly impacted and transfer an impaction force to the dynamic reference frame positioning member;wherein the pin passage limits contact and an impaction force applied to the pin during impaction of the dynamic reference frame positioning member;wherein the dynamic reference frame positioning member is operable to engage a bone in the anatomy in a selectively fixed manner at least by being impacted into the bone in the axial direction to hold said dynamic reference frame tracking sensor in a selected position relative to the bone.
  • 17. A method for using a surgical navigation system during surgical procedure on an anatomy including a bone, comprising: tracking a dynamic reference frame tracking sensor with a tracking system as a dynamic reference frame; andproviding a dynamic reference frame positioning member extending along a long axis and formed with: a bone engaging portion, wherein the bone engaging portion has at least a first section that has at least a first arm and a second arm that extend transverse to the long axis and is configured to be axially driven into the bone to at least resist rotation of the dynamic reference frame positioning member about the long axis, anda sensor holding section configured to engage the dynamic reference frame tracking sensor in a fixed position relative to the bone engaging portion;wherein the dynamic reference frame positioning member is operable to engage a bone in the anatomy in a selectively fixed manner at least by being impacted into the bone axially in a direction along the long axis to hold said dynamic reference frame tracking sensor in a selected position relative to the bone.
  • 18. The method of claim 17, further comprising: axially impacting, along the long axis, at least the bone engaging portion of the provided dynamic reference frame positioning member into the bone along the long axis to penetrate the bone with at least the bone engaging portion including the first arm and the second arm to at least rotationally fix the provided dynamic reference frame positioning member relative to the bone.
  • 19. The method of claim 18, further comprising: connecting the dynamic reference frame tracking sensor to the driven dynamic reference frame positioning member;performing a procedure relative to the driven dynamic reference frame positioning member; andafter performing the procedure, engaging the driven dynamic reference frame positioning member to remove at least the bone engaging portion from the bone.
  • 20. The method of claim 17, wherein providing the dynamic reference frame positioning member with the bone engaging portion further includes at least a third arm and a fourth arm such that the first arm, the second arm, the third arm, and the fourth arm form a cruciform configuration in cross-section substantially transverse to the long axis.
  • 21. The method of claim 17, further comprising: forming the provided dynamic reference frame positioning member to incorporate the dynamic reference frame tracking sensor as a single member;wherein the dynamic reference frame tracking sensor is fixed relative to the bone engaging section.
CROSS-REFERENCE TO RELATED APPLICATIONS

This application is a continuation of U.S. patent application Ser. No. 11/232,445 filed on Sep. 21, 2005, now U.S. Pat. No. 7,835,784. The entire disclosure of the above application is incorporated herein by reference.

US Referenced Citations (587)
Number Name Date Kind
1576781 Phillips Mar 1926 A
1735726 Bornhardt Nov 1929 A
2407845 Nemeyer Sep 1946 A
2650588 Drew Sep 1953 A
2697433 Sehnder Dec 1954 A
3016899 Stenvall Jan 1962 A
3017887 Heyer Jan 1962 A
3061936 Dobbeleer Nov 1962 A
3073310 Mocarski Jan 1963 A
3109588 Polhemus et al. Nov 1963 A
3294083 Alderson Dec 1966 A
3367326 Frazier Feb 1968 A
3439256 Kahne Apr 1969 A
3577160 White May 1971 A
3614950 Rabey Oct 1971 A
3644825 Davis, Jr. et al. Feb 1972 A
3674014 Tillander Jul 1972 A
3702935 Carey et al. Nov 1972 A
3704707 Halloran Dec 1972 A
3821469 Whetstone et al. Jun 1974 A
3868565 Kuipers Feb 1975 A
3941127 Froning Mar 1976 A
3983474 Kuipers Sep 1976 A
4017858 Kuipers Apr 1977 A
4037592 Kronner Jul 1977 A
4052620 Brunnett Oct 1977 A
4054881 Raab Oct 1977 A
4117337 Staats Sep 1978 A
4173228 Van Steenwyk et al. Nov 1979 A
4182312 Mushabac Jan 1980 A
4202349 Jones May 1980 A
4228799 Anichkov et al. Oct 1980 A
4256112 Kopf et al. Mar 1981 A
4262306 Renner Apr 1981 A
4287809 Egli et al. Sep 1981 A
4298874 Kuipers Nov 1981 A
4314251 Raab Feb 1982 A
4317078 Weed et al. Feb 1982 A
4319136 Jinkins Mar 1982 A
4328548 Crow et al. May 1982 A
4328813 Ray May 1982 A
4339953 Iwasaki Jul 1982 A
4341220 Perry Jul 1982 A
4346384 Raab Aug 1982 A
4358856 Stivender et al. Nov 1982 A
4368536 Pfeiler Jan 1983 A
4396885 Constant Aug 1983 A
4396945 DiMatteo et al. Aug 1983 A
4403321 Kruger Sep 1983 A
4418422 Richter et al. Nov 1983 A
4419012 Stephenson et al. Dec 1983 A
4422041 Lienau Dec 1983 A
4431005 McCormick Feb 1984 A
4485815 Amplatz et al. Dec 1984 A
4506676 Duska Mar 1985 A
4543959 Sepponen Oct 1985 A
4548208 Niemi Oct 1985 A
4571834 Fraser et al. Feb 1986 A
4572198 Codrington Feb 1986 A
4583538 Onik et al. Apr 1986 A
4584577 Temple Apr 1986 A
4608977 Brown Sep 1986 A
4613866 Blood Sep 1986 A
4617925 Laitinen Oct 1986 A
4618978 Cosman Oct 1986 A
4621628 Brudermann Nov 1986 A
4625718 Olerud et al. Dec 1986 A
4638798 Shelden et al. Jan 1987 A
4642786 Hansen Feb 1987 A
4645343 Stockdale et al. Feb 1987 A
4649504 Krouglicof et al. Mar 1987 A
4651732 Frederick Mar 1987 A
4653509 Oloff et al. Mar 1987 A
4659971 Suzuki et al. Apr 1987 A
4660970 Ferrano Apr 1987 A
4673352 Hansen Jun 1987 A
4688037 Krieg Aug 1987 A
4701049 Beckman et al. Oct 1987 A
4705395 Hageniers Nov 1987 A
4705401 Addleman et al. Nov 1987 A
4706665 Gouda Nov 1987 A
4709156 Murphy et al. Nov 1987 A
4710708 Rorden et al. Dec 1987 A
4719419 Dawley Jan 1988 A
4722056 Roberts et al. Jan 1988 A
4722336 Kim et al. Feb 1988 A
4723544 Moore et al. Feb 1988 A
4727565 Ericson Feb 1988 A
RE32619 Damadian Mar 1988 E
4733969 Case et al. Mar 1988 A
4737032 Addleman et al. Apr 1988 A
4737794 Jones Apr 1988 A
4737921 Goldwasser et al. Apr 1988 A
4742356 Kuipers May 1988 A
4742815 Ninan et al. May 1988 A
4743770 Lee May 1988 A
4743771 Sacks et al. May 1988 A
4745290 Frankel et al. May 1988 A
4750487 Zanetti Jun 1988 A
4753528 Hines et al. Jun 1988 A
4761072 Pryor Aug 1988 A
4764016 Johansson Aug 1988 A
4771787 Wurster et al. Sep 1988 A
4779212 Levy Oct 1988 A
4782239 Hirose et al. Nov 1988 A
4788481 Niwa Nov 1988 A
4791934 Brunnett Dec 1988 A
4793355 Crum et al. Dec 1988 A
4794262 Sato et al. Dec 1988 A
4797907 Anderton Jan 1989 A
4803976 Frigg et al. Feb 1989 A
4804261 Kirschen Feb 1989 A
4805615 Carol Feb 1989 A
4809694 Ferrara Mar 1989 A
4821200 Oberg Apr 1989 A
4821206 Arora Apr 1989 A
4821731 Martinelli et al. Apr 1989 A
4822163 Schmidt Apr 1989 A
4825091 Breyer et al. Apr 1989 A
4829373 Leberl et al. May 1989 A
4836778 Baumrind et al. Jun 1989 A
4838265 Cosman et al. Jun 1989 A
4841967 Chang et al. Jun 1989 A
4845771 Wislocki et al. Jul 1989 A
4849692 Blood Jul 1989 A
4860331 Williams et al. Aug 1989 A
4862893 Martinelli Sep 1989 A
4869247 Howard, III et al. Sep 1989 A
4875165 Fencil et al. Oct 1989 A
4875478 Chen Oct 1989 A
4884566 Mountz et al. Dec 1989 A
4889526 Rauscher et al. Dec 1989 A
4896673 Rose et al. Jan 1990 A
4905698 Strohl, Jr. et al. Mar 1990 A
4923459 Nambu May 1990 A
4931056 Ghajar et al. Jun 1990 A
4945305 Blood Jul 1990 A
4945914 Allen Aug 1990 A
4951653 Fry et al. Aug 1990 A
4955891 Carol Sep 1990 A
4961422 Marchosky et al. Oct 1990 A
4977655 Martinelli Dec 1990 A
4989608 Ratner Feb 1991 A
4991579 Allen Feb 1991 A
5002058 Martinelli Mar 1991 A
5005592 Cartmell Apr 1991 A
5013317 Cole et al. May 1991 A
5016639 Allen May 1991 A
5017139 Mushabac May 1991 A
5027818 Bova et al. Jul 1991 A
5030196 Inoue Jul 1991 A
5030222 Calandruccio et al. Jul 1991 A
5031203 Trecha Jul 1991 A
5042486 Pfeiler et al. Aug 1991 A
5047036 Koutrouvelis Sep 1991 A
5050608 Watanabe et al. Sep 1991 A
5054492 Scribner et al. Oct 1991 A
5057095 Fabian Oct 1991 A
5059789 Salcudean Oct 1991 A
5078140 Kwoh Jan 1992 A
5079699 Tuy et al. Jan 1992 A
5086401 Glassman et al. Feb 1992 A
5094241 Allen Mar 1992 A
5097839 Allen Mar 1992 A
5098426 Sklar et al. Mar 1992 A
5099845 Besz et al. Mar 1992 A
5099846 Hardy Mar 1992 A
5105829 Fabian et al. Apr 1992 A
5107839 Houdek et al. Apr 1992 A
5107843 Aarnio et al. Apr 1992 A
5107862 Fabian et al. Apr 1992 A
5109194 Cantaloube Apr 1992 A
5119817 Allen Jun 1992 A
5142930 Allen et al. Sep 1992 A
5143076 Hardy et al. Sep 1992 A
5152288 Hoenig et al. Oct 1992 A
5160337 Cosman Nov 1992 A
5161536 Vilkomerson et al. Nov 1992 A
5178164 Allen Jan 1993 A
5178621 Cook et al. Jan 1993 A
5186174 Schlondorff et al. Feb 1993 A
5187475 Wagener et al. Feb 1993 A
5188126 Fabian et al. Feb 1993 A
5190059 Fabian et al. Mar 1993 A
5193106 DeSena Mar 1993 A
5197476 Nowacki et al. Mar 1993 A
5197965 Cherry et al. Mar 1993 A
5198768 Keren Mar 1993 A
5198877 Schulz Mar 1993 A
5207688 Carol May 1993 A
5211164 Allen May 1993 A
5211165 Dumoulin et al. May 1993 A
5211176 Ishiguro et al. May 1993 A
5212720 Landi et al. May 1993 A
5214615 Bauer May 1993 A
5217003 Wilk Jun 1993 A
5217453 Wilk Jun 1993 A
5219351 Teubner et al. Jun 1993 A
5222499 Allen et al. Jun 1993 A
5224049 Mushabac Jun 1993 A
5228442 Imran Jul 1993 A
5230338 Allen et al. Jul 1993 A
5230623 Guthrie et al. Jul 1993 A
5233990 Barnea Aug 1993 A
5237996 Waldman et al. Aug 1993 A
5249581 Horbal et al. Oct 1993 A
5251127 Raab Oct 1993 A
5251635 Dumoulin et al. Oct 1993 A
5253647 Takahashi et al. Oct 1993 A
5255680 Darrow et al. Oct 1993 A
5257636 White Nov 1993 A
5257998 Ota et al. Nov 1993 A
5261404 Mick et al. Nov 1993 A
5265610 Darrow et al. Nov 1993 A
5265611 Hoenig et al. Nov 1993 A
5269759 Hernandez et al. Dec 1993 A
5271400 Dumoulin et al. Dec 1993 A
5273025 Sakiyama et al. Dec 1993 A
5274551 Corby, Jr. Dec 1993 A
5279309 Taylor et al. Jan 1994 A
5285787 Machida Feb 1994 A
5291199 Overman et al. Mar 1994 A
5291889 Kenet et al. Mar 1994 A
5295483 Nowacki et al. Mar 1994 A
5297549 Beatty et al. Mar 1994 A
5299253 Wessels Mar 1994 A
5299254 Dancer et al. Mar 1994 A
5299288 Glassman et al. Mar 1994 A
5300080 Clayman et al. Apr 1994 A
5305091 Gelbart et al. Apr 1994 A
5305203 Raab Apr 1994 A
5306271 Zinreich et al. Apr 1994 A
5307072 Jones, Jr. Apr 1994 A
5309913 Kormos et al. May 1994 A
5315630 Sturm et al. May 1994 A
5316024 Hirschi et al. May 1994 A
5318025 Dumoulin et al. Jun 1994 A
5320111 Livingston Jun 1994 A
5325728 Zimmerman et al. Jul 1994 A
5325873 Hirschi et al. Jul 1994 A
5329944 Fabian et al. Jul 1994 A
5330485 Clayman et al. Jul 1994 A
5333168 Fernandes et al. Jul 1994 A
5353795 Souza et al. Oct 1994 A
5353800 Pohndorf et al. Oct 1994 A
5353807 DeMarco Oct 1994 A
5359417 Muller et al. Oct 1994 A
5368015 Wilk Nov 1994 A
5368030 Zinreich et al. Nov 1994 A
5371778 Yanof et al. Dec 1994 A
5375596 Twiss et al. Dec 1994 A
5377678 Dumoulin et al. Jan 1995 A
5383454 Bucholz Jan 1995 A
5385146 Goldreyer Jan 1995 A
5385148 Lesh et al. Jan 1995 A
5386828 Owens et al. Feb 1995 A
5389101 Heilbrun et al. Feb 1995 A
5391199 Ben-Haim Feb 1995 A
5394457 Leibinger et al. Feb 1995 A
5394875 Lewis et al. Mar 1995 A
5397329 Allen Mar 1995 A
5398684 Hardy Mar 1995 A
5399146 Nowacki et al. Mar 1995 A
5400384 Fernandes et al. Mar 1995 A
5402801 Taylor Apr 1995 A
5408409 Glassman et al. Apr 1995 A
5413573 Koivukangas May 1995 A
5417210 Funda et al. May 1995 A
5419325 Dumoulin et al. May 1995 A
5423334 Jordan Jun 1995 A
5425367 Shapiro et al. Jun 1995 A
5425382 Golden et al. Jun 1995 A
5426683 O'Farrell, Jr. et al. Jun 1995 A
5426687 Goodall et al. Jun 1995 A
5427097 Depp Jun 1995 A
5429132 Guy et al. Jul 1995 A
5433198 Desai Jul 1995 A
RE35025 Anderton Aug 1995 E
5437277 Dumoulin et al. Aug 1995 A
5443066 Dumoulin et al. Aug 1995 A
5443489 Ben-Haim Aug 1995 A
5444756 Pai et al. Aug 1995 A
5445144 Wodicka et al. Aug 1995 A
5445150 Dumoulin et al. Aug 1995 A
5445166 Taylor Aug 1995 A
5446548 Gerig et al. Aug 1995 A
5447154 Cinquin et al. Sep 1995 A
5448610 Yamamoto et al. Sep 1995 A
5453686 Anderson Sep 1995 A
5456718 Szymaitis Oct 1995 A
5457641 Zimmer et al. Oct 1995 A
5458718 Venkitachalam Oct 1995 A
5464446 Dreessen et al. Nov 1995 A
5469847 Zinreich et al. Nov 1995 A
5478341 Cook et al. Dec 1995 A
5478343 Ritter Dec 1995 A
5480422 Ben-Haim Jan 1996 A
5480439 Bisek et al. Jan 1996 A
5483961 Kelly et al. Jan 1996 A
5484437 Michelson Jan 1996 A
5485849 Panescu et al. Jan 1996 A
5487391 Panescu Jan 1996 A
5487729 Avellanet et al. Jan 1996 A
5487757 Truckai et al. Jan 1996 A
5490196 Rudich et al. Feb 1996 A
5490852 Azer et al. Feb 1996 A
5494034 Schlondorff et al. Feb 1996 A
5503416 Aoki et al. Apr 1996 A
5513637 Twiss et al. May 1996 A
5514146 Lam et al. May 1996 A
5515160 Schulz et al. May 1996 A
5517990 Kalfas et al. May 1996 A
5531227 Schneider Jul 1996 A
5531520 Grimson et al. Jul 1996 A
5542938 Avellanet et al. Aug 1996 A
5543951 Moehrmann Aug 1996 A
5546940 Panescu et al. Aug 1996 A
5546949 Frazin et al. Aug 1996 A
5546951 Ben-Haim Aug 1996 A
5551429 Fitzpatrick et al. Sep 1996 A
5558091 Acker et al. Sep 1996 A
5566681 Manwaring et al. Oct 1996 A
5568384 Robb et al. Oct 1996 A
5568809 Ben-haim Oct 1996 A
5571109 Bertagnoli Nov 1996 A
5572999 Funda et al. Nov 1996 A
5573533 Strul Nov 1996 A
5575794 Walus et al. Nov 1996 A
5575798 Koutrouvelis Nov 1996 A
5583909 Hanover Dec 1996 A
5588430 Bova et al. Dec 1996 A
5590215 Allen Dec 1996 A
5592939 Martinelli Jan 1997 A
5595193 Walus et al. Jan 1997 A
5596228 Anderton et al. Jan 1997 A
5600330 Blood Feb 1997 A
5603318 Heilbrun et al. Feb 1997 A
5611025 Lorensen et al. Mar 1997 A
5617462 Spratt Apr 1997 A
5617857 Chader et al. Apr 1997 A
5619261 Anderton Apr 1997 A
5622169 Golden et al. Apr 1997 A
5622170 Schulz Apr 1997 A
5627873 Hanover et al. May 1997 A
5628315 Vilsmeier et al. May 1997 A
5630431 Taylor May 1997 A
5636644 Hart et al. Jun 1997 A
5638819 Manwaring et al. Jun 1997 A
5640170 Anderson Jun 1997 A
5642395 Anderton et al. Jun 1997 A
5643268 Vilsmeier et al. Jul 1997 A
5645065 Shapiro et al. Jul 1997 A
5646524 Gilboa Jul 1997 A
5647361 Damadian Jul 1997 A
5662111 Cosman Sep 1997 A
5664001 Tachibana et al. Sep 1997 A
5674296 Bryan et al. Oct 1997 A
5676673 Ferre et al. Oct 1997 A
5681260 Ueda et al. Oct 1997 A
5682886 Delp et al. Nov 1997 A
5682890 Kormos et al. Nov 1997 A
5690108 Chakeres Nov 1997 A
5690636 Wildgoose et al. Nov 1997 A
5694945 Ben-Haim Dec 1997 A
5695500 Taylor et al. Dec 1997 A
5695501 Carol et al. Dec 1997 A
5696500 Diem Dec 1997 A
5697377 Wittkampf Dec 1997 A
5702406 Vilsmeier et al. Dec 1997 A
5711299 Manwaring et al. Jan 1998 A
5713946 Ben-Haim Feb 1998 A
5715822 Watkins et al. Feb 1998 A
5715836 Kliegis et al. Feb 1998 A
5718241 Ben-Haim et al. Feb 1998 A
5727552 Ryan Mar 1998 A
5727553 Saad Mar 1998 A
5729129 Acker Mar 1998 A
5730129 Darrow et al. Mar 1998 A
5730130 Fitzpatrick et al. Mar 1998 A
5732703 Kalfas et al. Mar 1998 A
5735278 Hoult et al. Apr 1998 A
5738096 Ben-Haim Apr 1998 A
5740802 Nafis et al. Apr 1998 A
5740808 Panescu et al. Apr 1998 A
5741214 Ouchi et al. Apr 1998 A
5742394 Hansen Apr 1998 A
5744953 Hansen Apr 1998 A
5748767 Raab May 1998 A
5749362 Funda et al. May 1998 A
5749835 Glantz May 1998 A
5752513 Acker et al. May 1998 A
5755725 Druais May 1998 A
RE35816 Schulz Jun 1998 E
5758667 Slettenmark Jun 1998 A
5762064 Polvani Jun 1998 A
5767669 Hansen et al. Jun 1998 A
5767699 Bosnyak et al. Jun 1998 A
5767960 Orman Jun 1998 A
5769789 Wang et al. Jun 1998 A
5769843 Abela et al. Jun 1998 A
5769861 Vilsmeier Jun 1998 A
5772594 Barrick Jun 1998 A
5772661 Michelson Jun 1998 A
5775322 Silverstein et al. Jul 1998 A
5776064 Kalfas et al. Jul 1998 A
5782765 Jonkman Jul 1998 A
5787886 Kelly et al. Aug 1998 A
5788701 McCue Aug 1998 A
5792055 McKinnon Aug 1998 A
5795294 Luber et al. Aug 1998 A
5797849 Vesely et al. Aug 1998 A
5799055 Peshkin et al. Aug 1998 A
5799099 Wang et al. Aug 1998 A
5800352 Ferre et al. Sep 1998 A
5800535 Howard, III Sep 1998 A
5802719 O'Farrell, Jr. et al. Sep 1998 A
5803089 Ferre et al. Sep 1998 A
5807252 Hassfeld et al. Sep 1998 A
5808665 Green Sep 1998 A
5810008 Dekel et al. Sep 1998 A
5810728 Kuhn Sep 1998 A
5810735 Halperin et al. Sep 1998 A
5820553 Hughes Oct 1998 A
5823192 Kalend et al. Oct 1998 A
5823958 Truppe Oct 1998 A
5828725 Levinson Oct 1998 A
5828770 Leis et al. Oct 1998 A
5829444 Ferre et al. Nov 1998 A
5831260 Hansen Nov 1998 A
5833608 Acker Nov 1998 A
5834759 Glossop Nov 1998 A
5836954 Heilbrun et al. Nov 1998 A
5840024 Taniguchi et al. Nov 1998 A
5840025 Ben-Haim Nov 1998 A
5843076 Webster, Jr. et al. Dec 1998 A
5848967 Cosman Dec 1998 A
5851183 Bucholz Dec 1998 A
5865846 Bryan et al. Feb 1999 A
5868674 Glowinski et al. Feb 1999 A
5868675 Henrion et al. Feb 1999 A
5871445 Bucholz Feb 1999 A
5871455 Ueno Feb 1999 A
5871487 Warner et al. Feb 1999 A
5873822 Ferre et al. Feb 1999 A
5882304 Ehnholm et al. Mar 1999 A
5884410 Prinz Mar 1999 A
5889834 Vilsmeier et al. Mar 1999 A
5891034 Bucholz Apr 1999 A
5891157 Day et al. Apr 1999 A
5904691 Barnett et al. May 1999 A
5907395 Schulz et al. May 1999 A
5913820 Bladen et al. Jun 1999 A
5916164 Fitzpatrick et al. Jun 1999 A
5920395 Schulz Jul 1999 A
5921992 Costales et al. Jul 1999 A
5923727 Navab Jul 1999 A
5928248 Acker Jul 1999 A
5938603 Ponzi Aug 1999 A
5938694 Jaraczewski et al. Aug 1999 A
5947980 Jensen et al. Sep 1999 A
5947981 Cosman Sep 1999 A
5950629 Taylor et al. Sep 1999 A
5951475 Gueziec et al. Sep 1999 A
5951571 Audette Sep 1999 A
5954647 Bova et al. Sep 1999 A
5957844 Dekel et al. Sep 1999 A
5964796 Imran Oct 1999 A
5967980 Ferre et al. Oct 1999 A
5967982 Barnett Oct 1999 A
5968047 Reed Oct 1999 A
5971997 Guthrie et al. Oct 1999 A
5976156 Taylor et al. Nov 1999 A
5980535 Barnett et al. Nov 1999 A
5983126 Wittkampf Nov 1999 A
5987349 Schulz Nov 1999 A
5987960 Messner et al. Nov 1999 A
5999837 Messner et al. Dec 1999 A
5999840 Grimson et al. Dec 1999 A
6001130 Bryan et al. Dec 1999 A
6006126 Cosman Dec 1999 A
6006127 Van Der Brug et al. Dec 1999 A
6013087 Adams et al. Jan 2000 A
6014580 Blume et al. Jan 2000 A
6016439 Acker Jan 2000 A
6019725 Vesely et al. Feb 2000 A
6024695 Taylor et al. Feb 2000 A
6033406 Mathews Mar 2000 A
6050724 Schmitz et al. Apr 2000 A
6052477 Wang et al. Apr 2000 A
6059718 Taniguchi et al. May 2000 A
6063022 Ben-Haim May 2000 A
6071288 Carol et al. Jun 2000 A
6073043 Schneider Jun 2000 A
6076008 Bucholz Jun 2000 A
6096050 Audette Aug 2000 A
6104944 Martinelli Aug 2000 A
6118845 Simon et al. Sep 2000 A
6122538 Sliwa, Jr. et al. Sep 2000 A
6122541 Cosman et al. Sep 2000 A
6131396 Duerr et al. Oct 2000 A
6139183 Graumann Oct 2000 A
6147480 Osadchy et al. Nov 2000 A
6149592 Yanof et al. Nov 2000 A
6156067 Bryan et al. Dec 2000 A
6161032 Acker Dec 2000 A
6165181 Heilbrun et al. Dec 2000 A
6167296 Shahidi Dec 2000 A
6172499 Ashe Jan 2001 B1
6175756 Ferre et al. Jan 2001 B1
6178345 Vilsmeier et al. Jan 2001 B1
6194639 Botella et al. Feb 2001 B1
6201387 Govari Mar 2001 B1
6201984 Funda et al. Mar 2001 B1
6203497 Dekel et al. Mar 2001 B1
6211666 Acker Apr 2001 B1
6223067 Vilsmeier et al. Apr 2001 B1
6226548 Foley et al. May 2001 B1
6233476 Strommer et al. May 2001 B1
6246231 Ashe Jun 2001 B1
6259942 Westermann et al. Jul 2001 B1
6273896 Franck et al. Aug 2001 B1
6285902 Kienzle, III et al. Sep 2001 B1
6298262 Franck et al. Oct 2001 B1
6314310 Ben-Haim et al. Nov 2001 B1
6332089 Acker et al. Dec 2001 B1
6341231 Ferre et al. Jan 2002 B1
6348058 Melkent et al. Feb 2002 B1
6351659 Vilsmeier Feb 2002 B1
6381485 Hunter et al. Apr 2002 B1
6401348 Cavanaugh et al. Jun 2002 B1
6424856 Vilsmeier et al. Jul 2002 B1
6427314 Acker Aug 2002 B1
6428547 Vilsmeier et al. Aug 2002 B1
6430434 Mittelstadt Aug 2002 B1
6434415 Foley et al. Aug 2002 B1
6437567 Schenck et al. Aug 2002 B1
6445943 Ferre et al. Sep 2002 B1
6450978 Brosseau et al. Sep 2002 B1
6470207 Simon et al. Oct 2002 B1
6474341 Hunter et al. Nov 2002 B1
6478802 Kienzle, III et al. Nov 2002 B2
6484049 Seeley et al. Nov 2002 B1
6490467 Bucholz et al. Dec 2002 B1
6490475 Seeley et al. Dec 2002 B1
6493573 Martinelli et al. Dec 2002 B1
6498944 Ben-Haim et al. Dec 2002 B1
6499488 Hunter et al. Dec 2002 B1
6516046 Frohlich et al. Feb 2003 B1
6527443 Vilsmeier et al. Mar 2003 B1
6551325 Neubauer et al. Apr 2003 B2
6575899 Foley et al. Jun 2003 B1
6584174 Schubert et al. Jun 2003 B2
6609022 Vilsmeier et al. Aug 2003 B2
6611700 Vilsmeier et al. Aug 2003 B1
6640128 Vilsmeier et al. Oct 2003 B2
6662036 Cosman Dec 2003 B2
6694162 Hartlep Feb 2004 B2
6694168 Traxel et al. Feb 2004 B2
6701179 Martinelli et al. Mar 2004 B1
6725082 Sati et al. Apr 2004 B2
6980849 Sasso Dec 2005 B2
6993374 Sasso Jan 2006 B2
7237556 Smothers et al. Jul 2007 B2
7570791 Frank et al. Aug 2009 B2
7751865 Jascob et al. Jul 2010 B2
7835778 Foley et al. Nov 2010 B2
7835784 Mire et al. Nov 2010 B2
20010007918 Vilsmeier et al. Jul 2001 A1
20020038085 Immerz Mar 2002 A1
20020095081 Vilsmeier et al. Jul 2002 A1
20020198451 Carson Dec 2002 A1
20030066538 Martinelli et al. Apr 2003 A1
20030153829 Sarin et al. Aug 2003 A1
20030196671 Sasso Oct 2003 A1
20030225329 Rossner et al. Dec 2003 A1
20040024309 Ferre et al. Feb 2004 A1
20040030236 Mazzocchi et al. Feb 2004 A1
20040030237 Lee et al. Feb 2004 A1
20040068263 Chouinard et al. Apr 2004 A1
20040097952 Sarin et al. May 2004 A1
20040106926 Leitner et al. Jun 2004 A1
20040167393 Solar et al. Aug 2004 A1
20040171924 Mire et al. Sep 2004 A1
20040230199 Jansen et al. Nov 2004 A1
20040267242 Grimm et al. Dec 2004 A1
20050085720 Jascob et al. Apr 2005 A1
20060253152 Evans et al. Nov 2006 A1
Foreign Referenced Citations (69)
Number Date Country
964149 Mar 1975 CA
3042343 Jun 1982 DE
3508730 Sep 1986 DE
3717871 Dec 1988 DE
3831278 Mar 1989 DE
3838011 Jul 1989 DE
4213426 Oct 1992 DE
4225112 Dec 1993 DE
4233978 Apr 1994 DE
19715202 Oct 1998 DE
19751761 Oct 1998 DE
19832296 Feb 1999 DE
19747427 May 1999 DE
10085137 Nov 2002 DE
0062941 Oct 1982 EP
0119660 Sep 1984 EP
0155857 Sep 1985 EP
0319844 Jun 1989 EP
0326768 Aug 1989 EP
350996 Jan 1990 EP
0419729 Apr 1991 EP
0427358 May 1991 EP
0456103 Nov 1991 EP
0469966 Feb 1992 EP
0581704 Feb 1994 EP
0651968 May 1995 EP
0655138 May 1995 EP
0894473 Feb 1999 EP
0930046 Jul 1999 EP
2417970 Sep 1979 FR
2618211 Jan 1989 FR
2094590 Sep 1982 GB
2164856 Apr 1986 GB
62327 Jan 1983 JP
2765738 Jun 1988 JP
63240851 Oct 1988 JP
3267054 Nov 1991 JP
6194639 Jul 1994 JP
WO-8809151 Dec 1988 WO
WO-8905123 Jun 1989 WO
WO-9005494 May 1990 WO
WO-9103982 Apr 1991 WO
WO-9104711 Apr 1991 WO
WO-9107726 May 1991 WO
WO-9203090 Mar 1992 WO
WO-9206645 Apr 1992 WO
WO-9404938 Mar 1994 WO
WO-9423647 Oct 1994 WO
WO-9424933 Nov 1994 WO
WO-9507055 Mar 1995 WO
WO-9611624 Apr 1996 WO
WO-9632059 Oct 1996 WO
WO-9736192 Oct 1997 WO
WO-9749453 Dec 1997 WO
WO-9808554 Mar 1998 WO
WO-9838908 Sep 1998 WO
WO-9915097 Apr 1999 WO
WO-9921498 May 1999 WO
WO-9923956 May 1999 WO
WO-9926549 Jun 1999 WO
WO-9927839 Jun 1999 WO
WO-9929253 Jun 1999 WO
WO-9933406 Jul 1999 WO
WO-9937208 Jul 1999 WO
WO-9938449 Aug 1999 WO
WO-9952094 Oct 1999 WO
WO-9960939 Dec 1999 WO
WO-0130437 May 2001 WO
WO-2007038135 Apr 2007 WO
Non-Patent Literature Citations (129)
Entry
Adams et al., “Orientation Aid for Head and Neck Surgeons,” Innov. Tech. Biol. Med., vol. 13, No. 4, 1992, pp. 409-424.
Adams et al., Computer-Assisted Surgery, IEEE Computer Graphics & Applications, pp. 43-51, (May 1990).
Barrick et al., “Prophylactic Intramedullary Fixation of the Tibia for Stress Fracture in a Professional Athlete,” Journal of Orthopaedic Trauma, vol. 6, No. 2, pp. 241-244 (1992).
Barrick et al., “Technical Difficulties with the Brooker-Wills Nail in Acute Fractures of the Femur,” Journal of Orthopaedic Trauma, vol. 4, No. 2, pp. 144-150 (1990).
Barrick, “Distal Locking Screw Insertion Using a Cannulated Drill Bit: Technical Note,” Journal of Orthopaedic Trauma, vol. 7, No. 3, 1993, pp. 248-251.
Batnitzky et al., “Three-Dimensinal Computer Reconstructions of Brain Lesions from Surface Contours Provided by Computed Tomography: A Prospectus,” Neurosurgery, vol. 11, No. 1, Part 1, 1982, pp. 73-84.
Benzel et al., “Magnetic Source Imaging: a Review of the Magnes System of Biomagnetic Technologies Incorporated,” Neurosurgery, vol. 33, No. 2 (Aug. 1993), pp. 252-259.
Bergstrom et al. Stereotaxic Computed Tomography, Am. J. Roentgenol, vol. 127 pp. 167-170 (1976).
Bouazza-Marouf et al.; “Robotic-Assisted Internal Fixation of Femoral Fractures”, IMECHE., pp. 51-58 (1995).
Brack et al., “Accurate X-ray Based Navigation in Computer-Assisted Orthopedic Surgery,” CAR '98, pp. 716-722.
Brown, R., M.D., A Stereotactic Head Frame for Use with CT Body Scanners, Investigative Radiology .COPYRGT. J.B. Lippincott Company, pp. 300-304 (Jul.-Aug. 1979).
Bryan, “Bryan Cervical Disc System Single Level Surgical Technique”, Spinal Dynamics, 2002, pp. 1-33.
Bucholz et at., “Variables affecting the accuracy of stereotactic localizationusing computerized tomography,” Journal of Neurosurgery, vol. 79, Nov. 1993, pp. 667-673.
Bucholz, R.D., et al. Image-guided surgical techniques for infections and trauma of the central nervous system, Neurosurg. Clinics of N. A., vol. 7, No. 2, pp. 187-200 (1996).
Bucholz, R.D., et al., A Comparison of Sonic Digitizers Versus Light Emitting Diode-Based Localization, Interactive Image-Guided Neurosurgery, Chapter 16, pp. 179-200 (1993).
Bucholz, R.D., et al., Intraoperative localization using a three dimensional optical digitizer, SPIE—The Intl. Soc. For Opt. Eng., vol. 1894, pp. 312-322 (Jan. 17-19, 1993).
Bucholz, R.D., et al., Intraoperative Ultrasonic Brain Shift Monitor and Analysis, Stealth Station Marketing Brochure (2 pages) (undated).
Bucholz, R.D., et al., The Correction of Stereotactic Inaccuracy Caused by Brain Shift Using an Intraoperative Ultrasound Device, First Joint Conference, Computer Vision, Virtual Reality and Robotics in Medicine and Medical Robotics andComputer-Assisted Surgery, Grenoble, France, pp. 459-466 (Mar. 19-22, 1997).
Champleboux et al., “Accurate Calibration of Cameras and Range Imaging Sensors: the NPBS Method,” IEEE International Conference on Robotics and Automation, Nice, France, May, 1992.
Champleboux, “Utilisation de Fonctions Splines pour la Mise au Point D'un Capteur Tridimensionnel sans Contact,” Quelques Applications Medicates, Jul. 1991.
Cinquin et al., “Computer Assisted Medical Interventions,” IEEE Engineering in Medicine and Biology, May/Jun. 1995, pp. 254-263.
Cinquin et al., “Computer Assisted Medical Interventions,” International Advanced Robotics Programme, Sep. 1989, pp. 63-65.
Clarysse et al., “A Computer-Assisted System for 3-D Frameless Localization in Stereotaxic MRI,” IEEE Transactions on Medical Imaging, vol. 10, No. 4, Dec. 1991, pp. 523-529.
Cutting M.D. et al., Optical Tracking of Bone Fragments During Craniofacial Surgery, Second Annual International Symposium on Medical Robotics and Computer Assisted Surgery, pp. 221-225, (Nov. 1995).
Feldmar et al., “3D-2D Projective Registration of Free-Form Curves and Surfaces,” Rapport de recherche (Inria Sophia Antipolis), 1994, pp. 1-44.
Foley et al., “Fundamentals of Interactive Computer Graphics,” The Systems Programming Series, Chapter 7, Jul. 1984, pp. 245-266.
Foley et al., “Image-guided Intraoperative Spinal Localization,” Intraoperative Neuroprotection, Chapter 19, 1996, pp. 325-340. cited by other.
Foley, “The StealthStation: Three-Dimensional Image-Interactive Guidance for the Spine Surgeon,” Spinal Frontiers, Apr. 1996, pp. 7-9.
Friets, E.M., et al. A Frameless Stereotaxic Operating Microscope for Neurosurgery, IEEE Trans. On Biomed. Eng., vol. 36, No. 6, pp. 608-617 (Jul. 1989).
Gallen, C.C., et al., Intracranial Neurosurgery Guided by Functional Imaging, Surg. Neurol., vol. 42, pp. 523-530 (1994).
Galloway, R.L., et al., Interactive Image-Guided Neurosurgery, IEEE Trans. on Biomed. Eng., vol. 89, No. 12, pp. 1226-1231 (1992).
Galloway, R.L., Jr. et al, Optical localization for interactive, image-guided neurosurgery, SPIE, vol. 2164, pp. 137-145 (undated.
Germano, “Instrumentation, Technique and Technology”, Neurosurgery, vol. 37, No. 2, Aug. 1995, pp. 348-350.
Gildenberg et al., “Calculation of Stereotactic Coordinates from the Computed Tomographic Scan,” Neurosurgery, vol. 10, No. 5, May 1982, pp. 580-586.
Gomez, C.R., et al., Transcranial Doppler Ultrasound Following Closed Head Injury: Vasospasm or Vasoparalysis?, Surg. Neurol., vol. 35, pp. 30-35 (1991).
Gonzalez, “Digital Image Fundamentals,” Digital Image processing, Second Edition, 1987, pp. 52-54.
Gottesfeld Brown et al., “Registration of Planar Film Radiographs with Computer Tomography,” Proceedings of MMBIA, Jun. 1996, pp. 42-51.
Grimson, W.E.L., An Automatic Registration Method for Frameless Stereotaxy, Image Guided Surgery, and enhanced Reality Visualization, IEEE, pp. 430-436 (1994).
Grimson, W.E.L., et al., Virtual-reality technology is giving surgeons the equivalent of x-ray vision helping them to remove tumors more effectively, to minimize surgical wounds and to avoid damaging critical tissues, Sci. Amer., vol. 280, No. 6,pp. 62-69 (Jun. 1999).
Gueziec et al., “Registration of Computed Tomography Data to a Surgical Robot Using Fluoroscopy: A Feasibility Study,” Computer Science/Mathematics, Sep. 27, 1996, 6 pages.
Guthrie, B.L., Graphic-Interactive Cranial Surgery: The Operating Arm System, Handbook of Stereotaxy Using the CRW Apparatus, Chapter 13, pp. 193-211 (undated.
Hamadeh et al, “Kinematic Study of Lumbar Spine Using Functional Radiographies and 3D/2D Registration,” TIMC UMR 5525—IMAG.
Hamadeh et al., “Automated 3-Dimensional Computed Tomographic and Fluorscopic Image Registration,” Computer Aided Surgery (1998), 3:11-19.
Hamadeh et al., “Towards Automatic Registration Between CT and X-ray Images: Cooperation Between 3D/2D Registration and 2D Edge Detection,” MRCAS '95, pp. 39-46.
Hardy, T., M.D., et al., CASS: A Program for Computer Assisted Stereotaxic Surgery, The Fifth Annual Symposium on Comptuer Applications in Medical Care, Proceedings, Nov. 1-4, 1981, IEEE, pp. 1116-1126, (1981).
Hatch, “Reference-Display System for the Integration of CT Scanning and the Operating Microscope,” Thesis, Thayer School of Engineering, Oct. 1984, pp. 1-189.
Hatch, et al., “Reference-Display System for the Integration of CT Scanning and the Operating Microscope”, Proceedings of the Eleventh Annual Northeast Bioengineering Conference, Mar. 14-15, 1985, pp. 252-254.
Heilbrun et al., “Preliminary experience with Brown-Roberts-Wells (BRW) computerized tomography stereotaxic guidance system,” Journal of Neurosurgery, vol. 59, Aug. 1983, pp. 217-222.
Heilbrun, M.D., Progressive Technology Applications, Neurosurgery for the Third Millenium, Chapter 15, J. Whitaker & Sons, Ltd., Amer. Assoc. Of Neurol. Surgeons, pp. 191-198 (1992).
Heilbrun, M.P., Computed Tomography—Guided Stereotactic Systems, Clinical Neurosurgery, Chapter 31, pp. 564-581 (1983).
Heilbrun, M.P., et al., Stereotactic Localization and Guidance Using a Machine Vision Technique, Sterotact & Funct. Neurosurg., Proceed. Of the Mtg. of the Amer. Soc. For Sterot. And Funct. Neurosurg. (Pittsburgh, PA) vol. 58, pp. 94-98 (1992).
Henderson et al., “An Accurate and Ergonomic Method of Registration for Image-guided Neurosurgery,” Computerized Medical Imaging and Graphics, vol. 18, No. 4, Jul.-Aug. 1994, pp. 273-277.
Hoerenz, “The Operating Microscope I. Optical Principles, Illumination Systems, and Support Systems,” Journal of Microsurgery, vol. 1, 1980, pp. 364-369.
Hofstetter et al., “Fluoroscopy Based Surgical Navigation—Concept and Clinical Applications,” Computer Assisted Radiology and Surgery, 1997, pp. 956-960.
Horner et al., “A Comparison of CT-Stereotaxic Brain Biopsy Techniques,” Investigative Radiology, Sep.-Oct. 1984, pp. 367-373.
Hounsfield, “Computerized transverse axial scanning (tomography): Part 1. Description of system,” British Journal of Radiology, vol. 46, No. 552, Dec. 1973, pp. 1016-1022.
International Search Report and Written Opinion mailed Mar. 20, 2007 for PCT/US06/36647 claiming benefit of U.S. Appl. No. 11/232,445, filed Sep. 21, 2005.
Jacques et al., “A Computerized Microstereotactic Method to Approach, 3-Dimensionally Reconstruct, Remove and Adjuvantly Treat Small CNS Lesions,” Applied Neurophysiology, vol. 43, 1980, pp. 176-182.
Jacques et al., “Computerized three-dimensional stereotaxic removal of small central nervous system lesion in patients,” J. Neurosurg., vol. 53, Dec. 1980, pp. 816-820.
Joskowicz et al., “Computer-Aided Image-Guided Bone Fracture Surgery: Concept and Implementation,” CAR '98, pp. 710-715.
Kall, B., The Impact of Computer and Imgaging Technology on Stereotactic Surgery, Proceedings of the Meeting of the American Society for Stereotactic and Functional Neurosurgery, pp. 10-22 (1987).
Kato, A., et al., A frameless, armless navigational system for computer-assisted neurosurgery, J. Neurosurg., vol. 74, pp. 845-849 (May 1991).
Kelly et al., “Computer-assisted stereotaxic laser resection of intra-axial brain neoplasms,” Journal of Neurosurgery, vol. 64, Mar. 1986, pp. 427-439.
Kelly et al., “Precision Resection of Intra-Axial CNS Lesions by CT-Based Stereotactic Craniotomy and Computer Monitored CO2 Laser,” Acta Neurochirurgica, vol. 68, 1983, pp. 1-9.
Kelly, P.J., Computer Assisted Stereotactic Biopsy and Volumetric Resection of Pediatric Brain Tumors, Brain Tumors in Children, Neurologic Clinics, vol. 9, No. 2, pp. 317-336 (May 1991).
Kelly, P.J., Computer-Directed Stereotactic Resection of Brain Tumors, Neurologica Operative Atlas, vol. 1, No. 4, pp. 299-313 (1991).
Kelly, P.J., et al., Results of Computed Tomography-based Computer-assisted Stereotactic Resection of Metastatic Intracranial Tumors, Neurosurgery, vol. 22, No. 1, Part 1, 1988, pp. 7-17 (Jan. 1988).
Kelly, P.J., Stereotactic Imaging, Surgical Planning and Computer-Assisted Resection of Intracranial Lesions: Methods and Results, Advances and Technical Standards in Neurosurgery, vol. 17, pp. 78-118, (1990).
Kim, W.S. et al., A Helmet Mounted Display for Telerobotics, IEEE, pp. 543-547 (1988).
Klimek, L., et al., Long-Term Experience with Different Types of Localization Systems in Skull-Base Surgery, Ear, Nose & Throat Surgery, Chapter 51, pp. 635-638 (undated).
Kosugi, Y., et al., An Articulated Neurosurgical Navigation System Using MRI and CT Images, IEEE Trans. on Biomed, Eng. vol. 35, No. 2, pp. 147-152 (Feb. 1988).
Krybus, W., et al., Navigation Support for Surgery by Means of Optical Position Detection, Computer Assisted Radiology Proceed. of the Intl. Symp. CAR '91 Computed Assisted Radiology, pp. 362-366 (Jul. 3-6, 1991).
Kwoh, Y.S., Ph.D., et al., A New Computerized Tomographic-Aided Robotic Stereotaxis System, Robotics Age, vol. 7, No. 6, pp. 17-22 (Jun. 1985).
Laitinen et al., “An Adapter for Computed Tomography-Guided, Stereotaxis,” Surg. Neurol., 1985, pp. 559-566.
Laitinen, “Noninvasive multipurpose stereoadapter,” Neurological Research, Jun. 1987, pp. 137-141.
Lavallee “VI Adaption de la Methodologie a Quelques Applications Cliniques,” Chapitre VI, pp. 133-148.
Lavallee et al, “Matching 3-D Smooth Surfaces with their 2-D Projections using 3-D Distance Maps,” SPIE, vol. 1570, Geometric Methods in Computer Vision, 1991, pp. 322-336.
Lavallee et al., “Computer Assisted Driving of a Needle into the Brain,” Proceedings of the International Symposium CAR '89, Computer Assisted Radiology, 1989, pp. 416-420.
Lavallee et al., “Computer Assisted Interventionist Imaging: The Instance of Stereotactic Brain Surgery,” North-Holland MEDINFO 89, Part 1, 1989, pp. 613-617.
Lavallee et al., “Computer Assisted Spine Surgery: A Technique for Accurate Transpedicular Screw Fixation Using CT Data and a 3-D Optical Localizer,” TIMC, Faculte de Medecine de Grenoble.
Lavallee et al., “Image guided operating robot: a clinical application in stereotactic neurosurgery,” . Proceedings of the 1992 IEEE Internation Conference on Robotics and Automation, May 1992, pp. 618-624.
Lavallee et al., “Matching of Medical Images for Computed and Robot Assisted Surgery,” IEEE EMBS, Orlando, 1991.
Lavallee, “A New System for Computer Assisted Neurosurgery,” IEEE Engineering in Medicine & Biology Society 11th Annual International Conference, 1989, pp. 0926-0927.
Lavallee, S., et al., Computer Assisted Knee Anterior Cruciate Ligament Reconstruction First Clinical Tests, Proceedings of the First International Symposium on Medical Robotics and Computer Assisted Surgery, pp. 11-16 (Sep. 1994).
Lavallee, S., et al., Computer Assisted Medical Interventions, NATO ASI Series, vol. F 60, 3d Imaging in Medic., pp. 301-312 (1990).
Leavitt, D.D., et al., Dynamic Field Shaping to Optimize Stereotactic Radiosurgery, I.J. Rad. Onc. Biol. Physc., vol. 21, pp. 1247-1255 (1991).
Leksell et al., “Stereotaxis and Tomography—A Technical Note,” ACTA Neurochirurgica, vol. 52, 1980, pp. 1-7.
Lemieux et al., “A Patient-to-Computed-Tomography Image Registration Method Based on Digitally Reconstructed Radiographs,” Med. Phys. 21 (11), Nov. 1994, pp. 1749-1760.
Levin et al., “The Brain: Integrated Three-dimensional Display of MR and PET Images,” Radiology, vol. 172, No. 3, Sep. 1989, pp. 783-789.
Maurer, Jr., et al., Registration of Head CT Images to Physical Space Using a Weighted Combination of Points and Surfaces, IEEE Trans. on Med. Imaging, vol. 17, No. 5, pp. 753-761 (Oct. 1998).
Mazier et al., “Computer-Assisted Interventionist Imaging: Application to the Vertebral Column Surgery,” Annual International Conference of the IEEE Engineering in Medicine and Biology Society, vol. 12, No. 1, 1990, pp. 0430-0431.
Mazier et al., Chirurgie de la Colonne Vertebrale Assistee par Ordinateur: Appication au Vissage Pediculaire, Innov. Tech. Biol. Med., vol. 11, No. 5, 1990, pp. 559-566.
McGirr, S., M.D., et al., Stereotactic Resection of Juvenile Pilocytic Astrocytomas of the Thalamus and Basal Ganglia, Neurosurgery, vol. 20, No. 3, pp. 447-452, (1987).
Merloz, et al., “Computer Assisted Spine Surgery”, Clinical Assisted Spine Surgery, No. 337, pp. 86-96.
Ng, W.S. et al., Robotic Surgery—A First-Hand Experience in Transurethral Resection of the Prostate Surgery, IEEE Eng. in Med. and Biology, pp. 120-125 (Mar. 1993).
Pelizzari et al., “Accurate Three-Dimensional Registration of CT, PET, and/or MR Images of the Brain,” Journal of Computer Assisted Tomography, Jan./Feb. 1989, pp. 20-26.
Pelizzari et al., “Interactive 3D Patient-Image Registration,” Information Processing in Medical Imaging, 12th International Conference, IPMI '91, Jul. 7-12, 136-141 (A.C.F. Colchester et al. eds. 1991).
Pelizzari et al., No. 52813 “Three Dimensional Correlation of PET, CT and MRI Images,” The Journal of Nuclear Medicine, vol. 28, No. 4, Apr. 1987, p. 682.
Penn, R.D., et al., Stereotactic Surgery with Image Processing of Computerized Tomographic Scans, Neurosurgery, vol. 3, No. 2, pp. 157-163 (Sep.-Oct. 1978).
Phillips et al., “Image Guided Orthopaedic Surgery Design and Analysis,” Trans Inst. MC, vol. 17, No. 5, 1995, pp. 251-264.
Pixsys, 3-D Digitizing Accessories, by Pixsys (marketing brochure)(undated) (2 pages).
Potamianos et al., “Intra-Operative Imaging Guidance for Keyhole Surgery Methodology and Calibration,” First International Symposium on Medical Robotics and Computer Assisted Surgery, Sep. 22-24, 1994, pp. 98-104.
Prestige Cervical Disc System Surgical Technique, 12 pgs.
Reinhardt et al., “CT-Guided ‘Real Time’Stereotaxy,” ACTA Neurochirurgica, 1989.
Reinhardt, H., et al., A Computer-Assisted Device for Intraoperative CT-Correlated Localization of Brain Tumors, pp. 51-58 (1988).
Reinhardt, H.F. et al., Sonic Stereometry in Microsurgical Procedures for Deep-Seated Brain Tumors and Vascular Malformations, Neurosurgery, vol. 32, No. 1, pp. 51-57 (Jan. 1993).
Reinhardt, H.F., et al., Mikrochirugische Enffernung tiefliegender Gefa.beta.mi.beta.bildungen mit Hilfe der Sonar-Stereometrie (Microsurgical Removal of Deep-Seated Vascular Malformations Using Sonar Stereometry). Ultraschall in Med. 12, pp. 80-83(1991).
Reinhardt, Hans. F., Neuronavigation: A Ten-Year Review, Neurosurgery, pp. 329-341 (undated).
Roberts et al., “A frameless stereotaxic integration of computerized tomographic imaging and the operating microscope,” J. Neurosurg., vol. 65, Oct. 1986, pp. 545-549.
Rosenbaum et al., “Computerized Tomography Guided Stereotaxis: A New Approach,” Applied Neurophysiology, vol. 43, No. 3-5, 1980, pp. 172-173.
Sautot, “Vissage Pediculaire Assiste Par Ordinateur,” Sep. 20, 1994.
Schueler et al., “Correction of Image Intensifier Distortion for Three-Dimensional X-Ray Angiography,” SPIE Medical Imaging 1995, vol. 2432, pp. 272-279.
Selvik et al., “A Roentgen Stereophotogrammetric System,” Acta Radiologica Diagnosis, 1983, pp. 343-352.
Shelden et al., “Development of a computerized microsteroetaxic method for localization and removal of minute CNS lesions under direct 3-D vision,” J. Neurosurg., vol. 52, 1980, pp. 21-27.
Simon, D.A., Accuracy Validation in Image-Guided Orthopaedic Surgery, Second Annual Intl. Symp. On Med. Rob. an Comp-Assisted surgery, MRCAS '95, pp. 185-192 (undated).
Smith et al., “Computer Methods for Improved Diagnostic Image Display Applied to Stereotactic Neurosurgery,” Automedical, vol. 14, 1992, pp. 371-382 (4 unnumbered pages).
Smith et al., “The Neurostation.TM.—A Highly Accurate, Minimally Invasive Solution to Frameless Stereotactic Neurosurgery,” Computerized Medical Imaging and Graphics, vol. 18, Jul.-Aug. 1994, pp. 247-256.
Smith, K.R., et al. Multimodality Image Analysis and Display Methods for Improved Tumor Localization in Stereotactic Neurosurgery, Annul Intl. Conf. of the IEEE Eng. in Med. And Biol. Soc., vol. 13, No. 1, p. 210 (1991).
Tan, K., Ph.D., et al., A frameless stereotactic approach to neurosurgical planning based on retrospective patient-image registration, J Neurosurgy, vol. 79, pp. 296-303 (Aug. 1993).
The Laitinen Stereotactic System, E2-E6.
Thompson, et al., A System for Anatomical and Functional Mapping of the Human Thalamus, Computers and Biomedical Research, vol. 10, pp. 9-24 (1977).
Trobraugh, J.W., et al., Frameless Stereotactic Ultrasonography: Method and Applications, Computerized Medical Imaging and Graphics, vol. 18, No. 4, pp. 235-246 (1994).
Viant et al., “A Computer Assisted Orthopaedic System for Distal Locking of Intramedullary Nails,” Proc. of MediMEC '95, Bristol, 1995, pp. 86-91.
Von Hanwhr et al., Foreword, Computerized Medical Imaging and Graphics, vol. 18, No. 4, pp. 225-228, (Jul.-Aug. 1994).
Wang, M.Y., et al., An Automatic Technique for Finding and Localizing Externally Attached Markers in CT and MR volume Images of the Head, IEEE Trans. on Biomed. Eng., vol. 43, No. 6, pp. 627-637 (Jun. 1996).
Watanabe et al., “Three-Dimensional Digitizer (Neuronavigator): New Equipment for Computed Tomography-Guided Stereotaxic Surgery,” Surgical Neurology, vol. 27, No. 6, Jun. 1987, pp. 543-547.
Watanabe, “Neuronavigator,” Igaku-no-Ayumi, vol. 137, No. 6, May 10, 1986, pp. 1-4.
Watanabe, E., M.D., et al., Open Surgery Assisted by the Neuronavigator, a Stereotactic, Articulated, Sensitive Arm, Neurosurgery, vol. 28, No. 6, pp. 792-800 (1991).
Weese et al., “An Approach to 2D/3D Registration of a Vertebra in 2D X-ray Fluoroscopies with 3D CT Images,” pp. 119-128. cited by other.
Related Publications (1)
Number Date Country
20110060213 A1 Mar 2011 US
Continuations (1)
Number Date Country
Parent 11232445 Sep 2005 US
Child 12946309 US