Navigating a surgical instrument

Abstract
An instrument tracking device for cooperating with an instrument during a surgical procedure where the instrument has an elongated body extending along a first longitudinal axis and including a working member at a distal tip includes a body having a distal end and a proximal end. An opening can be formed through the body along a second longitudinal axis. The opening can define a passthrough in the body from the distal end to the proximal end. A first tracking coil can be disposed in the body and can define a first tracking coil axis that is substantially coaxial with the second longitudinal axis of the body. A connection mechanism cooperates between the elongated housing and the body that secures the body to the elongated housing of the instrument upon passing at least a portion of the elongated housing through the passthrough of the body in an assembled position.
Description
FIELD

The present disclosure relates generally to image-guided surgery systems, and particularly to a removable electromagnetic instrument tracking device that selectively couples with a surgical handpiece.


BACKGROUND

The statements in this section merely provide background information related to the present disclosure and may not constitute prior art.


Surgical procedures can be performed on anatomies such as the human anatomy for providing a therapy to the anatomy. One area of surgery includes procedures performed on facial cavities of a patient such as on the ear, nose or throat (ENT). In such a procedure, a working member such as a shaver, bur, or other tool connected to an instrument (e.g., a handpiece) may be inserted into such a cavity to perform a shaping procedure for example. As it is desirable to minimize trauma produced by such procedures, it is favorable to avoid any supplemental incisions or other axis portals to be formed in the patient tissue.


Because the viewing angle of a surgeon at the area of interest can be obscured by the surrounding tissue of the cavity, the ability of a surgeon to effectively apply a therapy can be reduced. Therefore, it is desirable to provide a mechanism so that a surgeon can provide a therapy without minimization or reduction of effectiveness of the procedure or in viewing the area to apply the therapy. Navigation of instruments relative to the anatomy can be used.


In some navigation systems, instruments are provided with tracking devices. Sometimes however such tracking devices can be difficult to manipulate or cumbersome to the instrument. In other instances, the tracking devices can protrude a significant distance from the instrument and interfere with a surgeon's sight line. Other tracking devices fail to provide accurate positional information in 3-dimensional space or may be susceptible to electromagnetic interference because of metal objects.


SUMMARY

An instrument tracking device for cooperating with an instrument during a surgical procedure where the instrument has an elongated body extending along a first longitudinal axis and including a working member at a distal tip includes a body having a distal end and a proximal end. An opening can be formed through the body along a second longitudinal axis. The opening can define a passthrough in the body from the distal end to the proximal end. A first tracking coil can be disposed in the body and can define a first tracking coil axis that is substantially coaxial with the second longitudinal axis of the body. The first tracking coil can generate an output signal indicative of a position of the working member. A connection mechanism cooperates between the elongated housing and the body that secures the body to the elongated housing of the instrument upon passing at least a portion of the elongated housing through the passthrough of the body in an assembled position.


According to additional features, the first tracking coil can define a first tracking coil plane. A second tracking coil can be disposed in the body. The second tracking coil can define a second tracking coil plane that is substantially orthogonal relative to the first tracking coil plane. The second and third coils can be disposed in the body at a radially offset position relative to the first tracking coil. The first and second longitudinal axes can be collinear in the assembled position.


According to other features, a third tracking coil can be disposed in the body. The third tracking coil can define a third tracking coil plane that is substantially orthogonal to the first and second tracking coil planes. The second tracking coil can define a second tracking coil axis. The third tracking coil can define a third tracking coil axis. The second and third tracking coil axes can be substantially perpendicular.


In one configuration, the body can include a conical section that extends radially outwardly from the distal end to the proximal end. The second and third tracking coils can be offset toward the proximal end of the body relative to the first tracking coil. The body can include a pair of lobes that project radially from the proximal end of the body and alternately accommodate the second and third coils, respectively. The body can include an extension portion formed intermediate of the pair of lobes that accommodates a wire harness. According to one configuration, the body can comprise a base having the first, second and third tracking coils disposed therein and a cap that covers the base. The base and the cover can be formed of plastic.


According to one configuration, the body can be low profile relative to the instrument, such that an outermost distance of the body measured radially from the first tracking coil axis is less than three times an inner diameter of the opening. The instrument can comprise a surgical handpiece. The working member can comprise a blade or a bur. In one configuration, the body can be removably attached to the longitudinal body of the instrument with an adhesive. In one example, the working member can be removable from the body. In another example, the working member can be fixed and non-removable relative to the body.


Further areas of applicability will become apparent from the description provided herein. It should be understood that the description and specific examples are intended for purposes of illustration only and are not intended to limit the scope of the present disclosure.





DRAWINGS

The drawings described herein are for illustration purposes only and are not intended to limit the scope of the present disclosure in any way.



FIG. 1 is an environmental view of a surgical navigation system, according to various embodiments;



FIG. 2A is a perspective view of an exemplary handpiece having a workpiece such as a blade and shown with an electromagnetic tracking device in an assembled position on a distal end of the handpiece;



FIG. 2B is a perspective view of a blade assembly including the blade shown in FIG. 2A;



FIG. 3A is an exploded view of the instrument tracking device illustrating a body having a base and a cap;



FIG. 3B is a perspective view of an instrument tracking device having a body according to additional features;



FIG. 4 is a longitudinal end view of the electromagnetic tracking device illustrated in FIG. 3A and shown with the cap in phantom;



FIG. 5 is a sectional view of the instrument tracking device taken along line 5-5 of FIG. 4;



FIG. 6 is a sectional view of the electromagnetic instrument tracking device taken along line 6-6 of FIG. 4;



FIG. 7 is a perspective view of the electromagnetic instrument tracking device shown operatively connected with a wire harness and electrical connector and shown removed from a bag; and



FIG. 8 is a longitudinal end view of an electromagnetic tracking device according to additional features.





DETAILED DESCRIPTION

The following description is merely exemplary in nature and is not intended to limit the present disclosure, application, or uses. Initially, one skilled in the art will understand that the system and apparatus disclosed herein can be used in any appropriate procedure. Although a head frame is illustrated attached to a cranium and image data is illustrated for a cranium, any appropriate portion of the anatomy can be imaged. Moreover, a head frame may not be necessary and a dynamic reference frame can be attached to any appropriate structure, such as a bone screw, an adhesive base, an orifice, etc. Furthermore, the following discussion is directed toward instrumentation used during a surgical procedure focused on a sinus cavity of a patient. It will be appreciated however that the teachings may be similarly applied to other ear, nose, throat (ENT) procedures or other surgical procedures in general, such as, but not limited to anterior skull base surgery.


With initial reference to FIG. 1, an electromagnetic (EM) image-guided surgery system according to one example of the present teachings is shown and generally identified at reference numeral 10. The EM image-guided surgery system 10 can generally include an instrument such as a handpiece 12 having a removable EM instrument tracking device 14 configured to communicate with an exemplary navigation system 20. The navigation system 20 can be used to track the location of the handpiece 12 relative to a patient 22 to assist in a surgical procedure. As will be described in greater detail herein, the EM instrument tracking device 14 can receive a signal, transmit a signal, or combinations thereof to provide information to the navigation system 20 to determine a location of the EM instrument tracking device 14. Prior to discussing in detail the EM instrument tracking device 14 according to the present teachings, a general discussion of the exemplary navigation system 20 is warranted.


The navigation system 20 can include an imaging device 24 that is used to acquire pre-, intra-, or post-operative or real-time image data of the patient 22. The image data acquired with the imaging device 24 can be used as part of the image data in the EM image-guided surgery system 10. In addition, data from atlas models can be used to produce patient images, such as those disclosed in U.S. patent application Ser. No. 10/687,539, filed Oct. 16, 2003, now U.S. Pat. App. Pub. No. 2005/0085714, entitled “METHOD AND APPARATUS FOR SURGICAL NAVIGATION OF A MULTIPLE PIECE CONSTRUCT FOR IMPLANTATION”, incorporated herein by reference. The imaging device 24 is, for example, a fluoroscopic X-ray imaging device that may be configured as a C-arm 26 having an X-ray source 28, an X-ray receiving section 30, an optional calibration and tracking target 32 and optional radiation sensors. The calibration and tracking target 32 includes calibration markers (not illustrated). Image data may also be acquired using other imaging devices, such as those discussed herein.


In the exemplary configuration, the EM instrument tracking device 14 is coupled to the handpiece 12 such that the navigation system 20 can track the location of the handpiece 12, relative to a patient 22 to assist in a surgical procedure. As will be described, the EM instrument tracking device 14 can be either removably attached or fixed (non-removably attached) to the handpiece 12. It should be further noted that the EM instrument tracking device 14 can be coupled to other devices including: catheters, probes, needles, leads, implants, etc., for tracking their location with the navigation system 20. Moreover, the navigated device may be used in any region of the body. The navigation system 20 and the various devices may be used in any appropriate procedure, such as one that is generally minimally invasive, arthroscopic, percutaneous, stereotactic, or an open procedure. Although an exemplary navigation system 20 including the imaging system 24 are discussed herein, one skilled in the art will understand that the disclosure is merely for clarity of the present discussion and any appropriate imaging system, navigation system, patient specific data, and non-patient specific data can be used. For example, the intraoperative imaging system can include an MRI imaging system, such as the PoleStar® MRI or an O-arm™ imaging system both sold by Medtronic, Inc. It will be understood that the navigation system 20 can incorporate or be used with any appropriate preoperatively or intraoperatively acquired image data.


An optional imaging device controller 34 may control the imaging device 24, such as the C-arm 26, which can capture the x-ray images received at the X-ray receiving section 30 and store the images for later use. The controller 34 may also be separate from the C-arm 26 and can be part of or incorporated into a work station 36. The controller 34 can control the rotation of the C-arm 26. For example, the C-arm 26 can move in the direction of arrow 38 or rotate about a longitudinal axis 22a of the patient 22, allowing anterior or lateral views of the patient 22 to be imaged. Each of these movements involves rotation about a mechanical axis 39 of the C-arm 26. The movements of the imaging device 24, such as the C-arm 26 can be tracked with a tracking device 40.


In the example of FIG. 1, the longitudinal axis 22a of the patient 22 is substantially in line with the mechanical axis 39 of the C-arm 26. This enables the C-arm 26 to be rotated relative to the patient 22, allowing images of the patient 22 to be taken from multiple directions or in multiple planes. An example of a fluoroscopic C-arm X-ray device that may be used as the imaging device 24 is the “Series 9600 Mobile Digital Imaging System,” from GE Healthcare, (formerly OEC Medical Systems, Inc.) of Salt Lake City, Utah. Other exemplary fluoroscopes include bi-plane fluoroscopic systems, ceiling mounted fluoroscopic systems, cath-lab fluoroscopic systems, fixed C-arm fluoroscopic systems, isocentric C-arm fluoroscopic systems, 3D fluoroscopic systems, intraoperative O-arm™ imaging systems, etc.


The C-arm X-ray imaging system 26 can be any appropriate system, such as a digital or CCD camera, which are well understood in the art. Two dimensional fluoroscopic images that may be taken by the imaging device 24 are captured and stored in the imaging device 24. Multiple two-dimensional images taken by the imaging device 24 may also be captured and assembled to provide a larger view or image of a whole region of the patient 22, as opposed to being directed to only a portion of a region of the patient. The multiple image data can include multiple 2D slices that are assembled into a 3D model or image.


The image data can then be forwarded from the imaging device 34 to the navigation computer and/or processor controller or work station 36 having a display device 42 to display image data 44 and a user interface 46. The work station 36 can also include or be connected to an image processor, navigation processor, and a memory to hold instruction and data. The work station 36 can include an optimization processor that assists in a navigated procedure. It will also be understood that the image data is not necessarily first retained in the controller 34, but may also be directly transmitted to the work station 36. Moreover, processing for the navigation system 20 and optimization can all be done with a single or multiple processors all of which may or may not be included in the work station 36.


The work station 36 provides facilities for displaying the image data 44 as an image on the display device 42, saving, digitally manipulating, or printing a hard copy image of the received image data. The user interface 46, which may be a keyboard, mouse, touch pen, touch screen or other suitable device, allows a physician or user 50 to provide inputs to control the imaging device 24, via the controller 34, or adjust the display settings of the display 42. The work station 36 may also direct the controller 34 to adjust the rotational axis 38 of the C-arm 26 to obtain various two-dimensional images in different planes in order to generate representative two-dimensional and three-dimensional images.


While the imaging device 24 is shown in FIG. 1, any other alternative 2D, 3D or 4D imaging modality may also be used. For example, any 2D, 3D or 4D imaging device, such as isocentric fluoroscopy, bi-plane fluoroscopy, ultrasound, computed tomography (CT), multi-slice computed tomography (MSCT), magnetic resonance imaging (MRI), positron emission tomography (PET), optical coherence tomography (OCT) (a more detailed discussion on optical coherence tomography (OCT), is set forth in U.S. Pat. No. 5,740,808, issued Apr. 21, 1998, entitled “Systems And Methods For Guiding Diagnostic Or Therapeutic Devices In Interior Tissue Regions” which is hereby incorporated by reference), intra-vascular ultrasound (IVUS), intraoperative CT, single photo emission computed tomography (SPECT), planar gamma scintigraphy (PGS). Additional imaging systems include intraoperative MRI systems such as the PoleStar® MRI system sold by Medtronic, Inc. Further systems include the O-Arm™ imaging system sold by Medtronic, Inc. The images may also be obtained and displayed in two, three or four dimensions. In more advanced forms, four-dimensional surface rendering regions of the body may also be achieved by incorporating patient data or other data from an atlas or anatomical model map or from pre-operative image data captured by MRI, CT, or echocardiography modalities.


Image datasets from hybrid modalities, such as positron emission tomography (PET) combined with CT, or single photon emission computer tomography (SPECT) combined with CT, could also provide functional image data superimposed onto anatomical data to be used to confidently reach target sights within the patient 22. It should further be noted that the imaging device 24, as shown in FIG. 1, provides a virtual bi-plane image using a single-head C-arm fluoroscope as the imaging device 24 by simply rotating the C-arm 26 about at least two planes, which could be orthogonal planes to generate two-dimensional images that can be converted to three-dimensional volumetric images. By acquiring images in more than one plane, an icon representing the location of an impacter, stylet, reamer driver, taps, drill, or other instrument, or probe introduced and advanced in the patient 22, may be superimposed in more than one view on display 42 allowing simulated bi-plane or even multi-plane views, including two and three-dimensional views.


Four-dimensional (4D) image information can be used with the navigation system 20 as well. For example, the user 50 can use a physiologic signal, which can include Heart Rate (EKG), Breath Rate (Breath Gating) and combine this data with image data 44 acquired during the phases of the physiologic signal to represent the anatomy at various stages of the physiologic cycle. For example, the brain pulses (and therefore moves) with each heartbeat. Images can be acquired to create a 4D map of the brain, onto which atlas data and representations of the instrument can be projected. This 4D data set can be matched and co-registered with the physiologic signal (EKG) to represent a compensated image within the EM image-guided surgery system 10. The image data registered with the 4D information can show the brain (or anatomy of interest) moving during the cardiac or breath cycle. This movement can be displayed on the display 42 as the image data 44.


Likewise, other imaging modalities can be used to gather the 4D dataset to which pre-operative 2D and 3D data can be matched. One need not necessarily acquire multiple 2D or 3D images during the physiologic cycle of interest (breath or heart beat). Ultrasound imaging or other 4D imaging modalities can be used to create an image data that allows for a singular static pre-operative image to be matched via image-fusion techniques and/or matching algorithms that are non-linear to match the distortion of anatomy based on the movements during the physiologic cycle. The combination of a dynamic reference frame 60 and 4D registration techniques can help compensate for anatomic distortions during movements of the anatomy associated with normal physiologic processes.


The EM image-guided surgery system 10 includes a localizer, such as a coil array 64 and/or second tracking coil array 66, a coil array controller 68, a navigation handpiece interface 70, the handpiece 12 (e.g. catheter, needle, or instruments, as discussed herein) and the dynamic reference frame 60. The dynamic reference frame 60 can include a dynamic reference frame holder or EM, ENT head frame 74 and a removable EM, ENT patient tracker 76. In one example, the EM, ENT patient tracker 76 can be used with Synergy™ ENT software provided by Medtronic, Inc., on a computer-assisted surgery system (such as the EM image-guided surgery system 10 disclosed herein) to track the position of a patient's head during surgery. The EM, ENT head frame 74 can be mounted to the patient's forehead, such as by using an adhesive pad, a silicone pad and a frame strap. Alternatively, the head frame 74 can include a tracking device that can be formed integrally with the head frame 74.


Moreover, the DRF 60 can be provided as separate pieces and can be positioned at any appropriate position on the anatomy. For example, the ENT patient tracker 76 of the DRF can be fixed to the skin of the patient 22 with an adhesive. Also, the DRF 60 can be positioned near a leg, arm, etc. of the patient 22. Thus, the DRF 60 does not need to be provided with a head frame or require any specific base or holding portion.


The tracking devices 14, 40, 76 or any tracking device as discussed herein, can include a sensor, a transmitter, or combinations thereof. Further, the tracking devices can be wired or wireless to provide a signal emitter or receiver within the navigation system 20. For example, any or all of the tracking devices 14, 40, 76 can include an electromagnetic coil to sense a field produced by the localizing array 64, 66. Nevertheless, one will understand that the tracking device (14, 40, 76) can receive a signal, transmit a signal, or combinations thereof to provide information to the navigation system 20 to determine a location of the tracking device 14, 40, 76. Therefore, as used herein, “generating an output signal” is used to mean any combination of receiving a signal, transmitting a signal or combinations thereof. The navigation system 20 can then determine a position of the handpiece 12 or tracking device 14, 40, 76 to allow for navigation relative to the patient 22 and patient space. One suitable navigation system is disclosed in commonly owned U.S. Publication No 2008/0132909, filed Jun. 5, 2008, the contents of which are expressly incorporated herein by reference. One suitable commercially available navigation system includes “STEALTHSTATION® AXIEM™ Navigation System.


The coil arrays 64, 66 may also be supplemented or replaced with a mobile localizer. The mobile localizer may be one such as that described in U.S. patent application Ser. No. 10/941,782, filed Sep. 15, 2004, now U.S. Pat. App. Pub. No. 2005/0085720, entitled “METHOD AND APPARATUS FOR SURGICAL NAVIGATION”, herein incorporated by reference. As is understood the localizer array can transmit signals that are received by the tracking devices 14, 40, 76. The tracking device 14, 40, 76 can then transmit or receive signals based upon the transmitted or received signals from or to the array 64, 66.


Further included in the navigation system 20 may be an isolator circuit or assembly (not illustrated separately). The isolator circuit or assembly may be included in a transmission line to interrupt a line carrying a signal or a voltage to the navigation handpiece interface 70. Alternatively, the isolator circuit included in the isolator box may be included in the navigation handpiece interface 70, the handpiece 12, the dynamic reference frame 60, the transmission lines coupling the devices, or any other appropriate location. The isolator assembly is operable to isolate any of the instruments or patient coincidence instruments or portions that are in contact with the patient 22 should an undesirable electrical surge or voltage take place.


The EM image-guided surgery system 10 uses the coil arrays 64, 66 to create an electromagnetic field used for navigation. The coil arrays 64, 66 can include a plurality of coils that are each operable to generate distinct electromagnetic fields into the navigation region of the patient 22, which is sometimes referred to as patient space. Representative electromagnetic systems are set forth in U.S. Pat. No. 5,913,820, entitled “Position Location System,” issued Jun. 22, 1999 and U.S. Pat. No. 5,592,939, entitled “Method and System for Navigating a Catheter Probe,” issued Jan. 14, 1997, each of which are hereby incorporated by reference.


The coil array 64 is controlled or driven by the coil array controller 68. The coil array controller 68 drives each coil in the coil array 64 in a time division multiplex or a frequency division multiplex manner. In this regard, each coil may be driven separately at a distinct time or all of the coils may be driven simultaneously with each being driven by a different frequency.


Upon driving the coils in the coil array 64 with the coil array controller 68, electromagnetic fields are generated within the patient 22 in the area where the medical procedure is being performed, which is again sometimes referred to as patient space. The electromagnetic fields generated in the patient space induce currents in the EM instrument tracking device 14 and patient tracking device 76. These induced signals from the tracking devices 14, 40, 76 are delivered to the navigation handpiece interface 70 and subsequently forwarded to the coil array controller 68. The navigation handpiece interface 70 can also include amplifiers, filters and buffers to directly interface with the EM instrument tracking device 14 on the handpiece 12. Alternatively, the EM instrument tracking device 14, or any other appropriate portion, may employ a wireless communications channel, such as that disclosed in U.S. Pat. No. 6,474,341, entitled “Surgical Communication Power System,” issued Nov. 5, 2002, herein incorporated by reference, as opposed to being coupled directly to the navigation handpiece interface 70.


Various portions of the navigation system 20, such as the handpiece 12, or the dynamic reference frame 60, are equipped with at least one, and generally multiple, EM or other tracking devices (i.e., such as 14 and 40), that may also be referred to as localization sensors. The EM tracking devices 14, 40 can include one or more coils that are operable with the EM localizer arrays 64, 66. For example, the EM instrument tracking device 14 includes three coils as will be described. An additional representative alternative localization and tracking system is set forth in U.S. Pat. No. 5,983,126, entitled “Catheter Location System and Method,” issued Nov. 9, 1999, which is hereby incorporated by reference. Alternatively, the localization system may be a hybrid system that includes components from various systems.


The dynamic reference frame 60 may be fixed to the patient 22 adjacent to the region being navigated so that any movement of the patient 22 is detected as relative motion between the coil array 64, 66 and the dynamic reference frame 60. The dynamic reference frame 60 can be interconnected with the patient 22 in any appropriate manner, including those discussed herein. Relative motion is forwarded to the coil array controller 68, which updates registration correlation and maintains accurate navigation, further discussed herein. The dynamic reference frame 60 may include any appropriate tracking sensor. Therefore, the dynamic reference frame 60 may also be optical, acoustic, etc. If the dynamic reference frame 60 is electromagnetic it can be configured as a pair of orthogonally oriented coils, each having the same center or may be configured in any other non-coaxial or co-axial coil configurations.


Prior to tracking the EM instrument tracking device 14, patient registration is performed. Patient registration is the process of determining how to correlate the position of the handpiece 12 relative to the patient 22 to the position on the diagnostic or image data. To register the patient 22, the physician or user 50 may use point registration by selecting and storing particular points (e.g. fiducial points) from the image data and then touching the corresponding points on the patient's anatomy (i.e., in this example various facial features such as the nose, lips, chin, cheeks or other locations near the nose) with the handpiece 12. The navigation system 20 analyzes the relationship between the two sets of points that are selected and computes a match, which correlates every point in the image data with its corresponding point on the patient's anatomy or the patient space.


The points that are selected to perform registration are the fiducial markers or landmarks, such as anatomical landmarks. Again, the landmarks or fiducial points are identifiable on the images and identifiable and accessible on the patient 22. The landmarks can be artificial landmarks that are positioned on the patient 22. The artificial landmarks, such as the fiducial markers, can also form part of the dynamic reference frame 60, such as those disclosed in U.S. Pat. No. 6,381,485, entitled “Registration of Human Anatomy Integrated for Electromagnetic Localization,” issued Apr. 30, 2002, herein incorporated by reference.


The patient 22 can include one or a plurality of fiducial markers (not specifically shown) affixed to the anatomy of a patient 22, such as a cranium. It will be understood that any appropriate number of the fiducial markers can be affixed to the patient 22 and in any appropriate location. The fiducial markers can be randomly attached to the patient 22 or attached in specific locations.


The fiducial markers can include any appropriate marker to be interconnected with the patient 22. For example, the makers sold by IZI Medical Products, Baltimore, Md. can be used. The markers can include an adhesive base that is adhered to the cranium. The fiducial markers can be associated or connected to the patient 22 with an adhesive, mounting screw, clamp, etc. According to various embodiments, the fiducial markers can be attached to the patient 22 prior to acquiring image data with an adhesive in a selected manner. The fiducial markers can be placed in predetermined locations or in random locations for imaging. The fiducial marker can also include a fiducial divot or identification divot. Once registration is performed, the navigation system 20 is ready to track the EM instrument tracking device 14.


Briefly, the navigation system 20 operates as follows. The navigation system 20 creates a translation map between all points in the image data generated from the imaging device 24 which can include external and internal portions, and the corresponding points in the patient's anatomy in patient space. After this map is established, whenever the tracked handpiece 12 is used, the work station 36 in combination with the coil array controller 68 uses the translation map to identify the corresponding point on the image data or atlas model, which is displayed on display 42. This identification is known as navigation or localization. An icon representing the localized point or instruments is shown on the display 42 within several two-dimensional image planes, as well as on three and four dimensional images and models.


To enable navigation, the navigation system 20 must be able to detect both the position of the patient's anatomy and the position of the handpiece 12 (or EM instrument tracking device 14) attached to the handpiece 12. Knowing the location of these two items allows the navigation system 20 to compute and display the position of the handpiece 12 or any portion thereof (i.e., a shaver blade as will be described) in relation to the patient 22. The EM image-guided surgery system 10 is employed to track the handpiece 12 and the anatomy simultaneously.


The EM image-guided surgery system 10 essentially works by positioning the coil array 64, 66 adjacent to the patient 22 to generate a magnetic field, which can be low energy, and generally referred to as a navigation field. Because every point in the navigation field or patient space is associated with a unique field strength, the EM image-guided surgery system 10 can determine the position of the handpiece 12 by measuring the field strength at the location of the EM instrument tracking device 14. The dynamic reference frame 60 is fixed to the patient 22 to identify the location of the patient 22 in the navigation field. The EM image-guided surgery system 10 continuously recomputes or recalculates the relative position of the dynamic reference frame 60 and the handpiece 12 during localization and relates this spatial information to patient registration data to enable navigation of the handpiece 12 within and/or relative to the patient 22. Navigation can include image guidance or imageless guidance.


The navigation system 20 may also perform registration using anatomic surface information or path information as is known in the art (and may be referred to as auto-registration). The EM image-guided surgery system 10 may also perform 2D to 3D registration by utilizing the acquired 2D images to register 3D volume images by use of contour algorithms, point algorithms or density comparison algorithms, as is known in the art. An exemplary 2D to 3D registration procedure is set forth in U.S. Ser. No. 10/644,680, now U.S. Pat. App. Pub. No. 2004/0215071, entitled “Method and Apparatus for Performing 2D to 3D Registration” filed on Aug. 20, 2003, hereby incorporated by reference.


In order to maintain registration accuracy, the navigation system 20 continuously tracks the position of the patient 22 during registration and navigation. This is because the patient 22, dynamic reference frame 60, and transmitter coil array 64, 66 may all move during the procedure, even when this movement is not desired. Alternatively the patient 22 may be held immobile once the registration has occurred, such as with a fixed head frame. Therefore, if the navigation system 20 did not track the position of the patient 22 or area of the anatomy, any patient movement after registration would result in inaccurate navigation within that image. The dynamic reference frame 60 allows the EM image-guided surgery system 10 to register and track the anatomy. Because the dynamic reference frame 60 is rigidly fixed to the patient 22, any movement of the anatomy or the coil array 64, 66 is detected as the relative motion between the coil array 64, 66 and the dynamic reference frame 60. This relative motion is communicated to the coil array controller 68, via the navigation handpiece interface 70, which updates the registration correlation to thereby maintain accurate navigation.


The navigation system 20 can be used according to any appropriate method or system. For example, image data, atlas or 3D models may be registered relative to the patient and patient space, as discussed further herein. Generally, the navigation system 20 allows the images on the display 42 to be registered and accurately display the real time location of the various instruments and other appropriate items. In addition, the handpiece 12 may be used to register the patient space to the pre-acquired image data or the atlas or 3D models. In addition, the dynamic reference frame 60 may be used to ensure that any planned or unplanned movement of the patient or the array 64, 66 is determined and used to correct the image on the display 42.


To obtain a maximum reference, it can be selected to fix the dynamic reference frame 60 in each of at least 6 degrees of freedom. Thus, the dynamic reference frame 60 can be fixed relative to axial motion X, translational motion Y, rotational motion Z, yaw, pitch, and roll relative to the portion of the patient 22 to which it is attached. Any appropriate coordinate system can be used to describe the various degrees of freedom. Fixing the dynamic reference frame relative to the patient 22 in this manner can assist in maintaining maximum accuracy of the navigation system 20.


As mentioned briefly above, the display 42 can display any appropriate type of image data 44. For example, the image data 44 can include patient specific image data that can be acquired at any appropriate time. The image data can include magnetic resonance imaging data (MRI) that can provide structural anatomical image data (i.e., such as a nasal cavity, etc.) of the patient 22. The image data 44 can be displayed on the display 42 for use during a procedure by the user 50. The display 42 can also include various atlas image data. Atlas image data can include two-dimensional image data sets, three-dimensional image data sets, and even four-dimensional image data sets that show the change of various anatomical structures over time.


With reference now to FIG. 2A, the electromagnetic tracking device 14 is shown operatively coupled with the exemplary handpiece 12. One suitable handpiece is commercially available under the name “MENT M4 Straightshot™ Handpiece”, sold by Medtronic, Inc. The handpiece 12 can include a removably (or non-removably) attached working member 80, such as a debrider, bur or blade for example. The blade 80 can include a blade tip 82. According to the exemplary embodiment, the blade tip 82 can be specifically adapted for ear, nose and throat (ENT) surgical procedures for burring or shaping various ENT cavities. While only one blade 80 is shown, the EM image-guided surgery system 10 can include a plurality of blades each having different characteristics. The location of a respective tip (i.e., 82) for each blade (i.e., 80) can be programmed into the image-guided surgery system 10, such as at a terminal connector of the handpiece 12.


In one example, the working member 80 can be part of a blade assembly 84, FIG. 2B that is fixed to the electromagnetic tracking device 14. In another example, the working member 80 or the blade assembly 84 as a whole can be releasably coupled to the electromagnetic tracking device 14. The blade assembly 84 can include an outer hub 86 having locating tabs 88 and an aspirating port 90. A rotating hub 92 can be arranged intermediate of an inner tapered hub 94 and the outer hub 86. In an assembled position (FIG. 2A), the locating tabs 88 can engage structure (not specifically shown) on the handpiece 12. During use, rotatable motion is imparted into the rotating hub 92 for rotating within and relative to the outer hub 86.


With reference now to FIG. 3A, an exploded view of the EM instrument tracking device 14 is shown. The EM instrument tracking device 14 can generally include a body or housing 100 that comprises a base 102 and a cap 104. The cap 104 defines a cap opening 106 at a distal end 108 and a first and second lobe 110 and 112, respectively formed at a proximal end 114. The cap 104 includes a cap body 116 that defines a generally conical shape that tapers radially outwardly from the distal end 108 to the proximal end 114. A cap extension portion 120 is formed on the cap 104 generally between the first and second lobes 110 and 112. An accommodation portion 122 is formed generally on the conical section of the cap body 116 and aligned with the cap extension portion 120.


The base 102 generally includes a base opening 126 formed at a distal end 128. The base opening 126 can be concentric with a shaft of the working member 80. A base longitudinal axis 129 can be defined through the base opening 126. The longitudinal axis 170 of the handpiece 12 can be collinear with the base longitudinal axis 129 (FIG. 5). A second tracking coil support 130 and a third tracking coil support 132 can be formed at a proximal end 134 of the base 102. The base 102 generally provides a base body 136 that is in the form of a conically tapered section that expands radially outwardly from the distal end 128 of the base 102 to the proximal end 134 of the base 102. A cable support 140 is formed on the base 102 generally between the second tracking coil support 130 and the third tracking coil support 132. A first tracking coil 142 is generally disposed around the distal end 128 of the base 102 proximate to the base opening 126. A second tracking coil 144 is disposed generally within the second tracking coil support 130. A third tracking coil 146 is generally disposed within the third tracking coil support 132. The second and third tracking coils 144 and 146 are disposed radially outwardly in the body 100 relative to the first tracking coil 142. Explained differently, each of the second and third tracking coils 144 and 146 occupy a position offset radially relative to a circumference of the first tracking coil 142. Each of the first, second and third tracking coils 142, 144 and 146 are electrically connected to a terminal, such as a printed circuit board 148. A cable assembly 151 is electrically connected to the printed circuit board 148. In another example, a base 102′ can be provided (FIG. 3B) where the printed circuit board 148 can be eliminated and the terminal ends of each tracking coil 142, 144 and 146 can each be electrically connected to respective wires of the cable assembly 151. A longitudinal support 150 can extend from the cable assembly 151 and be secured to the base 102′. The longitudinal support 150 can provide additional strength to the cable assembly 151 where it attaches to the base 102′ to inhibit inadvertent separation of the cable assembly 151 from the base 102′. In one example, the longitudinal support 150 can be formed of Kevlar® and be adhesively attached to the base 102′. Adhesive can also be disposed in grooves of the base 102′ at the respective connections between the terminal ends of the tracking coils 142, 144 and 146 and the corresponding wires of the cable assembly 151. Again, as mentioned above, the instrument tracking device 14 can communicate wirelessly to the navigation handpiece assembly.


As can be appreciated, in an assembled position (i.e., FIG. 2, 4 or 5), the cap 104 is configured to be assembled generally over the base 102. In the assembled position, the distal end 108 of the cap 104 can enclose the first tracking coil 142 disposed on the distal end 128 of the base. Similarly, the first lobe 110 of the cap 104 can enclose the second tracking coil 144 and second tracking coil support 130. The second lobe 112 of the cap 104 can enclose the third tracking coil 146 and third tracking coil support 132. The cap extension portion 120 can enclose the cable support 140. The accommodating portion 122 of the cap 104 can enclose the printed circuit board 148. In one example, a flowable adhesive, such as an epoxy can be disposed within the annular space defined between the base 102 and the cap 104 upon assembly for maintaining the cap 104 and base 102 in a secured position relative to each other. In one example, the body can be formed of molded plastic. The second and third tracking coils 144 and 146 can be molded into the base 102. Alternatively, the second and third tracking coils 144 and 146 can be glued to the base 102. Epoxy can then be urged around the second and third tracking coils 144 and 146 in the base 102 to retain the second and third tracking coils 144 and 146. The EM instrument tracking device is simple to manufacture and easy to manipulate onto and off of an instrument 12.


With additional reference now to FIGS. 4-6, additional features of the EM instrument tracking device 14 will be described. The first tracking coil 142 defines a first tracking coil plane 152. The second tracking coil 144 defines a second tracking coil plane 154. The third tracking coil 146 defines a third tracking coil plane 156. For purposes of discussion, each of the tracking coil planes 152, 154 and 156 are defined through a horizontal centerline of each of the respective tracking coils 142, 144 and 146, respectively. As illustrated in FIGS. 4 and 5, the first, second and third tracking coil planes 152, 154 and 156 are all arranged orthogonally relative to each other. Additionally, each of the first, second and third tracking coils 142, 144 and 146 define a center point (i.e., the point at which their respective axes 162, 164 and 166 intersect their respective tracking coil planes 152, 154 and 156) that do not overlap.


A first tracking coil axis 162 is defined through the first tracking coil 142. A second coil axis 164 is defined through the second tracking coil 144. A third tracking coil axis 166 is defined through the third tracking coil 146. As best illustrated in FIG. 4, the second coil axis 164 is perpendicular relative to the third coil axis 166. The first tracking coil axis 162 is collinear with a longitudinal axis 170 defined by the handpiece 12. The first tracking coil 142 is therefore arranged perpendicular relative to the longitudinal axis 170 of the handpiece 12 and concentric with the working tool axis.


In use, each of the first, second and third tracking coils 142, 144 and 146 sense the electromagnetic field produced by the coil arrays 64 and 66 and generate an output signal to the navigation handpiece interface 70 indicative of a position of the working member 80, and more specifically the blade tip 82. The specific tracking coil configuration of the present disclosure where the first, second and third tracking coils 142, 144 and 146 are positioned orthogonally relative to each other and in the specific configuration relative to the shaver reduces electromagnetic interference from the handpiece 12, which contains a substantial amount of metallic material. The tracking coils 142, 144 and 146 can provide positional, translational and rotational information (6 degrees of freedom) along the respective axes 162, 164 and 166, along the respective planes 152, 154 and 156, and around the respective axes 162, 164 and 166 for the instrument tracking device 14 as a whole.


As can be appreciated, by utilizing three tracking coils (142, 144 and 146), the EM image-guided surgery system 10 can identify the location and orientation of the set of coils 142, 144 and 146 in space and therefore can calculate the location of the blade tip 82 in 3-dimensional space. In this way, translational motion along the X, Y and Z axes as well as yaw, pitch and roll relative to the patient 22 can be determined.


As discussed above, the EM instrument tracking device 14 may be removably attached or fixed (non-removably attached) to the handpiece 12. In one example, the EM instrument tracking device 14 can be slidably advanced over a distal end of the handpiece 12 and secured with a connection mechanism 168. The connection mechanism 168 can comprise adhesive, cooperating threads, clips, snaps, fasteners or other device. Explained generally, the distal end of the handpiece 12 can be inserted through a passthrough defined by the base opening 126. In other arrangements, the EM instrument tracking device 14 may be releasably coupled to the handpiece 12, such as by way of a snap-fit connection. It is contemplated however that the EM instrument tracking device 14 can be removably affixed to the handpiece 12 by any suitable method. In general however, the EM instrument tracking device 14 is disposable and because the EM instrument tracking device 14 is removably attached to the handpiece 12, it can also be easily replaced.


The electromagnetic tracking device 14 is low profile and aesthetically pleasing to the user 50 when attached to the handpiece 12. Explained further, the respective first and second lobes 110 and 112 that accommodate the second and third tracking coils 144 and 146, respectively protrude radially outwardly a minimal distance. In one example, the base opening 126 can define a diameter 174. An outermost distance 176 can be defined from the first coil axis 162 to an outermost dimension of the body 100 (i.e., at the first or second lobe 110 or 112). The distance 176 is less than three times the diameter 174. Because the body 100 provides a low profile configuration, the user 50 is afforded a greater viewing angle at the area of interest. In this way, the body 100 is substantially free from obstructing a line of sight by the user 50 during the surgical procedure, such as down the sight line or axis of the instrument 12.


With reference to FIG. 7, the EM instrument tracking device 14 can be provided with the cable assembly 151 and an electrical connector 180 at an opposite end. The EM instrument tracking device 14, the cable assembly 151 and electrical connector 180 can be contained in a hermetically sealed bag 182 prior to being attached to the blade 80.


In other examples, the EM instrument tracking device 14 can be provided with fewer or additional tracking coils than the three tracking coils 142, 144 and 146 described above. Moreover, the tracking coils 142, 144 and 146 can be arranged at alternate locations on the body 100. In one configuration as illustrated in FIG. 8, an EM instrument tracking device 14′ can include a first tracking coil 142′ that is located around the perimeter of a base 102′ at a radially offset position relative to a second and third tracking coils 144′ and 146′. In one example, the first tracking coil 142′ can be located within a third lobe 113′ of a cap 104′. The second and third tracking coils 144′ and 146′ can be located within a first and a second lobe 110′ and 112′, respectively. The first, second and third radially extending tracking coils 142′, 144′ and 146′ can be supported by tracking coil supports 133′, 130′ and 132′, respectively. In the example illustrated, the first, second and third tracking coils 142′, 144′ and 146′ define tracking coil planes 157′, 154′ and 156′ that are located at substantially 120° relative to each other substantially perpendicular to the longitudinal axis 170. Other configurations and/or radial offsets are contemplated.


The description of the present teachings is merely exemplary in nature and, thus, variations that do not depart from the gist of the present teachings are intended to be within the scope of the present teachings. Such variations are not to be regarded as a departure from the spirit and scope of the present teachings.

Claims
  • 1. A method of tracking a handpiece having a working member removably attached to a distal end of the handpiece, comprising: slidably advancing a tracking device over the distal end of the handpiece;removably affixing the tracking device to the distal end of the handpiece; andunobstructably viewing along a sight line of the handpiece as a distal tip of the working member is tracked with the tracking device;wherein the method further comprising,(i) positioning a first tracking coil positioned around a passthrough of the tracking device and attached to the tracking device about the working member such that the first tracking coil is positioned concentric to a longitudinal axis of the working member; or(ii) positioning first, second, and third tracking coils that are each radially spaced from a longitudinal axis of the working member and radially offset at 120° positions relative to each other about a second longitudinal axis of the tracking device on the handpiece.
  • 2. The method of claim 1, further comprising inserting the distal tip of the working member through an opening defined by the tracking device to removably affix the tracking device to the distal end of the handpiece.
  • 3. The method of claim 1, further comprising removably affixing the tracking device to the distal end of the handpiece with a connection member selected from the group consisting of adhesive, cooperating threads, clips, snaps, and fasteners.
  • 4. The method of claim 1, wherein removably affixing the tracking device to the distal end of the handpiece includes releasably coupling the tracking device to the handpiece by way of a snap-fit connection.
  • 5. The method of claim 1, further comprising tracking the position of the handpiece with a navigation system and displaying the handpiece relative to image data.
  • 6. The method of claim 1, further comprising removing the tracking device from the distal end of the handpiece.
  • 7. The method of claim 1, further comprising removing the tracking device from a sealed container prior to releasably coupling the tracking device to the distal end of handpiece.
  • 8. The method of claim 1, wherein the handpiece is a surgical handpiece, the working member is a blade or a bur, and the tracking device is an electromagnetic tracking device.
  • 9. A method of tracking a handpiece having a working member removably attached to a distal end of the handpiece, comprising: inserting a distal tip of the working member through an opening defined by a body of a tracking device;slidably advancing the body of the tracking device over the distal end of the handpiece;releasably coupling the body of the tracking device to the distal end of the handpiece; andnavigating and tracking the distal tip of the working member with the tracking device attached thereto;wherein the method further comprising,(i) positioning a first tracking coil positioned around the opening of the body of the tracking device and attached to the body of the tracking device about the working member such that the first tracking coil is positioned concentric to a longitudinal axis of the working member; or(ii) positioning first, second, and third tracking coils that are each radially spaced from a longitudinal axis of the working member and radially offset at 120° positions relative to each other about a second longitudinal axis of the body of the tracking device on the handpiece.
  • 10. The method of claim 9, further comprising removing the tracking device from a sealed container prior to releasably coupling the tracking device to the distal end of the handpiece.
  • 11. The method of claim 9, wherein navigating the handpiece with the tracking device attached thereto further includes tracking a position of the handpiece relative to a patient and displaying the handpiece on image data of the patient.
  • 12. A method of tracking a handpiece having a working member removably attached to a distal end of the handpiece, comprising: providing a body of a tracking device that defines an opening extending along a longitudinal axis of the body;providing at least one tracking coil attached to the body of the tracking device such that a distal tip of the working member can be inserted and slidably advanced through the opening defined by the body to position the at least one tracking coil relative to the distal end of the handpiece;removably attaching the tracking device to the distal end of the handpiece; andtracking a position of the distal tip of the working member of the handpiece with the tracking device and a navigation system;wherein the method further comprising,(i) positioning the at least one tracking coil positioned around the opening of the body of the tracking device and attached to the body of the tracking device about the working member such that the at least one tracking coil is positioned concentric to a longitudinal axis of the working member; or(ii) positioning first, second, and third tracking coils that are each radially spaced from a longitudinal axis of the working member and radially offset at 120° positions relative to each other about a second longitudinal axis of the body of the tracking device on the handpiece.
  • 13. The method of claim 12, further comprising providing the tracking device in a sealed container prior to removably attaching the tracking device to the handpiece.
CROSS-REFERENCE TO RELATED APPLICATIONS

This application is a continuation of U.S. patent application Ser. No. 13/959,123 filed on Aug. 5, 2013, which is a continuation of U.S. patent application Ser. No. 12/400,951 filed on Mar. 10, 2009, now U.S. Pat. No. 8,504,139 issued on Aug. 6, 2013. The disclosure of the above application is incorporated herein by reference.

US Referenced Citations (199)
Number Name Date Kind
3568660 Crites et al. Mar 1971 A
4188979 Nakamura et al. Feb 1980 A
4317078 Weed et al. Feb 1982 A
4788987 Nickel Dec 1988 A
4806182 Rydell et al. Feb 1989 A
4813210 Masuda Mar 1989 A
5005592 Cartmell Apr 1991 A
5226423 Tenerz et al. Jul 1993 A
5391199 Ben-Haim Feb 1995 A
5443489 Ben-Haim Aug 1995 A
5538511 Van Antwerp Jul 1996 A
5591141 Nettekoven Jan 1997 A
5592939 Martinelli Jan 1997 A
5645559 Hachtman et al. Jul 1997 A
5693085 Buirge et al. Dec 1997 A
5740808 Panescu et al. Apr 1998 A
5762637 Berg et al. Jun 1998 A
5840024 Taniguchi et al. Nov 1998 A
5913820 Bladen et al. Jun 1999 A
5938602 Lloyd Aug 1999 A
5963120 Zaviska Oct 1999 A
5983126 Wittkampf Nov 1999 A
6010529 Herweck et al. Jan 2000 A
6021343 Foley et al. Feb 2000 A
6102885 Bass Aug 2000 A
6106486 Tenerz et al. Aug 2000 A
6201387 Govari Mar 2001 B1
6235038 Hunter et al. May 2001 B1
6253770 Acker et al. Jul 2001 B1
6254600 Willink et al. Jul 2001 B1
6332891 Himes Dec 2001 B1
6336906 Hammarstrom et al. Jan 2002 B1
6348058 Melkent et al. Feb 2002 B1
6381485 Hunter et al. Apr 2002 B1
6427079 Schneider et al. Jul 2002 B1
6434507 Clayton et al. Aug 2002 B1
6474341 Hunter et al. Nov 2002 B1
6478802 Kienzle, III et al. Nov 2002 B2
6556857 Estes et al. Apr 2003 B1
6610066 Dinger et al. Aug 2003 B2
6615155 Gilboa Sep 2003 B2
6616651 Stevens Sep 2003 B1
6687531 Ferre et al. Feb 2004 B1
6689049 Miyagi et al. Feb 2004 B1
6695764 Silverman et al. Feb 2004 B2
6747539 Martinelli Jun 2004 B1
6796988 Melkent et al. Sep 2004 B2
6833814 Gilboa et al. Dec 2004 B2
6926674 Tenerz et al. Aug 2005 B2
6940941 Gregerson et al. Sep 2005 B2
6977575 Bernier Dec 2005 B2
6980849 Sasso Dec 2005 B2
6993374 Sasso Jan 2006 B2
7001045 Gregerson et al. Feb 2006 B2
7106825 Gregerson et al. Sep 2006 B2
7108421 Gregerson et al. Sep 2006 B2
7118378 Karapetyan Oct 2006 B1
7135978 Gisselberg et al. Nov 2006 B2
7153308 Peterson Dec 2006 B2
7166114 Moctezuma De La Barrera et al. Jan 2007 B2
7188998 Gregerson et al. Mar 2007 B2
7226456 O'Neil et al. Jun 2007 B2
7346417 Luth et al. Mar 2008 B2
7366562 Dukesherer et al. Apr 2008 B2
7410480 Muni et al. Aug 2008 B2
7462175 Chang et al. Dec 2008 B2
7500971 Chang et al. Mar 2009 B2
7537594 Sartor May 2009 B2
7559137 Beer et al. Jul 2009 B2
7604609 Jervis Oct 2009 B2
7625617 Anderson et al. Dec 2009 B1
7629015 Anderson et al. Dec 2009 B2
7637896 Voegele et al. Dec 2009 B2
7647083 Al-Ali et al. Jan 2010 B2
7657301 Mate et al. Feb 2010 B2
7697972 Verard et al. Apr 2010 B2
7751865 Jascob et al. Jul 2010 B2
7763035 Melkent et al. Jul 2010 B2
7774933 Wilson et al. Aug 2010 B2
7797032 Martinelli et al. Sep 2010 B2
7818044 Dukesherer et al. Oct 2010 B2
7840253 Tremblay et al. Nov 2010 B2
7844319 Susil et al. Nov 2010 B2
7881769 Sobe Feb 2011 B2
7971341 Dukesherer et al. Jul 2011 B2
7979032 Lomnitz Jul 2011 B2
8075969 Anderson et al. Dec 2011 B2
8086298 Whitmore, III et al. Dec 2011 B2
8105339 Melkent et al. Jan 2012 B2
8147486 Honour et al. Apr 2012 B2
8239001 Verard et al. Aug 2012 B2
8251949 Warnack Aug 2012 B2
8255027 Al-Ali et al. Aug 2012 B2
8504139 Jacobsen et al. Aug 2013 B2
8644907 Hartmann et al. Feb 2014 B2
8648605 Nakamura et al. Feb 2014 B2
8674694 Hyde et al. Mar 2014 B2
8862204 Sobe et al. Oct 2014 B2
9232985 Jacobsen Jan 2016 B2
9504530 Hartmann Nov 2016 B2
9623208 Wright et al. Apr 2017 B2
9750486 Burg et al. Sep 2017 B2
20010034549 Bartholf et al. Oct 2001 A1
20020153015 Garibaldi et al. Oct 2002 A1
20020165448 Ben-Haim et al. Nov 2002 A1
20020167313 Taimisto Nov 2002 A1
20030009095 Skarda Jan 2003 A1
20030050552 Vu Mar 2003 A1
20030187347 Nevo et al. Oct 2003 A1
20030216639 Gilboa et al. Nov 2003 A1
20030236549 Bonadio et al. Dec 2003 A1
20040097804 Sobe May 2004 A1
20040116803 Jascob et al. Jun 2004 A1
20040199072 Sprouse et al. Oct 2004 A1
20040215071 Frank et al. Oct 2004 A1
20050027339 Schrom et al. Feb 2005 A1
20050027340 Schrom et al. Feb 2005 A1
20050027341 Schrom et al. Feb 2005 A1
20050060885 Johnson et al. Mar 2005 A1
20050085715 Dukesherer et al. Apr 2005 A1
20050085720 Jascob et al. Apr 2005 A1
20050105212 Sato May 2005 A1
20050137576 Packard Jun 2005 A1
20050154294 Uchiyama et al. Jul 2005 A1
20050171508 Gilboa Aug 2005 A1
20050215922 Tsonton et al. Sep 2005 A1
20050222554 Wallace et al. Oct 2005 A1
20060004286 Chang et al. Jan 2006 A1
20060025677 Verard et al. Feb 2006 A1
20060036189 Martinelli et al. Feb 2006 A1
20060084867 Tremblay et al. Apr 2006 A1
20060095066 Chang et al. May 2006 A1
20060129061 Kaneto et al. Jun 2006 A1
20060142656 Malackowski et al. Jun 2006 A1
20060173284 Ackerman et al. Aug 2006 A1
20060206039 Wilson et al. Sep 2006 A1
20060206170 Denker et al. Sep 2006 A1
20060224142 Wilson et al. Oct 2006 A1
20060229624 May et al. Oct 2006 A1
20060235314 Migliuolo et al. Oct 2006 A1
20070032862 Weber et al. Feb 2007 A1
20070088416 Atalar et al. Apr 2007 A1
20070157828 Susel et al. Jul 2007 A1
20070197899 Ritter et al. Aug 2007 A1
20070208252 Makower Sep 2007 A1
20070220746 Anderson et al. Sep 2007 A1
20070225595 Malackowski et al. Sep 2007 A1
20070255132 Shalgi et al. Nov 2007 A1
20080058767 Rotman et al. Mar 2008 A1
20080097195 Urquhart et al. Apr 2008 A1
20080097347 Arvanaghi Apr 2008 A1
20080119727 Barbagli et al. May 2008 A1
20080119919 Atalar et al. May 2008 A1
20080132909 Jascob et al. Jun 2008 A1
20080154296 Taylor et al. Jun 2008 A1
20080171934 Greenan et al. Jul 2008 A1
20080171937 Dukesherer et al. Jul 2008 A1
20080172069 Dukesherer et al. Jul 2008 A1
20080228195 von Jako et al. Sep 2008 A1
20090088728 Dollar et al. Apr 2009 A1
20090118742 Hartmann et al. May 2009 A1
20090171187 Gerhart et al. Jul 2009 A1
20090204023 Goldenberg Aug 2009 A1
20090209947 Gordin et al. Aug 2009 A1
20100063383 Anderson et al. Mar 2010 A1
20100081965 Mugan et al. Apr 2010 A1
20100087789 Leeflang et al. Apr 2010 A1
20100130852 Neidert et al. May 2010 A1
20100134096 Chiba et al. Jun 2010 A1
20100185083 Neidert et al. Jul 2010 A1
20100210939 Hartmann et al. Aug 2010 A1
20100228117 Hartmann Sep 2010 A1
20100234724 Jacobsen et al. Sep 2010 A1
20100253361 Nakamura et al. Oct 2010 A1
20100280363 Skarda et al. Nov 2010 A1
20100331763 Wilson et al. Dec 2010 A1
20110014705 Leach et al. Jan 2011 A1
20110060214 Makower Mar 2011 A1
20110066029 Lyu et al. Mar 2011 A1
20110118592 Sobe et al. May 2011 A1
20110251519 Romoscanu Oct 2011 A1
20110258842 Dukesherer et al. Oct 2011 A1
20110270081 Burg et al. Nov 2011 A1
20120112746 Hyde et al. May 2012 A1
20120172696 Kallback et al. Jul 2012 A1
20120197108 Hartmann et al. Aug 2012 A1
20120197109 Hartmann et al. Aug 2012 A1
20120197110 Hartmann et al. Aug 2012 A1
20120245665 Friedman et al. Sep 2012 A1
20120271135 Burke et al. Oct 2012 A1
20120283570 Tegg Nov 2012 A1
20130066194 Seter et al. Mar 2013 A1
20130137954 Jacobsen et al. May 2013 A1
20130317355 Jacobsen et al. Nov 2013 A1
20140012130 Jacobsen et al. Jan 2014 A1
20140148692 Hartmann et al. May 2014 A1
20140158555 Nakamura et al. Jun 2014 A1
20140276004 Strupeck et al. Sep 2014 A1
20150005625 Sobe et al. Jan 2015 A1
Foreign Referenced Citations (42)
Number Date Country
2011245296 Dec 2012 AU
2797359 Nov 2011 CA
101621966 Jan 2010 CN
103068332 Apr 2013 CN
102009030731 Dec 2010 DE
0425319 May 1991 EP
1302172 Apr 2003 EP
1510182 Mar 2005 EP
1552795 Jul 2005 EP
1658818 May 2006 EP
1743591 Jan 2007 EP
1806756 Jul 2007 EP
2114263 Nov 2009 EP
2123220 Nov 2009 EP
2563260 Mar 2013 EP
2000151041 May 2000 JP
03-207344 Sep 2001 JP
2006167119 Jun 2006 JP
2007-527296 Sep 2007 JP
2008-155033 Jul 2008 JP
2008194475 Aug 2008 JP
2010082446 Apr 2010 JP
WO-9632060 Oct 1996 WO
WO-9729682 Aug 1997 WO
WO-9729684 Aug 1997 WO
WO-9940856 Aug 1999 WO
WO-0038571 Jul 2000 WO
WO-0134050 May 2001 WO
WO-2006096685 Sep 2006 WO
WO-2006116597 Nov 2006 WO
2008054423 May 2008 WO
WO-2008105874 Sep 2008 WO
WO-2009152486 Dec 2009 WO
WO-2010049834 May 2010 WO
WO-2010124285 Oct 2010 WO
WO-2010144419 Dec 2010 WO
WO-2011137301 Nov 2011 WO
WO-2012103304 Aug 2012 WO
WO-2012103304 Aug 2012 WO
WO-2012103407 Aug 2012 WO
WO-2012103410 Aug 2012 WO
WO-2013062869 May 2013 WO
Non-Patent Literature Citations (80)
Entry
European Office Action dated Mar. 19, 2018 in corresponding/related European Application No. 17208055.8.
European Office Action dated Nov. 9, 2017 in corresponding/related European Application No. 11719933.1.
European Office Action dated May 3, 2018 in corresponding/related European Application No. 14725316.5.
Canadian Office Action dated Jul. 6, 2018 in corresponding/related Canadian Application No. 2,942,656.
Japanese Office Action dated Aug. 29, 2016 for JP Application No. 2015-555345 corresponding to PCT/US2014/012967 which claims benefit of U.S. Appl. No. 13/751,032, filed Jan. 25, 2013.
Japanese Office Action dated Sep. 13, 2016 for JP Application No. 2016-510697 corresponding to PCT/US2014/034022 which claims benefit of U.S. Appl. No. 13/871,616, filed Apr. 26, 2013.
Chinese Office Action dated Feb. 4, 2017 for Chinese Application No. 2014800059516.
Chinese Office Action dated Mar. 9, 2017 for Chinese Application No. 201480004264.2.
European Office Action dated Mar. 1, 2017 for European Application No. 12709722.8.
Australian Office Action dated Sep. 27, 2017 in corresponding/related Australian Application No. 2014209251.
Japanese Office Action dated Dec. 1, 2017 in corresponding/related Japanese Application No. 2016-510697.
Australian Office Action dated Apr. 10, 2018 in corresponding/related Australian Application No. 2014257385.
Chinese Office Action dated Sep. 25, 2017 in corresponding/related Chinese Application No. 201480004264.2.
European Office Action dated Apr. 13, 2017 in European Application No. 14721108.0 corresponding to PCT/US2014/028100 which claims priority to U.S. Appl. No. 14/209,696, filed Mar. 13, 2014 and U.S. Appl. No. 61/790,479, filed Mar. 15, 2013.
Japanese Office Action dated May 19, 2017 for Japanese Application No. 2016-510697 corresponding to PCT/US2014/034022 which claims benefit of U.S. Appl. No. 13/871,616 filed Apr. 26, 2013.
European Office Action dated Nov. 23, 2017 in corresponding/related European Application No. 14721108.0.
Chinese Office Action dated Mar. 23, 2018 in corresponding/related Chinese Application No. 201480023678.X.
Australian Office Action dated Aug. 22, 2017 in corresponding Australian Application No. 2014209323.
Chinese Office Action dated Aug. 29, 2017 in corresponding Chinese Application No. 201480005951.6.
Chinese Office Action dated Jun. 29, 2017 in corresponding Chinese Application No. 201480023678.X.
European Office Action dated Apr. 13, 2017 in corresponding European Application No. 14706985.0.
European Office Action dated Jun. 22, 2017 in corresponding European Application No. 14703512.5.
Japanese Office Action dated Jul. 3, 2017 in corresponding Japanese Application No. 2015-555278.
Chinese Office Action dated Oct. 30, 2017 in corresponding/related Chinese Application No. 201610206046.8.
Communication pursuant to Article 94(3) EPC dated Nov. 24, 2016 for European Application No. 107084790 corresponding to PCT/US2010/026655 claiming benefit of U.S. Appl. No. 12/400,951, filed Mar. 10, 2009.
Communication pursuant to Article 94(3) EPC dated Feb. 1, 2017 for European Application No. 117199331 corresponding to PCT/US2011/034475 filed Apr. 29, 2011.
Canadian Office Action dated Oct. 24, 2017 in corresponding/related Canadian Application No. 2,942,656.
“Flexible electronics,” Dec. 19, 2012 (Dec. 19, 2012), XP055112518, en.wikipedia.org. Retrieved form the Internet: <URL:http://en.wikipedia.org/w/index.php?title=Flexible_electronics&oldid=528841651> [retrieved on Apr. 7, 2014]. (6 sheets).
“Flexible Printed Circuit Manufacturer—Capabilities,” Aug. 16, 2012 (Aug. 16, 2012), XP055112534, fpcexpress.com. Retrieved from the Internet: URL: <http://web.archive.org/web/20120816030431/http://fpcexpress.com/capabilities.htm1>.[retrieved on Apr. 7, 2014][retrieved on May 8, 2014]. (3 sheets).
“InstaTrak 3500 Plus. Applications: ENT. Cranial.” http://www.gehealthcare/usen/xr/surgery/products/nav.html (printed Dec. 14, 2009).
“InstaTrak® Image Guided Sinus Surgery, Introduction to the InstaTrak System.” Sinus-Clear.com http:/www.sinus-clear.com/instatrak.htm (printed Dec. 14, 2009).
“InstaTrak™ 3500 plus—Cranial. Multi-application electromagnetic surgical navigation system for ENT, Cranial, and Spine procedures.” GE Healthcare http://www.gehealthcare.com/euen/surgery/products/instatrak-3500-plus-cranial/index.html (printed Dec. 14, 2009).
“InstaTrak™ 3500 plus—ENT. Multi-application electromagnetic surgical navigation system for ENT and Cranial.” GE Healthcare http://www.gehealthcare.com/euen/surgery/products/instatrak-3500-plus-ent/index.html (printed Dec. 14, 2009).
“Mayfield® Skull Clamps and Headrest Systems,” Mayfield® Surgical Devices Product Index, pp. 1-6, Dec. 2004 Integra LifeSciences Corporation.
“Medtronic O-Arm Multi-Dimensional Surgical Imaging System”; Brochure, 24pp, 2009.
“Minco Bulletin FC-3,” Jul. 31, 2002 (Jul. 31, 2002). XP055115671, Retrieved from the Internet: <URL:http://www.temflexcontrols.com/pdf/fc3.pdf> [retrieved on Apr. 29, 2014]. (1 sheet).
“Sectional design standard for flexible printed boards,” Internet Citation, Nov. 30, 1998 (Nov. 30, 1998), pp. 1-35, XP002691487, Retrieved form the Interent: <URL:http://222.184.16.210/smt/tzxt/bz/IPC-2223.pdf>. [retrieved on Feb. 1, 2013].
“StealthStation_S7_System® Information Center in the OR,” (2009) Medtronic, Inc.
“StealthStation® TRIA™ plus Treatment Guidance System,” brochure, Medtronic Surgical Navigation Technologies (2004) 2 pages.
“The doctor can see you now” brochure. GE Medical Systems (2003) General Electric Company.
“TREON, StealthStation,” brochure, Medtronic Surgical Navigation Technologies (2001) 8 pages.
Acclarent™ “Instructions for Use. Balloon Sinuplasty™ System. Relieva™ Devices, ReliENT™ Navigation System, and OptiLINK™ Extension.” (Aug. 21, 2009) pp. 1-13.
Acclarent™ “Instructions for Use. Relieva Flex™ Sinus Guide Catheter, Relieva® Sinus Guide Catheter.” (Sep. 19, 2009) pp. 1-6.
Chinese Office Action dated Apr. 3, 2015 for Chinese Application No. 201180031075.0 claiming benefit of PCT/US2011/034475 filed Apr. 29, 2011, claiming benefit from U.S. Appl. No. 61/330,024, filed Apr. 30, 2010 and U.S. Appl. No. 13/097,243, filed Apr. 29, 2011.
Chinese Office Action dated Sep. 3, 2014 for Chinese Application No. 201180031075.0 claiming benefit of PCT/US2011/034475 filed Apr. 29, 2011, claiming benefit from U.S. Appl. No. 61/330,024, filed Apr. 30, 2010 and U.S. Appl. No. 13/097,243, filed Apr. 29, 2011.
Communication pursuant to Article 94(3) EPC for European Application No. 12703208.4-1654 dated Apr. 24, 2015.
European Office Action dated Oct. 20, 2015 for European Application No. 127841781 claiming benefit of PCT/US2012/061086 filed Oct. 19, 2012.
Examiner's Report dated Dec. 18, 2013 for Canadian Application No. 2,797,359 claiming benefit of U.S. Appl. No. 13/097,243, filed Apr. 29, 2011.
http://oxforddictionaries.com/definition/english/barrel (accessed Dec. 3, 2012).
International Preliminary Report on Patentability and Written Opinion dated May 8, 2014 for PCT/US2012/061086 claiming benefit of U.S. Appl. No. 13/281,001, filed Oct. 25, 2011.
International Preliminary Report on Patentability and Written Opinion dated Aug. 6, 2015 for PCT/US2014/012967 claiming benefit of U.S. Appl. No. 13/751,032, filed Jan. 25, 2013.
International Preliminary Report on Patentability and Written Opinion dated Aug. 6, 2015 for PCT/US2014/012786 claiming benefit of U.S. Appl. No. 13/748,150, filed Jan. 23, 2012.
International Preliminary Report on Patentability and Written Opinion dated Aug. 6, 2015 for PCT/US2014/012786 claiming benefit of U.S. Appl. No. 13/748,150, filed Jan. 23, 2013.
International Preliminary Report on Patentability and Written Opinion dated Sep. 22, 2011 for PCT/US2010/026655 claiming benefit of U.S. Appl. No. 12/400,451, filed Mar. 10, 2009.
International Preliminary Report on Patentability and Written Opinion dated Sep. 24, 2015 for PCT/US2014/028100 claiming benefit to U.S. Appl. No. 14/209,696, filed Mar. 13, 2014.
International Preliminary Report on Patentability dated Aug. 8, 2013 for PCT/US2012/022676 claiming benefit of U.S. Appl. No. 13/016,740 filed Jan. 28, 2011.
International Preliminary Report on Patentability dated Aug. 8, 2013 for PCT/US2012/022840 claiming benefit of U.S. Appl. No. 13/016,762, filed Jan. 28, 2011.
International Preliminary Report on Patentability dated Aug. 8, 2013 for PCT/US2012/022846 claiming benefit of U.S. Appl. No. 13/016,765, filed Jan. 28, 2011.
International Preliminary Report on Patentability dated Nov. 15, 2012 for PCT/US2011/34475 claiming benefit of U.S. Appl. No. 13/097,243, filed Apr. 29, 2011.
International Preliminary Report on Patentability dated Oct. 27, 2015 for PCT/US2014/034022 claiming benefit of U.S. Appl. No. 13/871,616, filed Apr. 26, 2013.
International Search Report and Written Opinion dated Oct. 27, 2014 for PCT/US2014/028100 claiming benefit of U.S. Appl. No. 14/209,696, filed Mar. 13, 2014.
International Search Report and Written Opinion dated Apr. 23, 2014 for PCT/US2014/012786 claiming benefit of U.S. Appl. No. 13/748,150, filed Jan. 23, 2013.
International Search Report and Written Opinion dated Feb. 6, 2013 for PCT/US2012/061086 claiming benefit of U.S. Appl. No. 13/281,001, filed Oct. 25, 2011.
International Search Report and Written Opinion dated Jul. 6, 2012 for PCT/US2012/022840 claiming benefit to U.S. Appl. No. 13/016,762, filed Jan. 28, 2011.
International Search Report and Written Opinion dated May 12, 2014 for PCT/US2014/012967 claiming benefit of U.S. Appl. No. 13/751,032, filed Jan. 25, 2013.
International Search Report and Written Opinion dated May 9, 2012 for PCT/US2012/022676 claiming benefit of U.S. Appl. No. 13/016,740, filed Jan. 28, 2011.
International Search Report and Written Opinion dated May 9, 2012 for PCT/US2012/022846 claiming benefit of U.S. Appl. No. 13/016,765, filed Jan. 28, 2011.
International Search Report and Written Opinion dated Oct. 31, 2011, claiming benefit of U.S. Appl. No. 13/097,243, filed Apr. 29, 2011.
International Search Report dated Jul. 15, 2010 for PCT/US2010/026655 claiming benefit of U.S. Appl. No. 12/400,451, filed Mar. 10, 2009.
International Search Report dated Jul. 31, 2014 for PCT/US2014/034022 claiming benefit of U.S. Appl. No. 13/871,616, filed Apr. 26, 2013.
Invitation to Pay Additional Fees and Where Applicable, Protest Fee dated Aug. 14, 2014 for PCT/US2014/028100 claiming benefit of U.S. Appl. No. 14/209,696, filed Mar. 13, 2014.
Invitation to Pay Additional Fees dated Dec. 17, 2012 for PCT/US2012/061086 claiming benefit of U.S. Appl. No. 13/281,001, filed Oct. 25, 2011.
Invitation to Pay Additional Fees dated Jul. 25, 2011, claiming benefit of U.S. Appl. No. 13/097,243, filed Apr. 29, 2011.
Invitation to Pay Additional Fees dated May 8, 2012 for PCT/US2012/022840 claiming benefit of 13/016,762, filed Jan. 28, 2011.
Japanese Office Action dated Jan. 7, 2014 for Japan Application No. 2013-508273 claiming benefit of U.S. Appl. No. 13/097,243, filed Apr. 29, 2011.
Medtronic Navigation, “StealthStation® AXIEM™ Electromagnetic Navigation . . . ”, 2005, www.stealthstation.com/physician/spine/library/axiem_ent.jsp, printed Aug. 19, 2006 (2 pages).
Australian Office Action dated Sep. 21, 2018 in corresponding/related Australian Application No. 2014209251.
Office Action dated Mar. 15, 2019 in corresponding/related European Application No. 17208055.8.
Extended European Search Report dated Sep. 9, 2019 in corresponding/related European Application No. 19179149.0.
Extended European Search Report, dated Sep. 20, 2019, in corresponding/related European Application No. 19171550.7.
Related Publications (1)
Number Date Country
20160120609 A1 May 2016 US
Continuations (2)
Number Date Country
Parent 13959123 Aug 2013 US
Child 14992418 US
Parent 12400951 Mar 2009 US
Child 13959123 US