Apparatus and method for four dimensional soft tissue navigation in endoscopic applications

Information

  • Patent Grant
  • 11109740
  • Patent Number
    11,109,740
  • Date Filed
    Wednesday, March 20, 2019
    5 years ago
  • Date Issued
    Tuesday, September 7, 2021
    3 years ago
Abstract
A surgical instrument navigation system is provided that visually simulates a virtual volumetric scene of a body cavity of a patient from a point of view of a surgical instrument residing in the cavity of the patient. The surgical instrument navigation system includes: a surgical instrument; an imaging device which is operable to capture scan data representative of an internal region of interest within a given patient; a tracking subsystem that employs electro-magnetic sensing to capture in real-time position data indicative of the position of the surgical instrument; a data processor which is operable to render a volumetric, perspective image of the internal region of interest from a point of view of the surgical instrument; and a display which is operable to display the volumetric perspective image of the patient.
Description
BACKGROUND

The invention relates generally to a medical device and particularly to an apparatus and methods associated with a range of image guided medical procedures.


Image guided surgery (IGS), also known as image guided intervention (IGI), enhances a physician's ability to locate instruments within anatomy during a medical procedure. IGS can include 2-dimensional (2-D), 3-dimensional (3-D), and 4-dimensional (4-D) applications. The fourth dimension of IGS can include multiple parameters either individually or together such as time, motion, electrical signals, pressure, airflow, blood flow, respiration, heartbeat, and other patient measured parameters.


Existing imaging modalities can capture the movement of dynamic anatomy. Such modalities include electrocardiogram (ECG)-gated or respiratory-gated magnetic resonance imaging (MRI) devices, ECG-gated or respiratory-gated computer tomography (CT) devices, standard computed tomography (CT), 3D Fluoroscopic images (Angio-suites), and cinematography (CINE) fluoroscopy and ultrasound. Multiple image datasets can be acquired at different times, cycles of patient signals, or physical states of the patient. The dynamic imaging modalities can capture the movement of anatomy over a periodic cycle of that movement by sampling the anatomy at several instants during its characteristic movement and then creating a set of image frames or volumes.


A need exists for an apparatus that can be used with such imaging devices to capture pre-procedural or intra-procedural images of a targeted anatomical body and use those images intra-procedurally to help guide a physician to the correct location of the anatomical body during a medical procedure.


SUMMARY OF THE INVENTION

A method includes receiving during a first time interval image data associated with an image of a dynamic body. The image data includes an indication of a position of a first marker on a patient tracking device (PTD) coupled to the dynamic body and a position of a second marker on the PTD. Some registration methods such as 2D to 3D registration techniques allow for the image data containing the target or patient anatomy of interest to not contain the PTD. A registration step is performed to calculate the transformation from image space to patient space using an additional dataset to register (i.e., a 2D fluoroscopic set of images is used to register a 3D fluoroscopic dataset). This technique is not limited to fluoroscopic procedures as it can implemented in any procedure acquiring 2D images such as ultrasound, OCT (optical coherence tomography), EBUS (endobronchial ultrasound), or IVUS (intravascular ultrasound). This technique uses the markers that are within multiple 2D images to register the 3D volume that is reconstructed from these 2D images. The reconstructed 3D volume is smaller than the field of view of the 2D images, so this technique allows for the PTD markers to be visible in a subset of the 2D images, but not within the 3D volume. In certain embodiments, the first marker is coupled to the PTD at a first location and the second marker is coupled to the PTD at a second location. A distance between the position of the first marker and the position of the second marker is determined. During a second time interval after the first time interval, data associated with a position of a first localization element coupled to the PTD at the first location and data associated with a position of a second localization element coupled to the PTD at the second location are received. A distance between the first localization element and the second localization element based on the data associated with the position of the first localization element and the position of the second localization element is determined. A difference is calculated between the distance between the first marker and the second marker during the first time interval and the distance between the first localization element and the second localization element during the second time interval. In addition the PTD device can be tracked continuously during the procedure and a sequence of motion of the PTD device that represents the patient motion of an organ or the patient's respiratory cycle can be collected. The sequence of motion can then be analyzed to find unique similar points within the dataset and grouped.


Other objects and features will be in part apparent and in part pointed out hereinafter.





BRIEF DESCRIPTION OF THE DRAWINGS

The details of the present invention, both as to its construction and operation can best be understood with reference to the accompanying drawings, in which like numerals refer to like parts, and in which:



FIG. 1 is a schematic illustration of various devices used with a method according to an embodiment of the invention.



FIG. 2 is a schematic illustration of various devices used with a method according to an embodiment of the invention.



FIG. 3 is a schematic illustrating vector distances on an apparatus according to an embodiment of the invention.



FIG. 4A is a schematic illustrating vector distances from a localization device according to an embodiment of the invention.



FIG. 4B is a schematic illustrating vector distances from image data according to an embodiment of the invention.



FIG. 5 is a front perspective view of an apparatus according to an embodiment of the invention.



FIG. 6 is a graphical representation illustrating the function of an apparatus according to an embodiment of the invention.



FIG. 7 is a flowchart illustrating a method according to an embodiment of the invention.



FIG. 8 shows the layout of a system that may be used to carry out image guided interventions using certain of the present methods that involve gated datasets.



FIG. 9 illustrates one example of samples of a periodic human characteristic signal (specifically, an ECG waveform) associated, or gated, with images of dynamic anatomy.



FIG. 10 is a diagram of an exemplary surgical instrument navigation system in accordance with present invention;



FIG. 11 is a flowchart that depicts a technique for simulating a virtual volumetric scene of a body cavity from a point of view of a surgical instrument positioned within the patient in accordance with the present invention;



FIG. 12 is an exemplary display from the surgical instrument navigation system of the present invention;



FIG. 12A is an illustration of the position of the surgical instrument within the trachea as depicted in FIG. 12.



FIGS. 12B and 12C each illustrate the video image provided by the surgical instrument when undergoing the “wiggle maneuver” along the plane shown in FIG. 12A at the respective positions indicated.



FIG. 12D is an illustration of the video image provided by the surgical instrument and corresponding to the 3D navigation model image of FIG. 12.



FIG. 13 is a flowchart that depicts a technique for synchronizing the display of an indicia or graphical representation of the surgical instrument with cardiac or respiratory cycle of the patient in accordance with the present invention; and



FIG. 14 is a flowchart that depicts a technique for generating four-dimensional image data that is synchronized with the patient in accordance with the present invention.



FIG. 15 is a graph depicting an axis or point that the instrument (e.g., a bronchoscope) deflects in a single planar direction. The graph shows the instrument (e.g., a bronchoscope) being maneuvered in six different orientations in a 3D localizer volume, with all orientations converging about a common axis or point of deflection.



FIG. 16 is a graph depicting the eigenvalues (e0,e1,e2) for a moving 3.0 sec PCA (principal component analysis) window over a data file including 1800 samples. The square wave represents an on/off “wiggle detector” state based on the algorithm described herein, and this square wave demonstrates that the algorithm exhibits no false negatives for the validation test data and that the seven exemplary “wiggle” periods are clearly matched to the “on” state of the wiggle detector. The implementation of the algorithm uses low pass filtering and an appropriate comparator function to eliminate any false positive traces or spots (“blips”) indicated in FIG. 16.



FIG. 17 is an image of an exemplary synthetic radiograph in accordance with the present invention, depicting the historical instrument position trace.



FIG. 18 depicts an exemplary curvature warning system in accordance with the invention described herein.



FIG. 19 depicts an exemplary real-time respiration compensation algorithm.





DETAILED DESCRIPTION

The accompanying Figures and this description depict and describe embodiments of a navigation system (and related methods and devices) in accordance with the present invention, and features and components thereof. It should also be noted that any references herein to front and back, right and left, top and bottom and upper and lower are intended for convenience of description, not to limit the present invention or its components to any one positional or spatial orientation.


It is noted that the terms “comprise” (and any form of comprise, such as “comprises” and “comprising”), “have” (and any form of have, such as “has” and “having”), “contain” (and any form of contain, such as “contains” and “containing”), and “include” (and any form of include, such as “includes” and “including”) are open-ended linking verbs. Thus, a method, an apparatus, or a system that “comprises,” “has,” “contains,” or “includes” one or more items possesses at least those one or more items, but is not limited to possessing only those one or more items. For example, a method that comprises receiving a position of an instrument reference marker coupled to an instrument; transforming the position into image space using a position of a non-tissue internal reference marker implanted in a patient; and superimposing a representation of the instrument on an image in which the non-tissue internal reference marker appears possesses at least the receiving, transforming, and superimposing steps, but is not limited to possessing only those steps. Accordingly, the method also covers instances where the transforming includes transforming the position into image space using a transformation that is based, in part, on the position of the non-tissue internal reference marker implanted in the patient, and calculating the transformation using image space coordinates of the internal reference marker in the image. The term “use” should be interpreted the same way. Thus, a calculation that uses certain items uses at least those items, but also covers the use of additional items.


Individual elements or steps of the present methods, apparatuses, and systems are to be treated in the same manner. Thus, a step that calls for creating a dataset that includes images, one of the images (a) depicting a non-tissue internal reference marker, (b) being linked to non-tissue internal reference marker positional information, and (c) being at least 2-dimensional covers the creation of at least such a dataset, but also covers the creation of a dataset that includes images, where each image (a) depicts the non-tissue internal reference marker, and (b) is linked to non-tissue internal reference marker positional information.


The terms “a” and “an” are defined as one or more than one. The term “another” is defined as at least a second or more. The term “coupled” encompasses both direct and indirect connections, and is not limited to mechanical connections.


Those of skill in the art will appreciate that in the detailed description below, certain well known components and assembly techniques have been omitted so that the present methods, apparatuses, and systems are not obscured in unnecessary detail.


An apparatus according to an embodiment of the invention includes a PTD and two or more markers coupled to the PTD. The apparatus can also include two or more localization elements coupled to the PTD proximate the markers. The apparatus is configured to be coupled to a dynamic body, such as selected dynamic anatomy of a patient. Dynamic anatomy can be, for example, any anatomy that moves during its normal function (e.g., the heart, lungs, kidneys, liver and blood vessels). A processor, such as a computer, is configured to receive image data associated with the dynamic body taken during a pre-surgical or pre-procedural first time interval. The image data can include an indication of a position of each of the markers for multiple instants in time during the first time interval. The processor can also receive position data associated with the localization elements during a second time interval in which a surgical procedure or other medical procedure is being performed. The processor can use the position data received from the localization elements to determine a distance between the elements for a given instant in time during the second time interval. The processor can also use the image data to determine the distance between the markers for a given instant in time during the first time interval. The processor can then find a match between an image where the distance between the markers at a given instant in time during the first time interval is the same as the distance between the elements associated with those markers at a given instant in time during the medical procedure, or second time interval. Additionally, the processor can determine a sequence of motion of the markers and match this sequence of motion to the recorded motion of the markers over the complete procedure or significant period of time. Distance alone between the markers may not be sufficient to match the patient space to image space in many instances, it is important for the system to know the direction the markers are moving and the range and speed of this motion to find the appropriate sequence of motion for a complex signal or sequence of motion by the patient.


A physician or other healthcare professional can use the images selected by the processor during a medical procedure performed during the second time interval. For example, when a medical procedure is performed on a targeted anatomy of a patient, such as a heart or lung, the physician may not be able to utilize an imaging device during the medical procedure to guide him to the targeted area within the patient. A PTD according to an embodiment of the invention can be positioned or coupled to the patient proximate the targeted anatomy prior to the medical procedure, and pre-procedural images can be taken of the targeted area during a first time interval. Markers or fiducials coupled to the PTD can be viewed with the image data, which can include an indication of the position of the markers during a given path of motion of the targeted anatomy (e.g., the heart) during the first time interval. Such motion can be due, for example, to inspiration (i.e., inhaling) and expiration (i.e., exhaling) of the patient, or due to the heart beating. During a medical procedure, performed during a second time interval, such as a procedure on a heart or lung, the processor receives data from the localization elements associated with a position of the elements at a given instant in time during the medical procedure (or second time interval). The distance between selected pairs of markers can be determined from the image data and the distance, range, acceleration, and speed between corresponding selected pairs of localization elements can be determined based on the element data for given instants in time. From multiple image datasets the range and speed of the markers motion can be calculated.


Because the localization elements are coupled to the PTD proximate the location of the markers, the distance between a selected pair of elements can be used to determine an intra-procedural distance between the pair of corresponding markers to which the localization elements are coupled. An image from the pre-procedural image data taken during the first time interval can then be selected where the distance between the pair of selected markers in that image corresponds with or closely approximates the same distance determined using the localization elements at a given instant in time during the second time interval. This process can be done continuously during the medical procedure, producing simulated real-time, intra-procedural images illustrating the orientation and shape of the targeted anatomy as a catheter, sheath, needle, forceps, guidewire, fiducial delivery devices, therapy device (ablation modeling, drug diffusion modeling, etc.), or similar structure(s) is/are navigated to the targeted anatomy. Thus, during the medical procedure, the physician can view selected image(s) of the targeted anatomy that correspond to and simulate real-time movement of the anatomy. In addition, during a medical procedure being performed during the second time interval, such as navigating a catheter or other instrument or component thereof to a targeted anatomy, the location(s) of a sensor (e.g., an electromagnetic coil sensor) coupled to the catheter during the second time interval can be superimposed on an image of a catheter. The superimposed image(s) of the catheter can then be superimposed on the selected image(s) from the first time interval, providing simulated real-time images of the catheter location relative to the targeted anatomy. This process and other related methods are described in U.S. Pat. No. 7,398,116, entitled Methods, Apparatuses, and Systems Useful in Conducting Image Guided Interventions, filed Aug. 26, 2003.


In one embodiment, a real-time pathway registration is applied to a pre-acquired dataset that does not contain the PTD. It will be understood that the pre-acquired dataset can be at only one cycle of a patient's respiratory, heartbeat, or other path of motion. In order to optimize the registration of a pre-acquired dataset that does not contain the PTD, a PTD can be subsequently applied to the patient, and the PTD signal can be used to collect registration information throughout full range or path of motion but only that information that is captured at a similar PTD orientation, shape, or point along the PTD cycle of motion is used. This method enhances the registration accuracy by ensuring that the registration points being used to register are at the same point during the initial dataset acquisition. In preferred embodiments, the method uses multiple subsets of the acquired registration data that are collected based on the PTD signal. These multiple subsets are then applied against the pre-acquired dataset to find the optimal registration fit.


In another embodiment, the device can be integrated with one or more fiber optic localization (FDL) devices and/or techniques. In this way, the sensor (such as an EM sensor) provides the 3D spatial orientation of the device, while the FDL provides shape sensing of the airway, vessel, pathway, organ, environment and surroundings. Conventional FDL techniques can be employed. In various embodiments, for example, the FDL device can be used to create localization information for the complete pathway or to refine the localization accuracy in a particular segment of the pathway. By either using 3D localization information, shape, or both detected by the FDL device, the system can use a weighted algorithm between multiple localization devices to determine the location and orientation of the instrument in the patient. The FDL device can also be used as or in conjunction with the PTD to track the patient's motion such as respiration or heartbeat.


Other aspects involve using a guidewire or other navigated instrument with one to one rotation to continuously align a virtual display view to be consistent with the actual bronchoscopic video view. A similar technique can be used with OCT, IVUS, or EBUS devices to orient the virtual view to the image captured by the OCT, IVUS, or EBUS devices.


Other aspects involve using video input of the bronchoscope to adjust the virtual “fly-through” view to be consistent with the user's normal perspective. For example, conventional video processing and matching techniques can be used to align the real-time video and the virtual image.


Other aspects involve using bronchoscopic video to provide angular information at a current location to provide targeting or directional cues to the user. Angular information can be derived from the location of patient anatomy in the image and the relative size of each within the image. Using information extracted from the video captured by the bronchoscope, the system can determine which the direction of the display. This can be done using, for example, translation, rotation, or a combination of both. Comparing the real-time image captured to the virtual image constructed from the 3D dataset (i.e., CT) the system can use this information to align the virtual image and/or enhance the system accuracy.


In another aspect, a high-speed three-dimensional imaging device, such as an optical coherence tomography (OCT) device, can be tracked. In accordance with conventional methods, such a device can only view 1-2 mm below the surface. With an EM sensor attached in accordance with the systems and methods described herein, multiple 3D volumes of data can be collected and a larger 3D volume of collected data can be constructed. Knowing the 3D location and orientation of the multiple 3D volumes will allow the user to view a more robust image of, for example, pre-cancerous changes in the esophagus or colon. This data can also be correlated to pre-acquired or intra-procedurally acquired CT, fluoroscopic, ultrasound, or 3D fluoroscopic images to provide additional information.


Among several potential enhancements that could be provided by an endolumenal system as described herein is that a user could overlay the planned pathway information on to the actual/real-time video image of the scope or imaging device (such as ultrasound based device). Additionally, the system and apparatus could provide a visual cue on the real-time video image showing the correct direction or pathway to take.


Additionally, or alternatively, one could use a 5DOF sensor 1024 and a limited or known range of motion of a localization device 134 to determine its orientation in the field. This is particularly relevant, for example, in determining which way is up or down or the overall rotation of an image in 3D space. In bronchoscopy, for instance, this can be used to orient the bronchoscopic view to the user's normal expected visual orientation. Because a bronchoscope 1012 is typically only able to move in one plane up and down, the system can use the 3D location of a tip sensor 1024 moving up and down to determine the sixth degree of freedom. In this implementation, such as is shown in FIG. 10, a user could steer the bronchoscope 1012 to a bifurcation or close to a bifurcation such as at location 29 shown in FIG. 12 and then perform a motion with the scope (e.g., up and down) using the thumb control to wiggle or flutter the tip 1015 and sensor 1024. With this motion (i.e., described herein generally as the “wiggle maneuver”), the system can determine the orientation and display the correct image such as depicted in bronchoscope video image of FIG. 12D corresponding to the 3D navigation model image provided by the imaging device shown at view 38 of FIG. 12. Typically, 5DOF sensing of instrument tip POSE (position and orientation) determines 5 of the 6 POSE parameters (x, y, z, pitch, and yaw). Such sensing may be unable in certain applications, however, to determine the instantaneous roll of the device. This roll determination can be critical in matching the video coming from the device to the images. The techniques described herein advantageously allow for users to relate orientation of a navigation model or virtual endoscopic “fly-thru” display 38 to actual video orientation presented by the endoscopic instrument as illustrated in FIG. 12D.


In general, the methods described herein provides for the user to select a location, most typically at a branching point in the bronchial tree, such as is shown in FIG. 12A where the position of the catheter tip 1015 is shown in a position corresponding to location 29 depicted in FIG. 12, and perform a “wiggle” maneuver with the tip of the device. In preferred embodiments, the wiggle maneuver generally consists of three steps:


(i). At desired branch or other point, the physical or translational location of device is substantially secured or held in place. For example, with a bronchoscope, the user should ensure that the scope is held securely so it cannot translate relative to the airway.


(ii). Perform a tip wiggle in plane, illustrated by arc 31 in FIG. 12A, by a rhythmic actuation of the scope steering mechanism. The magnitude of the actuation should be sufficient in force for the tip to cover an approximate 1 cm-2 cm range of motion, but less motion may be sufficient in some applications or embodiments. The video view provided by of the bronchoscope 1012 shown during the wiggle maneuver is shown in FIGS. 12B-12C.


(iii). Continue the wiggle maneuver, keeping the scope itself substantially stationary until the systems described herein and related software defines the sixth degree of freedom, and the orientation of the video display of the scope as illustrated in FIG. 12D, matches the primary perspective image 38 of the virtual “fly-thru” or navigation model.


Algorithmically, a manifestation of an algorithm to detect this operation consists of recognizing a unique signature of motion over time. One such technique, for example, consists of two parts:


(a) performance of a continuous PCA analysis of a specified time window of 5DOF sensor locations. A repeated motion in a plane by the instrument tip will produce a covariance matrix such that the foremost eigenvalue (e0) will reflect the variance of a 1 cm-2 cm motion over a given time window. In addition, the secondary and tertiary eigenvalues (e1 and e2) will reflect a very small variance, as the tip should preferably be moving in an arc constrained to the plane defined by the spline mechanism of the scope.


(b) once an appropriate PCA signature is detected, orientation of the tip to the eigenvector EU, which represents the historical vector of motion for the instrument tip, is compared. An exemplary way to do this is the dot product of the measured tip vector and EU, which represents the acute angle.

r=V·E0


By way of example and not by way of limitation, in an ideal wiggle maneuver, the orientation of the tip should show a rhythmic oscillation of about 90° (approximate range could be for instance +/−45°). This comparison of tip orientation to E0 provides the basis for a determination of the wiggle plane normal using, for example, a cross-product or gram-schmidt orthogonalization. The physical wiggle plane normal is in a constant relationship to the coordinate space of the video signal, and thus can be related (calibrated) thereto.


Interestingly, this algorithm can be used in a variety of different modes to detect certain forms of scope motion:

    • stationary scope (wherein the e0, e1, and e2 values will be very small and roughly equal, representing the small magnitude, unbiased Gaussian noise of the direct localization measurement);
    • scope in motion along a straight line in space, such as when traversing individual segments (wherein the e0 will be very large, with very small e1 and e2 values, representing a large co-linear translation in space). In addition, the absolute value of r will be close to 1, indicating that the orientation of the tip is nearly collinear with the translation of the tip over time.


In general, the wiggle techniques described herein for determining direction and orientation, including “up” and “down” orientation relative to (or independent of) the instrument steering mechanism, may used in a range of different endoscopic applications. Although the above examples and embodiments generally refer to the use of the wiggle maneuver in connection with bronchoscopic applications, it will be understood that the techniques described herein may also be used in other applications including, but not limited to, enteroscopy, colonoscopy, sigmoidoscopy, rhinoscopy, proctoscopy, otoscopy, cystoscopy, gynoscopy, colposcopy, hysteroscopy, falloposcopy, laparoscopy, arthroscopy, thoracoscopy, amnioscopy, fetoscopy, panendoscopy, epiduroscopy, and the like. The wiggle techniques described herein may also be applicable in non-medical endoscopic uses, such as the internal inspection of complex technical systems, surveillance, and the like.


In general, the systems and methods described herein can be implemented regardless of the number of sensors that are used. In some embodiments, serial orientation or positioning of multiple sensors allows the determination of one or more parameters such as shape, position, orientation, and mechanical status of a complete or partial section of guidewire or other device or instrument. For example, the placement of multiple sensors can assist in visualizing the shape of the device and any bends in the path by providing a number of data points on the path (e.g., 8 sensors, spaced 1 mm apart) to create a 3D shape model of the device. Various parameters can be used to track past or present movement and changes in device shape including, for example, elasticity, bend radius, limiting, and durometer rating of the device material. These parameters and accompanying data can provide visual cues to the user during the procedure, for example, when the device has a certain bend or curvature (based on path or surroundings), e.g., to provide a notice or warning that the device is on the correct or incorrect path, or to provide notice regarding, or track, a particular parameter(s) that the user is interested in. Such a sensor pathway is generally depicted in FIG. 18, which shows exemplary curvature warning scenarios in the differently marked sections or segments.


In various aspects and embodiments described herein, one can use the knowledge of the path traveled by the instrument and segmented airway or vessel from the acquired image (e.g., CT) to limit the possibilities of where the instrument is located in the patient. The techniques described herein, therefore, can be valuable to improve virtual displays for users. Fly through, Fly-above, or image displays related to segmented paths are commonly dependent upon relative closeness to the segmented path. For a breathing patient, for example, or a patient with a moving vessel related to heartbeat, it is valuable to use the path traveled information to determine where in the 4D patient motion cycle the system is located within the patient. By comparing the 3D location, the patient's tracked or physiological signal is used to determine 4D patient motion cycle, and with the instrument's traveled path, one can determine the optical location relative to a segmented airway or vessel and use this information to provide the optimal virtual display.



FIGS. 1 and 2 are schematic illustrations of devices that can be used in conjunction with, or to perform, various procedures described herein. As shown in FIG. 1, an apparatus 110 includes a PTD 120. The PTD 120 can be coupled to a dynamic body B. The dynamic body B can be, for example, a selected dynamic portion of the anatomy of a patient. The PTD 120 can be a variety of different shapes and sizes. For example, in one embodiment the PTD 120 is substantially planar, such as in the form of a patch that can be disposed at a variety of locations on a patient's body. Such a PTD 120 can be coupled to the dynamic body with adhesive, straps, hook and pile, snaps, or any other suitable coupling method. In another embodiment the PTD can be a catheter type device with a pigtail or anchoring mechanism that allows it to be attached to an internal organ or along a vessel.


Two or more markers or fiducials 122 are coupled to the PTD 120 at selected locations as shown in FIG. 1. The markers 122 are constructed of a material that can be viewed on an image, such as an X-ray or CT. The markers 122 can be, for example, radiopaque, and can be coupled to the PTD 120 using any known methods of coupling such devices. FIGS. 1 and 2 illustrate the apparatus 110 having four markers 122, but any number of two or more markers can be used. In one embodiment the marker or fiducials and the localization element can be the same device.


An imaging device 140 can be used to take images of the dynamic body B while the PTD 120 is coupled to the dynamic body B, pre-procedurally during a first time interval. As stated above, the markers 122 are visible on the images and can provide an indication of a position of each of the markers 122 during the first time interval. The position of the markers 122 at given instants in time through a path of motion of the dynamic body B can be illustrated with the images. The imaging device 140 can be, for example, a computed tomography (CT) device (e.g., respiratory-gated CT device, ECG-gated CT device), a magnetic resonance imaging (MRI) device (e.g., respiratory-gated MRI device, ECG-gated MRI device), an X-ray device, or any other suitable medical imaging device. In one embodiment, the imaging device 140 is a computed tomography positron emission tomography device that produces a fused computed tomography positron emission tomography image dataset. The imaging device 140 can be in communication with a processor 130 and send, transfer, copy and/or provide image data taken during the first time interval associated with the dynamic body B to the processor 130.


The processor 130 includes a processor-readable medium storing code representing instructions to cause the processor 130 to perform a process. The processor 130 can be, for example, a commercially available personal computer, or a less complex computing or processing device that is dedicated to performing one or more specific tasks. For example, the processor 130 can be a terminal dedicated to providing an interactive graphical user interface (GUI). The processor 130, according to one or more embodiments of the invention, can be a commercially available microprocessor. Alternatively, the processor 130 can be an application-specific integrated circuit (ASIC) or a combination of ASICs, which are designed to achieve one or more specific functions, or enable one or more specific devices or applications. In yet another embodiment, the processor 130 can be an analog or digital circuit, or a combination of multiple circuits.


The processor 130 can include a memory component 132. The memory component 132 can include one or more types of memory. For example, the memory component 132 can include a read only memory (ROM) component and a random access memory (RAM) component. The memory component can also include other types of memory that are suitable for storing data in a form retrievable by the processor 130. For example, electronically programmable read only memory (EPROM), erasable electronically programmable read only memory (EEPROM), flash memory, as well as other suitable forms of memory can be included within the memory component. The processor 130 can also include a variety of other components, such as for example, coprocessors, graphic processors, etc., depending upon the desired functionality of the code.


The processor 130 can store data in the memory component 132 or retrieve data previously stored in the memory component 132. The components of the processor 130 can communicate with devices external to the processor 130 by way of an input/output (I/O) component (not shown). According to one or more embodiments of the invention, the I/O component can include a variety of suitable communication interfaces. For example, the I/O component can include, for example, wired connections, such as standard serial ports, parallel ports, universal serial bus (USB) ports, S-video ports, local area network (LAN) ports, small computer system interface (SCCI) ports, and so forth. Additionally, the I/O component can include, for example, wireless connections, such as infrared ports, optical ports, Bluetooth® wireless ports, wireless LAN ports, or the like.


The processor 130 can be connected to a network, which may be any form of interconnecting network including an intranet, such as a local or wide area network, or an extranet, such as the World Wide Web or the Internet. The network can be physically implemented on a wireless or wired network, on leased or dedicated lines, including a virtual private network (VPN).


As stated above, the processor 130 can receive image data from the imaging device 140. The processor 130 can identify the position of selected markers 122 within the image data or voxel space using various segmentation techniques, such as Hounsfield unit thresholding, convolution, connected component, or other combinatory image processing and segmentation techniques. The processor 130 can determine a distance and direction between the position of any two markers 122 during multiple instants in time during the first time interval, and store the image data, as well as the position and distance data, within the memory component 132. Multiple images can be produced providing a visual image at multiple instants in time through the path of motion of the dynamic body. The processor 130 can also include a receiving device or localization device 134, which is described in more detail below.


A deformation field may also be included in the analysis in various embodiments described herein. For example, the deformation field can be applied to fuse 3D fluoroscopic images to CT images in order to compensate for different patient orientations, patient position, respiration, deformation induced by the catheter or other instrument, and/or other changes or perturbations that occur due to therapy delivery or resection or ablation of tissue.


In some embodiments, for example, real-time respiration compensation can be determined by applying an inspiration-to-expiration deformation vector field. In combination with the PTD respiratory signal, for example, the instrument location can be calculated using the deformation vector field. A real-time instrument tip correction vector can be applied to a 3D localized instrument tip. The real-time correction vector is computed by scaling an inspiration-to-expiration deformation vector (found from the inspiration-to-expiration deformation vector field) based on the PTD respiratory signal. This correction vector can then be applied to the 3D localized instrument tip. This can further optimize accuracy during navigation.


An example of an algorithm for real-time respiration compensation can be found in FIG. 19. In accordance with this algorithm, for each custom character:

    • (a) find vi such that scalar d is minimized;
    • (b) compute c, wherein:

      c=−vit
    • and (c) compute custom character′, wherein:

      custom character′+custom character+c

      Thus, custom character′ is a respiration compensated version of custom character.


Although FIG. 19 and the above discussion generally relate to real-time respiration motion, it will be understood that these calculations and determinations may also be applied to real-time heartbeat and/or vessel motion compensation, or any other motion of a dynamic body as described herein. In one embodiment, for example, the deformation matrix is calculated based upon inspiration and expiration. In another embodiment, for example, the deformation matrix is calculated based upon heartbeat. In yet another embodiment, for example, the deformation matrix is based upon vessel motion. In these and other embodiments, it is also possible to extend these calculations and determinations to develop multiple deformation matricies across multiple patient datasets, by acquiring the multiple datasets over the course of, for example, a single heartbeat cycle or a single respiratory cycle.


Deformation on 2D images can also be calculated based upon therapeutic change of tissue, changes in Houndsfield units for images, patient motion compensation during the imaging sequence, therapy monitoring, and temperature monitoring with fluoroscopic imaging, among other things. One potential issue with conventional therapy delivery, for instance, is monitoring the therapy for temperature or tissue changes. In accordance with the methods described herein, this monitoring can be carried out using intermittent fluoroscopic imaging, where the images are compensated between acquisition times to show very small changes in image density, which can represent temperature changes or tissue changes as a result of the therapy and/or navigation.


In general, it may also be preferable to reduce the level of radiation that patients are exposed to before or during a procedure (or pre-procedural analysis) as described herein. One method of reducing radiation during the acquisition of a 3D fluoroscopic dataset (or other dataset described herein), for example, is to use a deformation field between acquired 2D images to reduce the actual number of 2D images that need to be acquired to create the 3D dataset. In one particular embodiment, the deformation field is used to calculate the deformation between images in the acquisition sequence to produce 2D images between the acquired slices, and these new slices can be used to calculate the 3D fluoroscopic dataset. For example, if 180 2D image slices were previously required, e.g., an image(s) taken every 2 degrees of a 360 degree acquisition sequence, in accordance with some embodiments 90 2D images can be acquired over a 360 degree acquisition sequence and the data from the images that would have ordinarily been acquired between each slice can be calculated and imported into the 3D reconstruction algorithm. Thus, the radiation is effectively reduced by 50%.


As shown in FIG. 2, two or more localization elements 124 are coupled to the PTD 120 proximate the locations of the markers 122 for use during a medical procedure to be performed during a second time interval. The localization elements 124 can be, for example, electromagnetic coils, infrared light emitting diodes, and/or optical passive reflective markers. The localization elements 124 can also be, or be integrated with, one or more fiber optic localization (FDL) devices. The markers 122 can include plastic or non-ferrous fixtures or dovetails or other suitable connectors used to couple the localization elements 124 to the markers 122. A medical procedure can then be performed with the PTD 120 coupled to the dynamic body B at the same location as during the first time interval when the pre-procedural images were taken. During the medical procedure, the localization elements 124 are in communication or coupled to the localization device 134 included within processor 130. The localization device 134 can be, for example, an analog to digital converter that measures voltages induced onto localization coils in the field; creates a digital voltage reading; and maps that voltage reading to a metric positional measurement based on a characterized volume of voltages to millimeters from a fixed field emitter. Position data associated with the elements 124 can be transmitted or sent to the localization device 134 continuously during the medical procedure during the second time interval. Thus, the position of the localization elements 124 can be captured at given instants in time during the second time interval. Because the localization elements 124 are coupled to the PTD 120 proximate the markers 122, the localization device 134 can use the position data of the elements 124 to deduce coordinates or positions associated with the markers 122 intra-procedurally during the second time interval. The distance, range, acceleration, and speed between one or more selected pairs of localization elements 124 (and corresponding markers 122) can then be determined and various algorithms can be used to analyze and compare the distance between selected elements 124 at given instants in time, to the distances between and orientation among corresponding markers 122 observed in the pre-operative images.


An image can then be selected from the pre-operative images taken during the first time interval that indicates a distance or is grouped in a similar sequence of motion between corresponding markers 122 at a given instant in time, that most closely approximates or matches the distance or similar sequence of motion between the selected elements 124. The process of comparing the distances is described in more detail below. Thus, the apparatus 110 and processor 130 can be used to provide images corresponding to the actual movement of the targeted anatomy during the medical procedure being performed during the second time interval. The images illustrate the orientation and shape of the targeted anatomy during a path of motion of the anatomy, for example, during inhaling and exhaling.



FIG. 3 illustrates an example set of distances or vectors d1 through d6 between a set of markers 122, labeled m1 through m9 that are disposed at spaced locations on a PTD 120. As described above, pre-procedure images can be taken of a dynamic body for which the PTD 120 is to be coupled during a first time interval. The distances between the markers can be determined for multiple instants in time through the path of motion of the dynamic body. Then, during a medical procedure, performed during a second time interval, localization elements (not shown in FIG. 3) coupled proximate to the location of markers 122 can provide position data for the elements to a localization device (not shown in FIG. 3). The localization device can use the position data to determine distances or vectors between the elements for multiple instants in time during the medical procedure or second time interval.



FIG. 4A shows an example of distance or vector data from the localization device. Vectors a1 through a6 represent distance data for one instant in time and vectors n1 through n6 for another instant in time, during a time interval from a to n. As previously described, the vector data can be used to select an image from the pre-procedural images that includes distances between the markers m1 through m9 that correspond to or closely approximate the distances a1 through a6 for time a, for example, between the localization elements. The same process can be performed for the vectors n1 through n6 captured during time n.


One method of selecting the appropriate image from the pre-procedural images is to execute an algorithm that can sum all of the distances a1 through a6 and then search for and match this sum to an image containing a sum of all of the distances d1 through d6 obtained pre-procedurally from the image data that is equal to the sum of the distances a1 through a6. When the difference between these sums is equal to zero, the relative position and orientation of the anatomy or dynamic body D during the medical procedure will substantially match the position and orientation of the anatomy in the particular image. The image associated with distances d1 through d6 that match or closely approximate the distances a1 through a6 can then be selected and displayed. For example, FIG. 4B illustrates examples of pre-procedural images, Image a and Image n, of a dynamic body D that correspond to the distances a1 through a6 and n1 through n6, respectively. An example of an algorithm for determining a match is as follows:

Does Σai=Σdi (i=1 to 6 in this example) OR
Does Σ(ai−di)=0 (i=1 to 6 in this example).

If yes to either of these, then the image is a match to the vector or distance data obtained during the medical procedure.



FIG. 5 illustrates an apparatus 210 according to an embodiment of the invention. The apparatus 210 includes a tubular shaped PTD 220 that can be constructed with a rigid material or, alternatively, a flexible and/or stretchable material. In one embodiment, for example, the PTD 220 is substantially rigid in structure. In another embodiment, for example, the PTD 220 has a flexible or stretchable structure. The PTD 220 can be positioned over a portion of a patient's body, such as around the upper or lower torso of the patient. In the embodiments in which the PTD 220 is constructed with a stretchable and/or flexible material, for instance, the stretchability of the PTD 220 allows the PTD 220 to at least partially constrict some of the movement of the portion of the body for which it is coupled. The apparatus 210 further includes multiple markers or fiducials 222 coupled to the PTD 220 at spaced locations. A plurality of localization elements 224 are removably coupled proximate to the locations of markers 222, such that during a first time interval as described above, images can be taken without the elements 224 being coupled to the PTD 220. The localization elements need not be removably coupled. For example, the elements can be fixedly coupled to the PTD. In addition, the elements can be coupled to the PTD during the pre-procedure imaging.



FIG. 6 is a graphical illustration indicating how the apparatus 210 (shown without localization elements 224) can move and change orientation and shape during movement of a dynamic body, such as a mammalian body M. The graph is one example of how the lung volume can change during inhalation (inspiration) and exhalation (expiration) of the mammalian body M. The corresponding changes in shape and orientation of the apparatus 210 during inhalation and exhalation are also illustrated. The six markers 222 shown in FIG. 5 are labeled a, b, c, d, e, and f. As described above, images of the apparatus 210 can be taken during a first time interval. The images can include an indication of relative position of each of the markers 222, that is the markers 222 are visible in the images, and the position of each marker 222 can then be observed over a period of time. A distance between any two markers 222 can then be determined for any given instant of time during the first time interval. For example, a distance X between markers a and b is illustrated, and a distance Y between markers b and f is illustrated. These distances can be determined for any given instant in time during the first time interval from an associated image that illustrates the position and orientation of the markers 222. As illustrated, during expiration of the mammalian body M at times indicated as A and C, the distance X is smaller than during inspiration of the mammalian body M, at the time indicated as B. Likewise, the distance Y is greater during inspiration than during expiration. The distance between any pair of markers 222 can be determined and used in the processes described herein. Thus, the above embodiments are merely examples of possible pair selections. For example, a distance between a position of marker e and a position of marker b may be determined. In addition, multiple pairs or only one pair may be selected for a given procedure.



FIG. 7 is a flowchart illustrating a method according to an embodiment of the invention. A method 51 includes at step 52 receiving image data during a pre-procedural or first time interval. As discussed above, images are taken of a dynamic body using an appropriate imaging modality (e.g., CT Scan, MRI, etc.). The image data is associated with one or more images taken of a PTD (as described herein) coupled to a dynamic body, where the PTD includes two or more markers coupled thereto. In other words, the image data of the dynamic body is correlated with image data related to the PTD. The one or more images can be taken using a variety of different imaging modalities as described previously. The image data can include an indication of a position of a first marker and an indication of a position of a second marker, as illustrated at step 54. The image data can include position data for multiple positions of the markers during a range or path of motion of the dynamic body over a selected time interval. As described above, the image data can include position data associated with multiple markers, however, only two are described here for simplicity. A distance between the position of the first marker and the position of the second marker can be determined for multiple instants in time during the first time interval, at step 56. As also described above, the determination can include determining the distance based on the observable distance between the markers on a given image. The image data, including all of the images received during the first time interval, the position, and the distance data can be stored in a memory and/or recorded at step 58.


Then at step 60, during a second time interval, while performing a medical procedure on the patient with the PTD positioned on the patient at substantially the same location, position data can be received for a first localization element and a second localization element. The localization elements can be coupled to the PTD proximate the locations of the markers, such that the position data associated with the elements can be used to determine the relative position of the markers in real-time during the medical procedure. The position data of the elements can be stored and/or recorded at step 62.


A distance between the first and second localization elements can be determined at step 64. Although only two localization elements are described, as with the markers, position data associated with more than two localization elements can be received and the distances between the additional elements can be determined.


The next step is to determine which image from the one or more images taken during the first time interval represents the relative position and/or orientation of the dynamic body at a given instant in time during the second time interval or during the medical procedure. To determine this, at step 66, the distance between the positions of the first and second localization elements at a given instant in time during the second time interval are compared to the distance(s) determined in step 56 between the positions of the first and second markers obtained with the image data during the first time interval.


An image can be selected from the first time interval that best represents the same position and orientation of the dynamic body at a given instant in time during the medical procedure. To do this, the difference between the distance between a given pair of localization elements during the second time interval is used to select the image that contains the same distance between the same given pair of markers from the image data received during the first time interval. This can be accomplished, for example, by executing an algorithm to perform the calculations. When there are multiple pairs of markers and localization elements, the algorithm can sum the distances between all of the selected pairs of elements for a given instant in time during the second time interval and sum the distances between all of the associated selected pairs of markers for each instant in time during the first time interval when the pre-procedural image data was received.


When an image is found that provides the sum of distances for the selected pairs of markers that is substantially the same as the sum of the distances between the localization elements during the second time interval, then that image is selected at step 68. The selected image can then be displayed at step 70. The physician can then observe the image during the medical procedure on a targeted portion of the dynamic body. Thus, during the medical procedure, the above process can be continuously executed such that multiple images are displayed and images corresponding to real-time positions of the dynamic body can be viewed.



FIG. 8 shows one embodiment of a system (system 100) that includes components that can be used to perform image guided interventions using a gated imaging modality, such as ECG-gated MRI, or ECG-gated CT. The figure depicts a patient 10 positioned on an operating table 12 with a physician 14 performing a medical procedure on him.


Specifically, FIG. 8 depicts physician 14 steering a medical instrument 16 through the patient's internal anatomy in order to deliver therapy. In this particular instance, instrument 16 is depicted as a catheter entering the right atrium by way of the inferior vena cava preceded by a femoral artery access point; however, the present systems are not limited to catheter use indications. The position of virtually any instrument may be tracked as discussed below and a representation of it superimposed on the proper image, consistent with the present methods, apparatuses, and systems. An “instrument” is any device controlled by physician 14 for the purpose of delivering therapy, and includes needles, guidewires, stents, filters, occluders, retrieval devices, imaging devices (such as OCT, EBUS, IVUS, and the like), and leads. Instrument 16 is fitted with one or more instrument reference markers 18. A tracker 20 (which is sometimes referred to in the art as a “tracking system”) is configured to track the type of reference marker or markers coupled to instrument 16. Tracker 20 can be any type of tracking system, including but not limited to an electromagnetic tracking system. An example of a suitable electromagnetic tracking system is the AURORA electromagnetic tracking system, commercially available from Northern Digital Inc. in Waterloo, Ontario Canada. If tracker 20 is an electromagnetic tracking system, element 20 would represent an electromagnetic field generator that emits a series of electromagnetic fields designed to engulf patient 10, and reference marker or markers 18 coupled to medical instrument 16 could be coils that would receive an induced voltage that could be monitored and translated into a coordinate position of the marker(s).


As noted herein, a variety of instruments and devices can be used in conjunction with the systems and methods described herein. In one embodiment, for example, an angled coil sensor is employed during the targeted navigation. In accordance with this embodiment, for example, instead of using a conventional wire sensor wrapped at about a 90° angle (i.e., roughly perpendicular) to the axial length (or core) of the sensor, the coil is wrapped at an acute angle (i.e., the angle is less than about 90°) relative to the axial length of the sensor. In one embodiment, the coil is positioned (e.g., wrapped) at an angle of from about 30° to about 60° relative to the axial length. In one preferred embodiment, the coil is positioned at about a 45° angle relative to the axial length. The positioning of the coil in accordance with the exemplary embodiments described herein advantageously provides a directional vector that is not parallel with the sensor core. Thus, the physical axis is different and, as the sensor moves, this additional directional vector can be quantified and used to detect up and down (and other directional) movement. This motion can be captured over time as described herein to determine orientation and prepare and display the more accurate images.


An external reference marker 22 can be placed in a location close to the region of the patient where the procedure is to be performed, yet in a stable location that will not move (or that will move a negligible amount) with the patient's heart beat and respiration. If patient 10 is securely fixed to table 12 for the procedure, external reference marker 22 (which may be described as “static”) can be affixed to table 12. If patient 10 is not completely secured to table 12, external reference marker 22 can be placed on region of the back of patient 10 exhibiting the least amount of movement. Tracker 20 can be configured to track external reference marker 22.


One or more non-tissue internal reference markers 24 can be placed in the gross region where the image guided navigation will be carried out. Non-tissue internal reference marker(s) 24 should be placed in an anatomic location that exhibits movement that is correlated with the movement of the anatomy intended for image guided navigation. This location will be internal to the patient, in the gross location of the anatomy of interest.


Medical instrument 16, instrument reference marker(s) 18, external reference marker 22, and non-tissue internal reference marker(s) 24 can be coupled to converter 26 of system 100. Converter 26, one example of which may be referred to in the art as a break-out box, can be configured to convert analog measurements received from the reference markers and tracker 20 into digital data understandable by image guidance computing platform 30, and relay that data to image guidance computing platform 30 to which converter 26 can be coupled. Image guidance computing platform 30 can take the form of a computer, and may include a monitor on which a representation of one or more instruments used during the IGI can be displayed over an image of the anatomy of interest.


System 100 also includes a periodic human characteristic signal monitor, such as ECG monitor 32, which can be configured to receive a periodic human characteristic signal. For example, ECG monitor 32 can be configured to receive an ECG signal in the form of the ECG data transmitted to it by ECG leads 34 coupled to patient 10. The periodic human characteristic signal monitor (e.g., ECG monitor 32) can also be configured to relay a periodic human characteristic signal (e.g., ECG data) to image guidance computing platform 30, to which it can be coupled.


Prior to the start of the image guided intervention, non-tissue internal reference marker(s) 24—but not necessarily static external reference marker 22—should be placed in the gross region of interest for the procedure. After placement of non-tissue internal reference marker(s) 24, patient 10 is to be scanned with an imaging device, such as gated scanner 40, and the resulting gated image dataset transferred to image guidance computing platform 30, to which the imaging device is coupled and which can reside in the operating or procedure theatre. Examples of suitable imaging devices, and more specifically suitable gated scanners, include ECG-gated MRI scanners and ECG-gated CT scanners. A hospital network 50 may be used to couple gated scanner 40 to image guidance computing platform 30.


The imaging device (e.g., gated scanner 40) can be configured to create a gated dataset that includes pre-operative images, one or more of which (up to all) are taken using the imaging device and are linked to a sample of a periodic human characteristic signal (e.g., a sample, or a phase, of an ECG signal). Once patient 10 is scanned using the imaging device and the gated dataset is transferred to and received by image guidance computing platform 30, patient 10 can be secured to operating table 12 and the equipment making up system 100 (e.g., tracker 20, converter 26, image guidance computing platform 30, ECG monitor 32, and gated scanner 40) set up as shown in FIG. 9. Information can then flow among the system 100 components.


At this point, a gated dataset created by gated scanner 40 resides on image guidance computing platform 30. FIG. 9 highlights the relationship between the samples (S1 . . . Sn) and the images (I1 . . . In) that were captured by gated scanner 40. Designations P, Q, R, S, and T are designations well known in the art; they designate depolarizations and re-polarizations of the heart. Gated scanner 40 essentially creates an image of the anatomy of interest at a particular instant in time during the anatomy's periodic movement. Image I1 corresponds to the image that was captured at the S1 moment of patient 10's ECG cycle. Similarly, I2 is correlated with S2, and In with Sn.



FIG. 10 is a diagram of another exemplary surgical instrument navigation system 1010. In accordance with one aspect of the present invention, the surgical instrument navigation system 1010 is operable to visually simulate a virtual volumetric scene within the body of a patient, such as an internal body cavity, from a point of view of a surgical instrument 1012 residing in the cavity of a patient 10. To do so, the surgical instrument navigation system 1010 is primarily comprised of a surgical instrument 1012, a data processor 1016 having a display 1018, and a tracking subsystem 1020. The surgical instrument navigation system 1010 may further include (or accompanied by) an imaging device 1014 that is operable to provide image data to the system.


The surgical instrument 1012 is preferably a relatively inexpensive, flexible and/or steerable catheter that may be of a disposable type. The surgical instrument 1012 is modified to include one or more tracking sensors 1024 that are detectable by the tracking subsystem 1020. It is readily understood that other types of surgical instruments (e.g., a guide wire, a needle, a forcep, a pointer probe, a stent, a seed, an implant, an endoscope, an energy delivery device, a therapy delivery device, etc.) are also within the scope of the present invention. It is also envisioned that at least some of these surgical instruments may be wireless or have wireless communications links. It is also envisioned that the surgical instruments may encompass medical devices which are used for exploratory purposes, testing purposes or other types of medical procedures.


The volumetric scan data is then registered as shown at 34. Registration of the dynamic reference frame 19 generally relates information in the volumetric scan data to the region of interest associated with the patient. This process is referred to as registering image space to patient space. Often, the image space must also be registered to another image space. Registration is accomplished through knowledge of the coordinate vectors of at least three non-collinear points in the image space and the patient space.


Referring to FIG. 11, the imaging device 1014 is used to capture volumetric scan data 1132 representative of an internal region of interest within the patient 10. The three-dimensional scan data is preferably obtained prior to surgery on the patient 10. In this case, the captured volumetric scan data may be stored in a data store associated with the data processor 1016 for subsequent processing. However, one skilled in the art will readily recognize that the principles of the present invention may also extend to scan data acquired during surgery. It is readily understood that volumetric scan data may be acquired using various known medical imaging devices 1014, including but not limited to a magnetic resonance imaging (MRI) device, a computed tomography (CT) imaging device, a positron emission tomography (PET) imaging device, a 2D or 3D fluoroscopic imaging device, and 2D, 3D or 4D ultrasound imaging devices. In the case of a two-dimensional ultrasound imaging device or other two-dimensional image acquisition device, a series of two-dimensional data sets may be acquired and then assembled into volumetric data as is well known in the art using a two-dimensional to three-dimensional conversion.


The multi-dimensional imaging modalities described herein may also be coupled with digitally reconstructed radiography (DRR) techniques. In accordance with a fluoroscopic image acquisition, for example, radiation passes through a physical media to create a projection image on a radiation-sensitive film or an electronic image intensifier. Given a 3D or 4D dataset as described herein, for example, a simulated image can be generated in conjunction with DRR methodologies. DRR is generally known in the art, and is described, for example, by Lemieux et al. (Med. Phys. 21(11), November 1994, pp. 1749-60).


When a DRR image is created, a fluoroscopic image is formed by computationally projecting volume elements, or voxels, of the 3D or 4D dataset onto one or more selected image planes. Using a 3D or 4D dataset of a given patient as described herein, for example, it is possible to generate a DRR image that is similar in appearance to a corresponding patient image. This similarity can be due, at least in part, to similar intrinsic imaging parameters (e.g., projective transformations, distortion corrections, etc.) and extrinsic imaging parameters (e.g., orientation, view direction, etc.). The intrinsic imaging parameters can be derived, for instance, from the calibration of the equipment. Advantageously, this provides another method to see the up-and-down (and other directional) movement of the instrument. This arrangement further provides the ability to see how the device moves in an image(s), which translates to improved movement of the device in a patient. An exemplary pathway in accordance with the disclosure herein can be seen in FIG. 17.


A dynamic reference frame 19 is attached to the patient proximate to the region of interest within the patient 10. To the extent that the region of interest is a vessel or a cavity within the patient, it is readily understood that the dynamic reference frame 19 may be placed within the patient 10. To determine its location, the dynamic reference frame 19 is also modified to include tracking sensors detectable by the tracking subsystem 1020. The tracking subsystem 1020 is operable to determine position data for the dynamic reference frame 19 as further described below.


The volumetric scan data is then registered as shown at 1134. Registration of the dynamic reference frame 19 generally relates information in the volumetric scan data to the region of interest associated with the patient. This process is referred to as registering image space to patient space. Often, the image space must also be registered to another image space. Registration is accomplished through knowledge of the coordinate vectors of at least three non-collinear points in the image space and the patient space.


Registration for image guided surgery can be completed by different known techniques. First, point-to-point registration is accomplished by identifying points in an image space and then touching the same points in patient space. These points are generally anatomical landmarks that are easily identifiable on the patient. Second, surface registration involves the user's generation of a surface in patient space by either selecting multiple points or scanning, and then accepting the best fit to that surface in image space by iteratively calculating with the data processor until a surface match is identified. Third, repeat fixation devices entail the user repeatedly removing and replacing a device (i.e., dynamic reference frame, etc.) in known relation to the patient or image fiducials of the patient. Fourth, automatic registration by first attaching the dynamic reference frame to the patient prior to acquiring image data. It is envisioned that other known registration procedures are also within the scope of the present invention, such as that disclosed in U.S. Pat. No. 6,470,207, filed on Mar. 23, 1999, entitled “NAVIGATIONAL GUIDANCE VIA COMPUTER-ASSISTED FLUOROSCOPIC IMAGING”, which is hereby incorporated by reference.



FIG. 12 illustrates another type of secondary image 28 which may be displayed in conjunction with the primary perspective image 38. In this instance, the primary perspective image is an interior view of an air passage within the patient 10. The secondary image 28 is an exterior view of the air passage which includes an indicia or graphical representation 29 that corresponds to the location of the surgical instrument 1012 within the air passage. In FIG. 12, the indicia 29 is shown as a crosshairs. It is envisioned that other indicia may be used to signify the location of the surgical instrument in the secondary image. As further described below, the secondary image 28 is constructed by superimposing the indicia 29 of the surgical instrument 1012 onto the manipulated image data 1138 (see FIG. 11).


Referring to FIG. 13, the display of an indicia of the surgical instrument 1012 on the secondary image may be synchronized with an anatomical function, such as the cardiac or respiratory cycle, of the patient. In certain instances, the cardiac or respiratory cycle of the patient may cause the surgical instrument 1012 to flutter or jitter within the patient. For instance, a surgical instrument 1012 positioned in or near a chamber of the heart will move in relation to the patient's heart beat. In these instance, the indicia of the surgical instrument 1012 will likewise flutter or jitter on the displayed image 1140. It is envisioned that other anatomical functions which may affect the position of the surgical instrument 1012 within the patient are also within the scope of the present invention.


To eliminate the flutter of the indicia on the displayed image 1140, position data for the surgical instrument 1012 is acquired at a repetitive point within each cycle of either the cardiac cycle or the respiratory cycle of the patient. As described above, the imaging device 1014 is used to capture volumetric scan data 1342 representative of an internal region of interest within a given patient. A secondary image may then be rendered 1344 from the volumetric scan data by the data processor 1016.


In order to synchronize the acquisition of position data for the surgical instrument 1012, the surgical instrument navigation system 1010 may further include a timing signal generator 1026. The timing signal generator 1026 is operable to generate and transmit a timing signal 1346 that correlates to at least one of (or both) the cardiac cycle or the respiratory cycle of the patient 10. For a patient having a consistent rhythmic cycle, the timing signal might be in the form of a periodic clock signal. Alternatively, the timing signal may be derived from an electrocardiogram signal from the patient 10. One skilled in the art will readily recognize other techniques for deriving a timing signal that correlate to at least one of the cardiac or respiratory cycle or other anatomical cycle of the patient.


As described above, the indicia of the surgical instrument 1012 tracks the movement of the surgical instrument 1012 as it is moved by the surgeon within the patient 10. Rather than display the indicia of the surgical instrument 1012 on a real-time basis, the display of the indicia of the surgical instrument 1012 is periodically updated 1348 based on the timing signal from the timing signal generator 1026. In one exemplary embodiment, the timing generator 1026 is electrically connected to the tracking subsystem 1020. The tracking subsystem 1020 is in turn operable to report position data for the surgical instrument 1012 in response to a timing signal received from the timing signal generator 1026. The position of the indicia of the surgical instrument 1012 is then updated 1350 on the display of the image data. It is readily understood that other techniques for synchronizing the display of an indicia of the surgical instrument 1012 based on the timing signal are within the scope of the present invention, thereby eliminating any flutter or jitter which may appear on the displayed image 1352. It is also envisioned that a path (or projected path) of the surgical instrument 1012 may also be illustrated on the displayed image data 1352.


In another aspect of the present invention, the surgical instrument navigation system 1010 may be further adapted to display four-dimensional image data for a region of interest as shown in FIG. 14. In this case, the imaging device 1014 is operable to capture volumetric scan data 1462 for an internal region of interest over a period of time, such that the region of interest includes motion that is caused by either the cardiac cycle or the respiratory cycle of the patient 10. A volumetric perspective view of the region may be rendered 1464 from the volumetric scan data 1462 by the data processor 1016 as described above. The four-dimensional image data may be further supplemented with other patient data, such as temperature or blood pressure, using coloring coding techniques.


In order to synchronize the display of the volumetric perspective view in real-time with the cardiac or respiratory cycle of the patient, the data processor 1016 is adapted to receive a timing signal from the timing signal generator 1026. As described above, the timing signal generator 1026 is operable to generate and transmit a timing signal that correlates to either the cardiac cycle or the respiratory cycle of the patient 10. In this way, the volumetric perspective image may be synchronized 1466 with the cardiac or respiratory cycle of the patient 10. The synchronized image 1466 is then displayed 1468 on the display 1018 of the system. The four-dimensional synchronized image may be either (or both of) the primary image rendered from the point of view of the surgical instrument or the secondary image depicting the indicia of the position of the surgical instrument 1012 within the patient 10. It is readily understood that the synchronization process is also applicable to two-dimensional image data acquire over time.


To enhance visualization and refine accuracy of the displayed image data, the surgical navigation system can use prior knowledge such as the segmented vessel or airway structure to compensate for error in the tracking subsystem or for inaccuracies caused by an anatomical shift occurring since acquisition of scan data. For instance, it is known that the surgical instrument 1012 being localized is located within a given vessel or airway and, therefore should be displayed within the vessel or airway. Statistical methods can be used to determine the most likely location; within the vessel or airway with respect to the reported location and then compensate so the display accurately represents the instrument 1012 within the center of the vessel or airway. The center of the vessel or airway can be found by segmenting the vessels or airways from the three-dimensional datasets and using commonly known imaging techniques to define the centerline of the vessel or airway tree. Statistical methods may also be used to determine if the surgical instrument 1012 has potentially punctured the vessel or airway. This can be done by determining the reported location is too far from the centerline or the trajectory of the path traveled is greater than a certain angle (worse case 90 degrees) with respect to the vessel or airway. Reporting this type of trajectory (error) is very important to the clinicians. The tracking along the center of the vessel may also be further refined by correcting for motion of the respiratory or cardiac cycle, as described above. While navigating along the vessel or airway tree prior knowledge about the last known location can be used to aid in determining the new location. The instrument or navigated device must follow a pre-defined vessel or airway tree and therefore cannot jump from one branch to the other without traveling along a path that would be allowed. The orientation of the instrument or navigated device can also be used to select the most likely pathway that is being traversed. The orientation information can be used to increase the probability or weight for selected location or to exclude potential pathways and therefore enhance system accuracy.


The surgical instrument navigation system of the present invention may also incorporate atlas maps. It is envisioned that three-dimensional or four-dimensional atlas maps may be registered with patient specific scan data or generic anatomical models. Atlas maps may contain kinematic information (e.g., heart and lung models) that can be synchronized with four-dimensional image data, thereby supplementing the real-time information. In addition, the kinematic information may be combined with localization information from several instruments to provide a complete four-dimensional model of organ motion. The atlas maps may also be used to localize bones or soft tissue which can assist in determining placement and location of implants.


CONCLUSION

While various embodiments of the invention have been described above, it should be understood that they have been presented by way of example only, and not limitation. Thus, the breadth and scope of the invention should not be limited by any of the above-described embodiments, but should be defined only in accordance with the following claims and their equivalents.


The previous description of the embodiments is provided to enable any person skilled in the art to make or use the invention. While the invention has been particularly shown and described with reference to embodiments thereof, it will be understood by those skilled in art that various changes in form and details may be made therein without departing from the spirit and scope of the invention. For example, the PTD, markers and localization elements can be constructed from any suitable material, and can be a variety of different shapes and sizes, not necessarily specifically illustrated, while still remaining within the scope of the invention.

Claims
  • 1. An image-guided method comprising: directing an endoscope having a tip that is equipped with a five degree of freedom electromagnetic sensor to an anatomical position in a patient, the tip configured to facilitate generation of real-time video image data having a direction of view along a path;forming a 3D navigation model representative of the patient's lung anatomy from CT data;capturing video of the patient's anatomy with the endoscope;presenting an image of the anatomical position within the 3D navigation model of the patient's lung anatomy;activating a steering mechanism of the endoscope to repeatedly move the tip in an arc constrained to a plane to acquire a sixth degree of freedom representing orientation of said direction of view and defining a view along the path.
  • 2. The image-guided method of claim 1, further comprising: determining the orientation and location of the sensor in the patient as the sensor is being repeatedly moved;correlating the determined sensor location and orientation with one or more patient images acquired prior to the repeated movement of the sensor.
  • 3. The image-guided method of claim 2, wherein orientation and location determination comprise applying a deformation vector field.
  • 4. The image-guided method of claim 3, wherein the one or more patient images are derived from an imaging device in conjunction with a patient tracking device disposed at an external region of the patient.
  • 5. The image-guided method of claim 1, wherein the anatomical position is a branching point of a bronchial tree.
CROSS-REFERENCE TO RELATED APPLICATIONS

This application is a continuation of U.S. patent application Ser. No. 14/182,896, filed Feb. 18, 2014 (abandoned), entitled “Apparatus and Method for Four Dimensional Soft Tissue Navigation in Endoscopic Applications,” which is a divisional of U.S. patent application Ser. No. 13/215,017 filed on Aug. 22, 2011, and issued as U.S. Pat. No. 8,696,549, and also entitled “Apparatus and Method for Four Dimensional Soft Tissue Navigation in Endoscopic Applications,” which claims the benefit of U.S. Provisional Application Ser. No. 61/375,439, filed Aug. 20, 2010, 61/375,484, filed Aug. 20, 2010, 61/375,523, filed Aug. 20, 2010, and 61/375,533, filed Aug. 20, 2010, each of which are hereby incorporated by reference in their entirety, including any figures, tables, and drawings.

US Referenced Citations (414)
Number Name Date Kind
3788324 Lim Jan 1974 A
4421106 Uehara Dec 1983 A
4583538 Onik et al. Apr 1986 A
5053042 Bidwell Oct 1991 A
5158088 Nelson et al. Oct 1992 A
5186174 Schlondorff et al. Feb 1993 A
5251165 James, III Oct 1993 A
5251635 Dumoulin et al. Oct 1993 A
5253770 Rosenthal Oct 1993 A
5265610 Darrow et al. Nov 1993 A
5348011 NessAiver Sep 1994 A
5377678 Dumoulin et al. Jan 1995 A
5391199 Ben-Haim Feb 1995 A
5417210 Funda et al. May 1995 A
5437292 Kipshidze et al. Aug 1995 A
5483691 Heck et al. Jan 1996 A
5483961 Kelly et al. Jan 1996 A
5577502 Darrow et al. Nov 1996 A
5581183 Lindstedt et al. Dec 1996 A
5644612 Moorman et al. Jul 1997 A
5671739 Darrow et al. Sep 1997 A
5674498 Inoue et al. Oct 1997 A
5704897 Truppe Jan 1998 A
5718241 Ben-Haim et al. Feb 1998 A
5730129 Darrow et al. Mar 1998 A
5740808 Panescu et al. Apr 1998 A
5765561 Chen et al. Jun 1998 A
5769789 Wang et al. Jun 1998 A
5769861 Vilsmeier Jun 1998 A
5771306 Stork et al. Jun 1998 A
5787866 Sugiyama et al. Aug 1998 A
5787886 Kelly et al. Aug 1998 A
5803089 Ferre et al. Sep 1998 A
5814022 Antanavich et al. Sep 1998 A
5814066 Spotnitz Sep 1998 A
5833608 Acker Nov 1998 A
5840025 Ben-Haim Nov 1998 A
5868673 Vesely Feb 1999 A
5899672 Salamey May 1999 A
5928248 Acker Jul 1999 A
5951461 Nyo et al. Sep 1999 A
5978696 VomLehn et al. Nov 1999 A
6016439 Acker Jan 2000 A
6019724 Gronningsaeter et al. Feb 2000 A
6026173 Svenson et al. Feb 2000 A
6078175 Foo Jun 2000 A
6122538 Sliwa, Jr. et al. Sep 2000 A
6122541 Cosman et al. Sep 2000 A
6132396 Antanavich et al. Oct 2000 A
6144875 Schweikard et al. Nov 2000 A
6167296 Shahidi Dec 2000 A
6173201 Front Jan 2001 B1
6188355 Gilboa Feb 2001 B1
6198959 Wang Mar 2001 B1
6201987 Dumoulin Mar 2001 B1
6226543 Gilboa et al. May 2001 B1
6226548 Foley et al. May 2001 B1
6233476 Strommer et al. May 2001 B1
6235038 Hunter et al. May 2001 B1
6236875 Bucholz et al. May 2001 B1
6246896 Dumoulin et al. Jun 2001 B1
6246898 Vesely et al. Jun 2001 B1
6253770 Acker et al. Jul 2001 B1
6267769 Truwit Jul 2001 B1
6275560 Blake et al. Aug 2001 B1
6282442 DeStefano et al. Aug 2001 B1
6285902 Kienzle, III et al. Sep 2001 B1
6298259 Kucharczyk et al. Oct 2001 B1
6314310 Ben-Haim et al. Nov 2001 B1
6314311 Williams et al. Nov 2001 B1
6314312 Wessels et al. Nov 2001 B1
6317616 Glossop Nov 2001 B1
6317619 Boernert et al. Nov 2001 B1
6330356 Sundareswaran et al. Dec 2001 B1
6332089 Acker et al. Dec 2001 B1
6332891 Himes Dec 2001 B1
6335617 Osadchy et al. Jan 2002 B1
6335623 Damadian et al. Jan 2002 B1
6340363 Bolger et al. Jan 2002 B1
6341231 Ferre et al. Jan 2002 B1
6347240 Foley et al. Feb 2002 B1
6348058 Melkent et al. Feb 2002 B1
6351573 Schneider Feb 2002 B1
6351659 Vilsmeier Feb 2002 B1
6361759 Frayne et al. Mar 2002 B1
6362821 Gibson et al. Mar 2002 B1
6368331 Front et al. Apr 2002 B1
6369571 Damadian et al. Apr 2002 B1
6369574 Ederlov et al. Apr 2002 B1
6373998 Thirion et al. Apr 2002 B2
6379302 Kessman et al. Apr 2002 B1
6380732 Gilboa Apr 2002 B1
6381485 Hunter et al. Apr 2002 B1
6402762 Hunter et al. Jun 2002 B2
6418238 Shiratani et al. Jul 2002 B1
6421551 Kuth et al. Jul 2002 B1
6424856 Vilsmeier et al. Jul 2002 B1
6425865 Salcudean et al. Jul 2002 B1
6428328 Haba et al. Aug 2002 B2
6430430 Gosche Aug 2002 B1
6434415 Foley et al. Aug 2002 B1
6434507 Clayton et al. Aug 2002 B1
6437571 Danby et al. Aug 2002 B1
6442417 Shahidi et al. Aug 2002 B1
6445186 Damadian et al. Sep 2002 B1
6445943 Ferre et al. Sep 2002 B1
6455182 Silver Sep 2002 B1
6461372 Jensen et al. Oct 2002 B1
6468265 Evans et al. Oct 2002 B1
6469508 Damadian et al. Oct 2002 B1
6470066 Takagi et al. Oct 2002 B2
6470207 Simon et al. Oct 2002 B1
6473032 Trimble Oct 2002 B1
6473635 Rasche Oct 2002 B1
6477400 Barrick Nov 2002 B1
6478793 Cosman et al. Nov 2002 B1
6478802 Kienzle, III et al. Nov 2002 B2
6483948 Spink et al. Nov 2002 B1
6484049 Seeley et al. Nov 2002 B1
6485413 Boppart et al. Nov 2002 B1
D466609 Glossop Dec 2002 S
D466610 Ashton et al. Dec 2002 S
6490467 Bucholz et al. Dec 2002 B1
6490475 Seeley et al. Dec 2002 B1
6490477 Zylka et al. Dec 2002 B1
6491699 Henderson et al. Dec 2002 B1
6491702 Heilbrun et al. Dec 2002 B2
6493574 Ehnholm et al. Dec 2002 B1
6496007 Damadian et al. Dec 2002 B1
6501981 Schweikard et al. Dec 2002 B1
6504893 Flohr et al. Jan 2003 B1
6504894 Pan et al. Jan 2003 B2
6516213 Nevo Feb 2003 B1
6517485 Torp et al. Feb 2003 B2
6527443 Vilsmeier et al. Mar 2003 B1
6535756 Simon et al. Mar 2003 B1
6538634 Chui et al. Mar 2003 B1
6539127 Roche et al. Mar 2003 B1
6541947 Dittmer et al. Apr 2003 B1
6541973 Danby et al. Apr 2003 B1
6544041 Damadian Apr 2003 B1
6547782 Taylor Apr 2003 B1
6558333 Gilboa et al. May 2003 B2
6562059 Edwards et al. May 2003 B2
6567687 Front et al. May 2003 B2
6580938 Acker Jun 2003 B1
6584174 Schubert et al. Jun 2003 B2
6584339 Galloway, Jr. et al. Jun 2003 B2
6591130 Shahidi Jul 2003 B2
6593884 Gilboa et al. Jul 2003 B1
6606513 Lardo et al. Aug 2003 B2
6609022 Vilsmeier et al. Aug 2003 B2
6615155 Gilboa Sep 2003 B2
6636757 Jascob et al. Oct 2003 B1
6650924 Kuth et al. Nov 2003 B2
6666579 Jensen Dec 2003 B2
6674833 Shahidi et al. Jan 2004 B2
6675032 Chen et al. Jan 2004 B2
6675033 Lardo et al. Jan 2004 B1
6687531 Ferre et al. Feb 2004 B1
6690386 Edelson et al. Feb 2004 B2
6690960 Chen et al. Feb 2004 B2
6694167 Ferre et al. Feb 2004 B1
6697664 Kienzle, III et al. Feb 2004 B2
6702780 Gilboa et al. Mar 2004 B1
6711429 Gilboa et al. Mar 2004 B1
6714629 Vilsmeier Mar 2004 B2
6714810 Grzeszczuk et al. Mar 2004 B2
6725080 Melkent et al. Apr 2004 B2
6738656 Ferre et al. May 2004 B1
6774624 Anderson et al. Aug 2004 B2
6782287 Grzeszczuk et al. Aug 2004 B2
6796988 Melkent et al. Sep 2004 B2
6799569 Danielsson et al. Oct 2004 B2
6823207 Jensen et al. Nov 2004 B1
6826423 Hardy et al. Nov 2004 B1
6833814 Gilboa et al. Dec 2004 B2
6850794 Shahidi Feb 2005 B2
6856826 Seeley et al. Feb 2005 B2
6856827 Seeley et al. Feb 2005 B2
6892090 Verard et al. May 2005 B2
6898303 Armato, III et al. May 2005 B2
6907281 Grzeszczuk Jun 2005 B2
6920347 Simon et al. Jul 2005 B2
6925200 Wood et al. Aug 2005 B2
6934575 Ferre et al. Aug 2005 B2
6947788 Gilboa et al. Sep 2005 B2
6968224 Kessman et al. Nov 2005 B2
6978166 Foley et al. Dec 2005 B2
6992477 Govari Jan 2006 B2
6996430 Gilboa et al. Feb 2006 B1
7015859 Anderson Mar 2006 B2
7015907 Tek et al. Mar 2006 B2
7035683 Guendel Apr 2006 B2
7050845 Vilsmeier May 2006 B2
7063660 Chen et al. Jun 2006 B2
7139601 Bucholz et al. Nov 2006 B2
7153297 Peterson Dec 2006 B2
7171257 Thomson Jan 2007 B2
7174201 Govari et al. Feb 2007 B2
7233820 Gilboa Jun 2007 B2
7260426 Schweikard et al. Aug 2007 B2
7302295 Stahmann et al. Nov 2007 B2
7339587 Kropfeld Mar 2008 B2
7356367 Liang et al. Apr 2008 B2
7366562 Dukesherer et al. Apr 2008 B2
7371067 Anderson et al. May 2008 B2
7398116 Edwards Jul 2008 B2
7505806 Masutani et al. Mar 2009 B2
7555330 Gilboa et al. Jun 2009 B2
7594925 Danek et al. Sep 2009 B2
7599730 Hunter et al. Oct 2009 B2
7641609 Ohnishi et al. Jan 2010 B2
7659912 Akimoto et al. Feb 2010 B2
7697972 Verard et al. Apr 2010 B2
7756563 Higgins et al. Jul 2010 B2
7889905 Higgins et al. Feb 2011 B2
7901348 Soper et al. Mar 2011 B2
7969143 Gilboa Jun 2011 B2
7985187 Wibowo et al. Jul 2011 B2
7998062 Gilboa Aug 2011 B2
8016749 Clerc et al. Sep 2011 B2
8046052 Verard et al. Oct 2011 B2
8048777 Eguchi et al. Nov 2011 B2
8049777 Akimoto et al. Nov 2011 B2
8064669 Higgins et al. Nov 2011 B2
8102416 Ito et al. Jan 2012 B2
8150138 Ohnishi Apr 2012 B2
8202213 Ito et al. Jun 2012 B2
8218846 Trumer et al. Jul 2012 B2
8218847 Averbuch et al. Jul 2012 B2
8219179 Ganatra et al. Jul 2012 B2
8317149 Greenburg et al. Nov 2012 B2
8382662 Soper et al. Feb 2013 B2
8433159 Nord et al. Apr 2013 B1
8468003 Gibbs et al. Jun 2013 B2
8483801 Edwards Jul 2013 B2
8494246 Trumer et al. Jul 2013 B2
8494612 Vetter et al. Jul 2013 B2
8611983 Glossop Dec 2013 B2
8611984 Greenburg et al. Dec 2013 B2
8632461 Glossop Jan 2014 B2
8672836 Higgins et al. Mar 2014 B2
8675935 Higgins et al. Mar 2014 B2
8696548 Gilboa Apr 2014 B2
8696685 Gilboa Apr 2014 B2
8700132 Ganatra et al. Apr 2014 B2
9218179 Huff, II et al. Dec 2015 B2
20010007918 Vilsmeier et al. Jul 2001 A1
20010007919 Shahidi Jul 2001 A1
20010025142 Wessels et al. Sep 2001 A1
20010029333 Shahidi Oct 2001 A1
20010031919 Strommer et al. Oct 2001 A1
20010031985 Gilboa et al. Oct 2001 A1
20010036245 Kienzle, III et al. Nov 2001 A1
20010041835 Front et al. Nov 2001 A1
20020044631 Graumann et al. Apr 2002 A1
20020049375 Strommer et al. Apr 2002 A1
20020049378 Grzeszczuk et al. Apr 2002 A1
20020070970 Wood et al. Jun 2002 A1
20020075994 Shahidi et al. Jun 2002 A1
20020077543 Grzeszczuk et al. Jun 2002 A1
20020077544 Shahidi Jun 2002 A1
20020082492 Grzeszczuk Jun 2002 A1
20020085681 Jensen Jul 2002 A1
20020140708 Sauer Oct 2002 A1
20020143317 Glossop Oct 2002 A1
20020161295 Edwards et al. Oct 2002 A1
20030000535 Galloway, Jr. et al. Jan 2003 A1
20030004411 Govari et al. Jan 2003 A1
20030016852 Kaufman et al. Jan 2003 A1
20030018251 Solomon Jan 2003 A1
20030023161 Govari et al. Jan 2003 A1
20030028091 Simon et al. Feb 2003 A1
20030029464 Chen et al. Feb 2003 A1
20030032878 Shahidi Feb 2003 A1
20030040667 Feussner et al. Feb 2003 A1
20030074011 Gilboa et al. Apr 2003 A1
20030088179 Seeley et al. May 2003 A1
20030125622 Schweikard et al. Jul 2003 A1
20030130576 Seeley et al. Jul 2003 A1
20030139663 Graumann Jul 2003 A1
20030199785 Hibner et al. Oct 2003 A1
20030208116 Liang et al. Nov 2003 A1
20030208122 Melkent et al. Nov 2003 A1
20030216631 Bloch et al. Nov 2003 A1
20030220557 Cleary et al. Nov 2003 A1
20040006268 Gilboa et al. Jan 2004 A1
20040013548 Seto et al. Jan 2004 A1
20040034300 Verard et al. Feb 2004 A1
20040049121 Yaron Mar 2004 A1
20040076259 Jensen et al. Apr 2004 A1
20040092815 Schweikard et al. May 2004 A1
20040097805 Verard et al. May 2004 A1
20040097806 Hunter et al. May 2004 A1
20040116803 Jascob et al. Jun 2004 A1
20040122311 Cosman Jun 2004 A1
20040138548 Strommer et al. Jul 2004 A1
20040152970 Hunter et al. Aug 2004 A1
20040152974 Solomon Aug 2004 A1
20040167393 Solar et al. Aug 2004 A1
20040193042 Scampini et al. Sep 2004 A1
20040210125 Chen et al. Oct 2004 A1
20040249267 Gilboa Dec 2004 A1
20050010099 Raabe et al. Jan 2005 A1
20050020900 Yngvesson et al. Jan 2005 A1
20050027186 Chen et al. Feb 2005 A1
20050033149 Strommer et al. Feb 2005 A1
20050038337 Edwards Feb 2005 A1
20050065433 Anderson Mar 2005 A1
20050085718 Shahidi Apr 2005 A1
20050085793 Glossop Apr 2005 A1
20050107679 Geiger et al. May 2005 A1
20050107688 Strommer May 2005 A1
20050113809 Melkent et al. May 2005 A1
20050143651 Verard et al. Jun 2005 A1
20050169510 Zuhars et al. Aug 2005 A1
20050182295 Soper et al. Aug 2005 A1
20050182319 Glossop Aug 2005 A1
20050187482 OBrien et al. Aug 2005 A1
20050197568 Vass et al. Sep 2005 A1
20050203383 Moctezuma de la Barrera et al. Sep 2005 A1
20050234335 Simon et al. Oct 2005 A1
20050288574 Thornton et al. Dec 2005 A1
20050288578 Durlak Dec 2005 A1
20060004281 Saracen Jan 2006 A1
20060025677 Verard et al. Feb 2006 A1
20060025679 Viswanathan et al. Feb 2006 A1
20060045318 Schoisswohl et al. Mar 2006 A1
20060050988 Kraus et al. Mar 2006 A1
20060058647 Strommer et al. Mar 2006 A1
20060059842 McCafferty Babcock et al. Mar 2006 A1
20060063998 von Jako et al. Mar 2006 A1
20060064006 Strommer et al. Mar 2006 A1
20060074292 Thomson et al. Apr 2006 A1
20060074299 Sayeh Apr 2006 A1
20060074304 Sayeh Apr 2006 A1
20060079759 Vaillant et al. Apr 2006 A1
20060084867 Tremblay et al. Apr 2006 A1
20060093089 Vertatschitsch et al. May 2006 A1
20060094958 Marquart et al. May 2006 A1
20060106292 Anderson May 2006 A1
20060116634 Shachar Jun 2006 A1
20060122497 Glossop Jun 2006 A1
20060142798 Holman et al. Jun 2006 A1
20060173291 Glossop Aug 2006 A1
20060173356 Feilkas Aug 2006 A1
20060174269 Hansen-Turton Aug 2006 A1
20060189867 Revie et al. Aug 2006 A1
20060247511 Anderson Nov 2006 A1
20060253032 Altmann et al. Nov 2006 A1
20060258933 Ellis et al. Nov 2006 A1
20070032723 Glossop Feb 2007 A1
20070038058 West et al. Feb 2007 A1
20070055128 Glossop Mar 2007 A1
20070060799 Lyon et al. Mar 2007 A1
20070066881 Edwards et al. Mar 2007 A1
20070066887 Mire et al. Mar 2007 A1
20070110289 Fu et al. May 2007 A1
20070129629 Beauregard et al. Jun 2007 A1
20070159337 Tethrake et al. Jul 2007 A1
20070160312 Blaffert et al. Jul 2007 A1
20070167714 Kiraly et al. Jul 2007 A1
20070167738 Timinger et al. Jul 2007 A1
20070167744 Beauregard et al. Jul 2007 A1
20070167784 Shekhar et al. Jul 2007 A1
20070225553 Shahidi Sep 2007 A1
20070225559 Clerc et al. Sep 2007 A1
20070232896 Gilboa et al. Oct 2007 A1
20070244355 Shaw Oct 2007 A1
20070249896 Goldfarb et al. Oct 2007 A1
20070276180 Greenburg et al. Nov 2007 A1
20080071142 Gattani et al. Mar 2008 A1
20080071143 Gattani et al. Mar 2008 A1
20080118135 Averbuch et al. May 2008 A1
20080125760 Gilboa May 2008 A1
20080132757 Tgavalekos Jun 2008 A1
20080140114 Edwards et al. Jun 2008 A1
20080161640 Weisman Jul 2008 A1
20080167639 Gilboa Jul 2008 A1
20080207997 Higgins et al. Aug 2008 A1
20080228086 Ilegbusi et al. Sep 2008 A1
20080247622 Aylward et al. Oct 2008 A1
20080255416 Gilboa Oct 2008 A1
20080262297 Gilboa et al. Oct 2008 A1
20080262342 Averbruch Oct 2008 A1
20080262430 Anderson et al. Oct 2008 A1
20080269561 Banik et al. Oct 2008 A1
20080287803 Li et al. Nov 2008 A1
20090088600 Meloul Apr 2009 A1
20090092300 Jerebko et al. Apr 2009 A1
20090124883 Wibowo et al. May 2009 A1
20090156895 Higgins et al. Jun 2009 A1
20090156951 Averbuch Jun 2009 A1
20090209817 Averbuch Aug 2009 A1
20090227861 Ganatra et al. Sep 2009 A1
20090240140 Fitelzon et al. Sep 2009 A1
20090240198 Averbuch Sep 2009 A1
20090284255 Zur Nov 2009 A1
20100030063 Lee et al. Feb 2010 A1
20100041949 Tolkowsky Feb 2010 A1
20100087705 Byers et al. Apr 2010 A1
20100160733 Gilboa Jun 2010 A1
20100183206 Carlsen et al. Jul 2010 A1
20100310146 Higgins et al. Dec 2010 A1
20110058721 Zhang et al. Mar 2011 A1
20110093243 Tawhai et al. Apr 2011 A1
20110166418 Aoyagi et al. Jul 2011 A1
20110184238 Higgins et al. Jul 2011 A1
20110282151 Trovato et al. Nov 2011 A1
20110311115 Li et al. Dec 2011 A1
20120059248 Holsing et al. Mar 2012 A1
20120123296 Hashimshony et al. May 2012 A1
20120288173 Rai et al. Nov 2012 A1
Foreign Referenced Citations (92)
Number Date Country
19751761 Oct 1998 DE
19725137 Jan 1999 DE
19829224 Jan 2000 DE
19909816 May 2000 DE
199909816 May 2000 DE
10136709 Feb 2003 DE
10161160 Jun 2003 DE
102005010010 Sep 2005 DE
102004030836 Jan 2006 DE
102005026251 Jan 2006 DE
102005038394 Mar 2006 DE
102005050286 Apr 2006 DE
102004058122 Jul 2006 DE
100000937 Nov 2012 DE
0591993 Apr 1994 EP
0869745 Oct 1998 EP
900048 Mar 1999 EP
0928600 Jul 1999 EP
977510 Feb 2000 EP
1079240 Feb 2001 EP
1152706 Nov 2001 EP
1181897 Feb 2002 EP
1319368 Jun 2003 EP
1374792 Jan 2004 EP
1374793 Jan 2004 EP
1391181 Feb 2004 EP
1421913 May 2004 EP
1464285 Oct 2004 EP
1504713 Feb 2005 EP
1504726 Feb 2005 EP
1519140 Mar 2005 EP
1523951 Apr 2005 EP
1561423 Aug 2005 EP
1629774 Mar 2006 EP
1629789 Mar 2006 EP
2380550 Oct 2011 EP
2876273 Apr 2006 FR
2000023941 Jan 2000 JP
9424933 Nov 1994 WO
9501757 Jan 1995 WO
9608209 Mar 1996 WO
9610949 Apr 1996 WO
9626672 Sep 1996 WO
9729699 Aug 1997 WO
9729709 Aug 1997 WO
9836684 Aug 1998 WO
9916352 Apr 1999 WO
9927839 Jun 1999 WO
9943253 Sep 1999 WO
0016684 Mar 2000 WO
0028911 May 2000 WO
0047103 Aug 2000 WO
0049958 Aug 2000 WO
0057767 Oct 2000 WO
0069335 Nov 2000 WO
0101845 Jan 2001 WO
0137748 May 2001 WO
0162134 Aug 2001 WO
0164124 Sep 2001 WO
0176496 Oct 2001 WO
0176497 Oct 2001 WO
0187136 Nov 2001 WO
0193745 Dec 2001 WO
0200093 Jan 2002 WO
0200103 Jan 2002 WO
0219936 Mar 2002 WO
0222015 Mar 2002 WO
0224051 Mar 2002 WO
04060157 Mar 2002 WO
05070318 Mar 2002 WO
05077293 Mar 2002 WO
05101277 Mar 2002 WO
05111942 Mar 2002 WO
06027781 Mar 2002 WO
06039009 Mar 2002 WO
06051523 Mar 2002 WO
02056770 Jul 2002 WO
02064011 Aug 2002 WO
02082375 Oct 2002 WO
02098273 Dec 2002 WO
2004046754 Jun 2004 WO
2004062497 Jul 2004 WO
2005016166 Feb 2005 WO
2006002396 Jan 2006 WO
2006005021 Jan 2006 WO
2006010141 Jan 2006 WO
2007002079 Jan 2007 WO
2007031314 Mar 2007 WO
2007033206 Mar 2007 WO
2007062051 May 2007 WO
2007084893 Jul 2007 WO
2009158578 Dec 2009 WO
Non-Patent Literature Citations (38)
Entry
Jan. 24, 2014 USPTO Office Action (U.S. Appl. No. 13/215,036).
Oct. 2, 2014 USPTO Office Action (U.S. Appl. No. 13/215,041).
Oct. 16, 2018 USPTO Office Action (U.S. Appl. No. 14/989,692).
Nov. 1, 2018 USPTO Office Action (U.S. Appl. No. 14/182,896).
Nov. 12, 2014 USPTO Office Action (U.S. Appl. No. 13/215,036).
Nov. 22, 2016 USPTO Office Action (U.S. Appl. No. 14/182,896).
Dec. 1, 2016 USPTO Office Action (U.S. Appl. No. 13/817,730).
Dec. 16, 2016 USPTO Office Action (U.S. Appl. No. 13/215,041).
Dec. 7, 2015 USPTO Office Action (U.S. Appl. No. 13/215,036).
Feb. 15, 2019 USPTO Office Action (U.S. Appl. No. 13/817,730).
Feb. 24, 2014 USPTO Office Action (U.S. Appl. No. 13/215,041).
Mar. 29, 2018 USPTO Office Action (U.S. Appl. No. 14/182,896).
Mar. 9, 2017 USPTO Office Action (U.S. Appl. No. 13/215,036).
Apr. 11, 2018 USPTO Office Action (U.S. Appl. No. 13/817,730).
Apr. 12, 2013 USPTO Office Action (U.S. Appl. No. 13/215,036).
Apr. 6, 2015 USPTO Office Action (U.S. Appl. No. 13/215,036).
May 15, 2018 USPTO Office Action (U.S. Appl. No. 14/989,692).
May 21, 2013 USPTO Office Action (U.S. Appl. No. 13/215,041).
May 3, 2016 USPTO Office Action (U.S. Appl. No. 13/817,730).
May 3, 2016 USPTO Office Action (U.S. Appl. No. 14/182,896).
Jun. 15, 2017 USPTO Office Action (U.S. Appl. No. 14/182,896).
Jun. 3, 2014 USPTO Office Action (U.S. Appl. No. 13/215,036).
Jul. 12, 2017 USPTO Office Action (U.S. Appl. No. 13/817,730).
Jul. 26, 2018 USPTO Office Action (U.S. Appl. No. 13/215,036).
Jul. 28, 2017 USPTO Office Action (U.S. Appl. No. 14/989,692).
Jul. 8, 2017 USPTO Office Action (U.S. Appl. No. 13/215,041).
Aug. 27, 2015 USPTO Office Action (U.S. Appl. No. 13/215,041).
Sep. 26, 2017 USPTO Office Action (U.S. Appl. No. 13/215,036).
Sep. 26, 2018 USPTO Office Action (U.S. Appl. No. 13/817,730).
Balamugesh et al. “Endobronchial ultrasound: A new innovation in bronchoscopy.” Lung India, Jan.-Mar. 2009; 26(1); 17-21.
European Patent Office, Extended Search Report issued for EP 11818898.6, 6 pages dated Dec. 20, 2013.
FDA Approves Lung Sealant, May 31, 2000 [online], [retried Oct. 17, 2008 from Internet]; http://www.meds.com/archive/mol-cancer/2000/05/msg01329.html Aug. 31, 2000.
Garrity, James M., “Development of a Dynamic Model for the Lung Lobes and Airway Tree in the NCAT Phantom,” IEEE Transactions on Nuclear Science, vol. 50, No. 3, pp. 378-383.
Highlights from Presentation of 5th Joint Meeting of European Assn. for Cardio-Thoracic Surgery and European Society of Thoracic Surgeons “Evidence for Fleece-Bound Sealants in Cardiothoracic Surgery” Sep. 9-13, 2006 Sep. 9, 2006.
Medical Industry Today, “New Navigational Aid Could Improve Hip Replacement Outcomes,” Jul. 22, 1997.
Moore, E. et al., Needle Aspiration Lung Biopsy: Re-evaluation of the blood patch technique in an equine model, Radiology, 196(1) Jul. 1, 1995.
Patent Cooperation Treaty, International Search Report and Written Opinion from PCT/US06/35548, dated Aug. 20, 2007, 7 pages.
Patent Cooperation Treaty, International Search Report from PCT/US11/48669, dated Apr. 9, 2012, 7 pages.
Related Publications (1)
Number Date Country
20190320876 A1 Oct 2019 US
Provisional Applications (4)
Number Date Country
61375523 Aug 2010 US
61375439 Aug 2010 US
61375484 Aug 2010 US
61375533 Aug 2010 US
Divisions (1)
Number Date Country
Parent 13215017 Aug 2011 US
Child 14182896 US
Continuations (1)
Number Date Country
Parent 14182896 Feb 2014 US
Child 16358882 US