Computed tomography enhanced fluoroscopic system, device, and method of utilizing the same

Information

  • Patent Grant
  • 11871913
  • Patent Number
    11,871,913
  • Date Filed
    Monday, June 10, 2019
    5 years ago
  • Date Issued
    Tuesday, January 16, 2024
    10 months ago
Abstract
A system and method for enhanced navigation for use during a surgical procedure including planning a navigation path to a target using a first data set of computed tomography images previously acquired; navigating a marker placement device to the target using the navigation path; placing a plurality of markers in tissue proximate the target; acquiring a second data set of computed tomography images including the plurality of markers; planning a second navigation path to a second target using the second data set of computed tomography images; navigating a medical instrument to a second target; capturing fluoroscopic data of tissue proximate the target; and registering the fluoroscopic data to the second data set of computed tomography images based on marker position and orientation within the real-time fluoroscopic data and the second data set of computed tomography images.
Description
BACKGROUND
Technical Field

The present disclosure relates to a system, apparatus, and method of navigation and position confirmation for surgical procedures. More particularly, the present disclosure relates to a system and method for enhanced navigation of an extended working channel or catheter and one or more medical instruments positionable therethrough in one or more branched luminal networks of a patient and confirming placement of those medical instruments prior to initiating treatment or biopsy.


Description of Related Art

Microwave ablation is a commonly applied method for treating various maladies affecting organs including the liver, brain, heart, lung and kidney. Commonly, one or more imaging modalities, whether magnetic resonance imaging, ultrasound imaging, computer tomography (CT), as well as others will be employed by a clinician to identify areas of interest within the patent and ultimately targets for treatment. Once identified, an area of interest will typically require a biopsy using a biopsy tool to confirm whether treatment and/or observation are necessitated at a particular time. This biopsy is typically performed under one of a number of image guidance modalities, and/or in conjunction with a navigation system. If the biopsy reveals that the area of interest is malignant, it may prove useful to treat the area using microwave ablation.


Microwave ablation may be performed by transmitting microwave energy through a needle inserted percutaneously in the patient to ablate the area of interest. Alternatively, where practicable, an endoscopic approach can be undertaken, where, once navigated to the identified target, a flexible microwave ablation catheter can be placed in the target to ablate the area of interest. The endoscopic approach is particularly useful when treating luminal networks of the body such as the lungs.


To enable the endoscopic approach, for example in the lungs, endobronchial navigation systems have been developed that use CT image data to create a navigation plan to facilitate advancing a navigation catheter (or other suitable device) through a bronchoscope and a branch of the bronchus of a patient to the area of interest. Endobronchial navigation may be employed both in the diagnostic (i.e., biopsy) phase and the treatment phases. Electromagnetic tracking may be utilized in conjunction with the CT data to facilitate guiding the navigation catheter through the branch of the bronchus to the area of interest. In certain instances, the navigation catheter may be positioned within one of the airways of the branched luminal networks adjacent to or within the area of interest to provide access for one or more medical instruments.


Once the navigation catheter is in position, fluoroscopy may be used to visualize medical instruments including biopsy tools, such as, for example, brushes, needles and forceps, as well as treatment tools such as an ablation catheter, as they are passed through the navigation catheter and into the lung and to the area of interest. Conventional fluoroscopy is widely used during medical procedures as a visualization imaging tool for guiding medical instruments inside the human body. Although medical instruments like catheters, biopsy tools, etc., are clearly visible on a fluoroscopic picture, organic features such as soft tissue, blood vessels, suspicious tumor lesions etc., are either somewhat or completely transparent and thus hard to identify with conventional fluoroscopy.


During procedures, such as a biopsy or ablation, a fluoroscopic image may be used by a clinician to aid in visualizing the placement of a medical instrument within a patient's body. However, although the medical instrument is visible in the fluoroscopic image, the area of interest or target tissue is generally somewhat transparent and not necessarily clearly visible within the image. Moreover, fluoroscopic images render flat 2D images on which it can be somewhat challenging to assess three-dimensional position of the medical instrument. As such, the clinician is not provided all the information that could be desired to visualize the placement of the medical device within the patient's body relative to the area of interest.


SUMMARY

As can be appreciated, a microwave ablation catheter that is positionable through one or more branched luminal networks of a patient to treat tissue may prove useful in the surgical arena.


Aspects of the present disclosure are described in detail with reference to the figures wherein like reference numerals identify similar or identical elements. As used herein, the term “distal” refers to the portion that is being described which is further from a user, while the term “proximal” refers to the portion that is being described which is closer to a user.


According to one aspect of the present disclosure, a method of enhanced navigation is provided including planning a navigation path to a target using a first data set of computed tomography images previously acquired, navigating a marker placement device to the target using the navigation path, placing a plurality of markers in tissue proximate the target, acquiring a second data set of computed tomography images including the plurality of markers, planning a second navigation path to a second target using the second data set of computed tomography images, navigating a medical instrument to the second target; capturing fluoroscopic data of tissue proximate the markers, and registering the fluoroscopic data to the second data set of computed tomography images based on marker position and/or orientation within the fluoroscopic data and the marker position and/or orientation within the second data set of computed tomography images.


A sample of the target tissue, such as tissue proximate the target, may be retrieved for biopsy or other purposes. Additionally, the method may further include displaying a representation of the second data set of computed tomography images and the fluoroscopic data on a graphical user interface. The first target and the second target may identify substantially the same area of interest. Further, at least a portion of the second data set of computed tomography images may be combined with the fluoroscopic data to generate a combined image for display on the graphical user interface. The combined image may be generated via superimposing, fusing, or overlaying the second data set of computed tomography images with the fluoroscopic data. The fluoroscopic data may be a fluoroscopic image, fluoroscopic images, or fluoroscopic video.


Additionally, the method may further include navigating a microwave ablation device to the target and activating the microwave ablation device to ablate tissue proximate the target. Additionally, the method may further include analyzing the fluoroscopic data and determining whether a medical instrument is correctly positioned relative to the target, and adjusting a position of the medical instrument relative to the target. A second fluoroscopic data set of the tissue proximate the target may also be acquired from a second perspective relative to a patient such that a three-dimensional position of the medical instrument is viewable from a different angle relative to the patient. The second fluoroscopic data set may also be analyzed to determine whether the three-dimensional position of the medical instrument relative to the target is correct, and if not, the three-dimensional position of the medical instrument relative to the target may be adjusted.


In yet another aspect of the present disclosure a non-transitory computer readable storage medium is provided including instructions that when executed by a computing device, cause the computing device to plan a navigation path to a target using a first data set of computed tomography images previously acquired, navigate a marker placement device to the target using the navigation path, acquire a second data set of computed tomography images including a plurality of markers previously placed in tissue proximate the target, plan a second navigation path to a second target using the second data set of computed tomography images, navigate a medical instrument to the second target using the second navigation path, capture fluoroscopic data of tissue proximate the plurality of markers using a fluoroscope, and register the fluoroscopic data to the second data set of computed tomography images based on marker position and/or orientation within the fluoroscopic data and marker position and/or orientation within the second data set of computed tomography images.


The first target and the second target may identify substantially the same area of interest. A sample of the target, such as tissue proximate the target, may be retrieved for biopsy or other purposes. Additionally, the computing device may further display a representation of the second data set of computed tomography images and the fluoroscopic data on a graphical user interface. Further, at least a portion of the second data set of computed tomography images may be combined with the fluoroscopic data to generate a combined image for display on the graphical user interface. The combined image may be generated via superimposing, fusing, or overlaying the second data set of computed tomography images with the fluoroscopic data. The fluoroscopic data may be a fluoroscopic image, fluoroscopic images, or fluoroscopic video.


Additionally, the computing device may further enable navigation of a microwave ablation device to the target and activation of the microwave ablation device to ablate tissue proximate the target. Additionally, the computing device may further analyze the fluoroscopic data and determine whether device medical instrument is correctly positioned relative to the target. A second fluoroscopic data set of the first or second target may also be acquired from a second perspective relative to the patient such that a three-dimensional position of the medical instrument is viewable from a different angle. The second fluoroscopic data set may also be analyzed to determine whether the three-dimensional position of the medical instrument relative to the target tissue is correct, and if not, the three-dimensional position of the medical instrument relative to the target may be adjusted.





BRIEF DESCRIPTION OF THE DRAWINGS

Various aspects and embodiments of the present disclosure are described hereinbelow with references to the drawings, wherein:



FIG. 1 depicts a portion of a user interface with navigational data from a navigation plan overlaid on a live fluoroscopic image;



FIG. 2 is a perspective view of one illustrative embodiment of an electromagnetic navigation (EMN) system in accordance with the present disclosure;



FIG. 3 is an end view of a fluoroscopic imaging C-arm incorporated in the EMN system of FIG. 2;



FIG. 4 is a flow chart of a method for performing a procedure with enhanced navigation using the system of FIG. 3 in accordance with the instant disclosure;



FIG. 5 is a flow chart of a method for performing enhanced navigation using the system of FIG. 3 in accordance with the instant disclosure;



FIG. 6 is an illustration of an example fluoroscopic image/video captured by a C-arm showing markers and an extended working channel of a catheter assembly positioned within a target region of a patient in accordance with the instant disclosure; and



FIG. 7 is a flow chart of a method for adjusting the position of a medical instrument relative to a target in accordance with the instant disclosure.





DETAILED DESCRIPTION

The present disclosure is generally directed to addressing the navigational and location confirmatory shortcomings of the previously known navigation and fluoroscopic imaging confirmation methods and devices. According to one embodiment of the present disclosure, following navigation of a catheter to an area of interest, a fluoroscopic image (or series of fluoroscopic images) is captured. By registering the location of markers previously placed within the patient and captured in the fluoroscopic image to the location of markers which appear in 3D model data generated from a previously acquired CT image data set, the fluoroscopic image can be overlaid with data from the 3D model data including target location data, navigation pathway data, luminal network data and more.


Detailed embodiments of the present disclosure are disclosed herein. However, the disclosed embodiments are merely examples of the disclosure, which may be embodied in various forms and aspects. Therefore, specific structural and functional details disclosed herein are not to be interpreted as limiting, but merely as a basis for the claims and as a representative basis for teaching one skilled in the art to variously employ the present disclosure in virtually any appropriately detailed structure.



FIG. 1 depicts the image outcome of one embodiment of the present disclosure. In FIG. 1, a composite fluoroscopic image 10 is displayed. The composite fluoroscopic image 10 may be presented on a display as an additional view of an Electromagnetic Navigation (EMN) system 100 (FIG. 2) used for navigation. Alternatively, the image may be presented on a fluoroscopic image viewer separate from the EMN system 100. The field of view of the fluoroscopic image 10 includes a distal portion of an extended working channel (EWC) 12 that has been maneuvered pursuant to a pathway plan, as will be described in greater detail below. The fluoroscopic image 10 is also overlaid with a variety of data originally developed and derived from navigation software. This additional data overlaid on the fluoroscopic image 10 includes a target 14, a pathway plan 16, luminal pathways of the area being imaged 18, and markers 20. With this enhanced fluoroscopic image 10 a clinician is allowed to visualize in real time the final placement of the EWC 12 in relation to the pathway plan 16, the target 14 and the markers 20 to ensure accurate final placement, as well as discern if there is any unintended movement of the EWC 12 as a result of tool exchanges into and out of the EWC 12.



FIG. 2 depicts an aspect of an EMN system 100 configured for reviewing CT image data to identify one or more targets 14, planning a pathway to an identified target 14 (planning phase), navigating an EWC 12 to the target 14 (navigation phase) via a user interface, and confirming placement of the EWC 12 relative to the target 14. One such EMN system is the ELECTROMAGNETIC NAVIGATION BRONCHOSCOPY® system currently sold by Covidien LP. The target 14 is a computer generated representation, created during the planning phase, of the tissue of interest identified by review of the CT image data. As described above, following navigation, a medical instrument such as a biopsy tool may be inserted into the EWC 12 to obtain a tissue sample from the tissue located at, or proximate to, the target 14.


As shown in FIG. 2, EWC 12 is part of a catheter guide assembly 40. In practice, the EWC 12 is inserted into bronchoscope 30 for access to a luminal network of the patient “P.” Specifically, EWC 12 of catheter guide assembly 40 may be inserted into a working channel of bronchoscope 30 for navigation through a patient's luminal network. A locatable guide (LG) 32, including a sensor 44 is inserted into the EWC 12 and locked into position such that the sensor 44 extends a desired distance beyond the distal tip of the EWC 12. The position and orientation (6 DOF) of the sensor 44 relative to the reference coordinate system, and thus the distal end of the EWC 12, within an electromagnetic field can be derived. Catheter guide assemblies 40 are currently marketed and sold by Covidien LP under the brand names SUPERDIMENSION® Procedure Kits, or EDGE™ Procedure Kits, and are contemplated as useable with the present disclosure. For a more detailed description of the catheter guide assemblies 40, reference is made to commonly-owned U.S. patent application Ser. No. 13/836,203 filed on Mar. 15, 2013 by Ladtkow et al, and U.S. Pat. No. 7,233,820 the entire contents of both are hereby incorporated by reference.


EMN system 100 generally includes an operating table 20 configured to support a patient “P” a bronchoscope 30 configured for insertion through the patient's “P's” mouth into the patient's “P's” airways; monitoring equipment 120 coupled to bronchoscope 30 (e.g., a video display, for displaying the video images received from the video imaging system of bronchoscope 30); a tracking system 50 including a tracking module 52, a plurality of reference sensors 54, and a transmitter mat 56; a computing device 125 including software and/or hardware used to facilitate identification of a target 14, pathway planning to the target 14, navigation of a medical instrument to the target 14, and confirmation of placement of an EWC 12, or a suitable device therethrough, relative to the target 14.



FIG. 3 depicts another view of the EMN system 100, including a fluoroscopic imaging device 110 capable of acquiring fluoroscopic or x-ray images or video of the patient “P.” The images, series of images, or video captured may be stored within the imaging device 110 or transmitted to computing device 125 for storage, processing, and display. Additionally, the imaging device 110 may rotate about the patient “P” so that images may be acquired from different angles or perspectives relative to the patient “P.” Imaging device 110 may include a single imaging device or more than one imaging device. In embodiments including multiple imaging devices, each imaging device may be a different type of imaging device or the same type. Further details regarding the imaging device 110 are described in U.S. Pat. No. 8,565,858, which is incorporated by reference in its entirety herein.


Computing device 125 may be any suitable computing device including a processor and storage medium, wherein the processor is capable of executing instructions stored on the storage medium. The computing device 125 may further include a database configured to store patient data, CT data sets including CT images, fluoroscopic data sets including fluoroscopic images and video, navigation plans, and any other such data. Although not explicitly illustrated, the computing device 125 may include inputs, or may otherwise be configured to receive, CT data sets and other data described herein. Additionally, computing device 125 includes a display configured to display graphical user interfaces such as those described below. Computing device 125 may be connected to one or more networks through which one or more databases may be accessed.


With respect to the planning phase, computing device 125 utilizes computed tomographic (CT) image data for generating and viewing a three-dimensional model of the patient's “P's” airways, enables the identification of a target 14 on the three-dimensional model (automatically, semi-automatically, or manually), and allows for determining a pathway through the patient's “P's” airways to tissue located at the target 14. More specifically, the CT scans are processed and assembled into a three-dimensional CT volume, which is then utilized to generate a three-dimensional model of the patient's “P's” airways. The three-dimensional model may be displayed on a display associated with computing device 125, or in any other suitable fashion. Using computing device 125, various views of the three-dimensional model or two-dimensional images generated from the three-dimensional model are presented. The three-dimensional model may be manipulated to facilitate identification of target 14 on the three-dimensional model or two-dimensional images, and selection of a suitable pathway through the patient's “P's” airways to access tissue located at the target 14 can be made. Once selected, the pathway plan, 3D model, and images derived therefrom can be saved and exported to a navigation system for use during the navigation phase(s). One such planning software is the ILOGIC® planning suite currently sold by Covidien LP.


With respect to the navigation phase, a six degrees-of-freedom electromagnetic tracking system 50, e.g., similar to those disclosed in U.S. Pat. Nos. 8,467,589, 6,188,355, and published PCT Application Nos. WO 00/10456 and WO 01/67035, the entire contents of each of which is incorporated herein by reference, or other suitable positioning measuring system, is utilized for performing registration of the images and the pathway and navigation, although other configurations are also contemplated. Tracking system 50 includes a tracking module 52, a plurality of reference sensors 54, and a transmitter mat 56. Tracking system 50 is configured for use with a locatable guide 32 and particularly sensor 44. As described above, locatable guide 32 and sensor 44 are configured for insertion through an EWC 12 into a patient's “P's” airways (either with or without bronchoscope 30) and are selectively lockable relative to one another via a locking mechanism.


As shown in FIGS. 2 and 3, transmitter mat 56 is positioned beneath patient “P.” Transmitter mat 56 generates an electromagnetic field around at least a portion of the patient “P” within which the position of a plurality of reference sensors 54 and the sensor element 44 can be determined with use of a tracking module 52. One or more of reference sensors 54 are attached to the chest of the patient “P.” The six degrees of freedom coordinates of reference sensors 54 are sent to computing device 125 (which includes the appropriate software) where they are used to calculate a patient coordinate frame of reference. Registration, as detailed below, is generally performed to coordinate locations of the three-dimensional model and two-dimensional images from the planning phase with the patient's “P's” airways as observed through the bronchoscope 30, and allow for the navigation phase to be undertaken with precise knowledge of the location of the sensor 44, even in portions of the airway where the bronchoscope 30 cannot reach. Further details of such a registration technique and their implementation in luminal navigation can be found in U.S. Patent Application Pub. No. 2011/0085720, the entire contents of which, is incorporated herein by reference, although other suitable techniques are also contemplated.


Registration of the patient “P's” location on the transmitter mat 56 is performed by moving LG 32 through the airways of the patient “P.” More specifically, data pertaining to locations of sensor element 44, while locatable guide 32 is moving through the airways, is recorded using transmitter mat 56, reference sensors 54, and tracking module 52. A shape resulting from this location data is compared to an interior geometry of passages of the three-dimensional model generated in the planning phase, and a location correlation between the shape and the three-dimensional model based on the comparison is determined, e.g., utilizing the software on computing device 125. In addition, the software identifies non-tissue space (e.g., air filled cavities) in the three-dimensional model. The software aligns, or registers, an image representing a location of sensor 44 with a the three-dimensional model and two-dimensional images generated from the three-dimension model, which are based on the recorded location data and an assumption that locatable guide 32 remains located in non-tissue space in the patient's “P's” airways. Alternatively, a manual registration technique may be employed by navigating the bronchoscope 30 with the sensor 44 to pre-specified locations in the lungs of the patient “P”, and manually correlating the images from the bronchoscope to the model data of the 3D model.


Following registration of the patient “P” to the image data and pathway plan, a user interface is displayed in the navigation software which sets forth the pathway that the clinician is to follow to reach the target 14. One such navigation software is the ILOGIC® navigation suite currently sold by Covidien LP.


Once EWC 12 has been successfully navigated proximate the target 14 as depicted on the user interface, the locatable guide 32 may be unlocked from EWC 12 and removed, leaving EWC 12 in place as a guide channel for guiding medical instruments including without limitation, optical systems, ultrasound probes, biopsy tools, ablation tools (i.e., microwave ablation devices), laser probes, cryogenic probes, sensor probes, and aspirating needles to the target 14.


Having described the components of system 100, depicted in FIGS. 2 and 3 the following description of FIGS. 4-7 provides an exemplary workflow of using the components of system 100 in conjunction with CT imaging to achieve the result depicted in FIG. 1. FIGS. 4-7, enable a method of identifying a target 14 and a pathway to the target 14 utilizing computed tomographic (“CT”) images, and once identified, further enables the use of a navigation or guidance system to position the EWC 12 of a catheter guide assembly 40, and medical instrument positioned therethrough, relative to the target 14. In addition, the following enables accurate live image confirmation of the location of the EWC 12 prior, during, and after treatment.


CT image data facilitates the identification of a target 14, planning of a pathway to an identified target 14, as well as providing the ability to navigate through the body to the target 14 via a user interface. This includes a preoperative component and an operative component (i.e., pathway planning and pathway navigation) as will be described in further detail below. Live fluoroscopic visualization of the placement of the EWC 12 and/or medical instruments positioned therethrough, relative to the target 14 is enabled, thus enabling the clinician to actually see the proper placement of the device relative to the target 14 in real time using a combination of live fluoroscopic data and the CT image data (or selected portions thereof). Once placement of the medical instrument/EWC 12 is confirmed within the target 14, a surgical treatment or diagnostic sampling may be performed. For example, microwave energy can be transmitted to an ablation device positioned through EWC 12 to treat tissue located at the target 14.


Following treatment of tissue located at the target 14, the live fluoroscopic imaging may be utilized to confirm, for example, that a suitable ablation zone has been formed around the tissue and whether additional application of energy is necessary. These steps of treating and imaging may be repeated iteratively until a determination is made that the tissue located at the target 14 has been successfully treated. Moreover, the methodology described above using the imaging modalities to confirm the extent of treatment and determine whether additional application of energy is necessary can be combined with the radiometry and temperature sensing techniques to both confirm what is depicted by the imaging modality and to assist in determining treatment cessation points.


Turning now to FIGS. 4-7, methods for performing enhanced navigation using system 100 will now be described with particular detail. Although the methods illustrated and described herein are illustrated and described as being in a particular order and requiring particular steps, any of the methods may include some or all of the steps and may be implemented in any order not specifically described.


With particular reference to FIG. 4, a method for performing enhanced navigation is illustrated and will be described as method 400. Method 400 begins with the pathway planning step 401. In embodiments, the pathway planning step 401 includes acquiring a first set of CT images for generation of a first CT data set. However, the acquisition of the CT images and/or the generating of the CT data set may be completed prior to the pathway planning step 401 in which the pre-acquired CT data set is uploaded into system 100. In embodiments, the pathway planning step 401 includes three general steps. The first step involves using software for generating and viewing a three-dimensional model of the bronchial airway tree (“BT”) and viewing the CT data to identify targets (i.e., target 14). The second step involves using the software for selection of a pathway on the BT to the identified target 14, either automatically, semi-automatically, or manually, if desired. Optionally, the pathway may be automatically segmented into a set of waypoints along the path that can be visualized on a display. In embodiments, a third step may include confirmation of the plan using a fly-through view, and then exporting the pathway plan for use in a navigation system. It is to be understood that the airways are being used herein as an example of a branched luminal network. Hence, the term “BT” is being used in a general sense to represent any such luminal network (e.g., the circulatory system, or the gastro-intestinal tract, etc.). Further details regarding the planning step are described in U.S. patent application Ser. No. 13/838,805, filed Mar. 15, 2013, the entire contents of which are incorporated by reference herein.


Method 400 then proceeds to a first navigation step 403. In step 403, using the plan developed in step 401, an EWC 12 is navigated to a target 14. Specifically, with reference back to FIGS. 1-3, the plan developed in step 401 is imported into computing device 125, or generated by computing device 125, and the plan is registered with the patient's “P's” location enabling a clinician to follow the plan within the patient's “P's” BT with EWC 12 and LG 32. A clinician follows the plan by advancing the bronchoscope 30, and once the bronchoscope 30 is wedged, advancing the EWC 12 of the catheter guide assembly 40 through the working channel of the bronchoscope 30 to the target 14. The location of the distal end of the EWC 12, where the LG 32 is located, is monitored by the tracking system 50 as it is advanced through the BT. Further details regarding the navigation are described in U.S. Pat. No. 7,233,820, the entire contents of which are hereby incorporated by reference in its entirety.


After navigating the EWC 12 proximate the target 14 (via the user interface), in 404 the EWC 12 is used in conjunction with marker placement tools and biopsy tools to place markers 20 in tissue located around the target 14 and, optionally, for the retrieval of biopsy samples of the tissue proximate target 14. As understood by those of skill in the art, and as described above, the target 14 is a computer generated representation, created during the planning phase, of the tissue of interest identified by review of the CT image data. Thus, markers are placed in, and biopsy samples may be taken from, the tissue of the patient “P” at the location the navigation system identifies as corresponding to the location of the target 14 in the pathway plan.


After the markers 20 are placed, the medical instrument used to place the markers 20, along with the EWC 12, is removed from the patient's “P's” BT and the method proceeds to step 405 where a second set of CT images is acquired for generating a second CT data set. The second CT data set acquired in step 405 includes CT images of the patient “P” including the markers 20 placed in step 404. This may be performed immediately or following cytopathologic examination of the biopsy samples.


Following acquisition of the second CT image set, analysis of any biopsy samples taken, and confirming that either further biopsy or treatment is necessary, a new pathway plan is developed by the clinician and a second navigation step 407 is performed including navigating to the target 14 using a pathway plan generated using the second CT data. This second pathway plan may selectively include data from the navigation plan generated in step 401 using the first CT data set. In step 407, the EWC 12 is navigated to the target 14 in a similar manner as the first navigation step 403 and therefore will not be described in further detail.


Subsequent to navigating the EWC 12 to the target 14 in step 407, method 400 proceeds to step 409 to perform enhanced medical imaging and device placement. Specifically, after the EWC 12 is navigated to the target 14 in step 407, the LG 32 may again be removed from the EWC 12 and a medical instrument may be positioned proximate the target 14 via the EWC 12. Fluoroscopic imaging is undertaken and a composite fluoroscopic image 10 (FIG. 1) including data from the pathway plan data is displayed to the clinician. Step 409 enables a clinician to verify the position of the medical instrument relative to the target 14 and make adjustments to the position of the surgical device relative to the target 14 before performing a surgical procedure (i.e., retrieval of sample tissue, ablation of tissue, placement of additional markers). Details with respect to enhanced medical device placement of step 409 will be described in further detail below with respect to method 500 in FIG. 5. Subsequent to performing the enhanced medical imaging device placement in step 409, method 400 proceeds to step 411 where the medical instrument, properly positioned relative to the target 14 is used for its intended purposes (i.e., a microwave ablation device is activated to treat tissue, a biopsy tool retrieves a sample of tissue, a marker placement tool places the marker(s)).


Turning now to FIG. 5 and with reference to FIGS. 1-3, a method for performing enhanced navigation will be described in particular detail and will be referred to as method 500. Method 500 begins at step 501 after the EWC 12 is navigated to the target 14 following the second navigating step 407 of method 400 (FIG. 4). Method 500 may be used to confirm placement of the EWC 12, or any medical instrument positioned through the EWC 12, relative to the target 14 to verify and adjust its position relative to the target 14 prior to performing a surgical procedure (i.e., retrieving a sample of the target tissue, ablating the target tissue).


In step 501, a real-time fluoroscopic image of the patient “P” is captured. FIG. 6 illustrates an example of a real-time fluoroscopic image 601 captured in step 501. The real-time fluoroscopic image 601 is captured using the imaging device 110 (FIG. 3). As seen in FIG. 6, the markers 20 placed in the proximity of the target 14 (step 404 of method 400) and the EWC 12 previously navigated to the target 14 in the pathway plan (step 407 of method 400) are visible in the captured fluoroscopic image 601. In embodiments, step 501 includes capturing a series of fluoroscopic images of the target region and/or a live fluoroscopic video stream.


In step 503 the fluoroscopic image 601 captured in step 501 is registered with the second CT data set acquired in step 405 of method 400. In embodiments, the registration of the fluoroscopic image 601 and the second CT data set is based on a comparison of the position and orientation of the markers 20 within the fluoroscopic image 601 and the position and orientation of the markers 20 within the second CT data set (not shown). Specifically, computing device 125 detects markers 20 in the CT images of the second CT data set using methods such as intensity thresholding or via clinician manual identification. Possible false indicators such as from calcification or other metal objects visible in the CT images may be detected and disregarded. In embodiments, the second CT data set may be displayed for a clinician to identify the markers 20 on a graphical user interface. Additionally, in step 503, the computing device 125 detects the markers 20 depicted in the fluoroscopic image(s) 601 acquired in step 501. For marker 20 detection in the fluoroscopic image(s) 601, computing device 125 may employ techniques such as contrast detection, intensity detection, shape detection, minimum axis detection, and/or any combinations thereof. Additionally, computing device 125 may also detect the marker center and marker end points for each marker 20 detected. After detecting the markers 20 in the fluoroscopic image 601 acquired in step 501 and the CT data set stored in computing device 125, computing device 125 then registers the fluoroscopic image 601 with the CT data set by comparing one or more of the position, length, angle, orientation, and distance between each of the markers 20 or between all of the markers 20 with the CT data set.


In step 507, the fluoroscopic image(s) 601 and/or video captured in step 501 is displayed on the display of computing device 125.


In step 509, computing device 125 analyzes the position and/or orientation of the markers 20 depicted in the fluoroscopic image 601 and performs a mathematical calculation to identify a 2D slice of the 3D model generated from the second CT data set such that one or more of the position, length, angle, orientation, and distance between each of the markers 20 or between all of the markers 20 in the identified 2D slice correspond with the same factors in the fluoroscopic image. This may be performed in conjunction with position and/or orientation data received from the imaging device 110. Once the 2D image from the CT data set corresponding to the fluoroscopic image is ascertained, the clinician may selectively identify what portions of the data included on the 2D image to incorporate into the displayed fluoroscopic image 601. Alternatively, data from the fluoroscopic image 601 may be incorporated into the 2D image from the CT data set. As an example, the target 14 which was identified in the CT data set during the planning phase may be available for selection. In addition, the pathway 16 and luminal network 18, as well as other data from the CT data set may be available for selection. As a result, a clinician may select an object that is viewable in a CT image of the CT data set that is not viewable in the fluoroscopic image 601 (i.e., a portion of soft tissue), such that the selection may be combined with the fluoroscopic image 601 to create a combined image 10 (FIG. 1).


In addition to permitting selection, the computing device 125 may also output an indicator of resolution of the markers 20 from the fluoroscopic image in the CT data set. For example, in FIG. 1 each marker 20 is circumscribed by a line indicating that it has been positively identified. If markers 20 are not resolved in the CT data set, this may be an indicator that the 2D image and the fluoroscopic image 601 are not actually registered to one another, and provides an indicator to the clinician that they may wish to perform another fluoroscopic imaging before proceeding.


In step 511, with reference with FIG. 1, the combined or composite image 10 is displayed on the display of computing device 125 and/or another device. The combined image 10 displayed in step 511 includes the portion selected in step 509 (e.g., the target 14) and the fluoroscopic image(s) 601 (FIG. 6) or video displayed in step 507. The combined image 10 may be a fused image, an overlay of images, or any other display of multiple images and/or video known in the art. For example, as illustrated in FIG. 1, where a user selects the target 14 in an image of the CT data in step 509 (or when the target 14 is automatically selected in step 509), in step 511 the combined image 10 includes the fluoroscopic image 601 (FIG. 6) (including visibility of the markers 20 and EWC 12 as well as any medical instrument, placed therein) and the selection of the image of the CT data set (the target 14). Using the registration between the fluoroscopic image(s) 601 and/or video and the CT data set in step 503, the system 100 determines where the selected portion (e.g., target 14) is to be positioned (i.e., overlay, fused, etc.) within the fluoroscopic image 601 and/or video to create the combined image 10.


In step 513, the position of the EWC 12, or the medical instrument positioned within the EWC 12, is adjusted relative to the target 14 and displayed using the combined image 10 generated in step 511. Further details regarding the adjustment in step 511 will be described in further detail below with reference to FIG. 7.


Turning now to FIG. 7, a method for adjusting the position/placement of the EWC 12, or the medical instrument positioned therein, will now be described and referred to as method 700. After navigating the EWC 12 to the target 14, in order to ensure that the medical instrument positioned within the EWC 12 of the catheter guide assembly 40 is properly positioned relative to the target 14, using method 700 a clinician can ensure that the medical instrument is properly positioned or otherwise adjust the position of the medical instrument relative to the target 14 until it is properly positioned. Method 700 begins at step 701 where a medical instrument is positioned relative to a target 14 via the EWC 12.


In step 703, using imaging device 110, a fluoroscopic image/video is captured from a first angle. The fluoroscopic image/video captured in step 703 is transmitted to computing device 125 for display on a graphical user interface and for the generation of a combined image 10 (FIG. 1). Viewing the combined image 10, which displays both the target 14 and the medical instrument in real-time relative to the target 14, a clinician may determine whether the position of the medical instrument relative to the target 14 is correct (step 705). If the position of the medical instrument relative to the target 14 is correct (yes in step 705) then method 700 proceeds to step 706. Alternatively, if the position of the medical instrument is not correct (no in step 705), then method 700 proceeds to step 706.


In step 706, a clinician adjusts the position of the medical instrument by manipulating the catheter guide assembly 40 and therewith the EWC 12 and any medical instrument located therein. If the imaging device 110 is capturing a live video, then the adjustment of the medical instrument/EWC 12 in step 706 is viewed in real time on the display of computing device 125 or any other suitable devices. However, if the imaging device 110 is only capturing an image, then a method 700 reverts back to step 703 where a new fluoroscopic image is captured displaying the new/adjusted position of the medical instrument/EWC 12. This process is repeated until the position of the medical instrument/EWC 12 is correct (yes in step 705). Once the position of the EWC 12 is correct (yes in step 705), then method 700 proceeds to step 707.


In step 707, a second fluoroscopic image/video is captured from a second angle relative to the patient. That is, the imaging device 110 is moved to a new location such that a second fluoroscopic image/video may be captured from a different viewing angle. The fluoroscopic image/video captured in step 707 is transmitted to computing device 125 for display on a graphical user interface and for the generation of the combined image 10 (FIG. 1). Viewing the combined image 10, which displays both the target 14 and the medical instrument in real-time relative to the target 14, a clinician may determine whether the three-dimensional position of the medical instrument relative to the target 14 is correct (step 709). If the three-dimensional position the medical instrument relative to the target 14 is correct (yes in step 709), then method 700 proceeds to step 711. Alternatively, if the three-dimensional position of the medical instrument is not correct (no in step 709), then method 700 proceeds to step 710.


In step 710, the clinician adjusts the three-dimensional position of the medical instrument relative to the target 14 by pushing/pulling the catheter guide assembly 40 and therewith the EWC 12 and any medical instrument located therein relative to the target 14. Because of the adjustment of the three-dimensional position of the medical instrument/EWC 12, a clinician may wish to revert back to step 703 to view the position of the medical instrument/EWC 12 relative to the target 14 again from the first angle.


Once the three-dimensional position of the medical instrument/EWC 12 relative to the target 14 is correct (yes in step 709), method 700 proceeds to step 711 where the treatment is performed. As described above, depending on the intended treatment to be performed, the treatment may include retrieving samples of tissue for biopsy or testing, ablating tissue located at the target 14, placing markers 20 or any other suitable surgical procedure.


From the foregoing and with reference to the various figure drawings, those skilled in the art will appreciate that certain modifications can also be made to the present disclosure without departing from the scope of the same. For example, one or modifications may be made in the way of device delivery and placement; device cooling and antenna buffering; and sensor feedback.


As can be appreciated a medical instrument such as a biopsy tool or an energy device, such as a microwave ablation catheter, that is positionable through one or more branched luminal networks of a patient to treat tissue may prove useful in the surgical arena and the present disclosure is directed to such apparatus, systems, and methods. Access to luminal networks may be percutaneous or through natural orifice. In the case of natural orifice, an endobronchial approach may be particularly useful in the treatment of lung disease. Targets, navigation, access and treatment may be planned pre-procedurally using a combination of imaging and/or planning software. In accordance with these aspects of the present disclosure, the planning software may offer custom guidance using pre-procedure images. Navigation of the luminal network may be accomplished using image-guidance. These image-guidance systems may be separate or integrated with the energy device or a separate access tool and may include MRI, CT, fluoroscopy, ultrasound, electrical impedance tomography, optical, and/or device tracking systems. Methodologies for locating the access tool include EM, IR, echolocation, optical, and others. Tracking systems may be integrated to an imaging device, where tracking is done in virtual space or fused with preoperative or live images. In some cases the treatment target may be directly accessed from within the lumen, such as for the treatment of the endobronchial wall for COPD, Asthma, lung cancer, etc. In other cases, the energy device and/or an additional access tool may be required to pierce the lumen and extend into other tissues to reach the target, such as for the treatment of disease within the parenchyma. Final localization and confirmation of energy device placement may be performed with imaging and/or navigational guidance using the modalities described below. The energy device has the ability to deliver an energy field for treatment (including but not limited to electromagnetic fields).


While several embodiments of the disclosure have been shown in the drawings, it is not intended that the disclosure be limited thereto, as it is intended that the disclosure be as broad in scope as the art will allow and that the specification be read likewise. Therefore, the above description should not be construed as limiting, but merely as exemplifications of particular embodiments. Those skilled in the art will envision other modifications within the scope and spirit of the claims appended hereto.

Claims
  • 1. A method for enhanced surgical navigation, comprising: generating a first navigation path based on a first data set of computed tomography (CT) images of a branched luminal network, the first navigation path defining a route through the branched luminal network to a target;displaying the route of the first navigation path for navigation to the target;receiving a second data set of CT images of the branched luminal network, the second data set of CT images including a marker proximate the target;generating a three-dimensional model of the branched luminal network based on the second data set of CT images;generating a second navigation path to the target based on the three-dimensional model, the second navigation path defining a second route through the branched luminal network to the target;displaying the second route to the target;registering fluoroscopic data of tissue proximate the marker to at least one of the first data set of CT images or the second data set of CT images; andcreating a composite fluoroscopic image including: the fluoroscopic data;an object derived from the second data set of CT images; anda representation of the branched luminal network derived from the second data set of CT images.
  • 2. The method according to claim 1, further comprising displaying the composite fluoroscopic image as a medical instrument is navigated to tissue proximate the target.
  • 3. The method according to claim 1, wherein displaying the route of the first navigation path for a first navigation to the target by following the route includes displaying the route of the first navigation path for a first navigation to the target by following the route and for placement of a plurality of markers in tissue proximate the target.
  • 4. The method according to claim 1, wherein the composite fluoroscopic image further includes a representation of the second route through the branched luminal network from the second data set of CT images.
  • 5. The method according to claim 1, further comprising: displaying a representation of the second data set of CT images; anddisplaying the fluoroscopic data.
  • 6. The method according to claim 1, further comprising: receiving a selection of at least a portion of the second data set of CT images or the fluoroscopic data; andcombining the selection with at least one of the second data set of CT images or the fluoroscopic data into the composite fluoroscopic image.
  • 7. The method according to claim 1, wherein the composite fluoroscopic image includes at least one of a fused, superimposed, or overlaid image of at least a portion of the second data set of CT images with the fluoroscopic data.
  • 8. The method according to claim 1, wherein the fluoroscopic data includes image data of a medical instrument positioned relative to tissue proximate the target and the method further comprises: determining whether the medical instrument is correctly positioned relative to the target based on an analysis of the composite fluoroscopic image.
  • 9. The method according to claim 8, further comprising: acquiring second fluoroscopic data of tissue proximate the marker from an imaging device; anddetermining whether a three-dimensional position of the medical instrument relative to the target is correct based on an analysis of the second fluoroscopic data.
  • 10. The method according to claim 1, wherein the fluoroscopic data is real-time fluoroscopic video of tissue proximate the marker.
  • 11. The method according to claim 1, wherein the fluoroscopic data is at least one a fluoroscopic image of tissue proximate the marker.
  • 12. The method according to claim 1, wherein registering fluoroscopic data of tissue proximate the marker to the second data set of CT images includes registering the fluoroscopic data to the second data set of CT images based on a position and orientation of the marker.
  • 13. The method according to claim 1, further comprising: identifying a slice of the second data set of CT images having a marker position and orientation corresponding to a marker position and orientation within the fluoroscopic data; andregistering the fluoroscopic data to the second data set of CT images based on the slice.
  • 14. A method for enhanced surgical navigation comprising: receiving a data set of CT images of a branched luminal network, the data set of CT images including a marker proximate a target;generating a three-dimensional model of the branched luminal network based on the data set of CT images;generating a navigation path through the branched luminal network based on the three-dimensional model, the navigation path defining a route through the branched luminal network to the target;displaying the route through a representation of the branched luminal network for navigation to the target;registering live fluoroscopic data of tissue proximate the marker to the data set of CT images; andcreating a two-dimensional (2D) composite fluoroscopic image including: the live fluoroscopic data;a 2D representation of an object derived from the data set of CT images; andthe representation of the branched luminal network.
  • 15. The method according to claim 14, wherein registering the live fluoroscopic data of tissue proximate the marker to the data set of CT images includes registering the live fluoroscopic data to the data set of CT images based on a position and orientation of the marker.
  • 16. The method according to claim 14, wherein the marker includes a plurality of markers.
  • 17. The method according to claim 14, wherein the marker includes a portion of a medical tool positioned proximate the target.
  • 18. The method according to claim 14, wherein registering the live fluoroscopic data of tissue proximate the marker to the data set of CT images includes registering a series of live fluoroscopic images to the data set of CT images.
  • 19. A method for enhanced surgical navigation comprising: receiving a data set of CT images of a branched luminal network;generating a three-dimensional model of the branched luminal network from the data set of CT images;generating a navigation path through the branched luminal network based on the three-dimensional model, the navigation path defining a route through the branched luminal network to a target;displaying the route through a representation of the branched luminal network for navigation of a medical tool to the target;registering fluoroscopic data of tissue proximate the target to the data set of CT images based on a position and orientation of the medical tool within the fluoroscopic data; andcreating a two-dimensional (2D) composite fluoroscopic image including: the live fluoroscopic data;a 2D representation of an object derived from the data set of CT images;the representation of the branched luminal network;the medical tool; andthe route defined by the navigation path.
  • 20. The method according to claim 19, wherein registering the live fluoroscopic data of tissue proximate the target to the data set of CT images includes registering a series of live fluoroscopic images to the data set of CT images.
CROSS REFERENCE TO RELATED APPLICATION

The present application is a continuation of U.S. application Ser. No. 15/972,156, filed on May 6, 2018, which is a continuation of U.S. application Ser. No. 14/880,338, now U.S. Pat. No. 9,974,525, filed on Oct. 12, 2015, which claims the benefit of and priority to U.S. Provisional Application Nos. 62/073,287 and 62/073,306, filed on Oct. 31, 2014. This application is related to U.S. patent application Ser. No. 14/880,361, now U.S. Pat. No. 9,986,983, filed on Oct. 12, 2015. The entire contents of each of the above applications are hereby incorporated herein by reference.

US Referenced Citations (325)
Number Name Date Kind
5852646 Klotz et al. Dec 1998 A
5930329 Navab Jul 1999 A
5951475 Gueziec et al. Sep 1999 A
5963612 Navab Oct 1999 A
5963613 Navab Oct 1999 A
6038282 Wiesent et al. Mar 2000 A
6049582 Navab Apr 2000 A
6050724 Schmitz et al. Apr 2000 A
6055449 Navab Apr 2000 A
6081577 Webber Jun 2000 A
6120180 Graumann Sep 2000 A
6236704 Navab et al. May 2001 B1
6314310 Ben-Haim et al. Nov 2001 B1
6317621 Graumann et al. Nov 2001 B1
6351513 Bani-Hashemi et al. Feb 2002 B1
6373916 Inoue et al. Apr 2002 B1
6389104 Bani-Hashemi et al. May 2002 B1
6404843 Vaillant Jun 2002 B1
6424731 Launay et al. Jul 2002 B1
6470207 Simon et al. Oct 2002 B1
6473634 Barni Oct 2002 B1
6484049 Seeley et al. Nov 2002 B1
6485422 Mikus et al. Nov 2002 B1
6490475 Seeley et al. Dec 2002 B1
6491430 Seissler Dec 2002 B1
6535756 Simon et al. Mar 2003 B1
6539127 Roche et al. Mar 2003 B1
6546068 Shimura Apr 2003 B1
6546279 Bova et al. Apr 2003 B1
6549607 Webber Apr 2003 B1
6697664 Kienzle, III et al. Feb 2004 B2
6707878 Claus et al. Mar 2004 B2
6714810 Grzeszczuk et al. Mar 2004 B2
6725080 Melkent et al. Apr 2004 B2
6731283 Navab May 2004 B1
6731970 Schlossbauer et al. May 2004 B2
6768784 Green et al. Jul 2004 B1
6782287 Grzeszczuk et al. Aug 2004 B2
6785356 Grass et al. Aug 2004 B2
6785571 Glossop Aug 2004 B2
6801597 Webber Oct 2004 B2
6823207 Jensen et al. Nov 2004 B1
6856826 Seeley et al. Feb 2005 B2
6856827 Seeley et al. Feb 2005 B2
6865253 Blumhofer et al. Mar 2005 B2
6898263 Avinash et al. May 2005 B2
6944260 Hsieh et al. Sep 2005 B2
6956927 Sukeyasu et al. Oct 2005 B2
7010080 Mitschke et al. Mar 2006 B2
7010152 Bojer et al. Mar 2006 B2
7035371 Boese et al. Apr 2006 B2
7106825 Gregerson et al. Sep 2006 B2
7117027 Zheng et al. Oct 2006 B2
7129946 Ditt et al. Oct 2006 B2
7130676 Barrick Oct 2006 B2
7165362 Jobs et al. Jan 2007 B2
7251522 Essenreiter et al. Jul 2007 B2
7327872 Vaillant et al. Feb 2008 B2
7343195 Strommer et al. Mar 2008 B2
7356367 Liang et al. Apr 2008 B2
7369641 Tsubaki et al. May 2008 B2
7440538 Tsujii Oct 2008 B2
7467007 Lothert Dec 2008 B2
7474913 Durlak Jan 2009 B2
7499743 Vass et al. Mar 2009 B2
7502503 Bojer et al. Mar 2009 B2
7505549 Ohishi et al. Mar 2009 B2
7508388 Barfuss et al. Mar 2009 B2
7551759 Hristov et al. Jun 2009 B2
7603155 Jensen Oct 2009 B2
7620223 Xu et al. Nov 2009 B2
7639866 Pomero et al. Dec 2009 B2
7664542 Boese et al. Feb 2010 B2
7689019 Boese et al. Mar 2010 B2
7689042 Brunner et al. Mar 2010 B2
7693263 Bouvier et al. Apr 2010 B2
7697972 Verard et al. Apr 2010 B2
7711082 Fujimoto et al. May 2010 B2
7711083 Heigl et al. May 2010 B2
7711409 Keppel et al. May 2010 B2
7720520 Willis May 2010 B2
7725165 Chen et al. May 2010 B2
7734329 Boese et al. Jun 2010 B2
7742557 Brunner et al. Jun 2010 B2
7761135 Pfister et al. Jul 2010 B2
7778685 Evron et al. Aug 2010 B2
7787932 Vilsmeier et al. Aug 2010 B2
7804991 Abovitz et al. Sep 2010 B2
7831096 Williamson, Jr. Nov 2010 B2
7835779 Anderson et al. Nov 2010 B2
7853061 Gorges et al. Dec 2010 B2
7877132 Rongen et al. Jan 2011 B2
7899226 Pescatore et al. Mar 2011 B2
7907989 Borgert et al. Mar 2011 B2
7912180 Zou et al. Mar 2011 B2
7912262 Timmer et al. Mar 2011 B2
7916918 Suri et al. Mar 2011 B2
7949088 Nishide et al. May 2011 B2
7991450 Virtue et al. Aug 2011 B2
7995819 Vaillant et al. Aug 2011 B2
8000436 Seppi et al. Aug 2011 B2
8043003 Vogt et al. Oct 2011 B2
8045780 Boese et al. Oct 2011 B2
8050739 Eck et al. Nov 2011 B2
8090168 Washburn et al. Jan 2012 B2
8098914 Liao et al. Jan 2012 B2
8111894 Van De Haar Feb 2012 B2
8111895 Spahn Feb 2012 B2
8126111 Uhde et al. Feb 2012 B2
8126241 Zarkh et al. Feb 2012 B2
8150131 Harer et al. Apr 2012 B2
8180132 Gorges et al. May 2012 B2
8195271 Rahn Jun 2012 B2
8200316 Keppel et al. Jun 2012 B2
8208708 Homan et al. Jun 2012 B2
8218843 Edlauer et al. Jul 2012 B2
8229061 Hanke et al. Jul 2012 B2
8238625 Strommer et al. Aug 2012 B2
8248413 Gattani et al. Aug 2012 B2
8270691 Xu et al. Sep 2012 B2
8271068 Khamene et al. Sep 2012 B2
8275448 Camus et al. Sep 2012 B2
8295577 Zarkh et al. Oct 2012 B2
8306303 Bruder et al. Nov 2012 B2
8311617 Keppel et al. Nov 2012 B2
8320992 Frenkel et al. Nov 2012 B2
8335359 Fidrich et al. Dec 2012 B2
8340379 Razzaque et al. Dec 2012 B2
8345817 Fuchs et al. Jan 2013 B2
8346344 Pfister et al. Jan 2013 B2
8358874 Haras Jan 2013 B2
8374416 Gagesch et al. Feb 2013 B2
8374678 Graumann Feb 2013 B2
8423117 Pichon et al. Apr 2013 B2
8442618 Strommer et al. May 2013 B2
8482606 Razzaque et al. Jul 2013 B2
8515527 Vaillant et al. Aug 2013 B2
8526688 Groszmann et al. Sep 2013 B2
8526700 Isaacs Sep 2013 B2
8532258 Bulitta et al. Sep 2013 B2
8532259 Shedlock et al. Sep 2013 B2
8548567 Maschke et al. Oct 2013 B2
8625865 Zarkh et al. Jan 2014 B2
8625869 Harder et al. Jan 2014 B2
8666137 Nielsen et al. Mar 2014 B2
8670603 Tolkowsky et al. Mar 2014 B2
8675996 Liao et al. Mar 2014 B2
8693622 Graumann et al. Apr 2014 B2
8693756 Tolkowsky et al. Apr 2014 B2
8694075 Groszmann et al. Apr 2014 B2
8706184 Mohr et al. Apr 2014 B2
8706186 Fichtinger et al. Apr 2014 B2
8712129 Strommer et al. Apr 2014 B2
8718346 Isaacs et al. May 2014 B2
8750582 Boese et al. Jun 2014 B2
8755587 Bender et al. Jun 2014 B2
8781064 Fuchs et al. Jul 2014 B2
8792704 Isaacs Jul 2014 B2
8798339 Mielekamp et al. Aug 2014 B2
8827934 Chopra et al. Sep 2014 B2
8831310 Razzaque et al. Sep 2014 B2
8855748 Keppel et al. Oct 2014 B2
9001121 Finlayson et al. Apr 2015 B2
9001962 Funk Apr 2015 B2
9008367 Tolkowsky et al. Apr 2015 B2
9031188 Belcher et al. May 2015 B2
9036777 Ohishi et al. May 2015 B2
9042624 Dennerlein May 2015 B2
9044190 Rubner et al. Jun 2015 B2
9087404 Hansis et al. Jul 2015 B2
9095252 Popovic Aug 2015 B2
9104902 Xu et al. Aug 2015 B2
9111175 Strommer et al. Aug 2015 B2
9135706 Zagorchev et al. Sep 2015 B2
9171365 Mareachen et al. Oct 2015 B2
9179878 Jeon Nov 2015 B2
9216065 Cohen et al. Dec 2015 B2
9232924 Liu et al. Jan 2016 B2
9262830 Bakker et al. Feb 2016 B2
9265468 Rai et al. Feb 2016 B2
9277893 Tsukagoshi et al. Mar 2016 B2
9280837 Grass et al. Mar 2016 B2
9282944 Fallavollita et al. Mar 2016 B2
9401047 Bogoni et al. Jul 2016 B2
9406134 Klingenbeck-Regn Aug 2016 B2
9433390 Nathaniel et al. Sep 2016 B2
9445772 Callaghan Sep 2016 B2
9445776 Han et al. Sep 2016 B2
9466135 Koehler et al. Oct 2016 B2
9833167 Cohen et al. Dec 2017 B2
9888898 Imagawa et al. Feb 2018 B2
9918659 Chopra et al. Mar 2018 B2
9974525 Weingarten et al. May 2018 B2
9986983 Weingarten et al. Jun 2018 B2
10127629 Razzaque et al. Nov 2018 B2
10130316 Funabasama et al. Nov 2018 B2
10321898 Weingarten et al. Jun 2019 B2
10373719 Soper et al. Aug 2019 B2
10376178 Chopra Aug 2019 B2
10405753 Sorger Sep 2019 B2
10478162 Barbagli et al. Nov 2019 B2
10480926 Froggatt et al. Nov 2019 B2
10524866 Srinivasan et al. Jan 2020 B2
10555788 Panescu et al. Feb 2020 B2
10610306 Chopra Apr 2020 B2
10638953 Duindam et al. May 2020 B2
10674970 Averbuch et al. Jun 2020 B2
10682070 Duindam Jun 2020 B2
10706543 Donhowe et al. Jul 2020 B2
10709506 Coste-Maniere et al. Jul 2020 B2
10772485 Schlesinger et al. Sep 2020 B2
10796432 Mintz et al. Oct 2020 B2
10823627 Sanborn et al. Nov 2020 B2
10827913 Ummalaneni et al. Nov 2020 B2
10835153 Rafii-Tari et al. Nov 2020 B2
10885630 Li et al. Jan 2021 B2
10896506 Zhao et al. Jan 2021 B2
20030013972 Makin Jan 2003 A1
20050027193 Mitschke et al. Feb 2005 A1
20050096522 Reddy May 2005 A1
20060033493 Biglieri Feb 2006 A1
20060167416 Mathis et al. Jul 2006 A1
20080146916 Okerlund et al. Jun 2008 A1
20080243142 Gildenberg Oct 2008 A1
20080262342 Averbruch Oct 2008 A1
20080269588 Csavoy et al. Oct 2008 A1
20090137952 Ramamurthy et al. May 2009 A1
20090257551 Dafni et al. Oct 2009 A1
20100008475 Maschke Jan 2010 A1
20110038458 Spahn Feb 2011 A1
20120281903 Trumer et al. Nov 2012 A1
20120289825 Rai et al. Nov 2012 A1
20130303945 Blumenkranz et al. Nov 2013 A1
20130317339 Waldstreicher et al. Nov 2013 A1
20140035798 Kawada et al. Feb 2014 A1
20150148690 Chopra et al. May 2015 A1
20150227679 Kamer et al. Aug 2015 A1
20150265368 Chopra et al. Sep 2015 A1
20160005194 Schretter et al. Jan 2016 A1
20160157939 Larkin et al. Jun 2016 A1
20160183841 Duindam et al. Jun 2016 A1
20160192860 Allenby et al. Jul 2016 A1
20160206380 Sparks et al. Jul 2016 A1
20160287343 Eichler et al. Oct 2016 A1
20160287344 Donhowe et al. Oct 2016 A1
20170112576 Coste-Maniere et al. Apr 2017 A1
20170209071 Zhao et al. Jul 2017 A1
20170265952 Donhowe et al. Sep 2017 A1
20170311844 Zhao et al. Nov 2017 A1
20170319165 Averbuch Nov 2017 A1
20180078318 Barbagli et al. Mar 2018 A1
20180153621 Duindam et al. Jun 2018 A1
20180235709 Donhowe et al. Aug 2018 A1
20180240237 Donhowe et al. Aug 2018 A1
20180256262 Duindam et al. Sep 2018 A1
20180263706 Averbuch Sep 2018 A1
20180279852 Rafii-Tari et al. Oct 2018 A1
20180325419 Zhao et al. Nov 2018 A1
20190000559 Berman et al. Jan 2019 A1
20190000560 Berman et al. Jan 2019 A1
20190008413 Duindam et al. Jan 2019 A1
20190038365 Soper et al. Feb 2019 A1
20190065209 Mishra et al. Feb 2019 A1
20190110839 Rafii-Tari et al. Apr 2019 A1
20190175062 Rafii-Tari et al. Jun 2019 A1
20190183318 Froggatt et al. Jun 2019 A1
20190183585 Rafii-Tari et al. Jun 2019 A1
20190183587 Rafii-Tari et al. Jun 2019 A1
20190192234 Gadda et al. Jun 2019 A1
20190209016 Herzlinger et al. Jul 2019 A1
20190209043 Zhao et al. Jul 2019 A1
20190216548 Ummalaneni Jul 2019 A1
20190239723 Duindam et al. Aug 2019 A1
20190239831 Chopra Aug 2019 A1
20190250050 Sanborn et al. Aug 2019 A1
20190254649 Walters et al. Aug 2019 A1
20190269470 Barbagli et al. Sep 2019 A1
20190272634 Li et al. Sep 2019 A1
20190298160 Ummalaneni et al. Oct 2019 A1
20190298451 Wong et al. Oct 2019 A1
20190320878 Duindam et al. Oct 2019 A1
20190320937 Duindam et al. Oct 2019 A1
20190336238 Yu et al. Nov 2019 A1
20190343424 Blumenkranz et al. Nov 2019 A1
20190350659 Wang et al. Nov 2019 A1
20190365199 Zhao et al. Dec 2019 A1
20190365479 Rafii-Tari Dec 2019 A1
20190365486 Srinivasan et al. Dec 2019 A1
20190380787 Ye et al. Dec 2019 A1
20200000319 Saadat et al. Jan 2020 A1
20200000526 Zhao Jan 2020 A1
20200008655 Schlesinger et al. Jan 2020 A1
20200030044 Wang et al. Jan 2020 A1
20200030461 Sorger Jan 2020 A1
20200038750 Kojima Feb 2020 A1
20200043207 Lo et al. Feb 2020 A1
20200046431 Soper et al. Feb 2020 A1
20200046436 Tzeisler et al. Feb 2020 A1
20200054399 Duindam et al. Feb 2020 A1
20200060771 Lo et al. Feb 2020 A1
20200069192 Sanborn et al. Mar 2020 A1
20200077870 Dicarlo et al. Mar 2020 A1
20200078095 Chopra et al. Mar 2020 A1
20200078103 Duindam et al. Mar 2020 A1
20200085514 Blumenkranz Mar 2020 A1
20200109124 Pomper et al. Apr 2020 A1
20200129045 Prisco Apr 2020 A1
20200129239 Bianchi et al. Apr 2020 A1
20200138515 Wong May 2020 A1
20200155116 Donhowe et al. May 2020 A1
20200170623 Averbuch Jun 2020 A1
20200170720 Ummalaneni Jun 2020 A1
20200179058 Barbagli et al. Jun 2020 A1
20200188038 Donhowe et al. Jun 2020 A1
20200205903 Srinivasan et al. Jul 2020 A1
20200205904 Chopra Jul 2020 A1
20200214664 Zhao et al. Jul 2020 A1
20200229679 Zhao et al. Jul 2020 A1
20200242767 Zhao et al. Jul 2020 A1
20200275860 Duindam Sep 2020 A1
20200297442 Adebar et al. Sep 2020 A1
20200315554 Averbuch et al. Oct 2020 A1
20200330795 Sawant et al. Oct 2020 A1
20200352427 Deyanov Nov 2020 A1
20200364865 Donhowe et al. Nov 2020 A1
Foreign Referenced Citations (25)
Number Date Country
0013237 Jul 2003 BR
0116004 Jun 2004 BR
1503184 Jun 2004 CN
101410060 Mar 2007 CN
103260518 Aug 2013 CN
104050348 Sep 2014 CN
486540 Sep 2016 CZ
2709512 Aug 2017 CZ
2884879 Jan 2020 CZ
1510182 Mar 2005 EP
3413830 Sep 2019 EP
3478161 Feb 2020 EP
3641686 Apr 2020 EP
3644885 May 2020 EP
3644886 May 2020 EP
PA03005028 Jan 2004 MX
225663 Jan 2005 MX
226292 Feb 2005 MX
246862 Jun 2007 MX
265247 Mar 2009 MX
284569 Mar 2011 MX
2005013841 Feb 2005 WO
2006078678 Jul 2006 WO
2007113703 Oct 2007 WO
2014025550 Feb 2014 WO
Non-Patent Literature Citations (8)
Entry
International Search Report and Written Opinion from PCT Appl. No. PCT/US2015/056376 dated Jan. 26, 2016.
Extended European Search Report from Appl. No. EP 15 854 370.2 dated Jun. 11, 2018 (8 pages).
Chinese Office Action issued in corresponding Appl. No. CN 201580060018.3 dated Dec. 28, 2018, together with English language translation (22 pages).
Australian Examination Report issued in corresponding Appl. No. AU 2015339687 dated Jul. 31, 2019 (4 pages).
Japanese Office Action issued in corresponding Appl. No. JP 2017-523487 dated Aug. 28, 2019, together with English language translation (7 pages).
Canadian Office action issued in Canadian application No. 2,966,319 dated Sep. 16, 2021.
Extended European Search Report issued in European Patent Application No. 21198048.7 dated Feb. 7, 2022.
Chinese Office Action issued in Chinese Patent Application No. 201911186082.2 dated Oct. 21, 2022, with English translation.
Related Publications (1)
Number Date Country
20190290249 A1 Sep 2019 US
Provisional Applications (2)
Number Date Country
62073287 Oct 2014 US
62073306 Oct 2014 US
Continuations (2)
Number Date Country
Parent 15972156 May 2018 US
Child 16436404 US
Parent 14880338 Oct 2015 US
Child 15972156 US