The present disclosure relates to bronchial registration and, more particularly, to devices, systems, and methods for automatically registering a three-dimensional bronchial tree model with a patient's real bronchial tree.
A common device for inspecting the airway of a patient is a bronchoscope. Typically, the bronchoscope is inserted into a patient's airways through the patient's nose or mouth and can extend into the lungs of the patient. A typical bronchoscope includes an elongated flexible tube having an illumination assembly for illuminating the region distal to the bronchoscope's tip, an imaging assembly for providing a video image from the bronchoscope's tip, and a working channel through which instruments, e.g., diagnostic instruments such as biopsy tools, therapeutic instruments can be inserted.
Bronchoscopes, however, are limited in how far they may be advanced through the airways due to their size. Where the bronchoscope is too large to reach a target location deep in the lungs, a clinician may utilize certain real-time imaging modalities such as fluoroscopy. Fluoroscopic images, while useful, present certain drawbacks for navigation as it is often difficult to distinguish luminal passageways from solid tissue. Moreover, the images generated by the fluoroscope are two-dimensional whereas navigating the airways of a patient requires the ability to maneuver in three dimensions.
To address these issues, systems have been developed that enable the development of three-dimensional models of the airways or other luminal networks, typically from a series of computed tomography (CT) images. One such system has been developed as part of the ILOGIC® ELECTROMAGNETIC NAVIGATION BRONCHOSCOPY® (ENB™), system currently sold by Medtronic PLC. The details of such a system are described in commonly assigned U.S. Pat. No. 7,233,820, entitled ENDOSCOPE STRUCTURES AND TECHNIQUES FOR NAVIGATING TO A TARGET IN BRANCHED STRUCTURE, filed on Mar. 29, 2004, by Gilboa, the entire contents of which are incorporated herein by reference.
While the system as described in U.S. Pat. No. 7,233,820 is quite capable, there is always a need for development of improvements and additions to such systems.
Provided in accordance with the present disclosure is a method of using carina locations to improve registration of a luminal network to a 3D model of the luminal network.
In an aspect of the present disclosure, the method includes generating a 3D model of a luminal network based on images of the luminal network, identifying a target within the 3D model of the luminal network, determining locations of a plurality of carinas in the luminal network proximate the target, displaying guidance for navigating a location sensor within the luminal network, tracking the location of the location sensor while the location sensor is navigated within the luminal network, comparing the tracked locations of the location sensor within the luminal network and the portions of the 3D model representative of open space, displaying guidance for navigating the location sensor a predetermined distance into each lumen originating at the plurality of carinas proximate the target, tracking the location of the location sensor while the location sensor is navigated the predetermined distance into each lumen, and updating the registration of the 3D model with the luminal network based on the tracked locations of the location sensor as it is navigated past the plurality of carinas proximate the target.
In a further aspect of the present disclosure, the luminal network is an airway of a patient.
In yet a further aspect of the present disclosure, the 3D model is a model of the airway of the patient.
In another aspect of the present disclosure, the carinas are used as fiducial markers for identifying the location of the target.
Provided in accordance with the present disclosure is a system of using carina locations to improve registration of a luminal network to a 3D model of the luminal network.
In an aspect of the present disclosure, the comprises a location sensor capable of being navigated within a luminal network inside a patient's body, an electromagnetic field generator configured to detect the location of the location sensor as it is navigated within the luminal network, and a computing device including a processor and a memory storing instructions which, when executed by the processor, cause the computing device to generate a 3D model of the luminal network based on images of the luminal network, identify a target within the 3D model of the luminal network, determine locations of a plurality of carinas in the luminal network proximate the target, display guidance for navigating the location sensor within the luminal network, track the location of the location sensor while the location sensor is navigated within the luminal network, compare the tracked locations of the location sensor within the luminal network and the portions of the 3D model representative of open space, display guidance for navigating the location sensor a predetermined distance into each lumen originating at the plurality of carinas proximate the target, track the location of the location sensor while the location sensor is navigated the predetermined distance into each lumen, and update the registration of the 3D model with the luminal network based on the tracked locations of the location sensor as it is navigated past the plurality of carinas proximate the target.
In a further aspect of the present disclosure, the luminal network is an airway of a patient.
In yet a further aspect of the present disclosure, the 3D model is a model of the airway of the patient.
In another aspect of the present disclosure, the carinas are used as fiducial markers for identifying the location of the target.
Provided in accordance with the present disclosure is a computer-readable storing medium storing instructions which, when executed by a processor, cause a computing device to use carina locations to improve registration of a luminal network to a 3D model of the luminal network.
In an aspect of the present disclosure, the non-transitory computer-readable storing medium stores instructions which, when executed by a processor, cause a computing device to generate a 3D model of a luminal network based on images of the luminal network, identify a target within the 3D model of the luminal network, determine locations of a plurality of carinas in the luminal network proximate the target, display guidance for navigating a location sensor within the luminal network, track the location of the location sensor while the location sensor is navigated within the luminal network, compare the tracked locations of the location sensor within the luminal network and the portions of the 3D model representative of open space, display guidance for navigating the location sensor a predetermined distance into each lumen originating at the plurality of carinas proximate the target, track the location of the location sensor while the location sensor is navigated the predetermined distance into each lumen, and update the registration of the 3D model with the luminal network based on the tracked locations of the location sensor as it is navigated past the plurality of carinas proximate the target.
In a further aspect of the present disclosure, the luminal network is an airway of a patient.
In yet a further aspect of the present disclosure, the 3D model is a model of the airway of the patient.
In another aspect of the present disclosure, the carinas are used as fiducial markers for identifying the location of the target.
Any of the above aspects and embodiments of the present disclosure may be combined without departing from the scope of the present disclosure.
Various aspects and features of the present disclosure are described hereinbelow with references to the drawings, wherein:
The present disclosure is directed to devices, systems, and methods for registering a three-dimensional bronchial tree model (hereinafter referred to as a “3D model”) with a patient's airways. Various methods for generating the 3D model and identifying target lesions are envisioned, some of which are more fully described in co-pending U.S. Patent Application Publication Nos. US 2014/0281961, US 2014/0270441, and US 2014/0282216, all entitled PATHWAY PLANNING SYSTEM AND METHOD, filed on Mar. 15, 2013, by Baker, the entire contents of all of which are incorporated herein by reference. Following generation of the 3D model and identification of the target lesions, the 3D model must be registered with the patient's airways. Various methods of manual and automatic registration are envisioned, some of which are more fully described in co-pending U.S. patent application Ser. No. 14/790,581, entitled REAL TIME AUTOMATIC REGISTRATION FEEDBACK, filed on Jul. 2, 2015, by Brown et al., the entire contents of which is incorporated herein by reference. As is described in more detail below, to further improve registration accuracy between the 3D model and the patient's airways, the clinician may, following automatic registration, perform additional localized registration of the airways surrounding the identified target lesions.
The registration system of the present disclosure, for example, generally includes at least one sensor whose position is tracked within an electromagnetic field. The location sensor may be incorporated into different types of tools, and enables determination of the current location of the tools within a patient's airways by comparing the sensed location in space to locations within the 3D model. The registration facilitates navigation of the sensor or a tool to a target location and/or manipulation of the sensor or tool relative to the target location. Navigation of the sensor or tool to the target location is more fully described in co-pending U.S. patent application Ser. No. 14/753,288, entitled SYSTEM AND METHOD FOR NAVIGATING WITHIN THE LUNG, filed on Jun. 29, 2015, by Brown et al., the entire contents of which is incorporated herein by reference.
Additional features of the ENB system of the present disclosure are described in co-pending U.S. patent application Ser. No. 14/753,229, entitled METHODS FOR MARKING BIOPSY LOCATION, filed on Jun. 29, 2015, by Brown; Ser. No. 14/754,058, entitled INTELLIGENT DISPLAY, filed on Jun. 29, 2015, by Kehat et al.; Ser. No. 14/788,952, entitled UNIFIED COORDINATE SYSTEM FOR MULTIPLE CT SCANS OF PATIENT LUNGS, filed on Jul. 1, 2015, by Greenburg; Ser. No. 14/790,395, entitled ALIGNMENT CT, filed on Jul. 2, 2015, by Klein et al.; Ser. No. 14/725,300, entitled FLUOROSCOPIC POSE ESTIMATION, filed on May 29, 2015, by Merlet; Ser. No. 14/753,674, entitled TRACHEA MARKING, filed on Jun. 29, 2015, by Lachmanovich et al.; Ser. Nos. 14/755,708 and 14/755,721, both entitled SYSTEM AND METHOD FOR DETECTING TRACHEA, filed on Jun. 30, 2015, by Markov et al.; Ser. No. 14/754,867, entitled SYSTEM AND METHOD FOR SEGMENTATION OF LUNG, filed on Jun. 30, 2015, by Markov et al.; Ser. No. 14/790,107, entitled SYSTEM AND METHOD FOR PROVIDING DISTANCE AND ORIENTATION FEEDBACK WHILE NAVIGATING IN 3D, filed on Jul. 2, 2015, by Lachmanovich et al.; and Ser. No. 14/751,257, entitled DYNAMIC 3D LUNG MAP VIEW FOR TOOL NAVIGATION INSIDE THE LUNG, filed on Jun. 26, 2015, by Weingarten et al., the entire contents of all of which are incorporated herein by reference.
Detailed embodiments of such devices, systems incorporating such devices, and methods using the same are described below. However, these detailed embodiments are merely examples of the disclosure, which may be embodied in various forms. Therefore, specific structural and functional details disclosed herein are not to be interpreted as limiting but merely as a basis for the claims and as a representative basis for allowing one skilled in the art to variously employ the present disclosure in virtually any appropriately detailed structure. While the example embodiments described below are directed to the bronchoscopy of a patient's airways, those skilled in the art will realize that the same or similar devices, systems, and methods may also be used in other lumen networks, such as, for example, the vascular, lymphatic, and/or gastrointestinal networks.
With reference to
EMN system 10 generally includes an operating table 40 configured to support a patient; a bronchoscope 50 configured for insertion through the patient's mouth and/or nose into the patient's airways; monitoring equipment 60 coupled to bronchoscope 50 for displaying video images received from bronchoscope 50; a tracking system 70 including a tracking module 72, a plurality of reference sensors 74, and an electromagnetic (EM) field generator 76; a workstation 80 including software and/or hardware used to facilitate pathway planning, identification of target tissue, navigation to target tissue, and digitally marking the biopsy location
As illustrated in
Catheter guide assemblies 90, 100 including LG 92 and EWC 96 are configured for insertion through a working channel of bronchoscope 50 into the patient's airways (although the catheter guide assemblies 90, 100 may alternatively be used without bronchoscope 50). LG 92 and EWC 96 are selectively lockable relative to one another via a locking mechanism 99. A six degrees-of-freedom electromagnetic tracking system 70, e.g., similar to those disclosed in U.S. Pat. No. 6,188,355 and published PCT Application Nos. WO 00/10456 and WO 01/67035, the entire contents of each of which is incorporated herein by reference, or any other suitable positioning measuring system, is utilized for performing navigation, although other configurations are also contemplated. Tracking system 70 is configured for use with catheter guide assemblies 90, 100 to track the position of EM sensor 94 as it moves in conjunction with EWC 96 through the airways of the patient, as detailed below.
As shown in
Also shown in
Although navigation is detailed above with respect to EM sensor 94 being included in LG 92 it is also envisioned that EM sensor 94 may be embedded or incorporated within biopsy tool 102 where biopsy tool 102 may alternatively be utilized for navigation without need of LG 92 or the necessary tool exchanges that use of LG 92 requires. A variety of useable biopsy tools are described in U.S. Provisional Patent Application No. 61/906,732, entitled DEVICES, SYSTEMS, AND METHODS FOR NAVIGATING A BIOPSY TOOL TO A TARGET LOCATION AND OBTAINING A TISSUE SAMPLE USING THE SAME, filed Nov. 20, 2013, U.S. patent application Ser. No. 14/488,754, entitled DEVICES, SYSTEMS, AND METHODS FOR NAVIGATING A BIOPSY TOOL TO A TARGET LOCATION AND OBTAINING A TISSUE SAMPLE USING THE SAME, filed Sep. 17, 2014, and U.S. patent application Ser. No. 14/564,779, entitled DEVICES, SYSTEMS, AND METHODS FOR NAVIGATING A BIOPSY TOOL TO A TARGET LOCATION AND OBTAINING A TISSUE SAMPLE USING THE SAME, filed on Dec. 9, 2014, the entire contents of each of which is incorporated herein by reference and useable with EMN system 10 as described herein.
During procedure planning, workstation 80 utilizes computed tomographic (CT) image data for generating and viewing the 3D model of the patient's airways, enables the identification of target tissue on the 3D model (automatically, semi-automatically or manually), and allows for the selection of a pathway through the patient's airways to the target tissue. More specifically, the CT scans are processed and assembled into a 3D volume, which is then utilized to generate the 3D model of the patient's airways. The 3D model may be presented on a display monitor associated with workstation 80, or in any other suitable fashion. Using workstation 80, various slices of the 3D volume and views of the 3D model may be presented and/or may be manipulated by a clinician to facilitate identification of a target and selection of a suitable pathway through the patient's airways to access the target. The 3D model may also show marks of the locations where previous biopsies were performed, including the dates, times, and other identifying information regarding the tissue samples obtained. These marks may also be selected as the target to which a pathway can be planned. Once selected, the pathway is saved for use during the navigation procedure. An example of a suitable pathway planning system and method is described in U.S. Patent Application Publication Nos. US 2014/0281961, US 2014/0270441, and US 2014/0282216, all entitled PATHWAY PLANNING SYSTEM AND METHOD, filed on Mar. 15, 2013, by Baker, the entire contents of each of which is incorporated herein by reference.
During navigation, EM sensor 94, in conjunction with tracking system 70, enables tracking of EM sensor 94 and/or biopsy tool 102 as EM sensor 94 or biopsy tool 102 is advanced through the patient's airways.
Turning now to
Memory 202 includes any non-transitory computer-readable storage media for storing data and/or software that is executable by processor 204 and which controls the operation of workstation 80. In an embodiment, memory 202 may include one or more solid-state storage devices such as flash memory chips. Alternatively or in addition to the one or more solid-state storage devices, memory 202 may include one or more mass storage devices connected to the processor 204 through a mass storage controller (not shown) and a communications bus (not shown). Although the description of computer-readable media contained herein refers to a solid-state storage, it should be appreciated by those skilled in the art that computer-readable storage media can be any available media that can be accessed by the processor 204. That is, computer readable storage media includes non-transitory, volatile and non-volatile, removable and non-removable media implemented in any method or technology for storage of information such as computer-readable instructions, data structures, program modules or other data. For example, computer-readable storage media includes RAM, ROM, EPROM, EEPROM, flash memory or other solid state memory technology, CD-ROM, DVD, Blu-Ray or other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to store the desired information and which can be accessed by workstation 80.
Memory 202 may store application 81 and/or CT data 214. Application 81 may, when executed by processor 204, cause display 206 to present user interface 216. Network interface 208 may be configured to connect to a network such as a local area network (LAN) consisting of a wired network and/or a wireless network, a wide area network (WAN), a wireless mobile network, a Bluetooth network, and/or the internet. Input device 210 may be any device by means of which a user may interact with workstation 80, such as, for example, a mouse, keyboard, foot pedal, touch screen, and/or voice interface. Output module 212 may include any connectivity port or bus, such as, for example, parallel ports, serial ports, universal serial busses (USB), or any other similar connectivity port known to those skilled in the art.
Referring now to
At step 306, application 81 displays guidance for performing automatic registration of the 3D model with the patient's airways, as described above, and in particular as described in co-pending U.S. patent application Ser. No. 14/790,581, entitled REAL TIME AUTOMATIC REGISTRATION FEEDBACK, filed on Jul. 2, 2015, by Brown et al., the entire contents of which is incorporated herein by reference. During registration, the location of EM sensor 94 within the patient's airways is tracked, and a plurality of points denoting the location of EM sensor 94 within the EM field generated by EM generator 76 is stored. At step 308, application 81 determines whether automatic registration has been completed. If no, processing returns to step 306, where further guidance is displayed to complete the automatic registration process. If yes, processing proceeds to step 310.
At step 310, application 81 begins the localized registration process by displaying guidance for navigating EM sensor 94 proximate a target 404. Thereafter, at step 312, application 81 determines one or more carina locations proximate the target. Application 81 determines the carina locations by analyzing the area of the 3D model proximate the target and any bifurcations in the airways. As shown in
At step 314, application 81 displays guidance for navigating EM sensor 94 into each airway branch 408 originating from a bifurcation at a carina 406. The clinician follows the displayed guidance to navigate EM sensor 94 in the patient's airways. For example, the guidance may instruct the clinician to navigate EM sensor 94 approximately 1 cm into each airway branch 408. Application 81 tracks the location of EM sensor 94 at step 316 as EM sensor 94 is navigated into the airway branches 408 originating from carina 406 and stores a plurality of points denoting the location of EM sensor 94 within the EM field generated by EM generator 76. Application 81 uses the stored points denoting the location of EM sensor 94 to, at step 318, perform localized registration of the 3D model with the patient's airways proximate the target. For example, localized registration may be performed based on a range of interpolation techniques, such as Thin Plates Splines (TPS) interpolation. In embodiments, TPS interpolation may be used for non-rigid registration of the points denoting the location of EM sensor 94 within the EM field generated by EM generator 76 stored during automatic registration with the 3D model, and may be augmented by additional points stored during localized registration.
Thereafter, at step 320, application 81 determines whether localized registration has been completed for the current target. If no, processing returns to step 314 where further guidance is displayed. If yes, processing proceeds to step 322 where application 81 determines if there are any more targets remaining in the navigation plan for which localized registration has not been performed. If yes, processing returns to step 310, where application 81 displays guidance for navigating EM sensor 94 proximate the next target. If no, the localized registration process is complete, and processing ends.
In addition to using carinas 406 for localized registration, carinas 406 may also be used as fiducial markers for locating target 404. Carinas 406 are particularly useful as fiducial markers because, unlike implanted foreign body markers, carinas 406 cannot migrate.
While several embodiments of the disclosure have been shown in the drawings, it is not intended that the disclosure be limited thereto, as it is intended that the disclosure be as broad in scope as the art will allow and that the specification be read likewise. Therefore, the above description should not be construed as limiting, but merely as exemplifications of particular embodiments. Those skilled in the art will envision other modifications within the scope and spirit of the claims appended hereto.
The present application claims the benefit of and priority to U.S. Provisional Application Ser. No. 62/246,721, filed on Oct. 27, 2015, the entire contents of which are incorporated herein by reference.
Number | Name | Date | Kind |
---|---|---|---|
5592939 | Martinelli | Jan 1997 | A |
5611025 | Lorensen et al. | Mar 1997 | A |
5676673 | Ferre et al. | Oct 1997 | A |
5697377 | Wittkampf | Dec 1997 | A |
5699799 | Xu et al. | Dec 1997 | A |
5715836 | Kliegis et al. | Feb 1998 | A |
5729129 | Acker | Mar 1998 | A |
5752513 | Acker et al. | May 1998 | A |
5782762 | Vining | Jul 1998 | A |
5881124 | Giger et al. | Mar 1999 | A |
5891030 | Johnson et al. | Apr 1999 | A |
5913820 | Bladen et al. | Jun 1999 | A |
5920319 | Vining et al. | Jul 1999 | A |
5967980 | Ferre et al. | Oct 1999 | A |
5971767 | Kaufman et al. | Oct 1999 | A |
5987960 | Messner et al. | Nov 1999 | A |
6019725 | Vesely et al. | Feb 2000 | A |
6047080 | Chen et al. | Apr 2000 | A |
6083162 | Vining | Jul 2000 | A |
6138045 | Kupinski et al. | Oct 2000 | A |
6151404 | Pieper | Nov 2000 | A |
6167296 | Shahidi | Dec 2000 | A |
6181348 | Geiger | Jan 2001 | B1 |
6201387 | Govari | Mar 2001 | B1 |
6233476 | Strommer et al. | May 2001 | B1 |
6246784 | Summers et al. | Jun 2001 | B1 |
6266551 | Osadchy et al. | Jul 2001 | B1 |
6332089 | Acker et al. | Dec 2001 | B1 |
6346940 | Fukunaga | Feb 2002 | B1 |
6366800 | Vining et al. | Apr 2002 | B1 |
6381485 | Hunter et al. | Apr 2002 | B1 |
6387092 | Burnside et al. | May 2002 | B1 |
6466815 | Saito et al. | Oct 2002 | B1 |
6496188 | Deschamps et al. | Dec 2002 | B1 |
6501848 | Carroll et al. | Dec 2002 | B1 |
6501981 | Schweikard et al. | Dec 2002 | B1 |
6505065 | Yanof et al. | Jan 2003 | B1 |
6522907 | Bladen et al. | Feb 2003 | B1 |
6526162 | Asano et al. | Feb 2003 | B2 |
6535756 | Simon et al. | Mar 2003 | B1 |
6578579 | Burnside et al. | Jun 2003 | B2 |
6584174 | Schubert et al. | Jun 2003 | B2 |
6603868 | Ludwig et al. | Aug 2003 | B1 |
6611793 | Burnside et al. | Aug 2003 | B1 |
6650927 | Keidar | Nov 2003 | B1 |
6651669 | Burnside | Nov 2003 | B1 |
6694163 | Vining | Feb 2004 | B1 |
6757557 | Bladen et al. | Jun 2004 | B1 |
6783523 | Qin et al. | Aug 2004 | B2 |
6792390 | Burnside et al. | Sep 2004 | B1 |
6829379 | Knoplioch et al. | Dec 2004 | B1 |
6850794 | Shahidi | Feb 2005 | B2 |
6892090 | Verard et al. | May 2005 | B2 |
6898263 | Avinash et al. | May 2005 | B2 |
6909913 | Vining | Jun 2005 | B2 |
6920347 | Simon et al. | Jul 2005 | B2 |
6925200 | Wood et al. | Aug 2005 | B2 |
7006677 | Manjeshwar et al. | Feb 2006 | B2 |
7072501 | Wood et al. | Jul 2006 | B2 |
7085400 | Holsing et al. | Aug 2006 | B1 |
7096148 | Anderson et al. | Aug 2006 | B2 |
7149564 | Vining et al. | Dec 2006 | B2 |
7167180 | Shibolet | Jan 2007 | B1 |
7174202 | Bladen et al. | Feb 2007 | B2 |
7179220 | Kukuk | Feb 2007 | B2 |
7233820 | Gilboa | Jun 2007 | B2 |
7236558 | Saito et al. | Jun 2007 | B2 |
7301332 | Govari et al. | Nov 2007 | B2 |
7315639 | Kuhnigk | Jan 2008 | B2 |
7324104 | Bitter et al. | Jan 2008 | B1 |
7336809 | Zeng et al. | Feb 2008 | B2 |
7397937 | Schneider et al. | Jul 2008 | B2 |
7428334 | Schoisswohl et al. | Sep 2008 | B2 |
7452357 | Vlegele et al. | Nov 2008 | B2 |
7505809 | Strommer et al. | Mar 2009 | B2 |
7517320 | Wibowo et al. | Apr 2009 | B2 |
7518619 | Stoval, III et al. | Apr 2009 | B2 |
7630752 | Viswanathan | Dec 2009 | B2 |
7630753 | Simon et al. | Dec 2009 | B2 |
7659912 | Akimoto et al. | Feb 2010 | B2 |
7702153 | Hong et al. | Apr 2010 | B2 |
7751865 | Jascob et al. | Jul 2010 | B2 |
7756316 | Odry et al. | Jul 2010 | B2 |
7788060 | Schneider | Aug 2010 | B2 |
7792565 | Vining | Sep 2010 | B2 |
7805269 | Glossop | Sep 2010 | B2 |
7809176 | Gundel | Oct 2010 | B2 |
7811294 | Strommer et al. | Oct 2010 | B2 |
7822461 | Geiger et al. | Oct 2010 | B2 |
7901348 | Soper et al. | Mar 2011 | B2 |
7907772 | Wang et al. | Mar 2011 | B2 |
7929014 | Akimoto et al. | Apr 2011 | B2 |
7951070 | Ozaki et al. | May 2011 | B2 |
7969142 | Krueger et al. | Jun 2011 | B2 |
7985187 | Wibowo et al. | Jul 2011 | B2 |
8009891 | de Vaan | Aug 2011 | B2 |
8049777 | Akimoto et al. | Nov 2011 | B2 |
8055323 | Sawyer | Nov 2011 | B2 |
8102416 | Ito et al. | Jan 2012 | B2 |
8126241 | Zarkh et al. | Feb 2012 | B2 |
8131344 | Strommer et al. | Mar 2012 | B2 |
8170328 | Masumoto et al. | May 2012 | B2 |
8199981 | Koptenko et al. | Jun 2012 | B2 |
8200314 | Bladen et al. | Jun 2012 | B2 |
8202213 | Ito et al. | Jun 2012 | B2 |
8208708 | Homan et al. | Jun 2012 | B2 |
8219179 | Ganatra et al. | Jul 2012 | B2 |
8257346 | Qin et al. | Sep 2012 | B2 |
8267927 | Dalal et al. | Sep 2012 | B2 |
8290228 | Cohen et al. | Oct 2012 | B2 |
8298135 | Ito et al. | Oct 2012 | B2 |
8391952 | Anderson | Mar 2013 | B2 |
8417009 | Mizuno | Apr 2013 | B2 |
8494612 | Vetter et al. | Jul 2013 | B2 |
8509877 | Mori et al. | Aug 2013 | B2 |
8672836 | Higgins et al. | Mar 2014 | B2 |
8682045 | Vining et al. | Mar 2014 | B2 |
8696549 | Holsing et al. | Apr 2014 | B2 |
8698806 | Kunert et al. | Apr 2014 | B2 |
8700132 | Ganatra et al. | Apr 2014 | B2 |
8706193 | Govari et al. | Apr 2014 | B2 |
8709034 | Keast et al. | Apr 2014 | B2 |
8730237 | Ruijters et al. | May 2014 | B2 |
8768029 | Helm et al. | Jul 2014 | B2 |
8784400 | Roschak | Jul 2014 | B2 |
8798227 | Tsukagoshi et al. | Aug 2014 | B2 |
8798339 | Mielekamp et al. | Aug 2014 | B2 |
8801601 | Prisco et al. | Aug 2014 | B2 |
8819591 | Wang et al. | Aug 2014 | B2 |
8821376 | Tolkowsky | Sep 2014 | B2 |
8862204 | Sobe et al. | Oct 2014 | B2 |
20040249267 | Gilboa | Dec 2004 | A1 |
20050182295 | Soper et al. | Aug 2005 | A1 |
20080183073 | Higgins et al. | Jul 2008 | A1 |
20090012390 | Pescatore et al. | Jan 2009 | A1 |
20090030306 | Miyoshi et al. | Jan 2009 | A1 |
20100030064 | Averbuch | Feb 2010 | A1 |
20100310146 | Higgins et al. | Dec 2010 | A1 |
20100312094 | Guttman et al. | Dec 2010 | A1 |
20110237897 | Gilboa | Sep 2011 | A1 |
20110251607 | Kruecker et al. | Oct 2011 | A1 |
20120203065 | Higgins et al. | Aug 2012 | A1 |
20120249546 | Tschirren et al. | Oct 2012 | A1 |
20120280135 | Bal | Nov 2012 | A1 |
20120287238 | Onishi et al. | Nov 2012 | A1 |
20130165854 | Sandhu et al. | Jun 2013 | A1 |
20130223702 | Holsing | Aug 2013 | A1 |
20160000356 | Brown et al. | Jan 2016 | A1 |
Number | Date | Country |
---|---|---|
2012614738 | Aug 2012 | JP |
Entry |
---|
Canadian Office Action for application No. 2,945,884 dated Feb. 28, 2018 (4 pages). |
“Three-dimensional CT-Guided Bronchoscopy With a Real-Time Electromagnetic Position Sensor: A Comparison of Two Image Registration Methods”, Stephen B. Solomon, et al.; Chest, American College of Chest Physicians, US, vol. 118, Dec. 1, 2000 (5 pages). |
Extended European Search Report issued by the European Patent Office corresponding to European Patent Application No. 16195857.4; dated Mar. 13, 2017 (9 pages). |
Office Action issued by the Japanese Patent Office corresponding to Japanese Patent Application No. 2016-209580; dated Aug. 31, 2017 with English Translation (7 pages). |
Australian Examination Report for application No. 2016250341 dated Oct. 14, 2017 (3 pages). |
Japanese Office Action for application No. 2016-209580 dated Jan. 4, 2018 with English translation (5 pages). |
Pre-Appeal Examination Report issued by the Japanese Patent Office in Application No. 2016-209680, dated Jul. 20, 2018. |
Australian Examination Report No. 3 issued in Appl. No. AU 2016250341 dated Aug. 7, 2018 (3 pages). |
Japanese Office Action issued in corresponding Appl. No. JP 2018-088636 dated Mar. 22, 2019, together with English language translation (7 pages). |
Japanese Office Action issued in corresponding Appl. No. JP 2016-209580 dated Mar. 29, 2019, together with English language translation (11 pages). |
Number | Date | Country | |
---|---|---|---|
20170112411 A1 | Apr 2017 | US |
Number | Date | Country | |
---|---|---|---|
62246721 | Oct 2015 | US |