The present disclosure relates to tools for assisting surgeons during performance of medical procedures, and more particularly, to systems, devices, and methods for providing image-based guidance during medical procedures.
Pulmonary disease may cause one or more portions of a patient's lungs may lose its ability to function normally and thus may need to be treated. Lung treatment procedures may be very complex and would be greatly aided if the surgeon performing the procedure can visualize the way airways and other structures in the patient's lungs are shaped and where tools are located. Traditional pre-operative images are helpful, to an extent, with the former, but provide no guidance with regard to the latter.
Systems for displaying images and tracking tools in the patient's lungs generally rely on pre-operative data, such as from computed tomography (CT) scans performed before, sometimes days or weeks in advance, the treatment procedure begins. However, such systems do not account for changes that may have occurred after the CT scan was performed, or for movement occurring during the treatment procedure. Systems, devices, and methods for improving on the process of identifying and visualizing a patient's lungs, as well as structures and tools located therein, are described below.
Provided in accordance with the present disclosure is a method of providing visual guidance for navigating inside a patient's chest. According to an embodiment of the present disclosure, the method includes presenting a three-dimensional (3D) model of a luminal network, generating, by an electromagnetic (EM) field generator, an EM field about the patient's chest, detecting a location of an EM sensor within the EM field, determining a position of a tool within the patient's chest based on the detected location of the EM sensor, displaying an indication of the position of the tool on the 3D model, receiving cone beam computed tomography (CBCT) image data of the patient's chest, detecting the location of the tool within the patient's chest based on the CBCT image data, and updating the indication of the position of the tool on the 3D model based on the detected location.
In another aspect of the present disclosure, the method further includes providing guidance for positioning a CBCT imaging device based on the detected location of the EM sensor.
In yet another aspect of the present disclosure, the method further includes imaging a portion of the patient's chest about the detected location of the EM sensor to generate the CBCT image data.
In a further aspect of the present disclosure, the presented 3D model includes an indication of a target location and a pathway for navigating the tool to the target location.
In yet a further aspect of the present disclosure, a first portion of the pathway to the target location is located inside of the luminal network and a second portion of the pathway to the target location is located outside of the luminal network.
In still a further aspect of the present disclosure, the CBCT data is received while the tool is navigated along the second portion of the pathway to the target location.
In yet a further aspect of the present disclosure, the second portion of the pathway to the target location is updated based on the CBCT data.
In another aspect of the present disclosure, the method further includes providing a notification that the EM field may be distorted based on the position of the CBCT imaging device.
In a further aspect of the present disclosure, the EM field generator compensates for distortion of the EM field based on the position of the CBCT imaging device.
In another aspect of the present disclosure, the method further includes providing a notification when one or more EM sensors on the patient's chest indicate that the patient has moved during the imaging.
In yet another aspect of the present disclosure, the method further includes disabling the EM field generator prior to imaging the portion of the patient's chest, and enabling the EM field generator after imaging the portion of the patient's chest.
In still another aspect of the present disclosure, the CBCT image data is generated during a first phase of the patient's respiratory cycle, and the method further includes imaging the portion of the patient's chest about the detected location of the EM sensor to generate additional CBCT image data during a second phase of the patient's respiratory cycle.
In a further aspect of the present disclosure, the method further includes monitoring the patient's respiratory cycle to determine a current phase of the patient's respiratory cycle, and displaying one of the CBCT image data or the additional CBCT image data corresponding to the current phase of the patient's respiratory cycle.
In yet a further aspect of the present disclosure, the monitoring of the patient's respiratory cycle is based on sensors located on the patient's chest.
In still a further aspect of the present disclosure, the monitoring of the patient's respiratory cycle is based on a ventilator connected to the patient.
In another aspect of the present disclosure, the method further includes monitoring the patient's respiratory cycle to determine a current phase of the patient's respiratory cycle, and displaying a notification when the current phase of the patient's respiratory cycle does not correspond to the CBCT image data or the additional CBCT image data.
In yet another aspect of the present disclosure, the method further includes identifying the tool in the CBCT image data, orienting the CBCT image data based on a position and orientation of the tool in the CBCT image data, and displaying the CBCT image data based on the orienting.
In a further aspect of the present disclosure, the tool is an ablation device, and the method further includes identify a radiating portion of the ablation device, determining a projected ablation zone based on the position and orientation of the ablation device and ablation settings, and displaying an indication of the projected ablation zone on the CBCT image data.
In another aspect of the present disclosure, the method further includes determining an estimated ablation zone based on the position and orientation of the ablation device, the ablation settings, and an elapsed time, and displaying an indication of the estimated ablation zone on the CBCT image data.
In yet another aspect of the present disclosure, the method further includes receiving fluoroscopic image data of the patient's chest, identifying the tool within the patient's chest in the fluoroscopic image data, and overlaying the CBCT image data onto the fluoroscopic image data based on the location of the tool.
Provided in accordance with an embodiment of the present disclosure is a non-transitory computer-readable storage medium storing instructions which, when executed by a processor, cause a computing device to receive first image data of a patient's chest, identify a luminal network in the patient's chest based on the first image data, generate a three-dimensional (3D) model of the luminal network based on the first image data, detect a location of an EM sensor within an electromagnetic (EM) field generated about the patient's chest by an EM field generator, determine a position of a tool within the patient's chest based on the detected location of the EM sensor, cause a display device to display an indication of the position of the tool on the 3D model, receive second image data of the patient's chest, detect a location of the tool within the patient's chest based on the second image data, and update the indication of the position of the tool on the 3D model based on the detected location.
Any of the above aspects and embodiments of the present disclosure may be combined without departing from the scope of the present disclosure.
Various aspects and features of the present disclosure are described hereinbelow with references to the drawings, wherein:
The present disclosure is directed to devices, systems, and methods for using cone beam computed tomography (CBCT) images while navigating tools in a patient's lungs and performing treatment in the patient's lungs. More particularly, the disclosure relates to integrating CBCT image data with a lung navigation system to update and/or improve localization and visualization of treatment targets and tools within the patient's lungs, and to update image-based guidance for a treatment procedure. The CBCT image data may be displayed in conjunction with or alongside a digital reconstruction, such as a three-dimensional (3D) model or map, of the patient's lungs, as well as image data from other imaging modalities including computed tomography (CT) images, magnetic resonance (MR) images, positron emission tomography (PET) images, fluoroscopic and other X-ray type images, and/or ultrasound images. The 3D model may be constructed based on preoperative image data from one or more of the aforementioned imaging modalities. Alternatively or in addition, a CBCT scan performed at the start of the treatment procedure, which may also be used for registration purposes as further described below, may further be used for constructing and/or enhancing the 3D model.
To create the 3D model, a preoperative segmental and subsegmental delineation and extrapolation may be performed based on image data of the patient's lungs to create a visual representation of the patient's lungs, including lumens, pleural surfaces and fissures of the patient's lungs, and/or tumors or other aberrant structures that may be present in the patient's lungs. The delineation may be performed using one or more software applications executing on a computer. The application may generate the 3D model of the patient's lungs based on radiographically obtained image data, noted above, to use for the visual representation of the patient's lungs.
The 3D model may show, among other things, the lumens, pleural surfaces and fissures, and other structures of the patient's lungs. The image data may further be processed to identify one or more treatment targets, such as tumors or other aberrant structures, in the patient's lungs. For example, the application may identify the locations of lumens, such as airways, blood vessels, and/or lymphatic structures from the image data, and, based on the locations of the lumens, determine where lung fissures are located and a degree of completeness of the fissures, as well as determine the locations of the pleural surfaces and/or treatment targets. The 3D model and image data may then be viewed by a clinician and/or surgeon to plan a medical treatment procedure, such as a surgical or interventional procedure. The 3D model and/or treatment plan may further be stored for later viewing during the treatment procedure in an operating room or the like.
As described further below, the treatment plan may include identified locations for one or more treatment targets, such as tumors, lesions, or other aberrant structures identified in the image data, and a pathway between the patient's trachea and each of the treatment targets. The pathway may include a portion located inside lumens, such as airways, of the patient's lungs, and a portion located outside of the airways of the patient's lungs. An “exit point” may mark the transition point between the portion of the pathway located inside the patient's airways and the portion of the pathway located outside of the patient's airways.
During the treatment procedure, the 3D model may be displayed, as further described below, to assist the clinician in navigating one or more tools to the treatment target. The 3D model may include an indicator of a tracked position of the tool inside the patient's lungs. At various times during the treatment procedure, additional image data, such as the CBCT data described above, may be collected to show a real-time location of the tool and/or the treatment target in the patient's lungs. For example, after the tool passes the “exit point” and is located outside of the patient's airways, or at any other time of the clinician's choosing, a CBCT scan may be performed and the collected data processed to identify the tool and/or the treatment target. The indicator on 3D model of the tracked position of the tool may then be updated based on the image data collected from the CBCT scan, thereby showing a confirmed location of the tool and/or the treatment target. The image data collected from the CBCT scan may further show, and thus provide a software application the ability to track, the location of the tool during various phases of the patient's respiration cycle. While the 3D model may be generated based on image data acquired while the patient was in a particular phase of the respiration cycle, e.g. full breath hold, the patient will not be maintaining that phase of the respiration cycle for the entire duration of the treatment procedure. Thus, acquiring image data during the treatment procedure, during various phases of the patient's respiration cycle, particularly during normal tidal volume breathing, may provide a clearer and more accurate visualization of the location of the tool and the treatment target inside the patient's lungs, as well as the position of the treatment tool relative to the treatment target. As such, the intra-procedural CBCT scan may be used to confirm placement of the tool at the treatment target.
The methods, systems, devices, and computer-readable media described herein are useful in various planning and/or navigation contexts for treatment procedures performed on the patient's lungs and surrounding tissue. For example, in an embodiment in which a clinician is performing treatment of an area of the patient's lungs, the methods and systems may provide the clinician with various views of the patient's lungs and the location of the tool and treatment target therein. Additionally, as will be described in further detail below, the methods and systems may provide the clinician with the ability to update the indicated location of the tool and/or treatment target on the 3D model at a time of the clinician's choosing, by performing one or more intra-operative CBCT scans to collect image data about the location of the tool and/or treatment target in the patient's lungs. These and other aspects of the present disclosure are detailed hereinbelow.
An electromagnetic navigation (EMN) system, such as the ELECTROMAGNETIC NAVIGATION BRONCHOSCOPY® system currently sold by Medtronic PLC under the brand name SUPERDIMENSION®, may be used for planning and performing treatment of an area of a patient's lungs. Generally, in an embodiment, the EMN system may be used in planning treatment of an area of the patient's lungs by identifying the locations of one or more treatment targets in the patient's lungs, selecting one or more of the treatment targets as a target location, determining a pathway to the target location, navigating a positioning assembly to the target location, and navigating a variety of tools to the target location via the positioning assembly. The EMN system may be configured to display various views of the patient's lungs, including the aforementioned image data and 3D model.
With reference to
Bronchoscope 50 is configured for insertion through the patient's mouth and/or nose into the patient's airways. Bronchoscope 50 includes a source of illumination and a video imaging system (not explicitly shown) and is coupled to monitoring equipment 30, for example, a video display, for displaying the video images received from the video imaging system of bronchoscope 50. In an embodiment, bronchoscope 50 may operate in conjunction with a catheter guide assembly 90. Catheter guide assembly 90 includes a locatable guide (LG) 92 and an extended working channel (EWC) 96 configured for insertion through a working channel of bronchoscope 50 into the patient's airways (although the catheter guide assembly 90 may alternatively be used without bronchoscope 50). Catheter guide assembly 90 includes a handle 91 connected to EWC 96, and which can be manipulated by rotation and compression to steer LG 92 and EWC 96. EWC 96 is sized for placement into the working channel of bronchoscope 50. In the operation of catheter guide assembly 90, LG 92, including an EM sensor 94, is inserted into EWC 96 and locked into position such that EM sensor 94 extends a desired distance beyond a distal tip 93 of EWC 96. The location of EM sensor 94, and thus distal tip 93 of EWC 96, within an EM field generated by EM field generator 76, can be derived by tracking module 72 and computing device 80. For a more detailed description of catheter guide assembly 90, reference is made to commonly-owned U.S. Pat. No. 9,247,992, entitled “MICROWAVE ABLATION CATHETER AND METHOD OF UTILIZING THE SAME”, filed on Mar. 15, 2013, by Ladtkow et al., the entire contents of which are hereby incorporated by reference.
LG 92 and EWC 96 are selectively lockable relative to one another via a locking mechanism 99. A six degrees-of-freedom EM tracking system 70, e.g., similar to those disclosed in U.S. Pat. No. 6,188,355 and published PCT Application Nos. WO 00/10456 and WO 01/67035, entitled “WIRELESS SIX-DEGREE-OF-FREEDOM LOCATOR”, filed on Dec. 14, 1998 by Gilboa, the entire contents of each of which is incorporated herein by reference, or any other suitable positioning measuring system, is utilized for performing navigation, although other configurations are also contemplated.
EM tracking system 70 may be configured for use with catheter guide assembly 90 to track a position of EM sensor 94 as it moves in conjunction with EWC 96 through the airways of the patient, as detailed below. In an embodiment, EM tracking system 70 includes a tracking module 72, a plurality of reference sensors 74, and an EM field generator 76. As shown in
Although EM sensor 94 is described above as being included in LG 92, it is also envisioned that EM sensor 94 may be embedded or incorporated within a treatment tool, such as a biopsy tool 62 and/or an ablation tool 64, where the treatment tool may alternatively be utilized for navigation without need of LG 92 or the necessary tool exchanges that use of LG 92 requires. EM sensor 94 may also be embedded or incorporated within EWC 96, such as at a distal portion of EWC 96, thereby enabling tracking of the distal portion of EWC 96 without the need for a separate LG 92.
According to an embodiment, treatment tools 62, 64 are configured to be insertable into catheter guide assembly 90 following navigation to a target location and removal of LG 92. Biopsy tool 62 may be used to collect one or more tissue samples from the target location, and in an embodiment, is further configured for use in conjunction with tracking system 70 to facilitate navigation of biopsy tool 62 to the target location, and tracking of a location of biopsy tool 62 as it is manipulated relative to the target location to obtain the tissue sample. Ablation tool 64 is configured to be operated with a generator 66, such as a radio frequency generator or a microwave generator, and may include any of a variety of ablation tools and/or catheters, examples of which are more fully described in U.S. Pat. Nos. 9,259,269; 9,247,993; 9,044,254; and 9,370,398, and U.S. Patent Application Publication No. 2014/0046211, all entitled “MICROWAVE ABLATION CATHETER AND METHOD OF USING THE SAME”, filed on Mar. 15, 2013, by Ladtkow et al., the entire contents of each of which is incorporated herein by reference. Though shown as a biopsy tool and microwave ablation tool in
A radiographic imaging device 20, such as a C-arm imaging device capable of performing a CBCT scan of at least a portion of the patient's lungs, may be used in conjunction with EMN system 100. Imaging device 20 may further be capable of performing fluoroscopic scans of the patient's lungs. As shown in
Computing device 80 includes software and/or hardware, such as application 81, used to facilitate the various phases of an EMN procedure, including generating the 3D model, identifying a target location, planning a pathway to the target location, registering the 3D model with the patient's actual airways, navigating to the target location, and performing treatment at the target location. For example, computing device 80 utilizes data acquired from a CT scan, CBCT scan, magnetic resonance imaging (MRI) scan, positron emission tomography (PET) scan, and/or any other suitable imaging modality to generate and display the 3D model of the patient's airways, to enable identification of a target location on the 3D model (automatically, semi-automatically or manually) by analyzing the image data and/or 3D model, and allow for the determination and selection of a pathway through the patient's airways to the target location. While the image data may have gaps, omissions, and/or other imperfections included in the image data, the 3D model is a smooth representation of the patient's airways, with any such gaps, omissions, and/or imperfections in the image data filled in or corrected. The 3D model may be presented on a display monitor associated with computing device 80, or in any other suitable fashion. An example of the planning software described herein can be found in U.S. Patent Publication Nos. 2014/0281961, 2014/0270441, and 2014/0282216, filed by Baker et al. on Mar. 15, 2013, and entitled “PATHWAY PLANNING SYSTEM AND METHOD”, the contents of all of which are incorporated herein by reference. Further examples of the planning software can be found in commonly assigned U.S. Patent Publication No. 2016/0000302, entitled “SYSTEM AND METHOD FOR NAVIGATING WITHIN THE LUNG”, filed on Jun. 29, 2015, by Brown et al., the contents of which are incorporated herein by reference.
Using computing device 80, various views of the image data and/or 3D model may be displayed to and manipulated by a clinician to facilitate identification of the target location. As noted above, the target location may be a site within the patient's lungs where treatment is to be performed. For example, the treatment target may be located in lung tissue adjacent to an airway. The 3D model may include, among other things, a model airway tree corresponding to the actual airways of the patient's lungs, and show the various passages, branches, and bifurcations of the patient's actual airway tree. Additionally, the 3D model may include lesions, markers, blood vessels and vascular structures, lymphatic vessels and structures, organs, other physiological structures, and/or a 3D rendering of the pleural surfaces and fissures of the patient's lungs. Some or all of the aforementioned elements may be selectively displayed, such that the clinician may choose which elements should be displayed when viewing the 3D model.
After identifying the target location, application 81 may determine a pathway between the patient's trachea and the target location via the patient's airways. In instances where the target location is located in lung tissue that is not directly adjacent an airway, at least a portion of the pathway will be located outside of the patient's airways to connect an exit point on an airway wall to the target location. In such instances, LG 94 and EWC 96 will first be navigated along a first portion of the pathway through the patient's airways to the exit point on the airway wall. LG 94 may then be removed from EWC 96 and an access tool, such as a piercing or puncture tool, inserted into EWC 96 to create an opening in the airway wall at the exit point. EWC 96 may then be advanced through the airway wall into the parenchyma surrounding the airways. The access tool may then be removed from EWC 96 and LG 94 and/or tools 62, 64 reinserted into EWC 96 to navigate EWC 96 along a second portion of the pathway outside of the airways to the target location.
During a procedure, EM sensor 94, in conjunction with tracking system 70, enables tracking of EM sensor 94 (and thus distal tip 93 of EWC 96 or tools 62, 64) as EM sensor 94 is advanced through the patient's airways following the pathway planned during the planning phase. As an initial step of the procedure, the 3D model is registered with the patient's actual airways to enable application 81 to display an indication of the location of EM sensor 94 on the 3D model corresponding to the location of EM sensor 94 within the patient's airways.
One potential method of registration involves performing a survey of the patient's lungs by navigating LG 92 into each lobe of the patient's lungs to at least the second bifurcation of the airways of that lobe. The position of LG 92 is tracked during this registration phase, and the 3D model is iteratively updated based on the tracked position of the locatable guide within the actual airways of the patient's lungs. This registration process is described in commonly-owned U.S. Patent Application Publication No. 2011/0085720, entitled “AUTOMATIC REGISTRATION TECHNIQUE,” filed on May 14, 2010, by Barak et al., and U.S. Patent Publication No. 2016/0000356, entitled “REAL-TIME AUTOMATIC REGISTRATION FEEDBACK”, filed on Jul. 2, 2015, by Brown et al., the contents of each of which are incorporated herein by reference. While the registration process focuses on aligning the patient's actual airways with the airways of the 3D model, registration also ensures that the position of vascular structures, pleural surfaces, and fissures of the lungs are accurately determined.
Another potential method of registration uses image data from a CBCT scan performed at the start of the treatment procedure to generate the 3D model with the patient remaining on table 40 while the clinician performs the above-described planning phase. Because the scan is taken with reference sensors 74 placed on the patient, the anatomy of the patient relative to reference sensors 74 is known. By performing the scan with reference sensors 74 placed on the patient, performing registration by using the lung survey technique, described above, becomes unnecessary. Additionally, features and sensors in EM field generator 76 under the patient may also be used as another means to help ensure the target location is placed within the ENB field. The clinician may then start the navigation phase of the procedure without performing the above-described survey of the patient's lungs because the patient will still be in substantially the same position as the patient was when the image data on which the 3D model is based were obtained. Thus, application 81 may extrapolate sufficient data points from the position of EM sensor 94 within the EM field while LG 92 is navigated along the planned pathway to register the 3D model to the patient's actual airways while the navigation phase is in process.
At various times during the procedure, the clinician may request that additional CBCT scans be performed on the patient. The additional CBCT scans may be directed at a particular location in the patient's body, such as an area of the patient's lungs about the position of LG 92, for which the clinician wants additional image data. For example, the additional image data may be used to confirm the position of EM sensor 94 (representing the location of LG 92 and/or tool 62, 64) and/or the target location within the patient's lungs. Application 81 may receive the image data acquired by the additional CBCT scan and process the additional image data to identify the position of EM sensor 94 and/or the target location within the patient's lungs. Application 81 may then update the indicator of the position of EM sensor 94 on the 3D model based on the additional CBCT image data if the additional image data indicates that the position displayed based on the original image data is incorrect. In some embodiments, the additional CBCT scans may be performed based on the patient's breathing or respiratory cycle, such as to acquire image data during different phases of the patient's respiratory cycle, as further described below. In addition to CBCT scans, the clinician may also request that fluoroscopic scans be performed at various times during the procedure. Image data acquired from the fluoroscopic scans may further be used to assist the clinician in navigating and positioning LG 92 about the target location.
Turning now to
With reference to
Starting with
After receiving the image data, application 81 processes the image data, at step S304, to identify a luminal network in the image data. The luminal network may be the patient's airways, blood vessels, and/or lymphatic vessels in the patient's lungs. Application 81 may further process the image data to identify other structures, such as the pleura and fissures of the patient's lungs, other organs and critical structures, and/or aberrant structures in and/or around the patient's lungs. Application 81 then, at step S306, generates a 3D model based on the processed image data. Based on the image data and/or the 3D model, at least one treatment target is identified, either automatically by application 81, semi-automatically with input from the clinician, or manually by the clinician. After identifying the treatment target, a target location representative of the identified location of the treatment target is marked on the 3D model, and application 81 determines a pathway between the patient's trachea and the target location via the patient's airways.
As noted above, the pathway may include various portions, including at least one portion located inside the patient's airways running between the trachea and an exit point in an airway wall proximate the target location, and at least one portion located outside the patient's airways running from the exit point to the target location. The pathway represents a recommended route along which LG 92 or other tool including sensor 94 should be navigated through the patient's airways and, as described further below, after reaching the exit point, through the tissue and space surrounding the patient's airways. Application 81 displays the pathway and the target location on the 3D model at step S308.
Next, the navigation phase of the treatment procedure commences. As an initial task, the 3D model must be registered to the actual airways of the patient's lungs. As described above, there are various methods of registration that may be used for this purpose, including the lung survey method described above. Alternatively, as also mentioned above, if the 3D model is generated based on image data from a CBCT scan performed at the start of the treatment procedure, and the patient remains in substantially the same position on table 40 during the above-described planning phase, a lung survey may not be necessary, because application 81 may collect sufficient data points regarding the position of LG 92 in the patient's airways during the initial portion of the navigation phase to register the 3D model to the patient's airways while navigation is in progress.
To start this process, at step S310, EM field generator 76 generates an EM field about the patient's chest. An EM sensor 94, whether included in LG 92, tools 62, 64, and/or directly in EWC 96, is then inserted into the patient's airways and a location of EM sensor 94 in the EM field is detected by tracking system 70 at step S311. The detected location of EM sensor 94 is relayed to application 81 to determine, at step S312, a position of LG 92, tools 62, 64, and/or EWC 96 based on the detected location of EM sensor 94. As mentioned herein, the detected position of EM sensor 94 may be reflective of whichever tool EM sensor 94 is included in, such as LG 92, tools 62, 64, and/or EWC 96, but for purpose of brevity and to ease this description, an example wherein LG 92 is used for navigation will be described hereinafter. However, those skilled in the art will realize that any of the other aforementioned tools and devices including an EM sensor 94 could be substituted for LG 92.
Next, at step S313, Application 81 displays an indication of the determined position of LG 92 on the 3D model. For example, as shown in
As noted above, the clinician may, at various times during the procedure, request that an intra-procedural CBCT scan be performed to verify the determined position of LG 92. For example, the clinician may request that a CBCT scan be performed when LG 92 reaches the exit point where the pathway moves from within the patient's airways to outside of the patient's airways. The clinician may also request that a CBCT scan be performed after LG 92 has been navigated through the airway wall and towards the target location outside of the patient's airways. Further, the clinician may request that a CBCT scan be performed to confirm the location of LG 92 when LG 92 is navigated proximate the target location, such as to confirm placement of LG 92 at the treatment target. Thus, application 81 determines, at step S314, whether a request, such as a button press or other user input, for a CBCT scan has been received. If a request for a CBCT scan has not been received, application 81 continues tracking the location of EM sensor 94 at step S315, whereafter processing returns to step S311. However, if application 81 determines that a request for a CBCT scan has been received, processing proceeds to step S320.
Turning now to
If imaging device 20 is within the EM field, application 81 may provide a notification, at step S322, informing the clinician that the EM field may be distorted, such as by metal and/or other EM components included in imaging device 20 being present in the EM field. The notification may be a visual and/or audio notification, and may alert the clinician that the displayed location of LG 92 may be incorrect due to the distortion of the EM field. Application 81 and/or tracking system 70 may then, at step S323, compensate for the distortion of the EM field, such as by adjusting the EM field and/or the displayed position of LG 92 on the 3D model based on the distortion of the EM field.
Thereafter, or if application 81 determines at step S321 that imaging device 20 is not in the EM field, processing proceeds to step S324 where application 81 determines whether the imaging device is ready for imaging. In some embodiments, application 81 may be able to actively track the location of imaging device 20, such as by sensors included in imaging device 20. Alternatively, application 81 may determine whether imaging device 20 is ready for imaging based on input from the clinician. If application 81 determines that imaging device 20 is not yet ready for imaging, processing returns to step S320. Alternatively, if application 81 determines that imaging device 20 is ready for imaging, processing proceeds to step S325, where application 81 determines a current phase of the patient's respiratory cycle. The current phase of the patient's respiratory cycle may be determined based on data received from sensors, such as reference sensors 74 located on the patient's chest. Further information on determination of a patient's respiratory cycle and compensation for movement occurring during the patient's respiratory cycle may be found in commonly-owned co-pending U.S. patent application Ser. No. 15/254,141, entitled RESPIRATION MOTION STABILIZATION FOR LUNG MAGNETIC NAVIGATION SYSTEM, filed on Sep. 1, 2016, by Koyrakh et al., the entire contents of which are incorporated herein by reference. Alternatively, or in addition, the current phase of the patient's respiratory cycle may be determined based on data received from a ventilator coupled to the patient. The clinician may request that the CBCT scan be performed during a particular desired phase of the patient's respiratory cycle, such as full breath hold, full exhale, etc. Therefore, at step S326, application 81 determines whether the current phase of the patient's respiratory cycle corresponds to the desired phase of the patient's respiratory cycle requested by the clinician. If application 81 determines that the current phase of the patient's respiratory cycle does not correspond to the desired phase, the method proceeds to step S327, where application 81 waits for the patient's respiratory cycle to enter the desired phase. Thereafter, processing returns to step S325 to again determine the current phase of the patient's respiratory cycle. If application 81 determines that the current phase of the patient's respiratory cycle corresponds to the desired phase, processing proceeds to step S330.
At step S330, application 81 causes tracking system 70 to disable EM field generator 76 to avoid interference with imaging device 20 during the imaging process. Thereafter, at step S331, the CBCT imaging is performed. The CBCT imaging may be performed manually by the clinician interacting with imaging device 20 to perform the CBCT imaging. Alternatively, the CBCT imaging may be performed automatically or semi-automatically via application 81 directly or indirectly controlling imaging device 20. After the CBCT imaging is complete, detected either based on input from the clinician or based on a signal received from imaging device 20, processing proceeds to step S332 where application 81 causes tracking system 70 to re-enable EM field generator 76.
Thereafter, application 81 determines, at step S333, whether the patient moved during the CBCT imaging process. The determination may be based on data received from sensors, such as reference sensors 74, indicating the patient's current position relative to the patient's position prior to the CBCT imaging, and/or indicating movement of the patient during the CBCT imaging process. If application 81 determines that the patient moved during the CBCT imaging process, application 81 determines, at step S334, whether an amount of the movement is within a predetermined threshold. For example, mere minor movement may be insufficient to affect the CBCT image data collected during the CBCT imaging process, while more significant movement may cause the CBCT image data to be unusable. Therefore, if application 81 determines that the movement is not within the predetermined threshold, and thus exceeds the predetermined threshold, application 81 may mark the CBCT image data received from imaging device 20 after the CBCT imaging as unusable, and reinitiate the CBCT imaging process at step S335, whereafter processing returns to step S324. Alternatively, if application 81 determines that the movement is within the predetermined threshold, application 81 may provide a notification at step S336, indicating that the patient moved but that the movement was within the predetermined threshold. Thereafter, or if application determined at step S333 that the patient did not move during the CBCT imaging process, processing proceeds to step S340.
Returning now to
Turning now to
If application 81 determined at step S351 that CBCT image data corresponding to the current phase of the patient's respiratory cycle is available, application 81 selects such CBCT image data at step S352. At step S355, application 81 identifies LG 92 in the CBCT image data, similar to the identification of LG 92 at step S341. Application 81 further determines a position and orientation of LG 92 based on the CBCT image data, and orients the CBCT image data based on the determined position and orientation of LG 92 at step S356. Application 81 then displays the CBCT image data at step S357, as shown in view 430 of
Turning now to
Application 81 then, at step S363, displays the CBCT image data in conjunction with the fluoroscopic image data. In an embodiment, application 81 may display the CBCT image data as an overlay onto the fluoroscopic image data, as shown in view 420 of
Thereafter, at step S364, application 81 again determines whether LG 92 has been placed at the target location. As with step S358 described above, the determination may be based on processing of the CBCT image data, tracking data received from tracking system 70, and/or input provided by the clinician. If application 81 determines that LG 92 has not been placed at the target location, or if application 81 cannot determine that LG 92 has been placed at the target location, processing returns to step S314 for further imaging and/or navigation.
Alternatively, if application 81 determines at either step S358 or step S364 that LG 92 has been placed at the target location, processing proceeds to step S359 (
Turning now to
After identifying the radiating portion of ablation tool 64, application 81, at step S371, determines a projected ablation zone based on the location of ablation tool 64, and thus the radiating portion of ablation tool 64, inside the patient's chest, and configuration settings for the ablation procedure. As part of the above-described planning phase, and/or after placement of LG 92 at the target location or at any point prior to step S371, the clinician may enter configuration settings, such as time, temperature, wattage, etc., for the ablation procedure into computing device 80. Application 81 may then, at step S371, use such configuration settings to determine a projected ablation zone representing a maximum area around the radiating portion of ablation tool 64 that will be ablated according to the configuration settings.
Application 81 may then, at step S372, display an indicator of the projected ablation zone on the CBCT image data, the fluoroscopic image data, and/or the 3D model. At step S373, application 81 determines whether the clinician has approved the projected ablation zone. For example, the projected ablation zone may be represented by a sphere surrounding ablation tool 64, with an anchor point of the sphere being the radiating portion of ablation tool 64. Based on the displayed projected ablation zone, the clinician may decide to adjust the configuration settings for the ablation procedure and/or the location of ablation tool 64. For example, based on the displayed projected ablation zone, the clinician may determine that the projected ablation zone does not sufficiently cover the treatment target, and thus choose to adjust the configuration settings or the location of ablation tool 64. The clinician may input a decision to proceed with the ablation procedure or return to navigation into computing device 80. If application 81 determines that the clinician does not approve of the displayed projected ablation zone, processing returns to step S314 for further navigation and/or imaging. Alternatively, if application 81 determines that the clinician approves of the displayed projected ablation zone, processing proceeds to step S374, where application 81 determines whether the ablation process has started. For example, application 81 may detect that an activation button on ablation tool 64 has been depressed. In another embodiment, ablation tool 64 may be controlled by application 81 based on input received from the clinician. Thus, if application 81 determines that the ablation process has not started, processing returns to step S372. Alternatively, if application 81 determines that the ablation process has started, processing proceeds to step S375.
At step S375, application 81 determines an estimated ablation zone. The determination of the estimated ablation zone may be based on the configuration settings for the ablation procedure, the location of the radiating portion of ablation tool 64, an elapsed time since the ablation process was started, and/or additional image data received during the ablation process. For example, application 81 may determine that, based on the configuration settings, after a particular amount of time has elapsed since the ablation process was started, a particular area around the radiating portion of ablation tool 64 would be expected to have been ablated. In another embodiment, additional CBCT and/or fluoroscopic scans may be performed during the ablation process to provide image data showing the progress of the ablation process. Application 81 may then, at step S376, display an indicator of such area, such as by a sphere or other shape around ablation tool 64 in the CBCT image data, the fluoroscopic image data, and/or the 3D model.
Thereafter, at step S377, application 81 determines if the ablation process has been completed. For example, application 81 may determine based on the elapsed time since the ablation process was started reaching the time included in the configuration settings that the ablation process is complete. Application 81 may also receive input from the clinician that the ablation process is complete. If application 81 determines that the ablation process has not been completed, processing returns to step S375. Alternatively, if application 81 determines that the ablation process has been completed, processing proceeds to step S380.
At step S380, application 81 marks the location where treatment, whether biopsy or ablation, was performed. For example, application 81 may store the position information received from tracking system 70 regarding the location of tool 62, 64 in the patient's chest while treatment was performed. In embodiments where an ablation procedure was performed, application 81 may also store the last determined estimated ablation zone of the ablation procedure. Such stored treatment locations and estimated ablation zones may further be displayed on the 3D model.
Next, at step S381, application 81 determines whether additional treatments are required. For example, application 81 may determine based on the above-described treatment plan that additional treatments are required. Application 81 may also receive input from the clinician that additional treatments are required. Thus, if application 81 determines that additional treatments are required, processing returns to step S311. If navigation to a different target location is required, and tool 62, 64 does not include an EM sensor 94, tool 62, 64 may be removed from EWC 96 and replaced by LG 92. Alternatively, if application 81 determines that additional treatments are not required, processing ends.
With reference now to
Turning now to
Memory 502 may include any non-transitory computer-readable storage media for storing data and/or software that is executable by processor 504 and which controls the operation of computing device 80. In an embodiment, memory 502 may include one or more solid-state storage devices such as flash memory chips. Alternatively or in addition to the one or more solid-state storage devices, memory 502 may include one or more mass storage devices connected to the processor 504 through a mass storage controller (not shown) and a communications bus (not shown). Although the description of computer-readable media contained herein refers to a solid-state storage, it should be appreciated by those skilled in the art that computer-readable storage media can be any available media that can be accessed by the processor 504. That is, computer readable storage media includes non-transitory, volatile and non-volatile, removable and non-removable media implemented in any method or technology for storage of information such as computer-readable instructions, data structures, program modules or other data. For example, computer-readable storage media includes RAM, ROM, EPROM, EEPROM, flash memory or other solid state memory technology, CD-ROM, DVD, Blu-Ray or other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to store the desired information and which can be accessed by computing device 80.
Network interface 508 may be configured to connect to a network such as a local area network (LAN) consisting of a wired network and/or a wireless network, a wide area network (WAN), a wireless mobile network, a Bluetooth network, and/or the internet. Input device 510 may be any device by means of which a user may interact with computing device 80, such as, for example, a mouse, keyboard, foot pedal, touch screen, and/or voice interface. Output module 512 may include any connectivity port or bus, such as, for example, parallel ports, serial ports, universal serial busses (USB), or any other similar connectivity port known to those skilled in the art.
While several embodiments of the disclosure have been shown in the drawings, it is not intended that the disclosure be limited thereto, as it is intended that the disclosure be as broad in scope as the art will allow and that the specification be read likewise. Therefore, the above description should not be construed as limiting, but merely as exemplifications of particular embodiments. Those skilled in the art will envision other modifications within the scope and spirit of the claims appended hereto.
The present application claims the benefit of and priority to U.S. Provisional Patent Application Ser. No. 62/264,145 entitled VISUALIZATION, NAVIGATION, AND PLANNING WITH ELECTROMAGNETIC NAVIGATION BRONCHOSCOPY AND CONE BEAM COMPUTED TOMOGRAPHY INTEGRATED, filed on Dec. 7, 2015, by Haley et al., the entire contents of which is incorporated by reference herein.
Number | Name | Date | Kind |
---|---|---|---|
4587975 | Salo et al. | May 1986 | A |
4593687 | Gray | Jun 1986 | A |
5419324 | Dillow | May 1995 | A |
5797849 | Vesely et al. | Aug 1998 | A |
5852646 | Klotz et al. | Dec 1998 | A |
5930329 | Navab | Jul 1999 | A |
5951475 | Gueziec et al. | Sep 1999 | A |
5963612 | Navab | Oct 1999 | A |
5963613 | Navab | Oct 1999 | A |
6028912 | Navab | Feb 2000 | A |
6038282 | Wiesent et al. | Mar 2000 | A |
6049582 | Navab | Apr 2000 | A |
6050724 | Schmitz et al. | Apr 2000 | A |
6055449 | Navab | Apr 2000 | A |
6081577 | Webber | Jun 2000 | A |
6120180 | Graumann | Sep 2000 | A |
6167296 | Shahidi | Dec 2000 | A |
6188355 | Gilboa | Feb 2001 | B1 |
6236704 | Navab et al. | May 2001 | B1 |
6314310 | Ben-Haim et al. | Nov 2001 | B1 |
6317621 | Graumann et al. | Nov 2001 | B1 |
6351513 | Bani-Hashemi et al. | Feb 2002 | B1 |
6351573 | Schneider | Feb 2002 | B1 |
6381483 | Hareyama et al. | Apr 2002 | B1 |
6389104 | Bani-Hashemi et al. | May 2002 | B1 |
6404843 | Vaillant | Jun 2002 | B1 |
6424731 | Launay et al. | Jul 2002 | B1 |
6470207 | Simon et al. | Oct 2002 | B1 |
6484049 | Seeley et al. | Nov 2002 | B1 |
6485422 | Mikus et al. | Nov 2002 | B1 |
6490475 | Seeley et al. | Dec 2002 | B1 |
6491430 | Seissler | Dec 2002 | B1 |
6520934 | Lee et al. | Feb 2003 | B1 |
6535756 | Simon et al. | Mar 2003 | B1 |
6539127 | Roche et al. | Mar 2003 | B1 |
6546068 | Shimura | Apr 2003 | B1 |
6546279 | Bova et al. | Apr 2003 | B1 |
6549607 | Webber | Apr 2003 | B1 |
6628977 | Graumann et al. | Sep 2003 | B2 |
6697664 | Kienzle, III et al. | Feb 2004 | B2 |
6707878 | Claus et al. | Mar 2004 | B2 |
6714810 | Grzeszczuk et al. | Mar 2004 | B2 |
6725080 | Melkent et al. | Apr 2004 | B2 |
6731283 | Navab | May 2004 | B1 |
6731970 | Schlossbauer et al. | May 2004 | B2 |
6768784 | Green et al. | Jul 2004 | B1 |
6782287 | Grzeszczuk et al. | Aug 2004 | B2 |
6785356 | Grass et al. | Aug 2004 | B2 |
6785571 | Glossop | Aug 2004 | B2 |
6801597 | Webber | Oct 2004 | B2 |
6823207 | Jensen et al. | Nov 2004 | B1 |
6856826 | Seeley et al. | Feb 2005 | B2 |
6856827 | Seeley et al. | Feb 2005 | B2 |
6865253 | Blumhofer et al. | Mar 2005 | B2 |
6898263 | Avinash et al. | May 2005 | B2 |
6944260 | Hsieh et al. | Sep 2005 | B2 |
6956927 | Sukeyasu et al. | Oct 2005 | B2 |
7010080 | Mitschke et al. | Mar 2006 | B2 |
7010152 | Bojer et al. | Mar 2006 | B2 |
7033325 | Sullivan | Apr 2006 | B1 |
7035371 | Boese et al. | Apr 2006 | B2 |
7106825 | Gregerson et al. | Sep 2006 | B2 |
7117027 | Zheng et al. | Oct 2006 | B2 |
7129946 | Ditt et al. | Oct 2006 | B2 |
7130676 | Barrick | Oct 2006 | B2 |
7165362 | Jobs et al. | Jan 2007 | B2 |
7251522 | Essenreiter et al. | Jul 2007 | B2 |
7321677 | Evron et al. | Jan 2008 | B2 |
7327872 | Valliant et al. | Feb 2008 | B2 |
7343195 | Strommer et al. | Mar 2008 | B2 |
7356367 | Liang et al. | Apr 2008 | B2 |
7369641 | Tsubaki et al. | May 2008 | B2 |
7440538 | Tsujii | Oct 2008 | B2 |
7467007 | Lothert | Dec 2008 | B2 |
7474913 | Durlak | Jan 2009 | B2 |
7499743 | Vass et al. | Mar 2009 | B2 |
7502503 | Bojer et al. | Mar 2009 | B2 |
7505549 | Ohishi et al. | Mar 2009 | B2 |
7505809 | Strommer et al. | Mar 2009 | B2 |
7508388 | Barfuss et al. | Mar 2009 | B2 |
7587074 | Zarkh et al. | Sep 2009 | B2 |
7603155 | Jensen | Oct 2009 | B2 |
7620223 | Xu et al. | Nov 2009 | B2 |
7639866 | Pomero et al. | Dec 2009 | B2 |
7664542 | Boese et al. | Feb 2010 | B2 |
7689019 | Boese et al. | Mar 2010 | B2 |
7689042 | Brunner et al. | Mar 2010 | B2 |
7693263 | Bouvier et al. | Apr 2010 | B2 |
7697972 | Verard et al. | Apr 2010 | B2 |
7711082 | Fujimoto et al. | May 2010 | B2 |
7711083 | Heigl et al. | May 2010 | B2 |
7711409 | Keppel et al. | May 2010 | B2 |
7720520 | Willis | May 2010 | B2 |
7725165 | Chen et al. | May 2010 | B2 |
7734329 | Boese et al. | Jun 2010 | B2 |
7742557 | Brunner et al. | Jun 2010 | B2 |
7742629 | Zarkh et al. | Jun 2010 | B2 |
7761135 | Pfister et al. | Jul 2010 | B2 |
7778685 | Evron et al. | Aug 2010 | B2 |
7787932 | Vilsmeier et al. | Aug 2010 | B2 |
7804991 | Abovitz et al. | Sep 2010 | B2 |
7831096 | Williamson, Jr. | Nov 2010 | B2 |
7835779 | Anderson et al. | Nov 2010 | B2 |
7853061 | Gorges et al. | Dec 2010 | B2 |
7877132 | Rongen et al. | Jan 2011 | B2 |
7899226 | Pescatore et al. | Mar 2011 | B2 |
7907989 | Borgert et al. | Mar 2011 | B2 |
7912180 | Zou et al. | Mar 2011 | B2 |
7912262 | Timmer et al. | Mar 2011 | B2 |
7941000 | Rongen et al. | May 2011 | B2 |
7949088 | Nishide et al. | May 2011 | B2 |
7991450 | Virtue et al. | Aug 2011 | B2 |
7995819 | Vaillant et al. | Aug 2011 | B2 |
8000436 | Seppi et al. | Aug 2011 | B2 |
8043003 | Vogt et al. | Oct 2011 | B2 |
8045780 | Boese et al. | Oct 2011 | B2 |
8050739 | Eck et al. | Nov 2011 | B2 |
8090168 | Washburn et al. | Jan 2012 | B2 |
8098914 | Liao et al. | Jan 2012 | B2 |
8111894 | Van De Haar | Feb 2012 | B2 |
8111895 | Spahn | Feb 2012 | B2 |
8126111 | Uhde et al. | Feb 2012 | B2 |
8126241 | Zarkh et al. | Feb 2012 | B2 |
8150131 | Harer et al. | Apr 2012 | B2 |
8180132 | Gorges et al. | May 2012 | B2 |
8195271 | Rahn | Jun 2012 | B2 |
8200316 | Keppel et al. | Jun 2012 | B2 |
8208708 | Homan et al. | Jun 2012 | B2 |
8218843 | Edlauer et al. | Jul 2012 | B2 |
8229061 | Hanke et al. | Jul 2012 | B2 |
8238625 | Strommer et al. | Aug 2012 | B2 |
8248413 | Gattani et al. | Aug 2012 | B2 |
8270691 | Xu et al. | Sep 2012 | B2 |
8271068 | Khamene et al. | Sep 2012 | B2 |
8275448 | Camus et al. | Sep 2012 | B2 |
8295577 | Zarkh et al. | Oct 2012 | B2 |
8306303 | Bruder et al. | Nov 2012 | B2 |
8311617 | Keppel et al. | Nov 2012 | B2 |
8320992 | Frenkel et al. | Nov 2012 | B2 |
8340379 | Razzaque et al. | Dec 2012 | B2 |
8345817 | Fuchs et al. | Jan 2013 | B2 |
8346344 | Pfister et al. | Jan 2013 | B2 |
8358874 | Haras | Jan 2013 | B2 |
8374416 | Gagesch et al. | Feb 2013 | B2 |
8374678 | Graumann | Feb 2013 | B2 |
8423117 | Pichon et al. | Apr 2013 | B2 |
8442618 | Strommer et al. | May 2013 | B2 |
8504588 | Hirschbeck et al. | Aug 2013 | B2 |
8515527 | Valliant et al. | Aug 2013 | B2 |
8526688 | Groszmann et al. | Sep 2013 | B2 |
8526700 | Isaacs | Sep 2013 | B2 |
8532258 | Bulitta et al. | Sep 2013 | B2 |
8532259 | Shedlock et al. | Sep 2013 | B2 |
8548567 | Maschke et al. | Oct 2013 | B2 |
8625865 | Zarkh et al. | Jan 2014 | B2 |
8625869 | Harder et al. | Jan 2014 | B2 |
8666137 | Nielsen et al. | Mar 2014 | B2 |
8670603 | Tolkowsky et al. | Mar 2014 | B2 |
8675996 | Liao et al. | Mar 2014 | B2 |
8693622 | Graumann et al. | Apr 2014 | B2 |
8693756 | Tolkowsky et al. | Apr 2014 | B2 |
8694075 | Groszmann et al. | Apr 2014 | B2 |
8706186 | Fichtinger et al. | Apr 2014 | B2 |
8712129 | Strommer et al. | Apr 2014 | B2 |
8718346 | Isaacs et al. | May 2014 | B2 |
8750582 | Boese et al. | Jun 2014 | B2 |
8755587 | Bender et al. | Jun 2014 | B2 |
8781064 | Fuchs et al. | Jul 2014 | B2 |
8792704 | Isaacs | Jul 2014 | B2 |
8798339 | Mielekamp et al. | Aug 2014 | B2 |
8831310 | Razzaque et al. | Sep 2014 | B2 |
8855748 | Keppel et al. | Oct 2014 | B2 |
9001121 | Finlayson et al. | Apr 2015 | B2 |
9001962 | Funk | Apr 2015 | B2 |
9008367 | Tolkowsky et al. | Apr 2015 | B2 |
9031188 | Belcher et al. | May 2015 | B2 |
9036777 | Ohishi et al. | May 2015 | B2 |
9042624 | Dennerlein | May 2015 | B2 |
9044190 | Rubner et al. | Jun 2015 | B2 |
9044254 | Ladtkow et al. | Jun 2015 | B2 |
9087404 | Hansis et al. | Jul 2015 | B2 |
9095252 | Popovic | Aug 2015 | B2 |
9104902 | Xu et al. | Aug 2015 | B2 |
9111175 | Strommer et al. | Aug 2015 | B2 |
9135706 | Zagorchev et al. | Sep 2015 | B2 |
9171365 | Mareachen et al. | Oct 2015 | B2 |
9179878 | Jeon | Nov 2015 | B2 |
9216065 | Cohen et al. | Dec 2015 | B2 |
9232924 | Liu et al. | Jan 2016 | B2 |
9247992 | Ladtkow et al. | Feb 2016 | B2 |
9247993 | Ladtkow et al. | Feb 2016 | B2 |
9259269 | Ladtkow et al. | Feb 2016 | B2 |
9262830 | Bakker et al. | Feb 2016 | B2 |
9265468 | Rai et al. | Feb 2016 | B2 |
9277893 | Tsukagoshi et al. | Mar 2016 | B2 |
9280837 | Grass et al. | Mar 2016 | B2 |
9282944 | Fallavollita et al. | Mar 2016 | B2 |
9370398 | Ladtkow et al. | Jun 2016 | B2 |
9401047 | Bogoni et al. | Jul 2016 | B2 |
9406134 | Klingenbeck-Regn | Aug 2016 | B2 |
9445772 | Callaghan | Sep 2016 | B2 |
9445776 | Han et al. | Sep 2016 | B2 |
9466135 | Koehler et al. | Oct 2016 | B2 |
20020143324 | Edwards | Oct 2002 | A1 |
20030220555 | Heigl et al. | Nov 2003 | A1 |
20040143317 | Stinson et al. | Jul 2004 | A1 |
20050008210 | Evron et al. | Jan 2005 | A1 |
20050027193 | Mitschke et al. | Feb 2005 | A1 |
20050215874 | Wang et al. | Sep 2005 | A1 |
20060002630 | Fu et al. | Jan 2006 | A1 |
20060023840 | Boese | Feb 2006 | A1 |
20070049861 | Gundel | Mar 2007 | A1 |
20070232898 | Huynh et al. | Oct 2007 | A1 |
20080033420 | Nields | Feb 2008 | A1 |
20080119725 | Lloyd | May 2008 | A1 |
20080178880 | Christopher | Jul 2008 | A1 |
20080243142 | Gildenberg | Oct 2008 | A1 |
20080262346 | Assis et al. | Oct 2008 | A1 |
20090080737 | Battle et al. | Mar 2009 | A1 |
20090216112 | Assis et al. | Aug 2009 | A1 |
20090281417 | Hartmann | Nov 2009 | A1 |
20090287443 | Jascob | Nov 2009 | A1 |
20110085720 | Barak et al. | Apr 2011 | A1 |
20110112398 | Zarkh et al. | May 2011 | A1 |
20120109260 | Stancer | May 2012 | A1 |
20120281903 | Trumer et al. | Nov 2012 | A1 |
20130116739 | Brada et al. | May 2013 | A1 |
20130259338 | Brehm | Oct 2013 | A1 |
20130259341 | Mountney et al. | Oct 2013 | A1 |
20130279780 | Grbic et al. | Oct 2013 | A1 |
20140046211 | Ladtkow et al. | Feb 2014 | A1 |
20140148808 | Inkpen | May 2014 | A1 |
20140270441 | Baker | Sep 2014 | A1 |
20140281961 | Baker | Sep 2014 | A1 |
20140282216 | Baker | Sep 2014 | A1 |
20140343416 | Panescu et al. | Nov 2014 | A1 |
20150042643 | Shibata et al. | Feb 2015 | A1 |
20150088120 | Garcia | Mar 2015 | A1 |
20150227679 | Kamer et al. | Aug 2015 | A1 |
20160000302 | Brown et al. | Jan 2016 | A1 |
20160000356 | Brown et al. | Jan 2016 | A1 |
20160005168 | Merlet | Jan 2016 | A1 |
20160005194 | Schretter et al. | Jan 2016 | A1 |
20160120521 | Weingarten et al. | May 2016 | A1 |
20160166329 | Langan | Jun 2016 | A1 |
20160206380 | Sparks et al. | Jul 2016 | A1 |
20160287343 | Eichler et al. | Oct 2016 | A1 |
20170340240 | Jacobsen | Nov 2017 | A1 |
Number | Date | Country |
---|---|---|
2655001 | Aug 2010 | CA |
2923457 | Mar 2015 | CA |
103402453 | Nov 2013 | CN |
104582622 | Apr 2015 | CN |
10323008 | Dec 2004 | DE |
2007113703 | Oct 2007 | WO |
2012177470 | Dec 2012 | WO |
2015087206 | Jun 2015 | WO |
2015089013 | Jun 2015 | WO |
Entry |
---|
Leira et al., A novel research platform for electromagnetic navigated bronchoscopy using cone beam CT imaging and an animal model. Minim Invasive Ther Allied Technol. Jan. 2011;20(1):30-41. doi: 10.3109/13645706.2010.518747. Epub Sep. 27, 2010. (Year: 2010). |
Lugez et al, Electromagnetic tracking in surgical and interventional environments: usability study, International Journal of Computer Assisted Radiology and Surgery, Mar. 2015, vol. 10, Issue 3, pp. 253-262. Epub Sep. 6, 2014. (Year: 2014). |
European Examination Report for application No. 16 202 781.7 dated Mar. 9, 2018, 3 pages. |
Yaniv Ziv et al., “Electromagnetic tracking in the clinical environment”, Medical Physics, vol. 36, No. 3, Mar. 2009, pp. 876-892. |
Extended European Search Report issued by the European Patent Office, corresponding to European Patent Application No. 16202781.7; dated Apr. 24, 2017 (8 pages). |
Chinese Office Action issued in Appl. No. CN 201611117979.6 dated Oct. 9, 2018, together with English language translation (15 pages). |
Chinese Office Action issued in corresponding Appl. No. CN 201611117979.6 dated May 21, 2019, together with English language translation (7 pages). |
Number | Date | Country | |
---|---|---|---|
20170156685 A1 | Jun 2017 | US |
Number | Date | Country | |
---|---|---|---|
62264145 | Dec 2015 | US |