Cone beam and 3D fluoroscope lung navigation

Information

  • Patent Grant
  • 12089902
  • Patent Number
    12,089,902
  • Date Filed
    Tuesday, June 23, 2020
    4 years ago
  • Date Issued
    Tuesday, September 17, 2024
    3 months ago
Abstract
A method and system for reducing divergence between computed tomography images and a patient using three-dimensional reconstructions. The method utilizes cone beam imaging or three-dimensional fluoroscopy to supplement or supplant pre-operative computed tomography imaging.
Description
FIELD

The disclosure relates to methods and systems for reducing divergence between computed tomography images and a patient through the use of cone beam computed tomography imaging.


BACKGROUND

Pulmonary disease may cause one or more portions of a patient's lungs may lose its ability to function normally and thus may need to be treated. Lung treatment procedures may be very complex and would be greatly aided if the surgeon performing the procedure can visualize the way airways and other structures in the patient's lungs are shaped and where tools are located. Traditional pre-operative images are helpful, to an extent, with the former, but provide no guidance with regard to the latter.


Systems for displaying images and tracking tools in the patient's lungs generally rely on pre-operative data, such as from computed tomography (CT) scans performed before, sometimes days or weeks in advance, the treatment procedure begins. However, such systems do not account for changes that may have occurred after the CT scan was performed, or for movement occurring during the treatment procedure. Systems, devices, and methods for improving on the process of identifying and visualizing a patient's lungs, as well as structures and tools located therein, are described below.


SUMMARY

The disclosure is directed to a systems and method of a method of registering an image to a luminal network including detecting a position of a sensor in a luminal network. The method of registering also includes receiving images for 3D reconstruction of the luminal network with the sensor within the luminal network; presenting the 3D reconstruction image on a user interface; receiving indication of location of target in the 3D reconstruction image; generating pathway through the luminal network to a target; and determining if the sensor moved from detected position following receipt of the images for 3D reconstruction, where when it is determined that the position of the sensor is the same as detected position, the luminal network and the 3D reconstruction are registered. Other embodiments of this aspect include corresponding computer systems, apparatus, and computer programs recorded on one or more computer storage devices, each configured to perform the actions of the methods.


Implementations may include one or more of receiving survey data when it is determined that the position of the sensor has changed, registering the luminal network to the 3D reconstruction based on the survey data or generating a 3D model of the luminal network. The method may further include displaying the pathway on in the 3D reconstruction, 2D slices images derived from the 3D reconstruction, a 3D model derived from the 3D reconstruction, or a virtual bronchoscopy. Additionally, or alternatively, the method may further include displaying the position of the sensor along the pathway in a user interface.


Another aspect of the disclosure is a method of registering an image to a luminal network including receiving a pre-operative computed tomography (CT) image of the luminal network, receiving an indication of a target within the luminal network, generating a pathway through the luminal network to the target. The method of registering also includes receiving images for 3D reconstruction of the luminal network, transforming coordinates of the pre-operative CT image to coordinates of the 3D reconstruction to register the pre-operative CT image to the 3D reconstruction, and updating a position of a catheter in the 3D reconstruction image or a 3D model upon detection of movement of the catheter.


The method may further include displaying the 3D reconstruction, 2D slices images derived from the 3D reconstruction, a 3D model derived from the 3D reconstruction, or a virtual bronchoscopy on a user interface. In another aspect the method includes generating a 3D model from the 3D reconstruction image before transforming the pre-operative CT coordinates and the 3D reconstruction coordinates and may also include matching features from the CT images to the 3D reconstruction and 3D model derived from the 3D reconstruction. Alternatively, the method includes generating a 3D model from the 3D reconstruction after transferring the target and pathway from the pre-operative CT image to the 3D reconstruction. The method may include receiving survey data, where the survey data is received prior to receipt of the 3D reconstruction or the survey data is received after transfer of the target and pathway to the 3D reconstruction from the pre-operative CT image to register the 3D reconstruction to the luminal network.


A further aspect of the disclosure is a method for registering an image to a luminal network including receiving a pre-operative computed tomography (CT) image of the luminal network, receiving an indication of a target within the luminal network, generating a pathway through the luminal network to the target, generating a CT 3D model, detecting a position of a catheter within the luminal network, registering the pre-operative CT image to the detected position of the catheter, receiving an indication of a location of a sensor in the pre-operative CT or CT 3D model and update the location in a user interface until proximate the target, receiving images for 3D reconstruction of the luminal network, and detecting a position of the catheter and updating the position on a use interface.


The method may further include generating a 3D model from the 3D reconstruction. Still further the method may include recalling survey data from memory, presenting the 3D reconstruction on a user interface, receive an indication of a location of a target in the 3D reconstruction, and generating a pathway in the 3D reconstruction or 3D model. Still further the method may further include determining a relative position of the target and the catheter in the 3D reconstruction and updating the relative position of the target and the catheter in the pre-operative CT image and CT 3D model based on the determined relative position in the 3D reconstruction or 3D model. Additionally, the method may include registering the pre-operative CT image with the 3D reconstruction and transferring the target and pathway from the pre-operative CT image and 3D model to the 3D reconstruction and 3D model.





BRIEF DESCRIPTION OF THE DRAWINGS

Various aspects and features of the disclosure are described hereinbelow with references to the drawings, wherein:



FIG. 1 is a schematic diagram depicting an imaging and navigation system in accordance with the disclosure;



FIG. 1A is a schematic diagram depicting an end view of the imaging and navigation system of FIG. 1 in accordance with aspects of het disclosure;



FIG. 2 is a flow chart of an imaging and navigation procedure in accordance with aspects of the disclosure;



FIG. 3 is a flow chart of an imaging and navigation procedure in accordance with aspects of the disclosure;



FIG. 4A is a partial flow chart of an imaging and navigation procedure in accordance with aspects of the disclosure;



FIG. 4B is a partial flow chart of an imaging and navigation procedure in accordance with aspects of the disclosure;



FIG. 5 is a flow chart of an imaging and navigation procedure in accordance with aspects of the disclosure;



FIG. 6 is a flow chart of an imaging and navigation procedure in accordance with aspects of the disclosure;



FIG. 7 is a block diagram depicting features and components of a computing device in accordance with aspects of the disclosure;



FIG. 8 is a flow chart of an imaging and navigation procedure in accordance with aspects of the disclosure;



FIG. 9 is a flow chart of an imaging and navigation procedure in accordance with aspects of the disclosure;



FIG. 10 is a flow chart of an imaging and navigation procedure in accordance with aspects of the disclosure.





DETAILED DESCRIPTION

The disclosure is directed to a system and method for using a cone beam computed tomography (CBCT) image or a 3D fluoroscopy image in connection with intraluminal navigation techniques and systems.


There exist a number of systems that utilize the output from a pre-procedural computed tomography (CT) scan (e.g., CT image data) for purposes of identifying areas of interest or targets to which navigation of an endoscope or catheter is desired. Typically, this navigation will be of luminal networks such as the airways of the lungs or the biliary tract, but they could also be of spaces such as the thoracic cavity generally or other locations within a patient. These systems generally have two phases. A first phase is a planning phase where the targets are identified, and a three-dimensional (3D) model is generated. A second phase is a navigation phase where the location of the catheter within the patient is detected and depicted on the 3D model or other images to allow the clinician to navigate to the identified targets. By updating the position of a catheter within the 3D model, the clinician is able to perform procedures such as biopsy or treatment at the target location. One such systems is the ILLUMISITE system sold by Medtronic PLC, which is an electromagnetic navigation (EMN) system.



FIG. 1 depicts a system 100 suitable for implementing methods described herein. As shown in FIG. 1, system 100 is used to perform one or more procedures on a patient supported on an operating table 40. In this regard, system 100 generally includes a bronchoscope 50, monitoring equipment 30, a tracking system 70, and a computing device 80.


Bronchoscope 50 is configured for insertion through the patient's mouth and/or nose into the patient's airways. Bronchoscope 50 includes a source of illumination and a video imaging system (not explicitly shown) and is coupled to monitoring equipment 30, for example, a video display, for displaying the video images received from the video imaging system of bronchoscope 50. In an embodiment, bronchoscope 50 may operate in conjunction with a catheter guide assembly 90. Catheter guide assembly 90 includes a locatable guide (LG) 92 and catheter 96. Catheter 96 may act as an extended working channel (EWC) and be configured for insertion through a working channel of bronchoscope 50 into the patient's airways (although the catheter guide assembly 90 may alternatively be used without bronchoscope 50). Catheter guide assembly 90 includes a handle 91 connected to catheter 96, and which can be manipulated by rotation and compression to steer LG 92 and catheter 96, catheter 96 is sized for placement into the working channel of bronchoscope 50. In the operation of catheter guide assembly 90, LG 92, including an EM sensor 94, is inserted into catheter 96 and locked into position such that EM sensor 94 extends a desired distance beyond a distal tip 93 of catheter 96. The location of EM sensor 94, and thus distal tip 93 of catheter 96, within an EM field generated by EM field generator 76, can be derived by tracking module 72 and computing device 80.


LG 92 and catheter 96 are selectively lockable relative to one another via a locking mechanism 99. A six degrees-of-freedom tracking system 70 is utilized for performing navigation, although other configurations are also contemplated. Tracking system 70 may be configured for use with catheter guide assembly 90 to track a position of EM sensor 94 as it moves in conjunction with catheter 96 through the airways of the patient, as detailed below. In an embodiment, tracking system 70 includes a tracking module 72, a plurality of reference sensors 74, and an EM field generator 76. As shown in FIG. 1, EM field generator 76 is positioned beneath the patient. EM field generator 76 and the plurality of reference sensors 74 are interconnected with tracking module 72, which derives the location of each reference sensor 74 in the six degrees of freedom. One or more of reference sensors 74 are attached to the chest of the patient. The six degrees of freedom coordinates of reference sensors 74 are sent as data to computing device 80, which includes an application 81, where the data from reference sensors 74 are used to calculate a patient coordinate frame of reference.


Although EM sensor 94 is described above as being included in LG 92, it is also envisioned that EM sensor 94 may be embedded or incorporated within a treatment tool, such as a biopsy tool 62 or an treatment tool 64 (e.g. an ablation catheter), where the treatment tool may alternatively be utilized for navigation without need of LG 92 or the necessary tool exchanges that use of LG 92 requires. EM sensor 94 may also be embedded or incorporated within catheter 96, such as at a distal portion of catheter 96, thereby enabling tracking of the distal portion of catheter 96 without the need for LG 92.


According to an embodiment, biopsy and treatment tools 62, 64 are configured to be insertable into catheter guide assembly 90 following navigation to a target location and removal of LG 92. Biopsy tool 62 may be used to collect one or more tissue samples from the target location, and in an embodiment, is further configured for use in conjunction with tracking system 70 to facilitate navigation of biopsy tool 62 to the target location, and tracking of a location of biopsy tool 62 as it is manipulated relative to the target location to obtain the tissue sample. Treatment tool 64 is configured to be operated with a generator 66, such as a radio frequency generator or a microwave generator and may include any of a variety of ablation tools and/or catheters. Though shown as a biopsy tool and microwave ablation tool in FIG. 1, those of skill in the art will recognize that other tools including for example RF ablation tools, brachytherapy tools, and others may be similarly deployed and tracked without departing from the scope of the present disclosure. Additionally, a piercing tool and/or puncture tool may be used with and/or incorporated in LG 92 to create an exit point where LG 92, and thereby catheter 96, is navigated outside of the patient's airways and toward the target location, as further described below.


A radiographic imaging device 20, such as a C-arm imaging device capable of capturing images of at least a portion of the patient's lungs is used in conjunction with system 100. Radiographic imaging device 20 captures images from which a 3D reconstruction can be generated such as a CBCT device or a 3D fluoroscopy device. Generally, both CBCT images and 3D fluoroscopy images are captured by sweeping the radiographic imaging device 20 through a defined sweep angle (e.g., 30-180 degrees and any integer value within that range). By processing the individual images or video captured during the sweep, a 3D reconstruction can be generated which is similar to a traditional CT image. As will be understood CBCT images have similar resolution to CT images whereas fluoroscopy images have a lower resolution.


As shown in FIG. 1, radiographic imaging device 20 is connected to computing device 80 such that application 81 may receive and process image data obtained by radiographic imaging device 20. However, radiographic imaging device 20 may also have a separate computing device located within itself, within the treatment room or in a separate control room to first receive the image data obtained by radiographic imaging device 20 and relay such image data to computing device 80. In one example, the radiographic imaging device 20 is connected to a picture archiving and communications system (PACS) server which in turn is connected to the computing device 80 and application 81. To avoid exposing the clinician to unnecessary radiation from repeated radiographic scans, the clinician may exit the treatment room and wait in an adjacent room, such as the control room, while radiographic imaging device 20 performs the CBCT and/or fluoroscopic scans. FIG. 1A depicts an end view of the radiographic imaging device as it might be used to image a patient while they are laying on table 40 in accordance with the disclosure.


Computing device 80 includes software and/or hardware, such as application 81, used to facilitate the various phases of an EMN procedure, including generating the 3D model, identifying a target location, planning a pathway to the target location, registering the 3D model with the patient's actual airways, navigating to the target location, and performing treatment at the target location. For example, computing device 80 utilizes data acquired from a CT scan, CBCT scan, magnetic resonance imaging (MRI) scan, positron emission tomography (PET) scan, and/or any other suitable imaging modality to generate and display the 3D model of the patient's airways, to enable identification of a target location on the 3D model (automatically, semi-automatically or manually) by analyzing the image data and/or 3D model, and allow for the determination and selection of a pathway through the patient's airways to the target location. While the image data may have gaps, omissions, and/or other imperfections included in the image data, the 3D model is a smooth representation of the patient's airways, with any such gaps, omissions, and/or imperfections in the image data filled in or corrected. The 3D model may be presented on a display monitor associated with computing device 80, or in any other suitable fashion.


Though described herein generally as generating a 3D model from either pre-operative CT images, CBCT images, or 3D fluoroscopy images, application 81 may not need to generate the 3D model or even a 3D reconstruction. Instead, that functionality may reside in the computing device associated with the radio graphic imaging device 20 or the PACS server. In such scenarios, the application 81 need merely import the 3D reconstruction or 3D model generated from a CT image, CBCT image, fluoroscopy images by the radiographic imaging device 20 or the PACS server.


Using computing device 80, various views of the image data and/or 3D model may be displayed to and manipulated by a clinician to facilitate identification of the target location. As noted above, the target location may be a site within the patient's lungs where treatment is to be performed. For example, the treatment target may be located in lung tissue adjacent to an airway. The 3D model may include, among other things, a model airway tree corresponding to the actual airways of the patient's lungs, and show the various passages, branches, and bifurcations of the patient's actual airway tree. Additionally, the 3D model may include lesions, markers, blood vessels and vascular structures, lymphatic vessels and structures, organs, other physiological structures, and/or a 3D rendering of the pleural surfaces and fissures of the patient's lungs. Some or all of the aforementioned elements may be selectively displayed, such that the clinician may choose which elements should be displayed when viewing the 3D model.


After identifying the target location, application 81 may determine a pathway between the patient's trachea and the target location via the patient's airways. In instances where the target location is located in lung tissue that is not directly adjacent an airway, at least a portion of the pathway will be located outside of the patient's airways to connect an exit point on an airway wall to the target location. In such instances, LG 92 and catheter 96 will first be navigated along a first portion of the pathway through the patient's airways to the exit point on the airway wall. LG 94 may then be removed from catheter 96 and an access tool, such as a piercing or puncture tool, inserted into catheter 96 to create an opening in the airway wall at the exit point, catheter 96 may then be advanced through the airway wall into the parenchyma surrounding the airways. The access tool may then be removed from catheter 96 and LG 92 and/or tools 62, 64 reinserted into catheter 96 to navigate catheter 96 along a second portion of the pathway outside of the airways to the target location.


During a procedure, EM sensor 94, in conjunction with tracking system 70, enables tracking of EM sensor 94 (and thus distal tip 93 of catheter 96 or tools 62, 64) as EM sensor 94 is advanced through the patient's airways following the pathway planned during the planning phase. Though generally described herein in connection with EM sensors 94, the disclosure is not so limited. Rather, the position of the bronchoscope 50, catheter 96 or tools 62, 64 can be determined through the use of flex sensors (E.g., Fiber-Bragg sensors) which are used to match the shape of the catheter 96 with the shape of the airways in the 3D model. By sensing the shape of the sensors, and matching the sensor's shape the airways, an accurate determination of the position of the sensor or a distal portion of the bronchoscope 50, catheter 96 or tools 62, 64 can be determined and displayed on the 3D model.


As an initial step of the procedure, when using a 3D model generated from CT scan, the 3D model must be registered with the patient's actual airways to enable application 81 to display an indication of the location of EM sensor 94 on the 3D model corresponding to the location of EM sensor 94 within the patient's airways. The registration is necessary because the CT scan may have been taken days, and even weeks or months prior to the actual procedure. Even if the CT scan were taken the same day, such CT scans are not undertaken within a surgical suite thus registration is still necessary.


One potential method of registration involves performing a survey of the patient's lungs by navigating LG 92 into each lobe of the patient's lungs to at least the second bifurcation of the airways of that lobe. The position of LG 92 is tracked during this registration phase, and the 3D model is iteratively updated based on the tracked position of the sensor 94 within the actual airways of the patient's lungs. While the registration process focuses on aligning the patient's actual airways with the airways of the 3D model, registration also ensures that the position of vascular structures, pleural surfaces, and fissures of the lungs are accurately determined.


Registration, however, does not achieve a perfect match of the position of the patient's lungs and the 3D model. There are a number of reasons for this mismatch, typically called CT-to-body divergence. As an initial matter, traditional CT images are taken at full breath hold. That is, the patient is asked to expand their lungs to a maximum and hold that position while undergoing the imaging. This has the benefit of inflating the airways and increasing their visibility in the CT images and make it easier to generate a highly detailed 3D model. However, when performing the procedure, the patient is not at a full breath hold, rather they are typically sedated and experiencing tidal volume breathing. This results in a difference in shape and position of the airways in the lungs of the patient during the procedure as compared to during the CT imaging. As a result, even when the airways have been registered to the 3D model (e.g., using the airway sweep or another method) there will be differences between the relative positions of the airways or targets identified in the lungs in the model and the actual relative positions of the patient's airways and the target.


One method of addressing the CT-to-body divergence is to utilize a CBCT image data set from radiographic imaging device 20 and not a traditional CT scans as the starting point for the procedure. In this process, the CBCT image data is used to generate and display the 3D model of the patient's airways, to enable identification of a target location on the 3D model (automatically, semi-automatically or manually) by analyzing the image data and/or 3D model, and allow for the determination and selection of a pathway through the patient's airways to the target location. Though the following techniques are described in connection with CBCT images those of skill in the art will appreciate that they are equally applicable to any imaging technique capable of generating a 3D reconstruction such as 3D fluoroscopy, as noted above.



FIG. 2 presents a method 200 for employing CBCT in conjunction with system 100 of FIG. 1 such that the planning phase occurs in conjunction with the navigation and treatment of the patient. As will be appreciated, the patient is situated on the table 40, reference sensors 74 are on the patient's chest, and connected to EM tracking system 70. A bronchoscope 50 and/or catheter 96 is inserted into the patient's airways and images may be displayed on the monitoring equipment 30. A position of the sensor 94 (e.g., one associated with the bronchoscope 50 or catheter 96 or another tool) can be detected and indication that a sensor position has been received by tracking system 70 can be presented on a user interface on computing device 80 at step 202.


Radiographic imaging device 20 may then be engaged and the computing device 80 receives the CBCT at step 204. The computing device 80 includes one or more applications for processing the CBCT data and presenting it on one or more user interfaces for manipulation and assessment. At step 206, the application analyzes the CBCT data and generates a 3D model of the airways. This 3D model can be manipulated by the user via a user-interface to ensure that it has sufficient resolution and sufficiently captures the airways of the patient (e.g., to a particular bifurcation point). A rejection of the 3D model may be received by the application at which point a further CBCT image may be acquired and the process restarted at step 204. The rejection may be based for example, on the clinician not being satisfied with the 3D model (e.g., insufficient bifurcation generation, missing a lobe or a significant portion thereof), alternatively, the 3D model may simply appear incorrect based on the clinician's experience with the physiology of patients. These types of deficiencies may be the result of improper or insufficient CBCT imaging or an improper setting on the radio graphic imaging device 120.


Acceptance of the 3D model is received by the application at step 208 and the user-interface presents CBCT images or virtual CBCT images of the patient's lungs from the CBCT at step 210. These CBCT images are slice images either taken at or generated for different points of the patient's lungs in cross section. The user interface allows the user to scroll through these images which show one or more of the axial, coronal or sagittal planes (though others are also possible) of the lungs and allows the user to identify a target within the images. The application receives the indication of a target at step 212 and generates a pathway to reach the target through airways at step 214. The target indication may be a manual marking by a clinician providing the indication through a user interface on computing device 80. Alternatively, the application 81 may perform an image analysis and automatically detect the target and provide the indication of its location. The user interface then displays the pathway though the airways in one or more CBCT images, virtual CBCT images, or a virtual bronchoscopy view of the 3D model at step 216. The user interface may additionally or alternatively display the pathway on one or more of CBCT images or virtual CBCT images.


The CBCT images will also show the presence of the bronchoscope 50 or catheter 96 that had been previously inserted into the patient. At step 217 the application can determine whether the sensor 94 has moved since the acquisition of the CBCT images. If the sensor 94 has not moved since the taking of the CBCT images in step 204, then the position of the EM sensor detected in step 202 corresponds to the position of the distal end of the bronchoscope 50 or catheter 96 in the images. As such the EM coordinate system and the CBCT image system are registered to one another and no further steps need be taken to register the patient's lungs to the 3D model generated from the CBCT images and further navigation can be undertaken following the planned pathway through the 3D model with confidence.


If the determination at step 217 is that the sensor 94 has moved, or moved greater than some threshold, then an indicator can be presented on the user interface suggesting that the user perform a survey, as described above, and the application 81 receives the survey data at step 218. The survey involves the insertion of the EM sensor 94 into the lobes of the lungs receipt by the tracking system 70 of the position of the EM sensor as it moves through the airways. As many hundreds or thousands of these positions (EMN coordinates) are collected a point cloud of positions is created. The point cloud, of which all points are assumed to be taken from within the luminal network has a 3D dimensional shape that can then be matched to the 3D shape of the airways to register to the 3D model and the airways of the patient. Once registered the detected position of the EM sensor can be used to follow a pathway in the 3D model to the identified target. The detected position of the EM sensor relative to the pathway and the target is continually updated on the user interface at step 220 until determining that the target has been is arrived at step 222 and a procedure is undertaken at step 224. The procedure may be a biopsy or a treatment of the target such as ablation (e.g., RF, microwave, cryo, thermal, chemical, immunotherapy, or combinations of these).


Whether the patient and the CBCT images are registered because the sensor 94 did not move following the imaging (step 216), or by use of the survey (step 218), this registration using a CBCT image should essentially eliminate any CT-to-body divergence issue as the CBCT images were acquired with the patient in exactly the same position as when the navigation procedure commences. Moreover, the CBCT images are taken while the patient is undergoing tidal breathing as opposed to full breath hold, thus the differences between the patient's lungs and the 3D modeling when tradition CT images are used while the patient is at full breath hold.


Though not described in detail here, the positioning and navigation of the EM sensor 94 (e.g., on bronchoscope 50, catheter 96, or other tools) may be done manually as described above in connection with catheter guide assembly 90 or may be achieved using a robotically driven catheter guide assembly.


A further method 300 that may be used with system 100 is described in connection with FIG. 3. In the method 300 a CT image and/or a CT 3D model is received and stored in a memory associated with computing device 80 at step 302. This is a standard pre-operative CT image taken with traditional CT imaging systems while the patient is at full breath hold, as described above. This pre-operative CT image is processed by the application 81 and a pathway is generated to targets within the luminal networks which have been imaged (e.g., the airways of the lungs) at step 304. Steps 302 and 304 achieve the planning phase.


At optional step 306, which may be at any time following completion of the planning phase, the patient is situated on the table 40 and the data from an survey (e.g., insertion of an EM sensor 94 into the airways) is received by the tracking system 70 and processed by application 81 in computing device 80. At step 308 CBCT image is acquired by application 81 of the desired portion of the patient using radiographic imaging device 20. This CBCT image may include the bronchoscope 50 or another device including EM sensor 94. Optionally, at step 310 a CBCT 3D model may be generated from the CBCT image. Alternatively, the acquired CBCT image received at step 308 may include a 3D model that was generated by software resident on the radio graphic imaging device 20, or on the PACS server, and supplied to the computing device 80 and application 81 with the CBCT image.


Both the pre-operative CT image that was used for the planning phase and the CBCT image acquired in step 308 are in Digital Imaging and Communications in Medicine (DICOMM) format. The DICOMM format includes reference to the coordinate system with which the image was acquired. As a result, the application 81, at step 312 transforms the coordinate system of the pre-operative CT image with the coordinate system of the CBCT image taken by the radiographic imaging device 20. Step 312 effectively registers the pre-operative CT image with the CBCT image.


Alternatively, at step 311 the application 81 aligns the CBCT 3D model generated at step 310 a 3D model generated from the pre-operative CT image and received at step 302. The alignment of the two 3D models registers the pre-operative CT image with the CBCT image. The application may present either or both of the pre-operative CT 3D model and the CBCT 3D model on a user interface and request confirmation of alignment by a user or allow for interaction by the user to finalize the orientation of the two 3D models relative to each other to finalize the registration of the two 3D models. Alternatively, this may be automatically performed by application 81.


A further alternative with respect to registration is to make an assumption as to alignment of the patient in the pre-operative CT image and the CBCT image. This process relies on the fact that during imaging with the radiographic imaging device 20 the patient is always lying flat on the table 40 with their chest away from the table 40 along the length of the table 40 and that they will be in essentially this position during the acquisition of the pre-operative CT. In this registration process, the application 81 may request via the user interface that the clinician identify a common point in both the pre-operative CT and the CBCT image. This point could be the target, as described above with respect to method 200, or it could be a point such as a main carina of the lungs or a rib or some other feature which appears in both image data sets. Alternatively, the application 81 may utilize various image processing techniques to identify these common features in the two image data sets and to register them to one another. Once identified, either manually or automatically, because of the assumption that the patient is aligned on the table 40 essentially in the same position in both images, the two image data sets (e.g., pre-operative CT and CBCT images) are registered to one another. As will be appreciated, the identification of 2, 3, 4, 5, 10 points, either automatically or by a clinician using the user interface will refine the registration even more, where desired. In some aspects this may be achieved using mutual information techniques of image brightness matching. This may be assisted by various deep learning methodologies where empirical algorithms are developed by the processing of hundreds or thousands or more images and performing the registration.


At step 314, once the two CT images or 3D models are registered to one another, all the planning data that was generated using the pre-operative CT image can be transferred to the CBCT image acquired at step 308 or to the 3D model acquired at step 310. With features such as the target and a pathway to the target, among others, transferred from the pre-operative CT image to the CBCT image, if a 3D model of the CBCT image was not generated at step 310, it can now be generated at step 316 and will include the target and pathway that has been transferred from the pre-operative CT image to the CBCT image at step 312. Alternatively, where the CBCT 3D model was generated at step 310, but the pre-operative CT 3D model and the CBCT 3D model were not registered to one another at step 311, the features transferred can be matched to the CBCT 3D model at optional step 318. Regardless of when the transfer to the features occurs, the application 81 can cause a user interface to display the CBCT 3D model and CBCT images and the features from the planning phase identified in the pre-operative CT image can be displayed therein on a user interface at step 320.


In instances where a survey was not undertaken at step 306, a survey can be undertaken at step 322. This survey registers the CBCT image and the CBCT 3D model to the patient's lungs by navigating the sensor 94, which is embodied on the bronchoscope 50, catheter 96 or another tool, into the airways of the patient, generating the point cloud discussed above. As will be appreciated, other methods of registration may also be employed without departing from the scope of the present disclosure. If the survey were conducted in step 306, above, the application may proceed during the acquisition of the CBCT image at step 308 to conduct the EM sensor 94 movement analysis, described above in step 216 to register the patient's airways to the CBCT image and 3D model generated therefrom. Once registered the detected position of the EM sensor can be used to follow a pathway in the CBCT 3D model to the target originally identified in the pre-operative CT image. The detected position of the EM sensor 94 relative to the pathway and the target is continually updated on the user interface at step 324 until the application 81 determines that the target has been arrived at step 326 and a procedure may be undertaken upon arrival at step 328. As an alternative, to use of an EM sensor 94 and detection of its position, the radiographic imaging device 20 may be capable of generating fluoroscopic images. The position of the catheter 96 may be detected in one or more fluoroscopic images that are acquired by the radio graphic imaging device 20. This detection may be manual by the clinician using a user interface on computing device 80 or may be performed by the application 81 via image processing techniques. Because the coordinate system is the same between the CBCT images and the fluoroscopic images acquired by the same device, the detected position of the catheter 96 in the fluoroscopic images can be transferred to the CBCT images or CBCT 3D model. The fluoroscopic images may be acquired periodically as the catheter 96 is navigated towards the target. The procedure may be a biopsy or a treatment of the target such as ablation (e.g., RF, microwave, cryo, thermal, chemical, immunotherapy, or combinations of these).


As with the method of FIG. 2, the method of FIG. 3, eliminates the CT-to-body divergence because the CBCT image and model are generated with the patient in the same position they are in for the navigation procedure. Further the target and pathways are shown in the CBCT 3D model and CBCT images. Further, any discrepancies in registration are minimized either by the DICOMM registration process, the acquisition of the CBCT image with the sensor 94 in the image, and/or receiving survey data and matching it to the airways of the CBCT images and CBCT 3D model.


A method 400 is described with reference to FIGS. 4A and 4B. In accordance with method 400, a pre-operative CT image is acquired at step 402 and saved in a memory of computing device 80. At step 404 the application 81 processes the CT image, generates a 3D model, presents on a user interface CT images on which to receive an indication of a target, and generates a pathway through the airways of a patient (or another luminal network) to reach the target. These steps complete the planning phase using a pre-operative CT image.


After the planning phase is complete, the patient may be placed on the table 40 and a bronchoscope 50 or catheter 96 inserted such that a sensor 94 can be detected by the tracking system 70 and that data provided to the application 81 at step 406. Next a survey can be conducted and a point cloud of positions of the sensor 94 received by the tracking system 70 as the survey is conducted at step 408. With the point cloud, the patient and the pre-operative CT image as well as the 3D model generated therefrom are registered to one another at step 410. As noted above, the sensor 94 may be an EM sensor, a flex sensor, or other sensor useable to determine a position of the catheter 96 or bronchoscope in a patient and depict that position in the pre-operative, thus registering the patient and the pre-operative CT image and 3D model.


With the patient and the pre-operative CT image registered navigation can commence with the tracking system 70 receiving indications of new locations of the sensor 94 as it is moved through the airways of the patient and the detected positions being updated on a user interface at step 412 as the pathway is followed to an identified target.


Once the sensor 94, and more particularly the bronchoscope 50, catheter 96, or other tool including the sensor 94, is proximate the target a CBCT image can be generated with radiographic imaging device 20 at step 414. At this point at least two different options are available. In accordance with one option, at step 416, a CBCT 3D model is generated from the CBCT image. Next at step 418 the point cloud that was generated by the survey at step 408 may be recalled from a memory in the computerized device 80 in which it is stored, and fit to the CBCT image, and the CBCT 3D model. Alternatively, the method may skip forward to step 420, where the CBCT model and the CBCT images are registered by any of the methods described herein and can be presented on a user interface. Because the CBCT image and 3D model are registered with the patient based on the survey from step 408, the pathway and targets identified at step 404 can be transferred from the pre-operative CT image 3D model to the CBCT image and CBCT 3D model at step 422 in FIG. 4B. Alternatively, in FIG. 4B the CBCT image and CBCT 3D model may be presented on the user interface at step 424 such that the target can be identified in the CBCT images. Once identification of the target is received by the application, the application generates a pathway from the location of the sensor 94, as depicted in the CBCT images and CBCT model to the target. Again, because the CBCT image is generated about the patient while they are in position on the table 40 on which the navigation is being undertaken, there is no CT-to-body divergence. The result is that the “last mile” of navigation to the target (e.g., the final 3 mm to a target) can be undertaken with heightened confidence that the target will be properly reached for biopsy or treatment. Subsequent movement of the sensor 94 is detected at step 426 and the position of the sensor 94 in the CT 3D model can be updated and a procedure can be undertaken at step 428.


As noted above, after step 414 an alternative method can be followed. At a step 430 the CBCT image, which includes the bronchoscope 50 or catheter 96 (or other tool) with sensor 94 therein is within the CBCT image, the CBCT image and/or CBCT 3D model can be analyzed to determine the relative position of the target and a distal end of the bronchoscope 50 or catheter 96. This relative position determination can be automatically derived by the application 81. Alternatively, the relative position can be determined by receipt of an indication of the location of the bronchoscope 50 or catheter 96 via the user interface, where one or more of the target and the distal end of the bronchoscope 50 or catheter 96 are shown in 2D images or the 3D model. The position of the target can be assumed to be the same in both the pre-operative CT image and the CBCT image. The relative position data can then be used by the application 81 at step 432 to update the detected position of the sensor 94 in the pre-operative CT image and the 3D model derived from the pre-operative CT. This update of position will account for the CT-to-body divergence that results from the use of the pre-operative CT image and the 3D model for navigation. Again, the last mile movement of the sensor 94 to the target can be detected at step 426 and a procedure can be performed at step 428.


As will be appreciated, the system 100, and particularly application 81 being run on computing device 80, can be configured to control operation of the radiographic imaging device 20. This control may be via user input to a user interface. As such according to this Alternatively, the application, can be configured, following registration of the pre-operative CT or an initial CBCT image to a patient (if required) and the identification of target, to adjust the imaging field of the CBCT to focus on the target. The application 81 may, using the location of the target in the patient, focus all future CBCT imaging on the target. This may be done without any intervention by the user. Similarly, the application 81 may initiate CBCT imaging at points during any of the methods described with respect to methods 200-400, without interaction from a user. For example in connection with a method 500 depicted in FIG. 5, as a bronchoscope 50, catheter 96, or any other tool including sensor 94 is navigated to and detected within a pre-determined distance from a target at step 502, the application 81 signals the radiographic imaging device 20 to initiate a CBCT image acquisition process at step 504. Alerts may be provided to the clinicians and surgical staff allowing them to move away from the patient and limit their exposure to the radiation emitted by the radiographic imaging device 20. These alerts may be audible or visual via the user interface.


The CBCT image is acquired via radiographic imaging device 20 and received by application 81 at step 506. This CBCT image may be used as described particularly with respect to the CBCT imaging described in the method 400. Alternatively, the use of CBCT imaging may be reviewed and considered completely separate from methods 200-400 and simply as another visual tool employed by the clinician to confirm placement, locations, and other clinical observations.


A further aspect of the disclosure is directed to breathing detection to assist in CBCT imaging. The method 600 is described with respect to FIG. 6. Employing the output from reference sensors 74, tracking system 70 and therewith application 81 can detect the phase of the phase of the patient's breathing at step 602. This phase can be monitored throughout a procedure at step 604. Whenever a request for a CBCT image is received (either directly or via an automatic process) the application 81 can determine the phase of the patient's breathing at step 606. There are a variety of options for the application at this point depending on which method 200-400 in which the system 100 is engaged.


If the CBCT image is the first CT image acquired for a procedure, the application 81 can at step 608 direct the radiographic imaging device to only acquire images when the reference sensors are within a tolerance of a desired portion of the breathing phase (e.g., nearing end of exhale phase, or nearing end of inhale phase). For example, nearing the end of the inhale phase may allow for the airways to be in an expanded state resulting in potentially cleaner images that can generate a more accurate 3D model due to the contrast of the airways that results from airways being expanded. Alternatively, when the breathing phase is approaching the end of the exhale phase, there may be a longer duration of the breathing cycle where there is substantially no movement of the lungs, thus allowing for more images to be captured and enhancing the stability of the images as they are acquired in the CBCT image.


At step 610 the application 81 signals the radiographic imaging device 20 to being imaging. When application 81 determines at step 612 that the desired portion of breathing phase is about to end the application signals the radiographic imaging device 20 to stop imaging the patient at step 614. At step 616 the application can determine whether the CBCT imaging is complete. If not, the method continues to step 618 where the application 81 determines that the desired breathing phase is about to be entered, by monitoring the position of the reference sensors 74, and the method returns to step 610 where the radiographic imaging device 20 again acquires images during the desired portion of the breathing phase. If the CBCT imaging is complete at step 616, then the application 81 stops the radiographic imaging device 20 at step 6192 and proceeds to the next steps in methods 200-400.


Where a registration is desired between a either a pre-operative CT image or a previously acquired CBCT image, the application 81 can at step 608 signal the radiographic imaging device 20 to acquire images only during those portions of the breathing cycle that most closely match the breathing cycle of the previously acquired CT or CBCT images. By matching the breathing cycles as closely as possible, the two image data sets will more closely resemble one another making registration between the two easier and to transfer features such as a target or a pathway from the first CT image to a second. For example, in instances where registration to a pre-operative CT image is desired, the radiographic imaging device 20, can be directed by the application 81 to acquire CBCT images only during portions of the breathing cycle approaching the maximum inhale of normal tidal breathing position. When it is two CBCT images that are to be acquired the application 81 can store in memory the breathing phase of the first CBCT image and direct the radiographic imaging device 20 acquire images at the same phase of breathing at step 608.


As will be appreciated, by limiting imaging to a specified portion of the breathing phase the time required to acquire a CBCT image may be increased and may take several breathing cycles to complete. This may minimally extend the time required to acquire the CBCT image but results in greater fidelity of the captured image as the lungs are always imaged in about the same position of the breathing cycle. In addition, by monitoring the reference sensors 74, if a patient were to cough or move on the table 40 during the imaging process, the application 81 which is monitoring the positions of the reference sensors 74 thorough the breathing cycle can detect the rapid movement of the sensor 74. If such a movement is detected during imaging by the radiographic imaging device 20 at step 622, the application 81 can reject the most recently acquired portion of the CBCT image at step 624. The application 81 can then direct the radiographic imaging device 20 to reposition itself at step 626 to reacquire a portion of the CBCT image that corresponds to that which was rejected in the next breathing phase and the method proceeds back to step 610.


Turning now to FIG. 7, there is shown a simplified block diagram of computing device 80. Computing device 80 may include a memory 702, a processor 704, a display 706, a network interface 708, an input device 710, and/or an output module 712. Memory 702 may store application 81 and/or image data 514. Application 81 may, when executed by processor 704, cause display 706 to present user interface 716. Application 81 may also provide the interface between the sensed position of EM sensor 94 and the image and planning data developed in the pathway planning phase, described above.


Memory 702 may include any non-transitory computer-readable storage media for storing data and/or software that is executable by processor 704 and which controls the operation of computing device 80. In an embodiment, memory 507 may include one or more solid-state storage devices such as flash memory chips. Alternatively, or in addition to the one or more solid-state storage devices, memory 702 may include one or more mass storage devices connected to the processor 704 through a mass storage controller (not shown) and a communications bus (not shown). Although the description of computer-readable media contained herein refers to a solid-state storage, it should be appreciated by those skilled in the art that computer-readable storage media can be any available media that can be accessed by the processor 704. That is, computer readable storage media includes non-transitory, volatile and non-volatile, removable and non-removable media implemented in any method or technology for storage of information such as computer-readable instructions, data structures, program modules or other data. For example, computer-readable storage media includes RAM, ROM, EPROM, EEPROM, flash memory or other solid state memory technology, CD-ROM, DVD, Blu-Ray or other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to store the desired information and which can be accessed by computing device 80.


Network interface 708 may be configured to connect to a network such as a local area network (LAN) consisting of a wired network and/or a wireless network, a wide area network (WAN), a wireless mobile network, a Bluetooth network, and/or the internet. Input device 710 may be any device by means of which a user may interact with computing device 80, such as, for example, a mouse, keyboard, foot pedal, touch screen, and/or voice interface. Output module 712 may include any connectivity port or bus, such as, for example, parallel ports, serial ports, universal serial busses (USB), or any other similar connectivity port known to those skilled in the art.



FIG. 8 describes a method 800 requiring no pre-procedure CT scan data. In FIG. 8 method 800 follows steps of set up of the system 100, placement of the patient on the operating table 40, and initial navigation of catheter guide assembly 90 either alone or in combination with bronchoscope 50 to a location within the lungs. The location within the lungs could be a target lobe or other anatomical point. As an example, the location may be the third bifurcation in a desire lobe of the lung. Once at this location, method 800 starts with capturing a CBCT scan at step 802 using radiographic imaging device 20. The computing device 80 receives the CBCT scan at step 804. For example, the application 81 may be retrieve the CBCT scan from a database in which the scan was stored following capture, or a user may direct the radiographic imaging device 20 to output the CBCT scan directly to the computing device 80. At step 806, the distal end of the catheter 96 and a target (e.g., a lesion or other location for treatment) are identified in one or more images of the CBCT scan. This identification can be manual where the user marks the distal end of the catheter 96 and the target in one or more of the images of the CBCT scan. These images may be displayed in a user interface on computing device 80. Alternatively, the application 81 may be configured to conduct image analysis and to automatically identify the distal end of the catheter 96 and the target. If either or both of the distal portion of the catheter or the target cannot be identified in the CBCT images from the scan, the process can return to step 802 to conduct another CBCT scan. This may require repositioning of the patient, radiographic imaging device 20, or the catheter 96.


Following identification, at step 808 the computing device 80 can register the CBCT scan data with the electromagnetic field generated by the tracking system 70. Registration may be undertaken in a variety of ways. If, for example, the coordinate system of the radiological image device 20 is perpendicular to the operating table 40, all that is required is translation of the CBCT coordinates to match the tracking system (e.g., EM coordinates of the field produced by EM field generator 76). Alternatively, registration may be achieved utilizing a pose estimation technique.


To determine the pose for each slice making up the CBCT scan, fiducial markers which are formed in or on the EM field generator 76 placed under the patient are analyzed. The markers may be evenly spaced or may be varyingly spaced from one another in a known pattern. Regardless of how spaced, the orientation and placement of the markers is know and the spacing and positioning of the markers in any slice of the CBCT can be analyzed to determine the angle of the device relative to the radiographic imaging device 20 relative to the EM field generator 76. With the known position of the markers, and both a marked position of the distal portion of the catheter 96 and a detected position of the catheter as identified by the tracking system 70, a mathematical transform from the coordinate system of the CBCT scan data to the coordinate system of the tracking system 70 (e.g., EM coordinates).


Once registration is complete, at step 810, a 3D model of the patient's lungs can be generated from the CBCT scan, similar to the process described above with the pre-procedure CT image. At step 312, a pathway is generated through the 3D model from the marked position of the distal portion of the catheter 96 to the marked position of the target. This pathway may be manually created by a user, semi-automatically, or automatically derived, much as it might be in a 3D model from a pre-procedure CT scan. Navigation to the target may now be undertaken. If at any time during the navigation the user wishes to perform another CBCT scan, the decision can be made at step 814 and the process can revert back to step 302. The use of multiple CBCT scans may be desirable, for example, when performing microwave or RF ablation procedures within the lungs to ensure accurate placement of an ablation catheter in a desired location in the target. Once navigated to an appropriate location, a user or a robot may remove the LG 92 to allow for placement of an ablation catheter or other tool (e.g., a biopsy tool) to perform a procedure at step 816.



FIG. 9 depicts a method 900 requiring no pre-procedure CT scan data. As with method 800, method 900 follows steps of set up of the system 100 and placement of the patient on the operating table 40. Rather than initiate navigation as described in method 800, at step 902 a CBCT scan of the patient is undertaken. The computing device 80 receives the CBCT scan at step 904. For example, the application 81 may be retrieve the CBCT scan from a database in which the scan was stored following capture, or a user may direct the radiographic imaging device 20 to output the CBCT scan directly to the computing device 80. At step 906 a 3D model of the airways is generated from the CBCT scan. At step 908 either the 3D model, or slice images from the CBCT image are analyzed to identify a target (e.g., a lesion) and to generate a pathway to the target through the airway in the same manner as can be done with a pre-procedure CT image. Following target identification and pathway planning, a sweep of the airways may be undertaken at step 910. As described above, in the sweep of the airways, sensor 94 of catheter 96 is inserted into the airways and a point cloud of position data is generated. At step 912, the point cloud of data is matched to the internal features of the 3D model and the coordinate system of radiographic imaging device 20 is registered to the electromagnetic field coordinate system of the EM field output by the EM field generator 76. Once registered navigation of the catheter 96, either manually or robotically can be undertaken at step 914. Once proximate a target, a second CBCT scan may be undertaken at step 916. Reviewing the slice images of the CBCT scan, at step 918 the positions of the distal end of the catheter 96 and the target can be marked. At step 920, based on the marked positions of the distal portion of the catheter 96 and the target and offset can be calculated. Since the position of the target is unlikely to have moved significantly during the procedure, this offset is substantially and indication of error in the detected position of the sensor 94 at the distal end of the catheter 96 in the EM filed. With this offset calculated, at step 922 a displayed position of the distal portion of the catheter 96 in the 3D model can be updated to accurately depict the relative position of the catheter and the target in the 3D model, and in other views provided by the user interface of application 81 described herein above. The combination of steps 920 and 922 are a local registration of the CBCT and the EM field coordinate systems and again provide greater accuracy as may be desired when performing a procedure at step 924 such as microwave ablation or diagnostics such as biopsy of a lesion. If further movement of the catheter is desired further navigation can be undertaken at step 926, and the method can revert back to step 916 to update the local registration.



FIG. 10 provides yet a further method in accordance with the disclosure. In method 1000, at step 1002 pre-procedure planning (as described above) is undertaken utilizing a pre-procedure CT scan. At some time after the pre-procedure planning, system 100 is initialized, this may entail placement of the patient on the operating table 40 and initializing of the tracking system 70 and other steps described above. Once the patient is in position, at step 1004 the radiological imaging device 20 is employed to acquire a CBCT scan of a relevant portion of the patient (e.g., the lungs). At step 1006, the application 81 operating on computing device 80 receives the CBCT scan. For example, the application 81 may be retrieve the CBCT scan from a database in which the CBCT scan was stored following capture, or a user may direct the radiographic imaging device 20 to output the CBCT scan directly to the computing device 80. At step 1008 the CBCT scan is registered to the pre-operative scan. A variety of means can be used for this registration, for example image or 3D model matching may be employed to substantially match the pre-procedure CT scan to the CBCT scan. This registration enables the transfer of a planned pathway and a target from the pre-procedure plan generated from the pre-procedure CT scan to the CBCT scan. As a result of this registration, a user interface on the computing device 80 can display a pathway through a 3D model and other views generated from the CBCT scan to the target which is also now presented in the CBCT scan images. At step 1010, a sweep of the airways may be undertaken. As described above, in the sweep of the airways, sensor 94 of catheter 96 is inserted into the airways and a point cloud of position data is generated. At step 1012, the point cloud of data is matched to the internal features of the 3D model generated from the CBCT scan and the coordinate system of radiographic imaging device 20 is registered to the electromagnetic field coordinate system of the EM field output by the EM field generator 76 (or other tracking system 70 described herein). Once registered navigation of the catheter 96, either manually or robotically can be undertaken at step 1014. Once proximate a target, a second CBCT scan may be undertaken at step 1016. Reviewing slice images of the CBCT scan, at step 1018 the positions of the distal end of the catheter 96 and the target can be marked. At step 1020, based on the marked positions of the distal portion of the catheter 96 and the target and offset can be calculated. This offset is used to update the detected position of the sensor 94 in the EM field relative to the target. With this offset calculated, at step 1022 a displayed position of the distal portion of the catheter 96 in the 3D model generated from the CBCT scan can be updated to accurately depict the relative position of the catheter and the target in the 3D model, as well as other views provided by the user interface of application 81 described herein above. The combination of steps 1020 and 1022 are a local registration of the CBCT and the EM field coordinate systems and again provide greater accuracy as may be desired when performing a procedure at step 1024 treatment such as microwave ablation or diagnostics such as biopsy of a lesion. If further movement of the catheter 96 is desired further navigation can be undertaken at step 1026, and the method can revert back to step 1016 to update the local registration.


While several aspects of the disclosure have been shown in the drawings, it is not intended that the disclosure be limited thereto, as it is intended that the disclosure be as broad in scope as the art will allow and that the specification be read likewise. Therefore, the above description should not be construed as limiting, but merely as exemplifications of particular aspects.

Claims
  • 1. A method of registering an image to a luminal network comprising: detecting a position of a catheter-based sensor within a luminal network;receiving cone-beam computed tomography (CBCT) images of the luminal network with the sensor within the luminal network;presenting a CBCT image on a user interface;receiving an indication of a location of a target in the presented CBCT image;generating a 3D model of the luminal network from the CBCT images;generating a pathway through the luminal network from the detected position of the sensor to the target in the CBCT images and the 3D model; andcomparing a detected position of the catheter-based sensor in the luminal network prior to receiving the CBCT images to a detected position of the catheter-based sensor in the luminal network after receiving the CBCT images, wherein when it is determined that the detected position of the catheter-based sensor prior to the receipt of the CBCT images is substantially the same as the detected position of the catheter-based sensor after receipt of the CBCT images, the luminal network, the CBCT images, and the 3D model are registered.
  • 2. The method of claim 1, further comprising receiving survey data from the catheter-based sensor when it is determined that the detected position of the sensor after receipt of the CBCT images is different from the detected position of the sensor prior to receipt of the CBCT images.
  • 3. The method of claim 2, further comprising registering the luminal network to the CBCT images based on survey data.
  • 4. The method of claim 1, further comprising displaying the pathway in the CBCT images, the 3D model generated from the CBCT images, or a virtual bronchoscopy view of the 3D model from the CBCT images.
  • 5. The method of claim 4 further comprising displaying the position of the sensor along the pathway in the user interface.
  • 6. A method of registering an image to a luminal network comprising: receiving a pre-operative computed tomography (CT) image of the luminal network;receiving an indication of a target within the luminal network in the CT image;generating a pathway through the luminal network to the target;receiving cone-beam computed tomography (CBCT) images of the luminal network;detecting a location of a catheter-based sensor within the luminal network;generating a 3D model of the luminal network from the CBCT images;transforming coordinates of the pre-operative CT image to coordinates of the CBCT images to register the pre-operative CT image to the CBCT images;matching features from the CT images to features of the CBCT images and 3D model derived from the CBCT images; anddisplaying the pathway from the detected location of the catheter-based sensor to the target in the CBCT images or 3D model.
  • 7. The method of claim 6, further comprising displaying, the 3D model derived from the CBCT images, or a virtual bronchoscopy view of the 3D model from the CBCT images on a user interface.
  • 8. The method of claim 6 further comprising generating the 3D model from the CBCT image before transforming the pre-operative CT coordinates and the CBCT coordinates.
  • 9. The method of claim 6, further comprising: transferring the target from the pre-operative CT image to the CBCT images; andgenerating the 3D model from the CBCT images after transferring the target and pathway from the pre-operative CT image to the CBCT images.
  • 10. The method of claim 6, further comprising receiving survey data, wherein the survey data is received prior to receipt of the CBCT images or the survey data is received after transfer of the target and pathway to the CBCT images from the pre-operative CT image to register the CBCT images to the luminal network.
US Referenced Citations (1134)
Number Name Date Kind
1576781 Philips Mar 1926 A
1735726 Bornhardt Nov 1929 A
2407845 Nemeyer Sep 1946 A
2650588 Drew Sep 1953 A
2697433 Zehnder Dec 1954 A
3016899 Stenvall Jan 1962 A
3017887 Heyer Jan 1962 A
3061936 De Nov 1962 A
3073310 Mocarski Jan 1963 A
3109588 Polhemus et al. Nov 1963 A
3121228 Kalmus Feb 1964 A
3294083 Alderson Dec 1966 A
3367326 Frazier Feb 1968 A
3439256 Robert Apr 1969 A
3519436 Bauer et al. Jul 1970 A
3577160 White May 1971 A
3600625 Asahide et al. Aug 1971 A
3605725 Bentov Sep 1971 A
3614950 Graham Oct 1971 A
3644825 Davis et al. Feb 1972 A
3674014 Hans Jul 1972 A
3702935 Carey et al. Nov 1972 A
3704707 Halloran Dec 1972 A
3821469 Whetstone et al. Jun 1974 A
3822697 Komiya Jul 1974 A
3868565 Kuipers Feb 1975 A
3941127 Froning Mar 1976 A
3983474 Kuipers Sep 1976 A
4017858 Kuipers Apr 1977 A
4037592 Kronner Jul 1977 A
4052620 Brunnett Oct 1977 A
4054881 Raab Oct 1977 A
4117337 Staats Sep 1978 A
4135184 Pruzick Jan 1979 A
4173228 Childress et al. Nov 1979 A
4182312 Mushabac Jan 1980 A
4202349 Jones May 1980 A
4228799 Anichkov et al. Oct 1980 A
4249167 Purinton et al. Feb 1981 A
4256112 David et al. Mar 1981 A
4262306 Renner Apr 1981 A
4287809 Egli et al. Sep 1981 A
4298874 Kuipers Nov 1981 A
4308530 Kip et al. Dec 1981 A
4314251 Raab Feb 1982 A
4317078 Weed et al. Feb 1982 A
4319136 Randolph Mar 1982 A
4328548 Crow et al. May 1982 A
4328813 Ray May 1982 A
4339953 Iwasaki Jul 1982 A
4341220 Perry Jul 1982 A
4341385 Doyle et al. Jul 1982 A
4346384 Raab Aug 1982 A
4358856 Stivender et al. Nov 1982 A
4368536 Pfeiler Jan 1983 A
4394831 Egli et al. Jul 1983 A
4396885 Constant Aug 1983 A
4396945 Dimatteo et al. Aug 1983 A
4403321 Krueger Sep 1983 A
4418422 Richter et al. Nov 1983 A
4419012 Stephenson et al. Dec 1983 A
4422041 Lienau Dec 1983 A
4425511 Brosh Jan 1984 A
4431005 Mccormick Feb 1984 A
4447224 Decant, Jr. et al. May 1984 A
4447462 Tafuri et al. May 1984 A
4485815 Amplatz et al. Dec 1984 A
4506676 Duska Mar 1985 A
4543959 Sepponen Oct 1985 A
4548208 Niemi Oct 1985 A
4571834 Fraser et al. Feb 1986 A
4572198 Codrington Feb 1986 A
4583538 Onik et al. Apr 1986 A
4584577 Temple Apr 1986 A
4586491 Carpenter May 1986 A
4587975 Salo et al. May 1986 A
4608977 Brown Sep 1986 A
4613866 Blood Sep 1986 A
4617925 Laitinen Oct 1986 A
4618978 Cosman Oct 1986 A
4621628 Brudermann Nov 1986 A
4625718 Olerud et al. Dec 1986 A
4638798 Hunter et al. Jan 1987 A
4642786 Hansen Feb 1987 A
4645343 Stockdale et al. Feb 1987 A
4649504 Krouglicof et al. Mar 1987 A
4651732 Frederick Mar 1987 A
4653509 Oloff et al. Mar 1987 A
4659971 Suzuki et al. Apr 1987 A
4660970 Ferrano Apr 1987 A
4673352 Hansen Jun 1987 A
4686695 Macovski Aug 1987 A
4688037 Krieg Aug 1987 A
4696544 Costella Sep 1987 A
4697595 Breyer et al. Oct 1987 A
4701049 Beckman et al. Oct 1987 A
4704602 Asbrink Nov 1987 A
4705395 Hageniers Nov 1987 A
4705401 Addleman et al. Nov 1987 A
4706665 Gouda Nov 1987 A
4709156 Murphy et al. Nov 1987 A
4710708 Rorden et al. Dec 1987 A
4719419 Dawley Jan 1988 A
4722056 Roberts et al. Jan 1988 A
4722336 Kim et al. Feb 1988 A
4723544 Moore et al. Feb 1988 A
4726355 Okada Feb 1988 A
4727565 Ericson Feb 1988 A
4733969 Case et al. Mar 1988 A
4737032 Addleman et al. Apr 1988 A
4737794 Jones Apr 1988 A
4737921 Goldwasser et al. Apr 1988 A
4742356 Kuipers May 1988 A
4742815 Ninan et al. May 1988 A
4743770 Lee May 1988 A
4743771 Sacks et al. May 1988 A
4745290 Frankel et al. May 1988 A
4750487 Zanetti Jun 1988 A
4753528 Hines et al. Jun 1988 A
4761072 Pryor Aug 1988 A
4764016 Johansson Aug 1988 A
4771787 Wurster et al. Sep 1988 A
4779212 Levy Oct 1988 A
4782239 Hirose et al. Nov 1988 A
4784117 Miyazaki Nov 1988 A
4788481 Niwa Nov 1988 A
4791934 Brunnett Dec 1988 A
4793355 Crum et al. Dec 1988 A
4794262 Sato et al. Dec 1988 A
4797907 Anderton Jan 1989 A
4803976 Frigg et al. Feb 1989 A
4804261 Kirschen Feb 1989 A
4805615 Carol Feb 1989 A
4809694 Ferrara Mar 1989 A
4821200 Deberg Apr 1989 A
4821206 Arora Apr 1989 A
4821731 Martinelli et al. Apr 1989 A
4822163 Schmidt Apr 1989 A
4825091 Breyer et al. Apr 1989 A
4829250 Rotier May 1989 A
4829373 Leberl et al. May 1989 A
4836778 Baumrind et al. Jun 1989 A
4838265 Cosman et al. Jun 1989 A
4841967 Chang et al. Jun 1989 A
4845771 Wislocki et al. Jul 1989 A
4849692 Blood Jul 1989 A
4860331 Williams et al. Aug 1989 A
4862893 Martinelli Sep 1989 A
4869247 Howard, III et al. Sep 1989 A
4875165 Fencil et al. Oct 1989 A
4875478 Chen Oct 1989 A
4884566 Mountz et al. Dec 1989 A
4889526 Rauscher et al. Dec 1989 A
4896673 Rose et al. Jan 1990 A
4905698 Strohl, Jr. et al. Mar 1990 A
4923459 Nambu May 1990 A
4931056 Ghajar et al. Jun 1990 A
4945305 Blood Jul 1990 A
4945912 Langberg Aug 1990 A
4945914 Allen Aug 1990 A
4951653 Fry et al. Aug 1990 A
4955891 Carol Sep 1990 A
4961422 Alexander et al. Oct 1990 A
4977655 Martinelli Dec 1990 A
4989608 Ratner Feb 1991 A
4991579 Allen Feb 1991 A
5002058 Martinelli Mar 1991 A
5005592 Cartmell Apr 1991 A
5013047 Schwab May 1991 A
5013317 Dean et al. May 1991 A
5016639 Allen May 1991 A
5017139 Mushabac May 1991 A
5023102 Given, Jr. Jun 1991 A
5027818 Bova et al. Jul 1991 A
5030196 Inoue Jul 1991 A
5030222 Calandruccio et al. Jul 1991 A
5031203 Trecha Jul 1991 A
5042486 Pfeiler et al. Aug 1991 A
5047036 Koutrouvelis Sep 1991 A
5050608 Watanabe et al. Sep 1991 A
5054492 Scribner et al. Oct 1991 A
5057095 Fabian Oct 1991 A
5059789 Salcudean Oct 1991 A
5070462 Chau Dec 1991 A
5078140 Kwoh Jan 1992 A
5079699 Tuy et al. Jan 1992 A
5082286 Ryan et al. Jan 1992 A
5086401 Glassman et al. Feb 1992 A
5088928 Chan Feb 1992 A
5094241 Allen Mar 1992 A
5097839 Allen Mar 1992 A
5098426 Alfred et al. Mar 1992 A
5099845 Besz et al. Mar 1992 A
5099846 Hardy Mar 1992 A
5104393 Isner et al. Apr 1992 A
5105829 Fabian et al. Apr 1992 A
5107839 Houdek et al. Apr 1992 A
5107843 Aarnio et al. Apr 1992 A
5107862 Fabian et al. Apr 1992 A
5109194 Cantaloube Apr 1992 A
5119817 Allen Jun 1992 A
5127408 Parsons et al. Jul 1992 A
5129654 Bogner Jul 1992 A
5142930 Allen et al. Sep 1992 A
5143076 Hardy et al. Sep 1992 A
5152277 Honda et al. Oct 1992 A
5152288 Hoenig et al. Oct 1992 A
5160337 Cosman Nov 1992 A
5161536 Vilkomerson et al. Nov 1992 A
5178130 Kaiya Jan 1993 A
5178164 Allen Jan 1993 A
5178621 Cook et al. Jan 1993 A
5186174 Schloendorff et al. Feb 1993 A
5187475 Wagener et al. Feb 1993 A
5188126 Fabian et al. Feb 1993 A
5188368 Ryan Feb 1993 A
5190059 Fabian et al. Mar 1993 A
5190285 Levy et al. Mar 1993 A
5193106 Desena Mar 1993 A
5196928 Karasawa et al. Mar 1993 A
5197476 Nowacki et al. Mar 1993 A
5197965 Cherry et al. Mar 1993 A
5198768 Keren Mar 1993 A
5198877 Schulz Mar 1993 A
5203337 Feldman Apr 1993 A
5207688 Carol May 1993 A
5211164 Allen May 1993 A
5211165 Dumoulin et al. May 1993 A
5211176 Ishiguro et al. May 1993 A
5212720 Landi et al. May 1993 A
5214615 Bauer May 1993 A
5219351 Teubner et al. Jun 1993 A
5222499 Allen et al. Jun 1993 A
5224049 Mushabac Jun 1993 A
5228442 Imran Jul 1993 A
5230338 Allen et al. Jul 1993 A
5230623 Guthrie et al. Jul 1993 A
5233990 Barnea Aug 1993 A
5237996 Waldman et al. Aug 1993 A
5249581 Horbal et al. Oct 1993 A
5251127 Raab Oct 1993 A
5251635 Dumoulin et al. Oct 1993 A
5253647 Takahashi et al. Oct 1993 A
5255680 Darrow et al. Oct 1993 A
5257636 White Nov 1993 A
5257998 Ota et al. Nov 1993 A
5261404 Mick et al. Nov 1993 A
5262722 Hedengren et al. Nov 1993 A
5265610 Darrow et al. Nov 1993 A
5265611 Hoenig et al. Nov 1993 A
5269759 Hernandez et al. Dec 1993 A
5271400 Dumoulin et al. Dec 1993 A
5273025 Sakiyama et al. Dec 1993 A
5274551 Corby, Jr. Dec 1993 A
5279309 Taylor et al. Jan 1994 A
5285787 Machida Feb 1994 A
5291199 Overman et al. Mar 1994 A
5291889 Kenet et al. Mar 1994 A
5295483 Nowacki et al. Mar 1994 A
5297549 Beatty et al. Mar 1994 A
5299253 Wessels Mar 1994 A
5299254 Dancer et al. Mar 1994 A
5299288 Glassman et al. Mar 1994 A
5300080 Clayman et al. Apr 1994 A
5301061 Nakada et al. Apr 1994 A
5305091 Gelbart et al. Apr 1994 A
5305203 Raab Apr 1994 A
5306271 Zinreich et al. Apr 1994 A
5307072 Jones, Jr. Apr 1994 A
5307816 Hashimoto et al. May 1994 A
5309913 Kormos et al. May 1994 A
5315630 Sturm et al. May 1994 A
5316024 Hirschi et al. May 1994 A
5318025 Dumoulin et al. Jun 1994 A
5320111 Livingston Jun 1994 A
5325728 Zimmerman et al. Jul 1994 A
5325873 Hirschi et al. Jul 1994 A
5327889 Imran Jul 1994 A
5329944 Fabian et al. Jul 1994 A
5330485 Clayman et al. Jul 1994 A
5333168 Fernandes et al. Jul 1994 A
5341807 Nardella Aug 1994 A
5347289 Elhardt Sep 1994 A
5353795 Souza et al. Oct 1994 A
5353800 Pohndorf et al. Oct 1994 A
5353807 Demarco Oct 1994 A
5357253 Van et al. Oct 1994 A
5359417 Mueller et al. Oct 1994 A
5368030 Zinreich et al. Nov 1994 A
5371778 Yanof et al. Dec 1994 A
5375596 Twiss et al. Dec 1994 A
5376795 Hasegawa et al. Dec 1994 A
5377678 Dumoulin et al. Jan 1995 A
5383454 Bucholz et al. Jan 1995 A
5383852 Stevens-Wright et al. Jan 1995 A
5385146 Goldreyer Jan 1995 A
5385148 Lesh et al. Jan 1995 A
5386828 Owens et al. Feb 1995 A
5389073 Imran Feb 1995 A
5389101 Heilbrun et al. Feb 1995 A
5391199 Ben-Haim Feb 1995 A
5394457 Leibinger et al. Feb 1995 A
5394875 Lewis et al. Mar 1995 A
5397321 Houser et al. Mar 1995 A
5397329 Allen Mar 1995 A
5398684 Hardy Mar 1995 A
5398691 Martin et al. Mar 1995 A
5399146 Nowacki et al. Mar 1995 A
5400384 Fernandes et al. Mar 1995 A
5400771 Pirak et al. Mar 1995 A
5402801 Taylor Apr 1995 A
5405346 Grundy et al. Apr 1995 A
5408409 Glassman et al. Apr 1995 A
5409000 Imran Apr 1995 A
5413573 Koivukangas May 1995 A
5417210 Funda et al. May 1995 A
5419325 Dumoulin et al. May 1995 A
5423334 Jordan Jun 1995 A
5425367 Shapiro et al. Jun 1995 A
5425382 Golden et al. Jun 1995 A
5426683 O'farrell et al. Jun 1995 A
5426687 Goodall et al. Jun 1995 A
5427097 Depp Jun 1995 A
5429132 Guy et al. Jul 1995 A
5433198 Desai Jul 1995 A
5435573 Oakford Jul 1995 A
5437277 Dumoulin et al. Aug 1995 A
5443066 Dumoulin et al. Aug 1995 A
5443489 Ben-Haim Aug 1995 A
5444756 Pai et al. Aug 1995 A
5445144 Wodicka et al. Aug 1995 A
5445150 Dumoulin et al. Aug 1995 A
5445166 Taylor Aug 1995 A
5446548 Gerig et al. Aug 1995 A
5447154 Cinquin et al. Sep 1995 A
5447156 Dumoulin et al. Sep 1995 A
5448610 Yamamoto et al. Sep 1995 A
5453686 Anderson Sep 1995 A
5456254 Pietroski et al. Oct 1995 A
5456664 Heinzelman et al. Oct 1995 A
5456689 Kresch et al. Oct 1995 A
5456718 Szymaitis Oct 1995 A
5457641 Zimmer et al. Oct 1995 A
5458718 Venkitachalam Oct 1995 A
5464446 Dreessen et al. Nov 1995 A
5469847 Zinreich et al. Nov 1995 A
5472441 Edwards et al. Dec 1995 A
5476100 Galel Dec 1995 A
5476495 Kordis et al. Dec 1995 A
5478341 Cook et al. Dec 1995 A
5478343 Ritter Dec 1995 A
5480422 Ben-Haim Jan 1996 A
5480439 Bisek et al. Jan 1996 A
5483961 Kelly et al. Jan 1996 A
5485849 Panescu et al. Jan 1996 A
5487391 Panescu Jan 1996 A
5487729 Avellanet et al. Jan 1996 A
5487757 Truckai et al. Jan 1996 A
5489256 Adair Feb 1996 A
5490196 Rudich et al. Feb 1996 A
5492131 Galel Feb 1996 A
5492713 Sommermeyer Feb 1996 A
5493517 Frazier Feb 1996 A
5494034 Schloendorff et al. Feb 1996 A
5503416 Aoki et al. Apr 1996 A
5513637 Twiss et al. May 1996 A
5514146 Lam et al. May 1996 A
5515160 Schulz et al. May 1996 A
5515853 Smith et al. May 1996 A
5517990 Kalfas et al. May 1996 A
5520059 Garshelis May 1996 A
5522814 Bernaz Jun 1996 A
5522815 Durgin, Jr. et al. Jun 1996 A
5531227 Bret Jul 1996 A
5531520 Grimson et al. Jul 1996 A
5531686 Lundquist et al. Jul 1996 A
5542938 Avellanet et al. Aug 1996 A
5543951 Moehrmann Aug 1996 A
5545200 West et al. Aug 1996 A
5546940 Panescu et al. Aug 1996 A
5546949 Frazin et al. Aug 1996 A
5546951 Ben-Haim Aug 1996 A
5551429 Michael et al. Sep 1996 A
5555883 Avitall Sep 1996 A
5558091 Acker et al. Sep 1996 A
5566681 Manwaring et al. Oct 1996 A
5568384 Robb et al. Oct 1996 A
5568809 Ben-Haim Oct 1996 A
5571083 Lemelson Nov 1996 A
5572999 Funda et al. Nov 1996 A
5573533 Strul Nov 1996 A
5575794 Walus et al. Nov 1996 A
5575798 Koutrouvelis Nov 1996 A
5577991 Akui et al. Nov 1996 A
5583909 Hanover et al. Dec 1996 A
5588033 Yeung Dec 1996 A
5588430 Bova et al. Dec 1996 A
5590215 Allen Dec 1996 A
5592939 Martinelli Jan 1997 A
5595193 Walus et al. Jan 1997 A
5596228 Anderton et al. Jan 1997 A
5599305 Hermann et al. Feb 1997 A
5600330 Blood Feb 1997 A
5603318 Heilbrun et al. Feb 1997 A
5606975 Liang et al. Mar 1997 A
5611025 Lorensen et al. Mar 1997 A
5617462 Bruce Apr 1997 A
5617857 Chader et al. Apr 1997 A
5619261 Larry Apr 1997 A
5620734 Wesdorp et al. Apr 1997 A
5622169 Golden et al. Apr 1997 A
5622170 Schulz Apr 1997 A
5627873 Hanover et al. May 1997 A
5628315 Vilsmeier et al. May 1997 A
5630431 Taylor May 1997 A
5636634 Kordis et al. Jun 1997 A
5636644 Hart et al. Jun 1997 A
5638819 Manwaring et al. Jun 1997 A
5640170 Anderson Jun 1997 A
5642395 Larry et al. Jun 1997 A
5643175 Adair Jul 1997 A
5643268 Vilsmeier et al. Jul 1997 A
5645065 Shapiro et al. Jul 1997 A
5646524 Gilboa Jul 1997 A
5646525 Gilboa Jul 1997 A
5647361 Damadian Jul 1997 A
5651047 Moorman et al. Jul 1997 A
5660856 Adler-Moore et al. Aug 1997 A
5662108 Budd et al. Sep 1997 A
5662111 Cosman Sep 1997 A
5664001 Tachibana et al. Sep 1997 A
5668844 Webber Sep 1997 A
5674296 Bryan et al. Oct 1997 A
5676673 Ferre et al. Oct 1997 A
5681260 Ueda et al. Oct 1997 A
5682886 Delp et al. Nov 1997 A
5682890 Kormos et al. Nov 1997 A
5690108 Chakeres Nov 1997 A
5694945 Ben-Haim Dec 1997 A
5695500 Taylor et al. Dec 1997 A
5695501 Carol et al. Dec 1997 A
5696500 Diem Dec 1997 A
5697377 Wittkampf Dec 1997 A
5699799 Xu et al. Dec 1997 A
5701898 Adam et al. Dec 1997 A
5702406 Vilsmeier et al. Dec 1997 A
5704361 Seward et al. Jan 1998 A
5711299 Manwaring et al. Jan 1998 A
5713369 Tao et al. Feb 1998 A
5713853 Clark et al. Feb 1998 A
5713946 Ben-Haim Feb 1998 A
5715822 Watkins et al. Feb 1998 A
5715836 Kliegis et al. Feb 1998 A
5718241 Ben-Haim et al. Feb 1998 A
5727552 Ryan Mar 1998 A
5727553 Saad Mar 1998 A
5729129 Acker Mar 1998 A
5730129 Darrow et al. Mar 1998 A
5730130 Michael et al. Mar 1998 A
5732703 Kalfas et al. Mar 1998 A
5735278 Hoult et al. Apr 1998 A
5738096 Ben-Haim Apr 1998 A
5740802 Nafis et al. Apr 1998 A
5740808 Panescu et al. Apr 1998 A
5741214 Ouchi et al. Apr 1998 A
5741320 Thornton et al. Apr 1998 A
5742394 Hansen Apr 1998 A
5744802 Muehllehner et al. Apr 1998 A
5744953 Hansen Apr 1998 A
5748767 Raab May 1998 A
5749362 Funda et al. May 1998 A
5749835 Glantz May 1998 A
5752513 Acker et al. May 1998 A
5752518 Mcgee et al. May 1998 A
5755725 Druais May 1998 A
5758667 Slettenmark Jun 1998 A
5760335 Gilboa Jun 1998 A
5762064 Polvani Jun 1998 A
5767699 Bosnyak et al. Jun 1998 A
5767960 Orman Jun 1998 A
5769789 Wang et al. Jun 1998 A
5769843 Abela et al. Jun 1998 A
5769861 Vilsmeier Jun 1998 A
5772594 Barrick Jun 1998 A
5775322 Silverstein et al. Jul 1998 A
5776050 Chen et al. Jul 1998 A
5776064 Kalfas et al. Jul 1998 A
5782762 Vining Jul 1998 A
5782765 Jonkman Jul 1998 A
5782828 Chen et al. Jul 1998 A
5787886 Kelly et al. Aug 1998 A
5792055 Mckinnon Aug 1998 A
5795294 Luber et al. Aug 1998 A
5797849 Vesely et al. Aug 1998 A
5799055 Peshkin et al. Aug 1998 A
5799099 Wang et al. Aug 1998 A
5800352 Ferre et al. Sep 1998 A
5800535 Howard, III Sep 1998 A
5802719 O'farrell et al. Sep 1998 A
5803084 Olson Sep 1998 A
5803089 Ferre et al. Sep 1998 A
5807252 Hassfeld et al. Sep 1998 A
5810007 Holupka et al. Sep 1998 A
5810008 Dekel et al. Sep 1998 A
5810728 Kuhn Sep 1998 A
5810735 Halperin et al. Sep 1998 A
5820553 Hughes Oct 1998 A
5820591 Thompson et al. Oct 1998 A
5823192 Kalend et al. Oct 1998 A
5823958 Truppe Oct 1998 A
5828725 Levinson Oct 1998 A
5828770 Leis et al. Oct 1998 A
5829444 Ferre et al. Nov 1998 A
5831260 Hansen Nov 1998 A
5833608 Acker Nov 1998 A
5834759 Glossop Nov 1998 A
5836954 Heilbrun et al. Nov 1998 A
5837001 Mackey Nov 1998 A
5840024 Taniguchi et al. Nov 1998 A
5840025 Ben-Haim Nov 1998 A
5842984 Avitall Dec 1998 A
5843051 Adams et al. Dec 1998 A
5843076 Webster, Jr. et al. Dec 1998 A
5846183 Chilcoat Dec 1998 A
5848967 Cosman Dec 1998 A
5851183 Bucholz Dec 1998 A
5853327 Gilboa Dec 1998 A
5857997 Cimino et al. Jan 1999 A
5865726 Katsurada et al. Feb 1999 A
5865846 Bryan et al. Feb 1999 A
5868673 Vesely Feb 1999 A
5868674 Glowinski et al. Feb 1999 A
5868675 Henrion et al. Feb 1999 A
5871445 Bucholz Feb 1999 A
5871455 Ueno Feb 1999 A
5871487 Warner et al. Feb 1999 A
5871523 Fleischman et al. Feb 1999 A
5873822 Ferre et al. Feb 1999 A
5881124 Giger et al. Mar 1999 A
5882304 Ehnholm et al. Mar 1999 A
5884410 Prinz Mar 1999 A
5889834 Vilsmeier et al. Mar 1999 A
5891030 Johnson et al. Apr 1999 A
5891034 Bucholz Apr 1999 A
5891134 Goble et al. Apr 1999 A
5891157 Day et al. Apr 1999 A
5893885 Webster, Jr. Apr 1999 A
5899860 Pfeiffer et al. May 1999 A
5902239 Buurman May 1999 A
5902324 Thompson et al. May 1999 A
5904691 Barnett et al. May 1999 A
5907395 Schulz et al. May 1999 A
5909476 Wang et al. Jun 1999 A
5913820 Bladen et al. Jun 1999 A
5916210 Winston Jun 1999 A
5919147 Jain Jul 1999 A
5919188 Shearon et al. Jul 1999 A
5920319 Vining et al. Jul 1999 A
5920395 Schulz Jul 1999 A
5921992 Costales et al. Jul 1999 A
5923727 Navab Jul 1999 A
5928248 Acker Jul 1999 A
5930329 Navab Jul 1999 A
5935160 Auricchio et al. Aug 1999 A
5938585 Donofrio Aug 1999 A
5938602 Lloyd Aug 1999 A
5938603 Ponzi Aug 1999 A
5938694 Jaraczewski et al. Aug 1999 A
5941251 Panescu et al. Aug 1999 A
5944023 Johnson et al. Aug 1999 A
5947925 Ashiya et al. Sep 1999 A
5947980 Jensen et al. Sep 1999 A
5947981 Cosman Sep 1999 A
5950629 Taylor et al. Sep 1999 A
5951461 Nyo et al. Sep 1999 A
5951475 Gueziec et al. Sep 1999 A
5951571 Audette Sep 1999 A
5954647 Bova et al. Sep 1999 A
5954649 Chia et al. Sep 1999 A
5954796 Mccarty et al. Sep 1999 A
5957844 Dekel et al. Sep 1999 A
5966090 Mcewan Oct 1999 A
5967980 Ferre et al. Oct 1999 A
5967982 Barnett Oct 1999 A
5968047 Reed Oct 1999 A
5971767 Kaufman et al. Oct 1999 A
5971997 Guthrie et al. Oct 1999 A
5976127 Lax Nov 1999 A
5976156 Taylor et al. Nov 1999 A
5980504 Sharkey et al. Nov 1999 A
5980535 Barnett et al. Nov 1999 A
5983126 Wittkampf Nov 1999 A
5987349 Schulz Nov 1999 A
5987960 Messner et al. Nov 1999 A
5999837 Messner et al. Dec 1999 A
5999840 Grimson et al. Dec 1999 A
6001130 Bryan et al. Dec 1999 A
6004269 Crowley et al. Dec 1999 A
6006126 Cosman Dec 1999 A
6006127 Brug et al. Dec 1999 A
6013087 Adams et al. Jan 2000 A
6014580 Blume et al. Jan 2000 A
6016439 Acker Jan 2000 A
6019724 Gronningsaeter et al. Feb 2000 A
6019725 Vesely et al. Feb 2000 A
6019728 Iwata et al. Feb 2000 A
6022578 Miller Feb 2000 A
6024695 Taylor et al. Feb 2000 A
6024739 Ponzi et al. Feb 2000 A
6032675 Rubinsky Mar 2000 A
6035229 Silverstein et al. Mar 2000 A
6047080 Chen et al. Apr 2000 A
6050724 Schmitz et al. Apr 2000 A
6059718 Taniguchi et al. May 2000 A
6061588 Thornton et al. May 2000 A
6063022 Ben-Haim May 2000 A
6064390 Sagar et al. May 2000 A
6071288 Carol et al. Jun 2000 A
6073043 Schneider Jun 2000 A
6076008 Bucholz Jun 2000 A
6077257 Edwards et al. Jun 2000 A
6083162 Vining Jul 2000 A
6096036 Bowe et al. Aug 2000 A
6096050 Audette Aug 2000 A
6104294 Andersson et al. Aug 2000 A
6104944 Martinelli Aug 2000 A
6106517 Zupkas Aug 2000 A
6112111 Glantz Aug 2000 A
6115626 Whayne et al. Sep 2000 A
6117476 Eger et al. Sep 2000 A
6118845 Simon et al. Sep 2000 A
6122538 Sliwa, Jr. et al. Sep 2000 A
6122541 Cosman et al. Sep 2000 A
6123979 Hepburn et al. Sep 2000 A
6131396 Duerr et al. Oct 2000 A
6138045 Kupinski et al. Oct 2000 A
6139183 Graumann Oct 2000 A
6147480 Osadchy et al. Nov 2000 A
6149592 Yanof et al. Nov 2000 A
6151404 Pieper Nov 2000 A
6156067 Bryan et al. Dec 2000 A
6161032 Acker Dec 2000 A
6165181 Heilbrun et al. Dec 2000 A
6167296 Shahidi Dec 2000 A
6171303 Ben-Haim et al. Jan 2001 B1
6172499 Ashe Jan 2001 B1
6175756 Ferre et al. Jan 2001 B1
6178345 Vilsmeier et al. Jan 2001 B1
6179809 Khairkhahan et al. Jan 2001 B1
6181348 Geiger Jan 2001 B1
6183444 Glines et al. Feb 2001 B1
6188355 Gilboa Feb 2001 B1
6192280 Sommer et al. Feb 2001 B1
6194639 Botella et al. Feb 2001 B1
6201387 Govari Mar 2001 B1
6203493 Ben-Haim Mar 2001 B1
6203497 Dekel et al. Mar 2001 B1
6208884 Kumar et al. Mar 2001 B1
6210362 Ponzi Apr 2001 B1
6211666 Acker Apr 2001 B1
6213995 Steen et al. Apr 2001 B1
6213998 Shen et al. Apr 2001 B1
6216027 Parker et al. Apr 2001 B1
6216029 Paltieli Apr 2001 B1
6223067 Vilsmeier et al. Apr 2001 B1
6226543 Gilboa et al. May 2001 B1
6233476 Strommer et al. May 2001 B1
6236875 Bucholz et al. May 2001 B1
6245020 Moore et al. Jun 2001 B1
6246231 Ashe Jun 2001 B1
6246784 Summers et al. Jun 2001 B1
6246898 Vesely et al. Jun 2001 B1
6246899 Chia et al. Jun 2001 B1
6248074 Ohno et al. Jun 2001 B1
6253770 Acker et al. Jul 2001 B1
6259942 Westermann et al. Jul 2001 B1
6264654 Swartz et al. Jul 2001 B1
6266551 Osadchy et al. Jul 2001 B1
6272371 Shlomo Aug 2001 B1
6273896 Franck et al. Aug 2001 B1
6285902 Kienzle et al. Sep 2001 B1
6289235 Webber et al. Sep 2001 B1
6298262 Franck et al. Oct 2001 B1
6304769 Arenson et al. Oct 2001 B1
6306097 Park et al. Oct 2001 B1
6314310 Ben-Haim et al. Nov 2001 B1
6319250 Falwell et al. Nov 2001 B1
6331116 Kaufman et al. Dec 2001 B1
6331156 Haefele et al. Dec 2001 B1
6332089 Acker et al. Dec 2001 B1
6335617 Osadchy et al. Jan 2002 B1
6341231 Ferre et al. Jan 2002 B1
6345112 Summers et al. Feb 2002 B1
6346940 Fukunaga Feb 2002 B1
6351513 Bani-Hashemi et al. Feb 2002 B1
6351659 Vilsmeier Feb 2002 B1
6366799 Acker et al. Apr 2002 B1
6366800 Vining et al. Apr 2002 B1
6373240 Govari Apr 2002 B1
6373916 Inoue et al. Apr 2002 B1
6380732 Gilboa Apr 2002 B1
6381485 Hunter et al. Apr 2002 B1
6383144 Mooney et al. May 2002 B1
6387092 Burnside et al. May 2002 B1
6405072 Cosman Jun 2002 B1
6423009 Downey et al. Jul 2002 B1
6424856 Vilsmeier et al. Jul 2002 B1
6427314 Acker Aug 2002 B1
6428547 Vilsmeier et al. Aug 2002 B1
6434415 Foley et al. Aug 2002 B1
6437567 Schenck et al. Aug 2002 B1
6443894 Sumanaweera et al. Sep 2002 B1
6445943 Ferre et al. Sep 2002 B1
6447504 Ben-Haim et al. Sep 2002 B1
6453190 Acker et al. Sep 2002 B1
6466815 Saito et al. Oct 2002 B1
6468265 Evans et al. Oct 2002 B1
6470207 Simon et al. Oct 2002 B1
6473634 Barni Oct 2002 B1
6473635 Rasche Oct 2002 B1
6474341 Hunter et al. Nov 2002 B1
6478802 Kienzle et al. Nov 2002 B2
6484049 Seeley et al. Nov 2002 B1
6484118 Govari Nov 2002 B1
6490475 Seeley et al. Dec 2002 B1
6493573 Martinelli et al. Dec 2002 B1
6496188 Deschamps et al. Dec 2002 B1
6498477 Govari et al. Dec 2002 B1
6498944 Ben-Haim et al. Dec 2002 B1
6499488 Hunter et al. Dec 2002 B1
6501848 Carroll et al. Dec 2002 B1
6501981 Schweikard et al. Dec 2002 B1
6503195 Keller et al. Jan 2003 B1
6505065 Yanof et al. Jan 2003 B1
6516046 Froehlich et al. Feb 2003 B1
6517534 Mcgovern et al. Feb 2003 B1
6522907 Bladen et al. Feb 2003 B1
6526162 Asano et al. Feb 2003 B2
6527443 Vilsmeier et al. Mar 2003 B1
6535756 Simon et al. Mar 2003 B1
6551325 Neubauer et al. Apr 2003 B2
6556696 Summers et al. Apr 2003 B1
6558333 Gilboa et al. May 2003 B2
6574492 Ben-Haim et al. Jun 2003 B1
6574498 Gilboa Jun 2003 B1
6578579 Burnside et al. Jun 2003 B2
6580938 Acker Jun 2003 B1
6584174 Schubert et al. Jun 2003 B2
6585763 Keilman et al. Jul 2003 B1
6591129 Ben-Haim et al. Jul 2003 B1
6593884 Gilboa et al. Jul 2003 B1
6603868 Ludwig et al. Aug 2003 B1
6609022 Vilsmeier et al. Aug 2003 B2
6611700 Vilsmeier et al. Aug 2003 B1
6611793 Burnside et al. Aug 2003 B1
6615155 Gilboa Sep 2003 B2
6618612 Acker et al. Sep 2003 B1
6628980 Atalar et al. Sep 2003 B2
6650927 Keidar Nov 2003 B1
6651669 Burnside Nov 2003 B1
6694163 Vining Feb 2004 B1
6757557 Bladen et al. Jun 2004 B1
6783523 Qin et al. Aug 2004 B2
6792390 Burnside et al. Sep 2004 B1
6829379 Knoplioch et al. Dec 2004 B1
6833814 Gilboa et al. Dec 2004 B2
6850794 Shahidi Feb 2005 B2
6892090 Verard et al. May 2005 B2
6898263 Avinash et al. May 2005 B2
6909913 Vining Jun 2005 B2
6920347 Simon et al. Jul 2005 B2
6925200 Wood et al. Aug 2005 B2
7006677 Manjeshwar et al. Feb 2006 B2
7072501 Wood et al. Jul 2006 B2
7085400 Holsing et al. Aug 2006 B1
7096148 Anderson et al. Aug 2006 B2
7149564 Vining et al. Dec 2006 B2
7167180 Shibolet Jan 2007 B1
7174202 Bladen et al. Feb 2007 B2
7179220 Kukuk Feb 2007 B2
7236558 Saito et al. Jun 2007 B2
7301332 Govari et al. Nov 2007 B2
7315639 Kuhnigk Jan 2008 B2
7324104 Bitter et al. Jan 2008 B1
7336809 Zeng et al. Feb 2008 B2
7397937 Schneider et al. Jul 2008 B2
7428334 Schoisswohl et al. Sep 2008 B2
7452357 Megele et al. Nov 2008 B2
7505809 Strommer et al. Mar 2009 B2
7517320 Wibowo et al. Apr 2009 B2
7518619 Stoval et al. Apr 2009 B2
7551759 Hristov et al. Jun 2009 B2
7630752 Viswanathan Dec 2009 B2
7630753 Simon et al. Dec 2009 B2
7659912 Akimoto et al. Feb 2010 B2
7702153 Hong et al. Apr 2010 B2
7751865 Jascob et al. Jul 2010 B2
7756316 Odry et al. Jul 2010 B2
7788060 Schneider Aug 2010 B2
7792565 Vining Sep 2010 B2
7805269 Glossop Sep 2010 B2
7809176 Gündel Oct 2010 B2
7811294 Strommer et al. Oct 2010 B2
7822461 Geiger et al. Oct 2010 B2
7901348 Soper et al. Mar 2011 B2
7907772 Wang et al. Mar 2011 B2
7916918 Suri et al. Mar 2011 B2
7929014 Akimoto et al. Apr 2011 B2
7951070 Ozaki et al. May 2011 B2
7969142 Krueger et al. Jun 2011 B2
7985187 Wibowo et al. Jul 2011 B2
8009891 Vaan Aug 2011 B2
8049777 Akimoto et al. Nov 2011 B2
8055323 Sawyer Nov 2011 B2
8102416 Ito et al. Jan 2012 B2
8126241 Zarkh et al. Feb 2012 B2
8131344 Strommer et al. Mar 2012 B2
8170328 Masumoto et al. May 2012 B2
8199981 Koptenko et al. Jun 2012 B2
8200314 Bladen et al. Jun 2012 B2
8202213 Ito et al. Jun 2012 B2
8208708 Homan et al. Jun 2012 B2
8219179 Ganatra et al. Jul 2012 B2
8257346 Qin et al. Sep 2012 B2
8267927 Dalal et al. Sep 2012 B2
8290228 Cohen et al. Oct 2012 B2
8298135 Ito et al. Oct 2012 B2
8335359 Fidrich et al. Dec 2012 B2
8391952 Anderson Mar 2013 B2
8417009 Mizuno Apr 2013 B2
8482606 Razzaque et al. Jul 2013 B2
8494612 Vetter et al. Jul 2013 B2
8509877 Mori et al. Aug 2013 B2
8625869 Harder et al. Jan 2014 B2
8672836 Higgins et al. Mar 2014 B2
8682045 Vining et al. Mar 2014 B2
8696549 Holsing et al. Apr 2014 B2
8698806 Kunert et al. Apr 2014 B2
8700132 Ganatra et al. Apr 2014 B2
8706184 Mohr et al. Apr 2014 B2
8706193 Govari et al. Apr 2014 B2
8709034 Keast et al. Apr 2014 B2
8730237 Ruijters et al. May 2014 B2
8768029 Helm et al. Jul 2014 B2
8784400 Roschak Jul 2014 B2
8798227 Tsukagoshi et al. Aug 2014 B2
8798339 Mielekamp et al. Aug 2014 B2
8801601 Prisco et al. Aug 2014 B2
8819591 Wang et al. Aug 2014 B2
8827934 Chopra et al. Sep 2014 B2
8828023 Neff et al. Sep 2014 B2
8862204 Sobe et al. Oct 2014 B2
9008754 Steinberg et al. Apr 2015 B2
9129048 Stonefield et al. Sep 2015 B2
9247992 Ladtkow et al. Feb 2016 B2
9259269 Ladtkow et al. Feb 2016 B2
9433390 Nathaniel et al. Sep 2016 B2
9459770 Baker Oct 2016 B2
9575140 Zur Feb 2017 B2
9592095 Panescu et al. Mar 2017 B2
9603668 Weingarten et al. Mar 2017 B2
9770216 Brown et al. Sep 2017 B2
9833167 Cohen et al. Dec 2017 B2
9848953 Weingarten et al. Dec 2017 B2
9888898 Imagawa et al. Feb 2018 B2
9918659 Chopra et al. Mar 2018 B2
9974525 Weingarten et al. May 2018 B2
10127629 Razzaque et al. Nov 2018 B2
10130316 Funabasama et al. Nov 2018 B2
10373719 Soper et al. Aug 2019 B2
10376178 Chopra Aug 2019 B2
10405753 Sorger Sep 2019 B2
10478162 Barbagli et al. Nov 2019 B2
10480926 Froggatt et al. Nov 2019 B2
10524866 Srinivasan et al. Jan 2020 B2
10555788 Panescu et al. Feb 2020 B2
10610306 Chopra Apr 2020 B2
10638953 Duindam et al. May 2020 B2
10674970 Averbuch et al. Jun 2020 B2
10682070 Duindam Jun 2020 B2
10706543 Donhowe et al. Jul 2020 B2
10709506 Coste-Maniere et al. Jul 2020 B2
10772485 Schlesinger et al. Sep 2020 B2
10796432 Mintz et al. Oct 2020 B2
10823627 Sanborn et al. Nov 2020 B2
10827913 Ummalaneni et al. Nov 2020 B2
10835153 Rafii-Tari et al. Nov 2020 B2
10842575 Panescu et al. Nov 2020 B2
10885630 Li et al. Jan 2021 B2
10896506 Zhao et al. Jan 2021 B2
20010007918 Vilsmeier et al. Jul 2001 A1
20010031919 Strommer et al. Oct 2001 A1
20010034530 Malackowski et al. Oct 2001 A1
20010036245 Kienzle et al. Nov 2001 A1
20010038705 Rubbert et al. Nov 2001 A1
20020022837 Mazzocchi et al. Feb 2002 A1
20020045916 Gray et al. Apr 2002 A1
20020045919 Johansson-Ruden et al. Apr 2002 A1
20020065461 Cosman May 2002 A1
20020082498 Wendt et al. Jun 2002 A1
20020095081 Vilsmeier Jul 2002 A1
20020128565 Rudy Sep 2002 A1
20020137014 Anderson et al. Sep 2002 A1
20020143324 Edwards et al. Oct 2002 A1
20020165448 Ben-Haim et al. Nov 2002 A1
20020173689 Kaplan Nov 2002 A1
20020193686 Gilboa Dec 2002 A1
20030013972 Makin Jan 2003 A1
20030018251 Solomon Jan 2003 A1
20030074011 Gilboa et al. Apr 2003 A1
20030086599 Armato et al. May 2003 A1
20030095692 Mundy et al. May 2003 A1
20030099390 Zeng et al. May 2003 A1
20030142753 Gunday Jul 2003 A1
20030144658 Schwartz et al. Jul 2003 A1
20030160721 Gilboa et al. Aug 2003 A1
20030164952 Deichmann et al. Sep 2003 A1
20030197686 Usuda Oct 2003 A1
20030216639 Gilboa et al. Nov 2003 A1
20040000249 Avetisian Jan 2004 A1
20040006268 Gilboa et al. Jan 2004 A1
20040015049 Zaar Jan 2004 A1
20040019350 O'brien et al. Jan 2004 A1
20040024309 Ferre et al. Feb 2004 A1
20040086161 Sivaramakrishna et al. May 2004 A1
20040097804 Sobe May 2004 A1
20040122310 Lim Jun 2004 A1
20040138548 Strommer et al. Jul 2004 A1
20040143317 Stinson et al. Jul 2004 A1
20040169509 Czipott et al. Sep 2004 A1
20040215181 Christopherson et al. Oct 2004 A1
20040249267 Gilboa Dec 2004 A1
20040254454 Kockro Dec 2004 A1
20050018885 Chen et al. Jan 2005 A1
20050027193 Mitschke et al. Feb 2005 A1
20050033149 Strommer et al. Feb 2005 A1
20050059890 Deal et al. Mar 2005 A1
20050085715 Dukesherer et al. Apr 2005 A1
20050090818 Pike, Jr. et al. Apr 2005 A1
20050107688 Strommer May 2005 A1
20050119527 Banik et al. Jun 2005 A1
20050154282 Li et al. Jul 2005 A1
20050182295 Soper et al. Aug 2005 A1
20050197566 Strommer et al. Sep 2005 A1
20050207630 Chan et al. Sep 2005 A1
20050272971 Ohnishi et al. Dec 2005 A1
20060015126 Sher Jan 2006 A1
20060058647 Strommer et al. Mar 2006 A1
20060064006 Strommer et al. Mar 2006 A1
20060079759 Vaillant et al. Apr 2006 A1
20060116575 Willis Jun 2006 A1
20060149134 Soper et al. Jul 2006 A1
20060241396 Fabian et al. Oct 2006 A1
20060241399 Fabian Oct 2006 A1
20060253030 Altmann et al. Nov 2006 A1
20070078334 Scully et al. Apr 2007 A1
20070163597 Mikkaichi et al. Jul 2007 A1
20070167714 Kiraly et al. Jul 2007 A1
20070167738 Timinger et al. Jul 2007 A1
20070167743 Honda et al. Jul 2007 A1
20070167801 Webler et al. Jul 2007 A1
20070167804 Park et al. Jul 2007 A1
20070167806 Wood et al. Jul 2007 A1
20070225553 Shahidi Sep 2007 A1
20070232882 Glossop et al. Oct 2007 A1
20070232898 Huynh et al. Oct 2007 A1
20070265639 Danek et al. Nov 2007 A1
20070287901 Strommer et al. Dec 2007 A1
20080008367 Franaszek et al. Jan 2008 A1
20080008368 Matsumoto Jan 2008 A1
20080018468 Volpi et al. Jan 2008 A1
20080033452 Vetter et al. Feb 2008 A1
20080086051 Voegele Apr 2008 A1
20080097154 Makower et al. Apr 2008 A1
20080097187 Gielen et al. Apr 2008 A1
20080118135 Averbuch et al. May 2008 A1
20080123921 Gielen et al. May 2008 A1
20080132909 Jascob et al. Jun 2008 A1
20080132911 Sobe Jun 2008 A1
20080139886 Tatsuyama Jun 2008 A1
20080139915 Dolan et al. Jun 2008 A1
20080144909 Wiemker et al. Jun 2008 A1
20080147000 Seibel et al. Jun 2008 A1
20080154172 Mauch Jun 2008 A1
20080157755 Kruger et al. Jul 2008 A1
20080161682 Kendrick et al. Jul 2008 A1
20080162074 Schneider Jul 2008 A1
20080183071 Strommer et al. Jul 2008 A1
20080183073 Higgins et al. Jul 2008 A1
20080188749 Rasche et al. Aug 2008 A1
20080212881 Hirakawa Sep 2008 A1
20080243142 Gildenberg Oct 2008 A1
20080247622 Aylward et al. Oct 2008 A1
20090012390 Pescatore et al. Jan 2009 A1
20090030306 Miyoshi et al. Jan 2009 A1
20090082665 Anderson Mar 2009 A1
20090096807 Silverstein et al. Apr 2009 A1
20090182224 Shmarak et al. Jul 2009 A1
20090189820 Saito et al. Jul 2009 A1
20090284255 Zur Nov 2009 A1
20090318797 Hadani Dec 2009 A1
20100016658 Zou et al. Jan 2010 A1
20100290693 Cohen et al. Nov 2010 A1
20100310146 Higgins et al. Dec 2010 A1
20100312094 Guttman et al. Dec 2010 A1
20110085720 Averbuch Apr 2011 A1
20110236868 Bronstein et al. Sep 2011 A1
20110237897 Gilboa Sep 2011 A1
20110251607 Kruecker et al. Oct 2011 A1
20120120091 Koudijs et al. May 2012 A1
20120184844 Gielen et al. Jul 2012 A1
20120188352 Wittenberg et al. Jul 2012 A1
20120190923 Kunz et al. Jul 2012 A1
20120203065 Higgins et al. Aug 2012 A1
20120249546 Tschirren et al. Oct 2012 A1
20120280135 Bal Nov 2012 A1
20120287238 Onishi et al. Nov 2012 A1
20130063434 Miga et al. Mar 2013 A1
20130165854 Sandhu et al. Jun 2013 A1
20130231556 Holsing et al. Sep 2013 A1
20130303945 Blumenkranz et al. Nov 2013 A1
20130317352 Case et al. Nov 2013 A1
20140035798 Kawada et al. Feb 2014 A1
20140066766 Stonefield et al. Mar 2014 A1
20140336461 Reiter et al. Nov 2014 A1
20150148690 Chopra et al. May 2015 A1
20150265257 Costello et al. Sep 2015 A1
20150265368 Chopra et al. Sep 2015 A1
20150305612 Hunter et al. Oct 2015 A1
20160000302 Brown et al. Jan 2016 A1
20160000356 Brown et al. Jan 2016 A1
20160005193 Markov et al. Jan 2016 A1
20160005220 Weingarten et al. Jan 2016 A1
20160038248 Bharadwaj et al. Feb 2016 A1
20160073854 Zeien Mar 2016 A1
20160157939 Larkin et al. Jun 2016 A1
20160183841 Duindam et al. Jun 2016 A1
20160192860 Allenby et al. Jul 2016 A1
20160287344 Donhowe et al. Oct 2016 A1
20170035379 Weingarten Feb 2017 A1
20170112576 Coste-Maniere et al. Apr 2017 A1
20170135760 Girotto et al. May 2017 A1
20170156685 Dickhans Jun 2017 A1
20170172664 Weingarten et al. Jun 2017 A1
20170209071 Zhao et al. Jul 2017 A1
20170258526 Lang Sep 2017 A1
20170265952 Donhowe et al. Sep 2017 A1
20170280970 Sartor et al. Oct 2017 A1
20170311844 Zhao et al. Nov 2017 A1
20170319165 Averbuch Nov 2017 A1
20170345155 Higgins et al. Nov 2017 A1
20170361093 Yoo et al. Dec 2017 A1
20180078318 Barbagli et al. Mar 2018 A1
20180085079 Krimsky Mar 2018 A1
20180146839 Friedlander et al. May 2018 A1
20180153621 Duindam et al. Jun 2018 A1
20180235709 Donhowe et al. Aug 2018 A1
20180240237 Donhowe et al. Aug 2018 A1
20180256262 Duindam et al. Sep 2018 A1
20180263706 Averbuch Sep 2018 A1
20180279852 Rafii-Tari et al. Oct 2018 A1
20180325419 Zhao et al. Nov 2018 A1
20190000559 Berman et al. Jan 2019 A1
20190000560 Berman et al. Jan 2019 A1
20190008413 Duindam et al. Jan 2019 A1
20190027252 Calhoun et al. Jan 2019 A1
20190038359 Weingarten et al. Feb 2019 A1
20190038365 Soper et al. Feb 2019 A1
20190065209 Mishra et al. Feb 2019 A1
20190110839 Rafii-Tari et al. Apr 2019 A1
20190139216 Georgescu et al. May 2019 A1
20190175062 Rafii-Tari et al. Jun 2019 A1
20190183318 Froggatt et al. Jun 2019 A1
20190183585 Rafii-Tari et al. Jun 2019 A1
20190183587 Rafii-Tari et al. Jun 2019 A1
20190192234 Gadda et al. Jun 2019 A1
20190209016 Herzlinger et al. Jul 2019 A1
20190209043 Zhao et al. Jul 2019 A1
20190216548 Ummalaneni Jul 2019 A1
20190239723 Duindam et al. Aug 2019 A1
20190239831 Chopra Aug 2019 A1
20190250050 Sanborn et al. Aug 2019 A1
20190254649 Walters et al. Aug 2019 A1
20190269470 Barbagli et al. Sep 2019 A1
20190272634 Li et al. Sep 2019 A1
20190298160 Ummalaneni et al. Oct 2019 A1
20190298451 Wong et al. Oct 2019 A1
20190320878 Duindam et al. Oct 2019 A1
20190320937 Duindam et al. Oct 2019 A1
20190336238 Yu et al. Nov 2019 A1
20190343424 Blumenkranz et al. Nov 2019 A1
20190350659 Wang et al. Nov 2019 A1
20190365199 Zhao et al. Dec 2019 A1
20190365479 Rafii-Tari Dec 2019 A1
20190365486 Srinivasan et al. Dec 2019 A1
20190380787 Ye et al. Dec 2019 A1
20200000319 Saadat et al. Jan 2020 A1
20200000526 Zhao Jan 2020 A1
20200008655 Schlesinger et al. Jan 2020 A1
20200015925 Scheib Jan 2020 A1
20200030044 Wang et al. Jan 2020 A1
20200030461 Sorger Jan 2020 A1
20200038750 Kojima Feb 2020 A1
20200043207 Lo et al. Feb 2020 A1
20200046431 Soper et al. Feb 2020 A1
20200046436 Tzeisler et al. Feb 2020 A1
20200054399 Duindam et al. Feb 2020 A1
20200060771 Lo et al. Feb 2020 A1
20200069192 Sanborn et al. Mar 2020 A1
20200077870 Dicarlo et al. Mar 2020 A1
20200078095 Chopra et al. Mar 2020 A1
20200078103 Duindam et al. Mar 2020 A1
20200085514 Blumenkranz Mar 2020 A1
20200109124 Pomper et al. Apr 2020 A1
20200129045 Prisco Apr 2020 A1
20200129239 Bianchi et al. Apr 2020 A1
20200138515 Wong May 2020 A1
20200155116 Donhowe et al. May 2020 A1
20200170623 Averbuch Jun 2020 A1
20200170720 Ummalaneni Jun 2020 A1
20200179058 Barbagli et al. Jun 2020 A1
20200188038 Donhowe et al. Jun 2020 A1
20200188047 Itkowitz et al. Jun 2020 A1
20200205903 Srinivasan et al. Jul 2020 A1
20200205904 Chopra Jul 2020 A1
20200214664 Zhao et al. Jul 2020 A1
20200229679 Zhao et al. Jul 2020 A1
20200242767 Zhao et al. Jul 2020 A1
20200275860 Duindam Sep 2020 A1
20200297442 Adebar et al. Sep 2020 A1
20200315554 Averbuch et al. Oct 2020 A1
20200330795 Sawant et al. Oct 2020 A1
20200352427 Deyanov Nov 2020 A1
20200364865 Donhowe et al. Nov 2020 A1
Foreign Referenced Citations (86)
Number Date Country
0013237 Jul 2003 BR
0116004 Jun 2004 BR
964149 Mar 1975 CA
103068294 Apr 2013 CN
486540 Sep 2016 CZ
2709512 Aug 2017 CZ
2884879 Jan 2020 CZ
3042343 Jun 1982 DE
3508730 Sep 1986 DE
3520782 Dec 1986 DE
3717871 Dec 1988 DE
3831278 Mar 1989 DE
3838011 Jul 1989 DE
4213426 Oct 1992 DE
4225112 Dec 1993 DE
4233978 Apr 1994 DE
19715202 Oct 1998 DE
19751761 Oct 1998 DE
19832296 Feb 1999 DE
19747427 May 1999 DE
10085137 Nov 2002 DE
102009043523 Apr 2011 DE
0062941 Oct 1982 EP
0119660 Sep 1984 EP
0155857 Sep 1985 EP
0319844 Jun 1989 EP
0326768 Aug 1989 EP
0350996 Jan 1990 EP
0419729 Apr 1991 EP
0427358 May 1991 EP
0456103 Nov 1991 EP
0581704 Feb 1994 EP
0600610 Jun 1994 EP
0651968 May 1995 EP
0655138 May 1995 EP
0796633 Sep 1997 EP
0829229 Mar 1998 EP
0857461 Aug 1998 EP
0894473 Feb 1999 EP
0908146 Apr 1999 EP
0922966 Jun 1999 EP
0930046 Jul 1999 EP
1078644 Feb 2001 EP
2096523 Sep 2009 EP
2117436 Nov 2009 EP
1499235 Aug 2016 EP
3413830 Sep 2019 EP
3478161 Feb 2020 EP
3641686 Apr 2020 EP
3644885 May 2020 EP
3644886 May 2020 EP
2417970 Sep 1979 FR
2618211 Jan 1989 FR
2094590 Sep 1982 GB
2164856 Apr 1986 GB
2197078 May 1988 GB
S63240851 Oct 1988 JP
H03267054 Nov 1991 JP
H06194639 Jul 1994 JP
H07159378 Jun 1995 JP
H08233601 Sep 1996 JP
H08299305 Nov 1996 JP
H0325752 Nov 1997 JP
2002306403 Oct 2002 JP
2003290131 Oct 2003 JP
2005287900 Oct 2005 JP
2006204635 Aug 2006 JP
2009018184 Jan 2009 JP
2009078133 Apr 2009 JP
2010279695 Dec 2010 JP
2011193885 Oct 2011 JP
2013506861 Feb 2013 JP
PA03005028 Jan 2004 MX
225663 Jan 2005 MX
226292 Feb 2005 MX
246862 Jun 2007 MX
265247 Mar 2009 MX
284569 Mar 2011 MX
2000010456 Mar 2000 WO
2001067035 Sep 2001 WO
03086498 Oct 2003 WO
03086498 Oct 2003 WO
2009138871 Nov 2009 WO
2011102012 Aug 2011 WO
2013192598 Dec 2013 WO
2015149040 Oct 2015 WO
Non-Patent Literature Citations (2)
Entry
Extended European Search Report issued in European Patent Application No. 20188309.7 dated Mar. 31, 2021.
Communication pursuant to Article 94(3) EPC issued in European Patent Application No. 20188309.7 dated Oct. 17, 2023.
Related Publications (1)
Number Date Country
20210030482 A1 Feb 2021 US
Provisional Applications (1)
Number Date Country
62880489 Jul 2019 US