The disclosure relates to a system, apparatus, and method of detecting a catheter in a series of fluoroscopic images, determining the position of the catheter based on the detection, and updating the displayed position of the catheter in an electromagnetic navigation system for surgical procedures.
There are several commonly applied methods for treating various maladies affecting organs including the liver, brain, heart, lung and kidney. Often, one or more imaging modalities, such as magnetic resonance imaging, ultrasound imaging, computed tomography (CT), as well as others are employed by clinicians to identify areas of interest within a patient and ultimately targets for treatment.
An endoscopic approach has proven useful in navigating to areas of interest within a patient, and particularly so for areas within luminal networks of the body such as the lungs. To enable the endoscopic, and more particularly the bronchoscopic, approach in the lungs, endobronchial navigation systems have been developed that use previously acquired MRI data or CT image data to generate a three dimensional rendering or volume of the particular body part such as the lungs. In particular, previously acquired images, acquired from an MRI scan or CT scan of the patient, are utilized to generate a three dimensional or volumetric rendering of the patient.
The resulting volume generated from the MRI scan or CT scan is then utilized to create a navigation plan to facilitate the advancement of a navigation catheter (or other suitable device) through a bronchoscope and a branch of the bronchus of a patient to an area of interest. Electromagnetic tracking may be utilized in conjunction with the CT data to facilitate guidance of the navigation catheter through the branch of the bronchus to the area of interest. In certain instances, the navigation catheter may be positioned within one of the airways of the branched luminal networks adjacent to, or within, the area of interest to provide access for one or more medical instruments.
A fluoroscopic imaging device is commonly located in the operating room during navigation procedures. The standard fluoroscopic imaging device may be used by a clinician to visualize and confirm the placement of a tool after it has been navigated to a desired location. However, although standard fluoroscopic images display highly dense objects such as metal tools and bones as well as large soft-tissue objects such as the heart, the fluoroscopic images have difficulty resolving small soft-tissue objects of interest such as lesions or tumors. Further, the fluoroscope image is only a two dimensional projection. In order to be able to see small soft-tissue objects in three dimensional space, an X-ray volumetric reconstruction is needed. Several solutions exist that provide three dimensional volume reconstruction of soft-tissues such as CT and Cone-beam CT which are extensively used in the medical world. These machines algorithmically combine multiple X-ray projections from known, calibrated X-ray source positions into three dimensional volume in which the soft-tissues are visible.
The disclosure relates to a system, apparatus, and method of detecting a catheter in a series of fluoroscopic images, determining the position of the catheter based on the detection, and updating the displayed position of the catheter in an electromagnetic navigation system for surgical procedures. The disclosure utilizes a combination of shallow neural network operators and deep neural network operators to detect catheter candidates in a fluoroscopic data set. Additionally, false-positive candidate detections are eliminated according to the methods described herein. The position data of the catheter is acquired from the fluoroscopic data and is used as a correction factor for the displayed electromagnetically tracked position of the catheter.
The system of the disclosure constructs a fluoroscopic-based 3D construction of a target area which includes the catheter and a target (e.g., soft-tissue object, lesion, tumor, etc.) in order to determine the location of the catheter relative to the target. In particular, the system identifies the position, orientation, angle, and distance of the catheter relative to the target in each fluoroscopic frame of the fluoroscopic data. This relative location data is used to update a displayed electromagnetic position of the catheter over a CT-based rendering, for example, of a patient's luminal network. With this updated display, a clinician is able to more accurately navigate and confirm placement of the catheter and other surgical tools relative to a target during an electromagnetic navigation procedure.
Aspects of the disclosure are described in detail with reference to the figures wherein like reference numerals identify similar or identical elements. As used herein, the term “distal” refers to the portion that is being described which is further from a user, while the term “proximal” refers to the portion that is being described which is closer to a user.
According to one aspect of the disclosure, a method for detecting a catheter in fluoroscopic data is provided. The method includes acquiring fluoroscopic data from a fluoroscopic sweep of a target area. The target area may be, for example, within a patient's luminal network. The fluoroscopic data includes 2D fluoroscopic frames of the target area captured from different perspectives. The method further includes performing an initial catheter detection for catheter tip candidates in each 2D frame of the fluoroscopic data, performing a secondary catheter detection for catheter tip candidates in each 2D frame of the fluoroscopic data. Additionally, the method includes eliminating false-positive catheter tip candidates of the secondary catheter detection by reconstructing a 3D position of the catheter tip and finding an intersecting point of rays corresponding to each 2D frame, and reweighing the catheter tip candidates of the secondary catheter detection based on a distance of the catheter tip candidate from a projected 3D point.
The initial catheter detection may include applying a shallow neural network operator and the secondary catheter detection may include applying a deep neural network operator. The secondary catheter detection for catheter tip candidates in each 2D frame of the fluoroscopic data may include considering the catheter tip candidates of the initial catheter detection. Reweighing the catheter tip candidates of the secondary catheter detection based on a distance of the catheter tip candidate from a projected 3D point may include decreasing a weight of a pixel corresponding to a candidate when the distance of the catheter tip candidate is far from the projected 3D point.
Additionally, the method may further include iteratively repeating eliminating false-positive detections by reconstructing a 3D position of the catheter tip and finding an intersecting point of rays corresponding to each 2D frame. Additionally, or alternatively, the method may include displaying a user interface for manually selecting the catheter tip in a 2D fluoroscopic frame of the fluoroscopic data prior to performing an initial catheter detection for catheter tip candidates in each 2D frame of the fluoroscopic data.
In another aspect, a method for detecting a catheter in fluoroscopic data during a surgical navigation procedure is provided. The method includes tracking an electromagnetic position of a catheter using electromagnetic coordinates during a navigation procedure of the catheter to a target area, displaying the tracked electromagnetic position of the catheter on a display of a 3D rendering, and acquiring fluoroscopic data from a fluoroscopic sweep of the target area. The fluoroscopic data includes 2D fluoroscopic frames of the target area captured from different perspectives. The target area may be, for example, within a patient's luminal network and the 3D rendering may be, for example, a 3D rendering of the patient's luminal network. The method further includes constructing fluoroscopic-based three dimensional volumetric data of the target area from the acquired fluoroscopic data including a three-dimensional construction of a soft-tissue target in the target area, acquiring, for each 2D frame of the fluoroscopic data, data of a position of the catheter relative to the three-dimensional construction of the soft-tissue, and registering the acquired data of the position of the catheter relative to the three-dimensional construction of the soft-tissue with the electromagnetic position of the catheter. Additionally, the method includes displaying the position of the catheter on the display of the 3D rendering based on the registration of the acquired data of the position of the catheter relative to the three-dimensional construction of the soft-tissue with the electromagnetic position of the catheter.
The method may further include performing at least one catheter detection for catheter tip candidates in each 2D frame of the fluoroscopic data. An initial catheter detection may include applying a shallow neural network operator and a secondary catheter detection may include applying a deep neural network operator. Additionally, or alternatively, the method may further include eliminating false-positive catheter tip candidates of the secondary catheter detection by reconstructing a 3D position of the catheter tip and finding an intersecting point of rays corresponding to each 2D frame. The secondary catheter tip detection may consider the candidates identified in the initial catheter detection. Additionally, or alternatively, the method may include reweighing the catheter tip candidates of the secondary catheter detection based on a distance of the catheter tip candidate from a projected 3D point.
In another aspect, a system for performing an electromagnetic surgical navigation procedure is provided. The system includes an electromagnetic tracking system having electromagnetic tracking coordinates, a catheter including a sensor configured to couple to the electromagnetic tracking system for detecting a position of the catheter in the electromagnetic coordinates, and a computing device operably coupled to the electromagnetic tracking system and the catheter.
The computing device is configured to display a navigation path to guide navigation of the catheter to a target area, display the position of the catheter in the electromagnetic coordinates on a 3D rendering, and acquire fluoroscopic data from a fluoroscopic sweep of the target area. The target area may be, for example, within a patient's luminal network and the 3D rendering may be, for example, a 3D rendering of the patient's luminal network. The fluoroscopic data includes 2D fluoroscopic frames of the target area captured from different perspectives. Additionally, the computing device is configured to perform an initial catheter detection for catheter tip candidates in each 2D frame of the fluoroscopic data, perform a secondary catheter detection for catheter tip candidates in each 2D frame of the fluoroscopic data, eliminate false-positive catheter tip candidates of the secondary catheter detection by reconstructing a 3D position of the catheter tip and finding an intersecting point of rays corresponding to each 2D frame, and reweigh the catheter tip candidates of the secondary catheter detection based on a distance of the catheter tip candidate from a projected 3D point.
The computing device may be configured to perform an initial catheter detection for catheter tip candidates in each 2D frame of the fluoroscopic data by applying a shallow neural network operator and perform a secondary catheter detection for catheter tip candidates in each 2D frame of the fluoroscopic data by applying a deep neural network operator. Additionally, or alternatively, the computing device may be configured to construct fluoroscopic-based three dimensional volumetric data of the target area from the acquired fluoroscopic data. The fluoroscopic-based three dimensional volumetric data includes a three-dimensional construction of a soft-tissue target in the target area.
In an aspect, the computing device is further configured to acquire, for each 2D frame of the fluoroscopic data, data of a position of the catheter relative to the three-dimensional construction of the soft-tissue, register the acquired data of the position of the catheter relative to the three-dimensional construction of the soft-tissue with the electromagnetic position of the catheter, and display the position of the catheter on the display of the 3D rendering based on the registration of the acquired data of the position of the catheter relative to the three-dimensional construction of the soft-tissue with the electromagnetic position of the catheter.
Various aspects and embodiments of the disclosure are described hereinbelow with references to the drawings, wherein:
In order to navigate tools to a remote soft-tissue target for biopsy or treatment, both the tool and the target should be visible in some sort of a three dimensional guidance system. The majority of these systems use some X-ray device to see through the body. For example, a CT machine can be used with iterative scans during procedure to provide guidance through the body until the tools reach the target. This is a tedious procedure as it requires several full CT scans, a dedicated CT room and blind navigation between scans. In addition, each scan requires the staff to leave the room. Another option is a Cone-beam CT machine which is available in some operation rooms and is somewhat easier to operate, but is expensive and like the CT only provides blind navigation between scans, requires multiple iterations for navigation and requires the staff to leave the room.
Accordingly, there is a need for a system that can achieve the benefits of the CT and Cone-beam CT three dimensional image guidance without the underlying costs, preparation requirements, and radiation side effects associated with these systems.
The disclosure is directed to a system and method for catheter detection in fluoroscopic data and constructing local three dimensional volumetric data, in which small soft-tissue objects are visible, from the fluoroscopic data captured by a standard fluoroscopic imaging device available in most procedure rooms. The catheter detection and constructed fluoroscopic-based local three dimensional volumetric data may be used for guidance, navigation planning, improved navigation accuracy, navigation confirmation, and treatment confirmation.
As shown in
The following description of
Method 200 begins with step 201, where a catheter (or any surgical tool) is navigated to a region of interest within a patient's luminal network. The navigation in step 201 may utilize the electromagnetic tracking and pathway plans described above. In step 203, a fluoroscopic sweep of the target area is performed to acquire fluoroscopic data of the target area with the catheter positioned therein. In particular, in step 203, a fluoroscope is positioned about the patient such that fluoroscopic images of the target area may be captured along the entire sweep (e.g. ranging from the angles of −30 degrees and +30 degrees).
After the fluoroscopic data is acquired, in step 205 an initial catheter detection is performed. In the initial catheter detection, computing device performs an operator in each 2D frame of the fluoroscopic data to detect possible catheter tip candidates in each 2D frame. In one aspect, step 205 includes utilizing a shallow neural network operator (e.g., four layers). Utilizing the shallow neural network operator provides the advantage of fast completion (approximately 1 second for every 32 2D frames of the fluoroscopic data), but simultaneously provides the disadvantage of identifying many false-positive catheter tip detections. In particular,
In step 207, a secondary catheter detection operation is performed to detect catheter tip candidates in each 2D frame of the fluoroscopic data. In one aspect, step 207 includes utilizing a second neural network operator, in this case a deep neural network operator (e.g., eleven layers). Utilizing the deep neural network operator takes a longer period of time to complete when compared to the shallow neural network operator, but results in fewer false-positive candidate detections. Additionally, the deep neural network operator alone, though not providing as many false-positive candidate detections as the shallow neural network operator, in some instances fails to detect the actual catheter tip as a candidate.
In view of the above, one benefit of the disclosed method is that utilizing only one of the initial catheter detection in step 205 or the secondary catheter detection in step 207 could be insufficient for accurately identifying the catheter tip in the fluoroscopic data. As explained above, if the shallow neural network operator of step 205 were performed alone, the catheter will be detected in almost all of the 2D frames, but many false-positives will be detected as well, leading to an inaccurate catheter detection. Additionally, if the deep neural network operator were performed alone, then few false positives will be detected, but some actual catheter detections will be missed, also leading to an inaccurate catheter detection. Accordingly, method 200 includes applying both the shallow neural network operator and the deep neural network operator to ensure an accurate catheter detection. Additionally, if the deep neural network has a misdetection in a fluoro video frame, the catheter won't be detected in that frame (even if it was found in the initial catheter detection). Thus, one thing that prevents misdetection of the catheter in this case are valid detections from other frames, as described in further detail below. Another benefit of utilizing two separate neural networks is that splitting the detector to an initial network and a second deeper network is may enhance runtime optimization as described above.
In one aspect, in step 207, the deep neural network operator considers the candidates from the initial catheter detection of step 205. That is, in utilizing the candidates detected in the initial catheter detection, it can be ensured that the actual catheter will not be missed in the deep neural network operator and will always be identified as a catheter candidate after performance of the secondary catheter detection. In particular, the initial catheter detection outputs a “catheter probability” for each pixel in each fluoroscopic frame. Only pixels with high probability (above some fixed threshold) are considered as catheter candidates after the initial catheter detection. An image patch is extracted around each pixel candidate (having the catheter probability exceeding the threshold) and is input to the second catheter detection (e.g., the deep neural network).
In step 209, false-positive detections are eliminated by reconstructing 3D position of the catheter tip. In particular, with reference to
In step 213, the catheter candidate detections are reweighed according to their distance from the projected 3D point in step 209. In particular, with reference to
Method 700 begins at step 701 where the electromagnetic position of a catheter is tracked during navigation of the catheter to a target area within a patient's luminal network. In step 703, the tracked electromagnetic position of the catheter is displayed during the navigation on a display of a CT-based 3D rendering of the patient's luminal network. Displaying the electromagnetic position of the catheter on the 3D rendering of the patient's luminal network assists the clinician in navigating to the region of interest or target area. Step 703 may additionally include displaying a predetermined route or path to follow through the patient's luminal network to navigate to the target area. Such a display of the pathway or route enables a clinician to identify when the catheter has deviated from the desired path, the progress of the navigation, and when the target is reached.
Once the catheter is navigated to the target area as displayed in step 703, it is possible that the actual position of the catheter within the patient's luminal network is not exactly as displayed in step 703, because the displayed position of the catheter is based only on the electromagnetically tracked position of the catheter's sensor and is displayed over an old CT-data set. Targets within the target area, and other such anatomical features, may have changed since the time of acquisition of the CT data and the electromagnetic tracking is subject to interference and unreliability. Additionally, the display in step 703 merely shows a representation of the catheter, and not the actual catheter within the luminal network relative to structures therein. To more accurately display the actual position of the catheter within the patient's actual luminal network, method 700 proceeds to step 705.
In step 705, a fluoroscopic sweep of the target area is performed to acquire fluoroscopic data of the target area with the catheter positioned therein. In particular, in step 705, a fluoroscope is positioned about the patient such that fluoroscopic images of the target area may be captured along a sweep (e.g. ranging from the angles of −30 degrees and +30 degrees). In step 707, the catheter tip is detected throughout the frames of the fluoroscopic data. In one aspect, step 707 utilizes some or all of the steps of method 200 for the catheter detection.
In step 709, a fluoroscopic-based 3D rendering of the target area with the catheter positioned therein is constructed. Such a fluoroscopic-based 3D construction enables visualization of objects that are not otherwise visible in the fluoroscopic data itself. For example, small soft-tissue objects, such as tumors or lesions, are not visible in the fluoroscopic data, but are visible in the fluoroscopic-based 3D rendering of the target area. Further details regarding the construction of step 709 may be found in U.S. Patent Application Publication No. 2017/0035380, filed Aug. 1, 2016, entitled System and Method for Navigating to Target and Performing Procedure on Target Utilizing Fluoroscopic-Based Local Three Dimensional Volume Reconstruction, and U.S. Patent Application Publication No. 2017/0035379, filed Aug. 1, 2016, entitled System and Method for Local Three Dimensional Volume Reconstruction Using a Standard Fluoroscope, the entire contents of each of which are incorporated by reference herein.
In step 711, position data of the catheter relative to the target in the target area in each frame of the fluoroscopic data is acquired. In particular, the construction of the fluoroscopic-based 3D rendering in step 709 enables the visibility of the target (e.g., soft tissue object, lesion, tumor, etc.) and the location of the target to be determined or otherwise ascertained. As described above, prior to the construction of the fluoroscopic-based 3D rendering in step 709, the soft-tissue objects (e.g., target, tumor, lesion, etc.) are not visible and the location of the object(s), relative to the catheter cannot be determined. In one aspect, in step 711, the position, orientation, and distance of the catheter relative to the target are determined for each slice of the fluoroscopic-based 3D rendering. Additionally, the position data acquired in step 711 may be correlated to the fluoroscopic data acquired in step 705.
In step 713, the position data acquired in step 711 is registered to the electromagnetic tracked coordinates of the catheter. In one aspect, in step 713, the system displays the catheter on the CT data by employing an “Antenna-to-CT” registration, where the catheter position (and optionally orientation) in antenna coordinates is continuously calculated by the electromagnetic localization algorithms. The position is then transformed to CT coordinates. This registration may be based on the first bifurcations of the airway tree (main carina and bronchi). Due to this (and other factors, such as patient sedation, and different patient pose during CT and bronchoscopic procedure) the registration is less accurate in the periphery of the lungs. The registration is corrected for the target area by composing one or more of the following registrations: 1) Antenna-to-fluoro, where the catheter position is known in both antenna coordinates (using electromagnetic localization) and fluoro coordinates (automatic catheter detection described in the preceding steps), and is used as a basis for the registration; and/or 2) Fluoro-to-CT, where the target's position is marked on the CT data pre-operatively and then also marked on the fluoro-based 3D reconstruction inter-operatively. After performing calculations for the above registrations, a new catheter position in antenna coordinates is transformed to fluoro coordinates by employing the “Antenna-to-fluoro” registration, which is then transformed to CT coordinates by employing the “Fluoro-to-CT” registration.
In step 715, the displayed position of the catheter on the CT-based 3D rendering of the patient's luminal network is updated using the position data acquired in step 711. In particular, the position, orientation, and distance of the catheter relative to the target in the fluoroscopic-based 3D data set are compared to the position, orientation, and distance of the displayed catheter relative to the target in the CT-based 3D rendering. In one aspect, the orientation, position, and distance of the displayed catheter relative to the target in the CT-based rendering is updated to correspond or match with the orientation, position, and distance of the catheter relative to the target in the fluoroscopic-based 3D data set.
Referring back to
A fluoroscopic imaging device 110 capable of acquiring fluoroscopic or x-ray images or video of the patient “P” is also included in this particular aspect of system 100. The fluoroscopic data (e.g., images, series of images, or video) captured by the fluoroscopic imaging device 110 may be stored within the fluoroscopic imaging device 110 or transmitted to computing device 125 for storage, processing, and display. Additionally, the fluoroscopic imaging device 110 may move relative to the patient “P” so that images may be acquired from different angles or perspectives relative to the patient “P” to create a fluoroscopic video from a fluoroscopic sweep. Fluoroscopic imaging device 110 may include a single imaging device or more than one imaging device. In embodiments including multiple imaging devices, each imaging device may be a different type of imaging device or the same type. Further details regarding the imaging device 110 are described in U.S. Pat. No. 8,565,858, which is incorporated by reference in its entirety herein.
Computing device 125 may be any suitable computing device including a processor and storage medium, wherein the processor is capable of executing instructions stored on the storage medium. The computing device 125 may further include a database configured to store patient data, CT data sets including CT images, fluoroscopic data sets including fluoroscopic images and video, navigation plans, and any other such data. Although not explicitly illustrated, the computing device 125 may include inputs, or may otherwise be configured to receive, CT data sets, fluoroscopic images/video and other data described herein. Additionally, computing device 125 includes a display configured to display graphical user interfaces.
With respect to the planning phase, computing device 125 utilizes previously acquired CT image data for generating and viewing a three dimensional model of the patient's “P's” airways, enables the identification of a target on the three dimensional model (automatically, semi-automatically, or manually), and allows for determining a pathway through the patient's “P's” airways to tissue located at and around the target. More specifically, CT images acquired from previous CT scans are processed and assembled into a three dimensional CT volume, which is then utilized to generate a three dimensional model of the patient's “P's” airways. The three dimensional model may be displayed on a display associated with computing device 125, or in any other suitable fashion. Using computing device 125, various views of the three dimensional model or enhanced two dimensional images generated from the three dimensional model are presented. The enhanced two dimensional images may possess some three dimensional capabilities because they are generated from three dimensional data. The three dimensional model may be manipulated to facilitate identification of target on the three dimensional model or two dimensional images, and selection of a suitable pathway (e.g., route to be following during navigation) through the patient's “P's” airways to access tissue located at the target can be made. Once selected, the pathway plan, three dimensional model, and images derived therefrom, can be saved and exported to a navigation system for use during the navigation phase(s). One such planning software is the ILOGIC® planning suite currently sold by Medtronic PLC.
With respect to the navigation phase, a six degrees-of-freedom electromagnetic tracking system 50, e.g., similar to those disclosed in U.S. Pat. Nos. 8,467,589, 6,188,355, and published PCT Application Nos. WO 00/10456 and WO 01/67035, the entire contents of each of which are incorporated herein by reference, or other suitable positioning measuring system, is utilized for performing registration of the images and the pathway for navigation, although other configurations are also contemplated. Tracking system 50 includes a tracking module 52, a plurality of reference sensors 54, and a transmitter mat 56. Tracking system 50 is configured for use with a sensor 44 of catheter guide assembly 40 to track the electromagnetic position thereof within an electromagnetic coordinate system.
Transmitter mat 56 is positioned beneath patient “P.” Transmitter mat 56 generates an electromagnetic field around at least a portion of the patient “P” within which the position of a plurality of reference sensors 54 and the sensor element 44 can be determined with use of a tracking module 52. One or more of reference sensors 54 are attached to the chest of the patient “P.” The six degrees of freedom coordinates of reference sensors 54 are sent to computing device 125 (which includes the appropriate software) where they are used to calculate a patient coordinate frame of reference. Registration, as detailed below, is generally performed to coordinate locations of the three dimensional model and two dimensional images from the planning phase with the patient's “P's” airways as observed through the bronchoscope 30, and allow for the navigation phase to be undertaken with precise knowledge of the location of the sensor 44, even in portions of the airway where the bronchoscope 30 cannot reach. Further details of such a registration technique and their implementation in luminal navigation can be found in U.S. Patent Application Publication No. 2011/0085720, the entire content of which is incorporated herein by reference, although other suitable techniques are also contemplated.
Registration of the patient's “P's” location on the transmitter mat 56 is performed by moving sensor 44 through the airways of the patient's “P.” More specifically, data pertaining to locations of sensor 44, while EWC 12 is moving through the airways, is recorded using transmitter mat 56, reference sensors 54, and tracking module 52. A shape resulting from this location data is compared to an interior geometry of passages of the three dimensional model generated in the planning phase, and a location correlation between the shape and the three dimensional model based on the comparison is determined, e.g., utilizing the software on computing device 125. In addition, the software identifies non-tissue space (e.g., air filled cavities) in the three dimensional model. The software aligns, or registers, an image representing a location of sensor 44 with a the three dimensional model and two dimensional images generated from the three dimension model, which are based on the recorded location data and an assumption that sensor 44 remains located in non-tissue space in the patient's “P's” airways. Alternatively, a manual registration technique may be employed by navigating the bronchoscope 30 with the sensor 44 to pre-specified locations in the lungs of the patient “P”, and manually correlating the images from the bronchoscope to the model data of the three dimensional model.
Following registration of the patient “P” to the image data and pathway plan, a user interface is displayed in the navigation software which sets forth the pathway that the clinician is to follow to reach the target. One such navigation software is the ILOGIC® navigation suite currently sold by Medtronic PLC.
Once EWC 12 has been successfully navigated proximate the target as depicted on the user interface, the EWC 12 is in place as a guide channel for guiding medical instruments including without limitation, optical systems, ultrasound probes, marker placement tools, biopsy tools, ablation tools (i.e., microwave ablation devices), laser probes, cryogenic probes, sensor probes, and aspirating needles to the target.
With the above-described updated position of the catheter relative to a target in a target area, a clinician is able to more accurately navigate to a region of interest or target within a patient's luminal network. Errors in the electromagnetically tracked and displayed location of the catheter in the CT-based 3D rendering of the patient's luminal network are corrected based on near real-time position data extracted from fluoroscopic-based 3D data of target area.
From the foregoing and with reference to the various figure drawings, those skilled in the art will appreciate that certain modifications can also be made to the disclosure without departing from the scope of the same. For example, although the systems and methods are described as usable with an EMN system for navigation through a luminal network such as the lungs, the systems and methods described herein may be utilized with systems that utilize other navigation and treatment devices such as percutaneous devices. Additionally, although the above-described system and method is described as used within a patient's luminal network, it is appreciated that the above-described systems and methods may be utilized in other target regions such as the liver. Further, the above-described systems and methods are also usable for transthoracic needle aspiration procedures.
Detailed embodiments of the disclosure are disclosed herein. However, the disclosed embodiments are merely examples of the disclosure, which may be embodied in various forms and aspects. Therefore, specific structural and functional details disclosed herein are not to be interpreted as limiting, but merely as a basis for the claims and as a representative basis for teaching one skilled in the art to variously employ the disclosure in virtually any appropriately detailed structure.
As can be appreciated a medical instrument such as a biopsy tool or an energy device, such as a microwave ablation catheter, that is positionable through one or more branched luminal networks of a patient to treat tissue may prove useful in the surgical arena and the disclosure is directed to systems and methods that are usable with such instruments and tools. Access to luminal networks may be percutaneous or through natural orifice using navigation techniques. Additionally, navigation through a luminal network may be accomplished using image-guidance. These image-guidance systems may be separate or integrated with the energy device or a separate access tool and may include MRI, CT, fluoroscopy, ultrasound, electrical impedance tomography, optical, and/or device tracking systems. Methodologies for locating the access tool include EM, IR, echolocation, optical, and others. Tracking systems may be integrated to an imaging device, where tracking is done in virtual space or fused with preoperative or live images. In some cases the treatment target may be directly accessed from within the lumen, such as for the treatment of the endobronchial wall for COPD, Asthma, lung cancer, etc. In other cases, the energy device and/or an additional access tool may be required to pierce the lumen and extend into other tissues to reach the target, such as for the treatment of disease within the parenchyma. Final localization and confirmation of energy device or tool placement may be performed with imaging and/or navigational guidance using a standard fluoroscopic imaging device incorporated with methods and systems described above.
While several embodiments of the disclosure have been shown in the drawings, it is not intended that the disclosure be limited thereto, as it is intended that the disclosure be as broad in scope as the art will allow and that the specification be read likewise. Therefore, the above description should not be construed as limiting, but merely as exemplifications of particular embodiments. Those skilled in the art will envision other modifications within the scope and spirit of the claims appended hereto.
This application is a continuation of U.S. patent application Ser. No. 17/115,589, filed on Dec. 8, 2020, which is a continuation of U.S. patent application Ser. No. 16/259,731 filed on Jan. 28, 2019, now U.S. Pat. No. 10,905,498, which claims the benefit of the filing date of provisional U.S. Patent Application Ser. No. 62/627,911, filed Feb. 8, 2018, the entire contents of each of which are incorporated herein by reference.
Number | Name | Date | Kind |
---|---|---|---|
5706324 | Wiesent et al. | Jan 1998 | A |
5852646 | Klotz et al. | Dec 1998 | A |
5930329 | Navab | Jul 1999 | A |
5951475 | Gueziec et al. | Sep 1999 | A |
5963612 | Navab | Oct 1999 | A |
5963613 | Navab | Oct 1999 | A |
6038282 | Wiesent et al. | Mar 2000 | A |
6049582 | Navab | Apr 2000 | A |
6050724 | Schmitz et al. | Apr 2000 | A |
6055449 | Navab | Apr 2000 | A |
6081577 | Webber | Jun 2000 | A |
6118845 | Simon et al. | Sep 2000 | A |
6120180 | Graumann | Sep 2000 | A |
6236704 | Navab et al. | May 2001 | B1 |
6243439 | Arai et al. | Jun 2001 | B1 |
6285739 | Rudin et al. | Sep 2001 | B1 |
6289235 | Webber et al. | Sep 2001 | B1 |
6317621 | Graumann et al. | Nov 2001 | B1 |
6351513 | Bani-Hashemi et al. | Feb 2002 | B1 |
6359960 | Wahl et al. | Mar 2002 | B1 |
6382835 | Graumann et al. | May 2002 | B2 |
6389104 | Bani-Hashemi et al. | May 2002 | B1 |
6404843 | Vaillant | Jun 2002 | B1 |
6424731 | Launay et al. | Jul 2002 | B1 |
6484049 | Seeley et al. | Nov 2002 | B1 |
6485422 | Mikus et al. | Nov 2002 | B1 |
6490475 | Seeley et al. | Dec 2002 | B1 |
6491430 | Seissler | Dec 2002 | B1 |
6546068 | Shimura | Apr 2003 | B1 |
6546279 | Bova et al. | Apr 2003 | B1 |
6549607 | Webber | Apr 2003 | B1 |
6585412 | Mitschke | Jul 2003 | B2 |
6662036 | Cosman | Dec 2003 | B2 |
6666579 | Jensen | Dec 2003 | B2 |
6697664 | Kienzle et al. | Feb 2004 | B2 |
6707878 | Claus et al. | Mar 2004 | B2 |
6714810 | Grzeszczuk et al. | Mar 2004 | B2 |
6731283 | Navab | May 2004 | B1 |
6731970 | Schlossbauer et al. | May 2004 | B2 |
6768784 | Green et al. | Jul 2004 | B1 |
6782287 | Grzeszczuk et al. | Aug 2004 | B2 |
6785356 | Grass et al. | Aug 2004 | B2 |
6785571 | Glossop | Aug 2004 | B2 |
6801597 | Webber | Oct 2004 | B2 |
6810278 | Webber et al. | Oct 2004 | B2 |
6823207 | Jensen et al. | Nov 2004 | B1 |
6851855 | Mitschke et al. | Feb 2005 | B2 |
6856826 | Seeley et al. | Feb 2005 | B2 |
6856827 | Seeley et al. | Feb 2005 | B2 |
6865253 | Blumhofer et al. | Mar 2005 | B2 |
6898263 | Avinash et al. | May 2005 | B2 |
6912265 | Hebecker et al. | Jun 2005 | B2 |
6928142 | Shao et al. | Aug 2005 | B2 |
6944260 | Hsieh et al. | Sep 2005 | B2 |
6956927 | Sukeyasu et al. | Oct 2005 | B2 |
7010080 | Mitschke et al. | Mar 2006 | B2 |
7010152 | Bojer et al. | Mar 2006 | B2 |
7035371 | Boese et al. | Apr 2006 | B2 |
7048440 | Graumann et al. | May 2006 | B2 |
7066646 | Pescatore et al. | Jun 2006 | B2 |
7106825 | Gregerson et al. | Sep 2006 | B2 |
7117027 | Zheng et al. | Oct 2006 | B2 |
7129946 | Ditt et al. | Oct 2006 | B2 |
7130676 | Barrick | Oct 2006 | B2 |
7142633 | Eberhard et al. | Nov 2006 | B2 |
7147373 | Cho et al. | Dec 2006 | B2 |
7165362 | Jobs et al. | Jan 2007 | B2 |
7186023 | Morita et al. | Mar 2007 | B2 |
7251522 | Essenreiter et al. | Jul 2007 | B2 |
7327872 | Vaidant et al. | Feb 2008 | B2 |
7343195 | Strommer et al. | Mar 2008 | B2 |
7369641 | Tsubaki et al. | May 2008 | B2 |
7426256 | Rasche et al. | Sep 2008 | B2 |
7440538 | Tsujii | Oct 2008 | B2 |
7467007 | Lothert | Dec 2008 | B2 |
7474913 | Durlak | Jan 2009 | B2 |
7502503 | Bojer et al. | Mar 2009 | B2 |
7505549 | Ohishi et al. | Mar 2009 | B2 |
7508388 | Barfuss et al. | Mar 2009 | B2 |
7603155 | Jensen et al. | Oct 2009 | B2 |
7620223 | Xu et al. | Nov 2009 | B2 |
7639866 | Pomero et al. | Dec 2009 | B2 |
7664542 | Boese et al. | Feb 2010 | B2 |
7671887 | Pescatore et al. | Mar 2010 | B2 |
7689019 | Boese et al. | Mar 2010 | B2 |
7689042 | Brunner et al. | Mar 2010 | B2 |
7693263 | Bouvier et al. | Apr 2010 | B2 |
7711082 | Fujimoto et al. | May 2010 | B2 |
7711083 | Heigl et al. | May 2010 | B2 |
7711409 | Keppel et al. | May 2010 | B2 |
7712961 | Horndler et al. | May 2010 | B2 |
7720520 | P et al. | May 2010 | B2 |
7725165 | Chen et al. | May 2010 | B2 |
7734329 | Boese et al. | Jun 2010 | B2 |
7742557 | Brunner et al. | Jun 2010 | B2 |
7761135 | Pfister et al. | Jul 2010 | B2 |
7778685 | Evron et al. | Aug 2010 | B2 |
7778690 | Boese et al. | Aug 2010 | B2 |
7787932 | Vilsmeier et al. | Aug 2010 | B2 |
7804991 | Abovitz et al. | Sep 2010 | B2 |
7831096 | Williamson et al. | Nov 2010 | B2 |
7835779 | Anderson et al. | Nov 2010 | B2 |
7844094 | Jeung et al. | Nov 2010 | B2 |
7853061 | Gorges et al. | Dec 2010 | B2 |
7877132 | Rongen et al. | Jan 2011 | B2 |
7899226 | Pescatore et al. | Mar 2011 | B2 |
7907989 | Borgert et al. | Mar 2011 | B2 |
7912180 | Zou et al. | Mar 2011 | B2 |
7912262 | Timmer et al. | Mar 2011 | B2 |
7949088 | Nishide et al. | May 2011 | B2 |
7950849 | Claus et al. | May 2011 | B2 |
7991450 | Virtue et al. | Aug 2011 | B2 |
8000436 | Seppi et al. | Aug 2011 | B2 |
8043003 | Vogt et al. | Oct 2011 | B2 |
8045780 | Boese et al. | Oct 2011 | B2 |
8050739 | Eck et al. | Nov 2011 | B2 |
8090168 | Washburn et al. | Jan 2012 | B2 |
8104958 | Weiser et al. | Jan 2012 | B2 |
8111894 | Haar | Feb 2012 | B2 |
8111895 | Spahn | Feb 2012 | B2 |
8126111 | Uhde et al. | Feb 2012 | B2 |
8126241 | Zarkh et al. | Feb 2012 | B2 |
8150131 | Harer et al. | Apr 2012 | B2 |
8180132 | Gorges et al. | May 2012 | B2 |
8195271 | Rahn | Jun 2012 | B2 |
8200316 | Keppel et al. | Jun 2012 | B2 |
8208708 | Homan et al. | Jun 2012 | B2 |
8229061 | Hanke et al. | Jul 2012 | B2 |
8248413 | Gattani et al. | Aug 2012 | B2 |
8270691 | Xu et al. | Sep 2012 | B2 |
8271068 | Khamene et al. | Sep 2012 | B2 |
8275448 | Camus et al. | Sep 2012 | B2 |
8306303 | Bruder et al. | Nov 2012 | B2 |
8311617 | Keppel et al. | Nov 2012 | B2 |
8320992 | Frenkel et al. | Nov 2012 | B2 |
8326403 | Pescatore et al. | Dec 2012 | B2 |
8335359 | Fidrich et al. | Dec 2012 | B2 |
8340379 | Razzaque et al. | Dec 2012 | B2 |
8345817 | Fuchs et al. | Jan 2013 | B2 |
8374416 | Gagesch et al. | Feb 2013 | B2 |
8374678 | Graumann | Feb 2013 | B2 |
8423117 | Pichon et al. | Apr 2013 | B2 |
8442618 | Strommer et al. | May 2013 | B2 |
8515527 | Vaillant et al. | Aug 2013 | B2 |
8526688 | Groszmann et al. | Sep 2013 | B2 |
8526700 | Isaacs | Sep 2013 | B2 |
8532258 | Bulitta et al. | Sep 2013 | B2 |
8532259 | Shedlock et al. | Sep 2013 | B2 |
8548567 | Maschke et al. | Oct 2013 | B2 |
8625869 | Harder et al. | Jan 2014 | B2 |
8666137 | Nielsen et al. | Mar 2014 | B2 |
8670603 | Tolkowsky et al. | Mar 2014 | B2 |
8675996 | Liao et al. | Mar 2014 | B2 |
8693622 | Graumann et al. | Apr 2014 | B2 |
8693756 | Tolkowsky et al. | Apr 2014 | B2 |
8694075 | Groszmann et al. | Apr 2014 | B2 |
8706184 | Mohr et al. | Apr 2014 | B2 |
8706186 | Fichtinger et al. | Apr 2014 | B2 |
8712129 | Strommer et al. | Apr 2014 | B2 |
8718346 | Isaacs et al. | May 2014 | B2 |
8750582 | Boese et al. | Jun 2014 | B2 |
8755587 | Bender et al. | Jun 2014 | B2 |
8781064 | Fuchs et al. | Jul 2014 | B2 |
8792704 | Isaacs | Jul 2014 | B2 |
8798339 | Mielekamp et al. | Aug 2014 | B2 |
8827934 | Chopra et al. | Sep 2014 | B2 |
8831310 | Razzaque et al. | Sep 2014 | B2 |
8855748 | Keppel et al. | Oct 2014 | B2 |
9001121 | Finlayson et al. | Apr 2015 | B2 |
9001962 | Funk | Apr 2015 | B2 |
9008367 | Tolkowsky et al. | Apr 2015 | B2 |
9031188 | Belcher et al. | May 2015 | B2 |
9036777 | Ohishi et al. | May 2015 | B2 |
9042624 | Dennerlein | May 2015 | B2 |
9044190 | Rubner et al. | Jun 2015 | B2 |
9087404 | Hansis et al. | Jul 2015 | B2 |
9095252 | Popovic | Aug 2015 | B2 |
9104902 | Xu et al. | Aug 2015 | B2 |
9111175 | Strommer et al. | Aug 2015 | B2 |
9135706 | Zagorchev et al. | Sep 2015 | B2 |
9171365 | Mareachen et al. | Oct 2015 | B2 |
9179878 | Jeon | Nov 2015 | B2 |
9216065 | Cohen et al. | Dec 2015 | B2 |
9232924 | Liu et al. | Jan 2016 | B2 |
9262830 | Bakker et al. | Feb 2016 | B2 |
9265468 | Rai et al. | Feb 2016 | B2 |
9277893 | Tsukagoshi et al. | Mar 2016 | B2 |
9280837 | Grass et al. | Mar 2016 | B2 |
9282944 | Fallavollita et al. | Mar 2016 | B2 |
9401047 | Bogoni et al. | Jul 2016 | B2 |
9406134 | Klingenbeck-Regn et al. | Aug 2016 | B2 |
9445772 | Callaghan et al. | Sep 2016 | B2 |
9445776 | Han et al. | Sep 2016 | B2 |
9466135 | Koehler et al. | Oct 2016 | B2 |
9743896 | Averbuch | Aug 2017 | B2 |
9918659 | Chopra et al. | Mar 2018 | B2 |
10373719 | Soper et al. | Aug 2019 | B2 |
10376178 | Chopra | Aug 2019 | B2 |
10405753 | Sorger | Sep 2019 | B2 |
10478162 | Barbagli et al. | Nov 2019 | B2 |
10480926 | Froggatt et al. | Nov 2019 | B2 |
10524866 | Srinivasan et al. | Jan 2020 | B2 |
10555788 | Panescu et al. | Feb 2020 | B2 |
10610306 | Chopra | Apr 2020 | B2 |
10638953 | Duindam et al. | May 2020 | B2 |
10674970 | Averbuch et al. | Jun 2020 | B2 |
10682070 | Duindam | Jun 2020 | B2 |
10706543 | Donhowe et al. | Jul 2020 | B2 |
10709506 | Coste-Maniere et al. | Jul 2020 | B2 |
10772485 | Schlesinger et al. | Sep 2020 | B2 |
10796432 | Mintz et al. | Oct 2020 | B2 |
10823627 | Sanborn et al. | Nov 2020 | B2 |
10827913 | Ummalaneni et al. | Nov 2020 | B2 |
10835153 | Rafii-Tari et al. | Nov 2020 | B2 |
10885630 | Li et al. | Jan 2021 | B2 |
10905498 | Birenbaum et al. | Feb 2021 | B2 |
20020122536 | Kerrien et al. | Sep 2002 | A1 |
20020163996 | Kerrien et al. | Nov 2002 | A1 |
20020188194 | Cosman | Dec 2002 | A1 |
20030013972 | Makin | Jan 2003 | A1 |
20030088179 | Seeley et al. | May 2003 | A1 |
20050220264 | Homegger | Oct 2005 | A1 |
20050245807 | Boese et al. | Nov 2005 | A1 |
20050281385 | Johnson et al. | Dec 2005 | A1 |
20060182216 | Lauritsch et al. | Aug 2006 | A1 |
20060251213 | Bernhardt et al. | Nov 2006 | A1 |
20070276216 | Beyar et al. | Nov 2007 | A1 |
20130223702 | Housing | Aug 2013 | A1 |
20130303945 | Blumenkranz et al. | Nov 2013 | A1 |
20140035798 | Kawada et al. | Feb 2014 | A1 |
20140275985 | Walker | Sep 2014 | A1 |
20150148690 | Chopra et al. | May 2015 | A1 |
20150227679 | Kamer et al. | Aug 2015 | A1 |
20150265368 | Chopra et al. | Sep 2015 | A1 |
20160005194 | Schretter et al. | Jan 2016 | A1 |
20160157939 | Larkin et al. | Jun 2016 | A1 |
20160183841 | Duindam et al. | Jun 2016 | A1 |
20160192860 | Allenby et al. | Jul 2016 | A1 |
20160206380 | Sparks et al. | Jul 2016 | A1 |
20160287343 | Eichler et al. | Oct 2016 | A1 |
20160287344 | Donhowe et al. | Oct 2016 | A1 |
20170112576 | Coste-Maniere et al. | Apr 2017 | A1 |
20170209071 | Zhao et al. | Jul 2017 | A1 |
20170265952 | Donhowe et al. | Sep 2017 | A1 |
20170311844 | Zhao et al. | Nov 2017 | A1 |
20170319165 | Averbuch | Nov 2017 | A1 |
20180078318 | Barbagli et al. | Mar 2018 | A1 |
20180153621 | Duindam et al. | Jun 2018 | A1 |
20180235709 | Donhowe et al. | Aug 2018 | A1 |
20180240237 | Donhowe et al. | Aug 2018 | A1 |
20180256262 | Duindam et al. | Sep 2018 | A1 |
20180263706 | Averbuch | Sep 2018 | A1 |
20180279852 | Rafii-Tari et al. | Oct 2018 | A1 |
20180325419 | Zhao et al. | Nov 2018 | A1 |
20190000559 | Berman et al. | Jan 2019 | A1 |
20190000560 | Berman et al. | Jan 2019 | A1 |
20190008413 | Duindam et al. | Jan 2019 | A1 |
20190038365 | Soper et al. | Feb 2019 | A1 |
20190065209 | Mishra et al. | Feb 2019 | A1 |
20190110839 | Rafii-Tari et al. | Apr 2019 | A1 |
20190175062 | Rafii-Tari et al. | Jun 2019 | A1 |
20190183318 | Froggatt et al. | Jun 2019 | A1 |
20190183585 | Rafii-Tari et al. | Jun 2019 | A1 |
20190183587 | Rafii-Tari et al. | Jun 2019 | A1 |
20190192234 | Gadda et al. | Jun 2019 | A1 |
20190209016 | Herzlinger et al. | Jul 2019 | A1 |
20190209043 | Zhao et al. | Jul 2019 | A1 |
20190216548 | Ummalaneni | Jul 2019 | A1 |
20190239723 | Duindam et al. | Aug 2019 | A1 |
20190239831 | Chopra | Aug 2019 | A1 |
20190250050 | Sanborn et al. | Aug 2019 | A1 |
20190254649 | Walters et al. | Aug 2019 | A1 |
20190269470 | Barbagli | Sep 2019 | A1 |
20190272634 | Li et al. | Sep 2019 | A1 |
20190298160 | Ummalaneni et al. | Oct 2019 | A1 |
20190298451 | Wong et al. | Oct 2019 | A1 |
20190320878 | Duindam et al. | Oct 2019 | A1 |
20190320937 | Duindam et al. | Oct 2019 | A1 |
20190336238 | Yu et al. | Nov 2019 | A1 |
20190343424 | Blumenkranz et al. | Nov 2019 | A1 |
20190350659 | Wang et al. | Nov 2019 | A1 |
20190365199 | Zhao et al. | Dec 2019 | A1 |
20190365479 | Rafii-Tari | Dec 2019 | A1 |
20190365486 | Srinivasan et al. | Dec 2019 | A1 |
20190380787 | Ye et al. | Dec 2019 | A1 |
20200000319 | Saadat et al. | Jan 2020 | A1 |
20200000526 | Zhao | Jan 2020 | A1 |
20200008655 | Schlesinger et al. | Jan 2020 | A1 |
20200030044 | Wang et al. | Jan 2020 | A1 |
20200030461 | Sorger | Jan 2020 | A1 |
20200038750 | Kojima | Feb 2020 | A1 |
20200043207 | Lo et al. | Feb 2020 | A1 |
20200046431 | Soper et al. | Feb 2020 | A1 |
20200046436 | Tzeisler et al. | Feb 2020 | A1 |
20200054399 | Duindam et al. | Feb 2020 | A1 |
20200060771 | Lo et al. | Feb 2020 | A1 |
20200069192 | Sanborn et al. | Mar 2020 | A1 |
20200077870 | Dicarlo et al. | Mar 2020 | A1 |
20200078095 | Chopra et al. | Mar 2020 | A1 |
20200078103 | Duindam et al. | Mar 2020 | A1 |
20200085514 | Blumenkranz | Mar 2020 | A1 |
20200109124 | Pomper et al. | Apr 2020 | A1 |
20200129045 | Prisco | Apr 2020 | A1 |
20200129239 | Bianchi et al. | Apr 2020 | A1 |
20200138515 | Wong | May 2020 | A1 |
20200155116 | Donhowe et al. | May 2020 | A1 |
20200170623 | Averbuch | Jun 2020 | A1 |
20200170720 | Ummalaneni | Jun 2020 | A1 |
20200179058 | Barbagli | Jun 2020 | A1 |
20200188038 | Donhowe et al. | Jun 2020 | A1 |
20200205903 | Srinivasan et al. | Jul 2020 | A1 |
20200205904 | Chopra | Jul 2020 | A1 |
20200214664 | Zhao et al. | Jul 2020 | A1 |
20200229679 | Zhao et al. | Jul 2020 | A1 |
20200242767 | Zhao et al. | Jul 2020 | A1 |
20200275860 | Duindam | Sep 2020 | A1 |
20200297442 | Adebar et al. | Sep 2020 | A1 |
20200315554 | Averbuch et al. | Oct 2020 | A1 |
20200330795 | Sawant et al. | Oct 2020 | A1 |
20200352427 | Deyanov | Nov 2020 | A1 |
20200364865 | Donhowe et al. | Nov 2020 | A1 |
Number | Date | Country |
---|---|---|
0013237 | Jul 2003 | BR |
0116004 | Jun 2004 | BR |
101190149 | Jun 2008 | CN |
486540 | Sep 2016 | CZ |
2709512 | Aug 2017 | CZ |
2884879 | Jan 2020 | CZ |
19919907 | Nov 2000 | DE |
69726415 | Sep 2004 | DE |
102004004620 | Aug 2005 | DE |
0917855 | May 1999 | EP |
1593343 | Nov 2005 | EP |
3413830 | Sep 2019 | EP |
3478161 | Feb 2020 | EP |
3641686 | Apr 2020 | EP |
3644885 | May 2020 | EP |
3644886 | May 2020 | EP |
PA03005028 | Jan 2004 | MX |
225663 | Jan 2005 | MX |
226292 | Feb 2005 | MX |
246862 | Jun 2007 | MX |
265247 | Mar 2009 | MX |
284569 | Mar 2011 | MX |
9944503 | Sep 1999 | WO |
0187136 | Nov 2001 | WO |
2004081877 | Sep 2004 | WO |
2005015125 | Feb 2005 | WO |
2005082246 | Sep 2005 | WO |
2009081297 | Jul 2009 | WO |
2015101948 | Jul 2015 | WO |
Entry |
---|
Australian Examination Report No. 2 issued in Appl. No. AU 2016210747 dated Oct. 18, 2017 (4 pages). |
Canadian Office Action issued in Appl. No. 2,937,825 dated Mar. 26, 2018 (4 pages). |
CT scan—Wikipedia, the free encyclopedia [retrieved from internet on Mar. 30, 2017] published on Jun. 30, 2015 as per Wayback Machine. |
Extended European Search Report from Appl. No. EP 16182953.6-1666 dated Jan. 2, 2017. |
Extended European Search Report issued in corresponding Appl. No. EP 19156034.1 dated Jul. 8, 2019 (10 pages). |
Lee, Flyunkwang, et al., “A Deep-Leaming System for Fully-Automated Peripherally Inserted Central Catheter (PICC) Tip Detection”, J. Digit Imaging, vol. 31, pp. 393-402 (2018). |
Office Action issued in Chinese Appl. No. 201610635896.X dated Jul. 23, 2018, together with English language translation (16 pages). |
Extended European search report issued in European Patent Application No. 22188667.4 dated Nov. 7, 2022. |
Number | Date | Country | |
---|---|---|---|
20220142719 A1 | May 2022 | US |
Number | Date | Country | |
---|---|---|---|
62627911 | Feb 2018 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 17115589 | Dec 2020 | US |
Child | 17584039 | US | |
Parent | 16259731 | Jan 2019 | US |
Child | 17115589 | US |