Examples described herein are directed to systems and methods for graphically identifying a target nodule during the planning of an interventional procedure.
Minimally invasive medical techniques are intended to reduce the amount of tissue that is damaged during medical procedures, thereby reducing patient recovery time, discomfort, and harmful side effects. Such minimally invasive techniques may be performed through natural orifices in a patient anatomy or through one or more surgical incisions. Through these natural orifices or incisions clinicians may insert minimally invasive medical instruments (including surgical, diagnostic, therapeutic, or biopsy instruments) to reach a target tissue location. Some minimally invasive techniques use medical instruments that may be inserted into anatomic passageways and navigated toward a region of interest within the patient anatomy. Planning for such procedures may be conducted with reference to images of the patient anatomy that include the target tissue. Improved systems and methods may be used to identify the target tissue and plan an interventional procedure at the target tissue.
Consistent with some examples, a system may comprise one or more processors and memory having computer readable instructions stored thereon. The computer readable instructions, when executed by the one or more processors, may cause the system to receive image data including a segmented candidate target nodule, receive a seed point, and determine if the segmented candidate target nodule is within a threshold proximity of the seed point. Based on a determination that the segmented candidate target nodule is within the threshold proximity of the seed point, the segmented candidate target nodule may be identified as an identified target nodule. A target nodule boundary corresponding to the identified target nodule may be displayed.
In another example, a non-transitory machine-readable medium may comprise a plurality of machine-readable instructions which when executed by one or more processors associated with a planning workstation are adapted to cause the one or more processors to perform a method. The method may comprise receiving image data including a segmented candidate target nodule, receiving a seed point, and determining if the segmented candidate target nodule is within a threshold proximity of the seed point. Based on a determination that the segmented candidate target nodule is within the threshold proximity of the seed point, the segmented candidate target nodule may be identified as an identified target nodule. A target nodule boundary corresponding to the identified target nodule may be displayed.
It is to be understood that both the foregoing general description and the following detailed description are illustrative and explanatory in nature and are intended to provide an understanding of the present disclosure without limiting the scope of the present disclosure. Additional aspects, features, and advantages of the present disclosure will be apparent to one skilled in the art from the following detailed description.
Examples of the present disclosure and their advantages are best understood by referring to the detailed description that follows. It should be appreciated that like reference numerals are used to identify like elements illustrated in one or more of the figures, wherein showings therein are for purposes of illustrating examples of the present disclosure and not for purposes of limiting the same.
During the planning of a medical procedure using a steerable medical instrument, accurate target tissue identification may help to determine the most efficient and effective approach for accessing the interventional site. Pre-operative or intra-operative anatomical imaging data of the patient anatomy may be referenced to identify target tissue nodules, but fully automatic segmentation of target nodules from the imaging data may result in the identification of false target nodules or the omission of actual target nodules due to poor image quality, breathing motion, imaging artifacts, or other factors. However, purely manual identification of target nodules from the imaging data may be time consuming and inefficient. The systems and methods described herein may provide a hybrid identification approach that uses image segmentation to supplement the target tissue identification process. Because target tissue nodules may have irregular shapes that may not be approximated well by default symmetrical shapes such as spheres, circles, ellipses, and ellipsoids, the systems and methods described herein may allow a user to edit default shapes or the target tissue boundaries suggested by the image segmentation to better identify the irregular shape of the target nodule. Providing the planning system with accurate information about the size and shape of the target nodule may allow for procedure planning that fully considers one or more path options, multiple interventional points, and the proximity of vulnerable anatomic structures. Illustrative examples of a graphical user interface for planning a medical procedure, including but not limited to the lung biopsy procedures, are also provided below.
The method 200 may be illustrated as a set of operations or processes 202 through 220 and is described with continuing reference to
In some examples, the patient image data of an anatomical region may be displayed. For example, as illustrated in
In some examples, additional panes may be displayed to present additional information or anatomic images. In some examples, fewer panes may be displayed. In some examples, the size or arrangement of the image panes 306-312 may be changed. For example, as shown in
Referring again to
In some examples, the display of one or more of the segmented structures or one or more categories of segmented structures may be suppressed. For example, in
Referring again to
In some examples, guidance may be provided to the user to assist with determining where the indicator 320 should be positioned to locate the seed point. The guidance may, for example, take the form of previously analyzed or annotated radiology report or radiology images with identified target nodules. In other examples, the guidance may take the form of graphical cues, auditory tones, haptic forces or other hints that are provided when the indicator moves within a predetermined range of a candidate target nodule identified by the image segmentation. Additionally or alternatively, guidance may be provided in the form of region shading, region outlines, or other graphical cues that indicate the general region where candidate target nodules are located in the segmented image data.
Referring again to
At a process 210, if a candidate target nodule is within the proximity of the seed point at process 208, the candidate target nodule may be displayed. The candidate target nodule may also be labeled, referenced, categorized, or otherwise recorded as an identified or selected target nodule within the planning system if the seed point is with the proximity of the candidate target nodule. Labels, user notes, characterizing information or other information associated with the selected target nodule may also be displayed. With reference to
At a process 212, edits may be made to the boundary 330. For example, the user may observe that the segmented boundary does not correspond with the boundaries or nodule margins that the user observes in the cross-sectional views. The editing process allows the user to modify or revise the segmented boundary to better correspond to the boundaries and margins preferred by the user. With reference to
As shown in
Referring again to
At a process 216, edits may be made to the size and/or shape of the boundary 350. In some examples the user may adjust the size of the boundary 350 by adjusting any of the three axial dimensions of the ellipsoid to approximate the dimensions of the target nodule 322. The adjustments may be based on the user's visualization of the target nodule 322 in the cross-sectional views of image panes 306, 308, 310. In some examples, the size of the ellipsoid shape may be adjusted to fully encompass the shape of the target nodule visible to the user in the cross-sectional views of image panes 306, 308, 310. In some examples, an editing tool such as the tool 344, 346 may be used to extend or reduce the boundary 350 into an irregular shape that more closely approximates the shape of the target nodule visible to the user in the cross-sectional views of image panes 306, 308, 310.
At an optional process 218, another seed point associated with a different target nodule may be received based on a user input. For example and with reference to
At an optional process 220, an instrument path to the target nodule may be planned. The instrument (e.g. instrument 100) may include, for example, a biopsy instrument for performing a biopsy procedure on the target nodule 322. The path may be generated manually by a user, automatically by the planning system, or by a combination of manual and automatic inputs. With reference to
In an alternative example, the instrument path to the target nodule may be planned based on a user selected exit point and a destination point selected by the path planning system based on the configuration of the target nodule. For example, and with reference to
When a plurality of target nodules have been identified, the path planning system may identify paths, exit points, and destination points for each of the target nodules. The path planning may include planned efficiencies that minimize the airway traversal needed to reach multiple target nodules. In some examples, a target nodule may be biopsied in multiple locations and thus multiple paths, exit points, and destination points may be determined in planning a biopsy for a single target nodule. In some examples, a plurality of target nodules may be identified and displayed before the paths to each of the target nodules is planned. In other examples, the order of the processes 218, 220 may be swapped such that a path is planned after each target nodule is identified and displayed.
In some examples, the planning techniques of this disclosure may be used in an image-guided medical procedure performed with a teleoperated or robot-assisted medical system as described in further detail below. As shown in
Robot-assisted medical system 400 also includes a display system 410 (which may include display system 302) for displaying an image or representation of the surgical site and medical instrument 404 generated by a sensor system 408 and/or an endoscopic imaging system 409. Display system 410 and master assembly 406 may be oriented so an operator O can control medical instrument 404 and master assembly 406 with the perception of telepresence. Any of the previously described graphical user interfaces may be displayable on a display system 410 and/or a display system of an independent planning workstation.
In some examples, medical instrument 404 may include components for use in surgery, biopsy, ablation, illumination, irrigation, or suction. Optionally medical instrument 404, together with sensor system 408 may be used to gather (e.g., measure or survey) a set of data points corresponding to locations within anatomic passageways of a patient, such as patient P. In some examples, medical instrument 404 may include components of the imaging system 409, which may include an imaging scope assembly or imaging instrument that records a concurrent or real-time image of a surgical site and provides the image to the operator or operator O through the display system 410. The concurrent image may be, for example, a two or three-dimensional image captured by an imaging instrument positioned within the surgical site. In some examples, the imaging system components that may be integrally or removably coupled to medical instrument 404. However, in some examples, a separate endoscope, attached to a separate manipulator assembly may be used with medical instrument 404 to image the surgical site. The imaging system 409 may be implemented as hardware, firmware, software or a combination thereof which interact with or are otherwise executed by one or more computer processors, which may include the processors of the control system 412.
The sensor system 408 may include a position/location sensor system (e.g., an electromagnetic (EM) sensor system) and/or a shape sensor system for determining the position, orientation, speed, velocity, pose, and/or shape of the medical instrument 404.
Robot-assisted medical system 400 may also include control system 412. Control system 412 includes at least one memory 416 and at least one computer processor 414 for effecting control between medical instrument 404, master assembly 406, sensor system 408, endoscopic imaging system 409, and display system 410. Control system 412 also includes programmed instructions (e.g., a non-transitory machine-readable medium storing the instructions) to implement a plurality of operating modes of the robot-assisted system including a navigation planning mode, a navigation mode, and/or a procedure mode. Control system 412 also includes programmed instructions (e.g., a non-transitory machine-readable medium storing the instructions) to implement some or all of the methods described in accordance with aspects disclosed herein, including, for example, instructions for providing information to display system 410, instructions for determining a target location, instructions for determining an anatomical boundary, instructions for determining a trajectory zone, instructions for determining a zone boundary, and instructions for receiving user (e.g., operator O) inputs to a planning mode.
Robot-assisted medical system 400 may also include a procedure planning system 418. The procedure planning system 418 may include a processing system, a memory, a user input device, and/or a display system for planning an interventional procedure that may be performed by the medical system 400. In some examples, the planning system 418 incorporate other components of the medical system 400 including the control system 412, the master assembly 406, and/or the display system 410. Alternately or additionally, the procedure planning system 418 may be located at a workstation dedicated to pre-operative planning.
A plan for a medical procedure, such as a biopsy procedure, may be saved and used by the control system 412 to provide automated navigation or operator navigation assistance of a medical instrument to perform the biopsy procedure. Control system 412 may optionally further include a virtual visualization system to provide navigation assistance to operator O when controlling medical instrument 404 during an image-guided surgical procedure. Virtual navigation using the virtual visualization system may be based upon reference to an acquired pre-operative or intra-operative dataset of anatomic passageways. The virtual visualization system processes images of the surgical site imaged using imaging technology such as computerized tomography (CT), magnetic resonance imaging (MRI), fluoroscopy, thermography, ultrasound, optical coherence tomography (OCT), thermal imaging, impedance imaging, laser imaging, nanotube X-ray imaging, and/or the like.
In this example, a sensor system (e.g., sensor system 408) includes a shape sensor 514. Shape sensor 514 may include an optical fiber extending within and aligned with elongate device 510. In one example, the optical fiber has a diameter of approximately 200 μm. In other examples, the dimensions may be larger or smaller. The optical fiber of shape sensor 514 forms a fiber optic bend sensor for determining the shape of the elongate device 510. In one alternative, optical fibers including Fiber Bragg Gratings (FBGs) are used to provide strain measurements in structures in one or more dimensions. Various systems and methods for monitoring the shape and relative position of an optical fiber in three dimensions are described in U.S. patent application Ser. No. 11/180,389 (filed Jul. 13, 2005) (disclosing “Fiber optic position and shape sensing device and method relating thereto”); U.S. patent application Ser. No. 12/047,056 (filed on Jul. 16, 2004) (disclosing “Fiber-optic shape and relative position sensing”); and U.S. Pat. No. 6,389,187 (filed on Jun. 17, 1998) (disclosing “Optical Fibre Bend Sensor”), which are all incorporated by reference herein in their entireties. Sensors in some examples may employ other suitable strain sensing techniques, such as Rayleigh scattering, Raman scattering, Brillouin scattering, and Fluorescence scattering. In some examples, the shape of the catheter may be determined using other techniques.
As shown in
Elongate device 510 includes a channel (not shown) sized and shaped to receive a medical instrument 522. In some examples, medical instrument 522 may be used for procedures such as surgery, biopsy, ablation, illumination, irrigation, or suction. Medical instrument 522 can be deployed through elongate device 510 and used at a target location within the anatomy. Medical instrument 522 may include, for example, image capture probes, biopsy instruments, laser ablation fibers, and/or other surgical, diagnostic, or therapeutic tools. Medical instrument 522 may be advanced from the distal end 518 of the elongate device 510 to perform the procedure and then retracted back into the channel when the procedure is complete. Medical instrument 522 may be removed from proximal end of elongate device 510 or from another optional instrument port (not shown) along elongate device 510.
Elongate device 510 may also house cables, linkages, or other steering controls (not shown) to controllably bend distal end 518. In some examples, at least four cables are used to provide independent “up-down” steering to control a pitch of distal end 518 and “left-right” steering to control a yaw of distal end 518.
A position measuring device 520 may provide information about the position of instrument body 512 as it moves on insertion stage 508 along an insertion axis A. Position measuring device 520 may include resolvers, encoders, potentiometers, and/or other sensors that determine the rotation and/or orientation of the actuators controlling the motion of instrument carriage 506 and consequently the motion of instrument body 512. In some examples, insertion stage 508 is linear, while in other examples, the insertion stage 508 may be curved or have a combination of curved and linear sections.
In the description, specific details have been set forth describing some examples. Numerous specific details are set forth in order to provide a thorough understanding of the examples. It will be apparent, however, to one skilled in the art that some examples may be practiced without some or all of these specific details. The specific examples disclosed herein are meant to be illustrative but not limiting. One skilled in the art may realize other elements that, although not specifically described here, are within the scope and the spirit of this disclosure.
Elements described in detail with reference to one example, implementation, or application optionally may be included, whenever practical, in other examples, implementations, or applications in which they are not specifically shown or described. For example, if an element is described in detail with reference to one example and is not described with reference to a second example, the element may nevertheless be claimed as included in the second example. Thus, to avoid unnecessary repetition in the following description, one or more elements shown and described in association with one example, implementation, or application may be incorporated into other examples, implementations, or aspects unless specifically described otherwise, unless the one or more elements would make an example or implementation non-functional, or unless two or more of the elements provide conflicting functions. Not all the illustrated processes may be performed in all examples of the disclosed methods. Additionally, one or more processes that are not expressly illustrated in may be included before, after, in between, or as part of the illustrated processes. In some examples, one or more of the processes may be performed by a control system or may be implemented, at least in part, in the form of executable code stored on non-transitory, tangible, machine-readable media that when run by one or more processors may cause the one or more processors to perform one or more of the processes.
Any alterations and further modifications to the described devices, instruments, methods, and any further application of the principles of the present disclosure are fully contemplated as would normally occur to one skilled in the art to which the disclosure relates. In addition, dimensions provided herein are for specific examples and it is contemplated that different sizes, dimensions, and/or ratios may be utilized to implement the concepts of the present disclosure. To avoid needless descriptive repetition, one or more components or actions described in accordance with one illustrative example can be used or omitted as applicable from other illustrative examples. For the sake of brevity, the numerous iterations of these combinations will not be described separately. For simplicity, in some instances the same reference numbers are used throughout the drawings to refer to the same or like parts.
The systems and methods described herein may be suited for navigation and treatment of anatomic tissues, via natural or surgically created connected passageways, in any of a variety of anatomic systems, including the lung, colon, the intestines, the kidneys and kidney calices, the brain, the heart, the circulatory system including vasculature, and/or the like. While some examples are provided herein with respect to medical procedures, any reference to medical or surgical instruments and medical or surgical methods is non-limiting. For example, the instruments, systems, and methods described herein may be used for non-medical purposes including industrial uses, general robotic uses, and sensing or manipulating non-tissue work pieces. Other example applications involve cosmetic improvements, imaging of human or animal anatomy, gathering data from human or animal anatomy, and training medical or non-medical personnel. Additional example applications include use for procedures on tissue removed from human or animal anatomies (without return to a human or animal anatomy) and performing procedures on human or animal cadavers. Further, these techniques can also be used for surgical and nonsurgical medical treatment or diagnosis procedures.
One or more elements in examples of this disclosure may be implemented in software to execute on a processor of a computer system such as control processing system. When implemented in software, the elements of the examples of this disclosure may be code segments to perform various tasks. The program or code segments can be stored in a processor readable storage medium or device that may have been downloaded by way of a computer data signal embodied in a carrier wave over a transmission medium or a communication link. The processor readable storage device may include any medium that can store information including an optical medium, semiconductor medium, and/or magnetic medium. Processor readable storage device examples include an electronic circuit; a semiconductor device, a semiconductor memory device, a read only memory (ROM), a flash memory, an erasable programmable read only memory (EPROM); a floppy diskette, a CD-ROM, an optical disk, a hard disk, or other storage device. The code segments may be downloaded via computer networks such as the Internet, Intranet, etc. Any of a wide variety of centralized or distributed data processing architectures may be employed. Programmed instructions may be implemented as a number of separate programs or subroutines, or they may be integrated into a number of other aspects of the systems described herein. In some examples, the control system may support wireless communication protocols such as Bluetooth, Infrared Data Association (IrDA), HomeRF, IEEE 802.11. Digital Enhanced Cordless Telecommunications (DECT), ultra-wideband (UWB). ZigBee, and Wireless Telemetry.
Note that the processes and displays presented may not inherently be related to any particular computer or other apparatus. Various general-purpose systems may be used with programs in accordance with the teachings herein, or it may prove convenient to construct a more specialized apparatus to perform the operations described. The required structure for a variety of these systems will appear as elements in the claims. In addition, the examples of the invention are not described with reference to any particular programming language. It will be appreciated that a variety of programming languages may be used to implement the teachings of the invention as described herein.
This disclosure describes various instruments, portions of instruments, and anatomic structures in terms of their state in three-dimensional space. As used herein, the term “position” refers to the location of an object or a portion of an object in a three-dimensional space (e.g., three degrees of translational freedom along Cartesian x-, y-, and z-coordinates). As used herein, the term “orientation” refers to the rotational placement of an object or a portion of an object (three degrees of rotational freedom—e.g., roll, pitch, and yaw). As used herein, the term “pose” refers to the position of an object or a portion of an object in at least one degree of translational freedom and to the orientation of that object or portion of the object in at least one degree of rotational freedom (up to six total degrees of freedom). As used herein, the term “shape” refers to a set of poses, positions, or orientations measured along an object.
While certain illustrative examples of the invention have been described and shown in the accompanying drawings, it is to be understood that such examples are merely illustrative of and not restrictive on the broad invention, and that the examples of the invention not be limited to the specific constructions and arrangements shown and described, since various other modifications may occur to those ordinarily skilled in the art.
This application claims priority to and benefit of U.S. Provisional Application No. 63/249,096, filed Sep. 28, 2021 and entitled “Systems and Methods for Target Nodule Identification,” which is incorporated by reference herein in its entirety.
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/US2022/076696 | 9/20/2022 | WO |
Number | Date | Country | |
---|---|---|---|
63249096 | Sep 2021 | US |