The present disclosure is directed to systems and methods for planning and performing an image-guided procedure and more particularly to systems and methods for automatically generating an anatomical boundary that may be viewed and/or manipulated via a graphical user interface.
Minimally invasive medical techniques are intended to reduce the amount of tissue that is damaged during medical procedures, thereby reducing patient recovery time, discomfort, and harmful side effects. Such minimally invasive techniques may be performed through natural orifices in a patient anatomy or through one or more surgical incisions. Through these natural orifices or incisions clinicians may insert minimally invasive medical instruments (including surgical, diagnostic, therapeutic, or biopsy instruments) to reach a target tissue location. Some minimally invasive techniques use medical instruments that may be inserted into anatomic passageways and navigated toward a region of interest within the patient anatomy. Control of such an instrument during an image-guided procedure may involve the management of several degrees of freedom of movement including insertion and retraction of the elongate device as well as steering of the device. Improved systems and methods may be used to reduce the risk of patient injury by identifying boundaries when planning the navigation and deployment of the instrument.
Consistent with some embodiments, a medical system is provided. The system includes a display system and a medical instrument. The system further includes a control system communicatively coupled to the display system. The control system is configured to display image data of a patient anatomy via the display system. The control system is further configured to determine, while the medical instrument navigates the patient anatomy, that a distal portion of the medical instrument is within a threshold distance of a target location in the patient anatomy. The control system is further configured to display an anatomical boundary via the display system if the distal portion of the medical instrument is within the threshold distance of the target location. The anatomical boundary indicates a surface of an anatomical structure of the patient anatomy.
In another example, a non-transitory machine readable medium is provided. The non-transitory machine readable medium includes a plurality of machine readable instructions which when executed by one or more processors associated with a workstation are adapted to cause the one or more processors to perform a method. The method includes displaying image data of a patient anatomy via a display system. The method further includes determining, while a medical instrument navigates the patient anatomy, that a distal portion of the medical instrument is within a threshold distance of a target location in the patient anatomy. The method further includes displaying an anatomical boundary via the display system if the distal portion of the medical instrument is within the threshold distance of the target location. The anatomical boundary indicates a surface of an anatomical structure of the patient anatomy.
It is to be understood that both the foregoing general description and the following detailed description are illustrative and explanatory in nature and are intended to provide an understanding of the present disclosure without limiting the scope of the present disclosure. Additional aspects, features, and advantages of the present disclosure will be apparent to one skilled in the art from the following detailed description.
BRIEF DESCRIPTIONS OF THE DRAWINGS
Examples of the present disclosure and their advantages are best understood by referring to the detailed description that follows. It should be appreciated that like reference numerals are used to identify like elements illustrated in one or more of the figures, wherein showings therein are for purposes of illustrating examples of the present disclosure and not for purposes of limiting the same.
During the planning and execution of a medical procedure using a steerable medical instrument, an anatomical boundary or a virtual “hazard fence” may be defined by identifying an anatomical surface to be avoided by the medical instrument during the medical procedure. The anatomical boundary may shield vulnerable portions of the anatomy that are in the vicinity of a target location or may protect other anatomical structures of interest from being inadvertently penetrated by the medical instrument. Structures of interest, including vulnerable anatomic structures or surfaces, may include, for example, pulmonary pleurae, pulmonary fissures, large bullae, and blood vessels. For example, puncturing the lung pleura during the medical procedure could cause dangerous pneumothorax to the patient. Generating an anatomical boundary corresponding to the lung pleura would allow the operator to constrain the path of the medical instrument to avoid the vulnerable portion of the anatomy. A candidate path identified during a planning procedure may be identified as invalid when it passes within a threshold distance of a vulnerable portion of the anatomy or breaches a vulnerable portion of the anatomy. Illustrative examples of a graphical user interface for planning a medical procedure, including but not limited to the lung biopsy procedures, are provided below. The graphical user interface may include a plurality of modes including a data selection mode, a hybrid segmentation and planning mode, a preview mode, a save mode, a management mode, and a review mode. Some aspects of the graphical user interface are similar to features described in U.S. Provisional Patent Application No. 62/357,217, titled “Graphical User Interface for Displaying Guidance Information During and Image-Guided Procedure” and filed Jun. 30, 2016, and U.S. Provisional Patent Application No. 62/357,258, titled “Graphical User Interface for Displaying Guidance Information in a Plurality of Modes During and Image-Guided Procedure” and filed Jun. 30, 2016, which are incorporated by reference herein in their entirety.
The method 200 is illustrated as a set of operations or processes 210 through 250 and is described with continuing reference to
At a process 210, image data of an anatomical region is displayed. For example, as illustrated in
Graphical user interface 400 displays information associated with planning a medical procedure in one or more views that are viewable to a user. Although illustrative arrangements of views are depicted in
At a process 220, a target location in the anatomical region is determined. For example, with reference to
As further illustrated in
At a process 230, and as illustrated in
An adjustment information input 264 may also or alternatively influence the determination of the anatomical boundary 420. For example, the target border region 418 may intersect a surface of interest in the anatomical region to determine the anatomical boundary 420. The anatomical boundary 420 may represent the area of intersection between the target border region 418 and the surface of interest. The size of an area of intersection may vary depending on the length of the radius R of the target border region 418. For example, the area of intersection may increase as the length of the radius R increases. Accordingly, in some examples, the anatomical boundary 420 shown in the image data 410 may increase in size as the length of the radius R of the target border region 418 increases. As further illustrated in
An image data characteristics input 266 may also or alternatively influence the determination of the anatomical boundary 420. For example, the anatomical boundary 420 may represent areas of image data 410 with a characteristic such as a high intensity gradient, as a high intensity gradient indicates the presence of a surface of interest (e.g., the pleura of the lungs, a fissure of the lungs, a blood vessel wall, etc.). Computer vision techniques, including machine learning algorithms, may be applied to image data 410 to identify image data characteristics associated with candidate anatomical boundaries. Consistent with such examples, anatomical boundary 420 may include or be a portion of a candidate anatomical boundary determined by such computer vision or machine learning techniques.
A patient movement input 268 may also or alternatively influence the determination of the anatomical boundary 420. During navigation, the patient anatomy and consequently the three-dimensional anatomical model may move or become deformed by, for example, forces from the medical instrument (e.g., medical instrument 100), lung expiration and inspiration, and/or beating of the heart. The deformation may be measured, for example by a shape sensor in the medical instrument, or predicted by simulation, and the deformation may be applied to the three-dimensional anatomical model. The anatomical boundary 420 may likewise be adjusted or deformed to correspond to actual or planned deformation of the three-dimensional anatomical model.
In some examples, after the anatomical boundary 420 is determined, the user (e.g., the surgeon) may enter a manual adjustment mode to make further adjustments to the anatomical boundary 420. In the manual adjustment mode, the surgeon may manually edit and/or fine-tune the anatomical boundary 420. In that regard, the user may adjust any portion of the anatomical boundary 420. For example, the user may smooth out one or more portions of the anatomical boundary 420, connect one or more portions of the anatomical boundary 420 that may be disconnected, expand one or more portions of the anatomical boundary 420, reduce one or more portions of the anatomical boundary 420, etc. In some examples, an icon (e.g., a button, a pop up window, etc.) may be presented in the graphical user interface 400 to allow the user to enter the manual adjustment mode. For example, an icon 460 may be presented in the adjustment menu 430 that says “MODIFY BORDER” or “MANUAL EDIT.” In such examples, the user may enter the manual adjustment mode by clicking and/or pressing the icon 460.
In the manual adjustment mode, in some examples, the manual adjustments may be made to a three-dimensional anatomical boundary, such as the anatomical boundary 420 overlaid on the image data 410. The user may rotate the anatomical boundary 420 in the graphical user interface 400 to view the anatomical boundary 420 from all angles. This allows the user to determine if all desired adjustments are made to the anatomical boundary 420. To manipulate/modify the anatomical boundary 420, the user may click and drag a portion of the anatomical boundary 420, for example. Alternately or additionally, the user may modify the anatomical boundary 420 to fill in any portion that may be missing and/or to connect any portions that may be disconnected. These adjustments may be made via an additional button, slider bar, or other icon that may be presented in the adjustment menu 430. In some examples, the user may move at least a portion of the anatomical boundary 420 outward in a direction away from the target border region 418. In some examples, the user may move at least a portion of the anatomical boundary 420 inward in a direction toward the target border region 418. The user may select a discrete portion of the anatomical boundary 420 to move toward or away from the target border region 418. Alternately or additionally, the entire anatomical boundary 420 may be moved toward or away from the target border region 418. In some examples, the user may draw a modified anatomical boundary in freehand, in polyline form, in a series of plotted points, or the like.
In alternative examples, in the manual adjustment mode, the manual adjustments may be made to a two-dimensional anatomical boundary (e.g., the anatomical boundary 420 illustrated in the thumbnail view 412). In such examples, the graphical user interface 400 may present the view 412 in the main viewing window and may present the image data 410 in a thumbnail view. Two-dimensional adjustments may be made to the anatomical boundary 420 in one or more of the manners discussed above with respect to three-dimensional adjustments. For example, the user may click and drag a portion of the anatomical boundary 420, fill in any portion of the anatomical boundary 420 that may be missing, and/or connect any portions of the anatomical boundary 420 that may be disconnected. In some examples, the anatomical boundary 420 may be moved away from or toward the target border region 418, as discussed above. Additionally, the user may draw a modified anatomical boundary in freehand, in polyline form, in a series of plotted points, or the like.
Referring again to
In some examples, after the target border region 418 is generated, the target border region 418 is analyzed to determine whether at least a threshold percentage of the target border region 418 is located outside the surface of interest (which may be the pleura of the lungs, for example). In some examples, a default threshold percentage of the target border region 418 is 15%. In other examples, the threshold percentage of the target border region 418 may be larger or smaller. For example, the threshold percentage may range from 15%-30%. In other examples, the threshold percentage may be less than 15% or greater than 30%. In examples when the threshold percentage is not outside the surface of interest, the anatomical boundary 420 remains unchanged. In examples when the threshold percentage is outside the surface of interest, a determination is made regarding whether the target border region 418 is near a portion of interest (e.g., a blood vessel, the heart, etc.) that may indicate a sensitive anatomical region that raises concerns for potential unintentional damages during a procedure.
If the target border region 418 is not near a portion of interest, then the anatomical boundary 420 may be adjusted to encompass the portion of the target border region 418 that was determined to be outside the surface of interest. For example, a portion of the anatomical boundary 420 may be expanded so that all portions of the target border region 418 are included within the anatomical boundary 420. To expand the anatomical boundary, a first margin is generated. At least a portion of the first margin may illustrate the expanded portion of the anatomical boundary 420 and as such may be displayed in the graphical user interface 400. The center C of the target border region 418 may also be the center of the first margin. The first margin may expand radially outward from its center. Additionally, the first margin may be larger than the target border region 418. In other examples, the first margin may be the same size as the target border region 418. In some examples, the first margin is sized based on the radius R of the target border region 418. For example, a radius of the first margin may be 5 mm greater than the radius R. In other examples, the radius of the first margin may be larger or smaller For example, the radius of the first margin may be 3 mm greater than the radius R. In other examples, the radius of the first margin may be 6 mm greater than the radius R. The lengths discussed above for the radius of the first margin are discussed for exemplary purposes only—the radius of the first margin may be any other suitable length. After the first margin is generated, a second margin may be generated. The second margin may be larger than the first margin. As with the first margin, the center C of the target border region 418 may be the center of the second margin. The second margin may expand radially outward from its center. Additionally, the second margin may be sized based on the radius R of the target border region 418. For example, a radius of the second margin may be 25 mm greater than the radius R. In other examples, the radius of the second margin may be larger or smaller For example, the radius of the second margin may be 20 mm greater than the radius R. In other examples, the radius of the second margin may be 30 mm greater than the radius R. The lengths discussed above for the radius of the second margin are discussed for exemplary purposes only—the radius of the second margin may be any other suitable length. After the second margin is determined, the control system may use the second margin to smooth out the first margin. This may be done to ensure that the anatomical boundary 420 smoothly transitions from the originally-determined anatomical boundary 420 to the expanded portion of the anatomical boundary 420 (i.e., the first margin) and back to the originally-determined anatomical boundary 420. A smooth anatomical boundary 420, including the first margin, may more closely represent the surface of interest by not depicting sharp corners or bends that may not be present in the surface of interest.
In other examples, if the target border region 418 is near a portion of interest, then a trajectory path (e.g., the trajectory path 546) may be determined. In some examples, the trajectory path may point away from or substantially away from the portion of interest. In such examples, the anatomical boundary 420 may be adjusted to encompass the portion of the target border region 418 that was determined to be outside the surface of interest, as discussed above. In other examples, the trajectory path may point toward or substantially toward the portion of interest. In such examples, the control system may disable (e.g., delete, hide, etc.) the determined anatomical boundary 420 and instead prompt the user to manually generate an anatomical boundary. Various systems and methods for manually generating an anatomical boundary are described in U.S. Provisional Patent Application No. 62/741,157 (filed on Oct. 4, 2018) (entitled “Graphical User Interface for Defining an Anatomical Boundary”), which is incorporated by reference herein in its entirety.
At a process 250, a zone boundary may be determined based on one or more inputs as illustrated in
An adjustment information input 274 may influence the determination of the zone boundary 540. In some examples, as shown in
A multiple trajectory path input 276 may also or alternatively influence the determination of the zone boundary 540. With reference to
A trajectory path analysis input 278 may also or alternatively influence the determination of the zone boundary 540. In some embodiments, one or more trajectory paths (e.g., trajectory path 546) may be analyzed for viability to determine whether the distal end of the medical instrument is in an area of the patient anatomy that is separated from the target by a surface of interest. For example, a determination may be made as to whether the distal end of the medical instrument is in a different lobe of the patient lung, separated by a lung fissure, from the anatomic target. Because fissures of the patient anatomy separate the lobes of the patient anatomy, in situations when the instrument distal end and anatomic target are in different lobes, the biopsy needle would puncture a fissure when traveling along the trajectory path between the instrument distal end and the anatomic target. While this discussion makes reference to a fissure of the patient anatomy, it is to be understood that the discussion may also apply to any other portions of interest (e.g., large bullae, blood vessels, etc.). In some examples, an analysis may be conducted to determine whether a portion of the trajectory path 546 intersects with a portion of interest, such as a fissure. The fissure may be modeled as a mesh or a voxel mask generated based on the segmentation of image data. If the trajectory path 546 intersects the fissure model, a determination may be made that the trajectory path is unsafe and the trajectory path may be discarded, suppressed or otherwise not used in the determination of the zone boundary 540. In cases where a trajectory path does not intersect the fissure, a determination may be made that the trajectory path is acceptable and the trajectory path may be presented to a user as a candidate trajectory path.
Additionally or alternatively, image data 410a may include a single plane or “slice” of the image data, as depicted in a thumbnail view 412a of graphical user interface 400. In some examples, the thumbnail view 412a may also include the guidance information, as shown in
In some examples, one or more user inputs may be received to manipulate an anatomical boundary in a graphical user interface.
In some examples, when the anatomical boundary 420 is hidden, the distance D between the exit point 535 and the intersection point 555 may still be displayed. In other embodiments, when the anatomical boundary 420 is hidden, the distance D may also be hidden. Additionally or alternatively, when the anatomical boundary 420 is hidden, any other feature shown in the image data 410 (e.g., the target location 416, the target border region 418, the center C, the radius R, etc.) corresponding to the anatomical boundary 420 may also be hidden. In some examples, when the anatomical boundary 420 is hidden, the slider bars 432, 438 may be removed from the adjustment menu 430, as shown in
As shown in the embodiment of
In some examples, when the anatomical boundary 420 is deleted, the distance D between the exit point 535 and the intersection point 555 may also be deleted. Additionally, when the anatomical boundary 420 is deleted, any other feature shown in the image data 410 (e.g., the target location 416, the target border region 418, the center C, the radius R, etc.) corresponding to the anatomical boundary 420 may also be deleted.
In some examples, when one of the menus 580, 590 is selected, the details corresponding to the selected menu may be displayed in the image data 410. For example, as shown in
The anatomical boundary 560 may be generated in the same manner as discussed above with respect to the generation of the anatomical boundary 420. Similarly, the target border region 564 may be generated and/or adjusted in the same manner as discussed above with respect to the target border region 418. In several examples, one or both of the anatomical boundaries 420, 560 may be displayed, hidden, or deleted, as discussed above. For example, if the operator wants to analyze the portion of the three-dimensional anatomic model surrounding the target location 562, the operator may choose to hide the anatomical boundary 420. Similarly, if the operator wants to analyze the portion of the three-dimensional anatomic model surrounding the target location 416, the operator may choose to hide the anatomical boundary 560. While
As further shown in
During navigation, the control system 612 may display a trajectory zone and/or an anatomical boundary with a three-dimensional anatomical model of the patient anatomy or with other anatomical views presented on a user display. As discussed above, prior to the navigation phase of a medical procedure (e.g., a biopsy procedure), the plan for the medical procedure may be saved (e.g., as one or more digital files) and transferred to the robotic-assisted medical system used to perform the medical procedure. The saved plan may include the 3D model, identification of airways, target locations, trajectories to target locations, routes through the 3D model, and/or the like. Alternately or additionally, all of the data obtained prior to the planning phase, which may include the saved plan, may be transferred to the robotic-assisted medical system. This data may then be used to dynamically generate an anatomical boundary during the navigation phase of the medical procedure. The anatomical boundary may be displayed on a graphical user interface. The shape, position, and/or orientation of the anatomical boundary may change as the position and orientation of the medical instrument changes as the medical instrument navigates through the patient anatomy.
For example,
As shown in
In several examples, a trajectory zone 830 may be determined. The trajectory zone 830 may be displayed in the graphical user interface 800 as the medical instrument is inserted into and navigates through the patient anatomy. In some examples, the trajectory zone 830 may be partially transparent so that the airways of the three-dimensional anatomical model behind the trajectory zone 830 may be visible. The position and orientation of the trajectory zone 830 may be based on the position and orientation of a distal portion of the medical instrument. For example, the trajectory zone 830 is displayed in the graphical user interface 800 so as to appear to project from the distal portion of the medical instrument. In this way, the position and orientation of the trajectory zone 830 may substantially match the position and orientation of the distal portion of the medical instrument. Thus, as the position/orientation of the medical instrument changes during navigation, the position/orientation of the trajectory zone 830 also correspondingly changes.
A depth Dt of the trajectory zone 830 may be based on an extension length of a tool extendable from the medical instrument. For example, the depth Dt of the trajectory zone 830 may be set at the maximum extension length of the tool. The maximum extension length may be measured from a distal end of the medical instrument to a distal end of the tool when the tool is fully extended from the medical instrument. In some examples, the maximum extension length is 30 mm. In other examples, the maximum extension length may be larger or smaller. For example, the maximum extension length may be 25 mm. In other examples, the maximum extension length may be 35 mm. The lengths discussed above for the maximum extension length of the tool are discussed for exemplary purposes only—the maximum extension length of the tool may be any other suitable length. As discussed above, in some examples, the tool is a needle.
As shown in
As further shown in
In several examples, during navigation, the patient anatomy and consequently the three-dimensional anatomical model may move or become deformed by, for example, forces from the medical instrument, lung expiration and inspiration, and/or beating of the heart. The deformation may be measured, for example by a shape sensor in the medical instrument, or predicted by simulation, and the deformation may be applied to the three-dimensional anatomical model. The displayed traversal path 820, trajectory zone 830, target location 840, and anatomical boundary 860 (
As discussed above, the position/orientation of the traversal path 820 and the trajectory zone 830 are updated as the medical instrument traverses the patient anatomy. In the example shown in
In
As the medical instrument is inserted further into the patient anatomy, as shown in
Also shown in
As the medical instrument is inserted further into the patient anatomy (e.g., to the exit point or substantially close to the exit point), as shown in
In some examples, the designs 862, 864 may be displayed when the medical instrument reaches discrete insertion distances, as discussed above. In other examples, the anatomical boundary 860 changes from being displayed without any designs, to being displayed with the first design 862, to being displayed with the second design 864 in a dynamic manner corresponding to the insertion of the medical instrument into the patient anatomy. The dynamically changing display of the anatomical boundary 860 may aid the user in determining when the medical instrument is approaching the target location 840 and how close the medical instrument may be getting to the target location 840. In examples when the designs 862, 864 include colors, the color of the anatomical boundary 860 may gradually change (e.g., in a gradient) as the medical instrument is inserted closer to the target location 840. For example, the anatomical boundary 860 may gradually change from yellow when the anatomical boundary 860 is first displayed (e.g., when the distal portion of the medical instrument reaches the threshold distance), to orange when the distal portion of the medical instrument reaches the first distance, to red when the distal portion of the medical instrument reaches the exit point. The gradually changing display of the anatomical boundary 860 may be illustrated in any other suitable manner For example, as shown in
In alternative examples, the gradually changing display of the anatomical boundary 860 may be illustrated through changes in brightness of the anatomical boundary 860, shading of the anatomical boundary 860, a pattern of the anatomical boundary 860, a border of the anatomical boundary 860, or in any other suitable manner In further examples, a different audible indicator/indication may be presented to the user when the medical instrument reaches each of the threshold distance, the first distance, and the second distance to indicate that the medical instrument is getting closer to the target location 840. Alternatively, a gradually changing audible indicator may be presented to the user as the medical instrument gets closer to the target location 840. For example, as the medical instrument gets closer to the target location 840, the volume of a tone may change (e.g., louder or softer), a pitch of a tone may change (e.g., low to high or high to low), etc. Any other audible indicator may be used to show that the medical instrument is approaching the target location 840. In further examples, a different and/or gradually changing textual indicator/indication may be presented to the user when the medical instrument reaches each of the threshold distance, the first distance, and the second distance to indicate that the medical instrument is getting closer to the target location 840. In further alternative examples, a different and/or gradually changing haptic indicator/indication (e.g., haptic feedback) may change as the medical instrument gets closer to the target location 840.
In some embodiments, the planning and/or navigation techniques of this disclosure may be used in an image-guided medical procedure performed with a teleoperated medical system as described in further detail below. As shown in
Teleoperated medical system 600 also includes a display system 610 (which may include graphical user interface 400) for displaying an image or representation of the surgical site and medical instrument 604 generated by a sensor system 608 and/or an endoscopic imaging system 609. Display system 610 and master assembly 606 may be oriented so an operator O can control medical instrument 604 and master assembly 606 with the perception of telepresence. Any of the previously described graphical user interfaces may be displayable on the display system 610 and/or a display system of an independent planning workstation.
In some embodiments, medical instrument 604 may include components for use in surgery, biopsy, ablation, illumination, irrigation, or suction. Optionally medical instrument 604, together with sensor system 608 may be used to gather (e.g., measure or survey) a set of data points corresponding to locations within anatomic passageways of a patient, such as patient P. In some embodiments, medical instrument 604 may include components of the imaging system 609, which may include an imaging scope assembly or imaging instrument that records a concurrent or real-time image of a surgical site and provides the image to the operator or operator O through the display system 610. The concurrent image may be, for example, a two or three-dimensional image captured by an imaging instrument positioned within the surgical site. In some embodiments, the imaging system components that may be integrally or removably coupled to medical instrument 604. However, in some embodiments, a separate endoscope, attached to a separate manipulator assembly may be used with medical instrument 604 to image the surgical site. The imaging system 609 may be implemented as hardware, firmware, software or a combination thereof which interact with or are otherwise executed by one or more computer processors, which may include the processors of the control system 612.
The sensor system 608 may include a position/location sensor system (e.g., an electromagnetic (EM) sensor system) and/or a shape sensor system for determining the position, orientation, speed, velocity, pose, and/or shape of the medical instrument 604.
Teleoperated medical system 600 may also include control system 612. Control system 612 includes at least one memory 616 and at least one computer processor 614 for effecting control between medical instrument 604, master assembly 606, sensor system 608, endoscopic imaging system 609, and display system 610. Control system 612 also includes programmed instructions (e.g., a non-transitory machine-readable medium storing the instructions) to implement a plurality of operating modes of the teleoperational system including a navigation planning mode, a navigation mode, and/or a procedure mode. Control system 612 also includes programmed instructions (e.g., a non-transitory machine-readable medium storing the instructions) to implement some or all of the methods described in accordance with aspects disclosed herein, including, for example, instructions for providing information to display system 610, instructions for determining a target location, instructions for determining an anatomical boundary, instructions for determining a trajectory zone, instructions for determining a zone boundary, and instructions for receiving user (e.g., operator O) inputs to a planning mode.
A plan for a medical procedure, such as a biopsy procedure, may be saved and used by the control system 612 to provide automated navigation or operator navigation assistance of a medical instrument to perform the biopsy procedure. During navigation, the control system 612 may display an anatomical boundary and/or a zone boundary with a three-dimensional anatomic model of the anatomic region, with an endoluminal view, or with other anatomical views presented on a user display. The anatomical boundary and/or a zone boundary may also or alternatively be displayed with (e.g., overlaid on) registered images from other imaging technology such as fluoroscopic images obtained during a medical procedure.
Control system 612 may optionally further include a virtual visualization system to provide navigation assistance to operator O when controlling medical instrument 604 during an image-guided surgical procedure. Virtual navigation using the virtual visualization system may be based upon reference to an acquired pre-operative or intra-operative dataset of anatomic passageways. The virtual visualization system processes images of the surgical site imaged using imaging technology such as computerized tomography (CT), magnetic resonance imaging (MRI), fluoroscopy, thermography, ultrasound, optical coherence tomography (OCT), thermal imaging, impedance imaging, laser imaging, nanotube X-ray imaging, and/or the like.
In this embodiment, a sensor system (e.g., sensor system 608) includes a shape sensor 714. Shape sensor 714 may include an optical fiber extending within and aligned with elongate device 710. In one embodiment, the optical fiber has a diameter of approximately 200 pm. In other embodiments, the dimensions may be larger or smaller. The optical fiber of shape sensor 714 forms a fiber optic bend sensor for determining the shape of the elongate device 710. In one alternative, optical fibers including Fiber Bragg Gratings (FBGs) are used to provide strain measurements in structures in one or more dimensions. Various systems and methods for monitoring the shape and relative position of an optical fiber in three dimensions are described in U.S. patent application Ser. No. 11/180,389 (filed Jul. 13, 2005) (disclosing “Fiber optic position and shape sensing device and method relating thereto”); U.S. patent application Ser. No. 12/047,056 (filed on Jul. 16, 2004) (disclosing “Fiber-optic shape and relative position sensing”); and U.S. Pat. No. 6,389,187 (filed on Jun. 17, 1998) (disclosing “Optical Fibre Bend Sensor”), which are all incorporated by reference herein in their entireties. Sensors in some embodiments may employ other suitable strain sensing techniques, such as Rayleigh scattering, Raman scattering, Brillouin scattering, and Fluorescence scattering. In some embodiments, the shape of the catheter may be determined using other techniques. For example, a history of the distal end pose of elongate device 710 can be used to reconstruct the shape of elongate device 710 over the interval of time.
As shown in
Elongate device 710 includes a channel (not shown) sized and shaped to receive a medical instrument 722. In some embodiments, medical instrument 722 may be used for procedures such as surgery, biopsy, ablation, illumination, irrigation, or suction. Medical instrument 722 can be deployed through elongate device 710 and used at a target location within the anatomy. Medical instrument 722 may include, for example, image capture probes, biopsy instruments, laser ablation fibers, and/or other surgical, diagnostic, or therapeutic tools. Medical instrument 722 may be advanced from the distal end 718 of the elongate device 710 to perform the procedure and then retracted back into the channel when the procedure is complete. Medical instrument 722 may be removed from proximal end of elongate device 710 or from another optional instrument port (not shown) along elongate device 710.
Elongate device 710 may also house cables, linkages, or other steering controls (not shown) to controllably bend distal end 718. In some examples, at least four cables are used to provide independent “up-down” steering to control a pitch of distal end 718 and “left-right” steering to control a yaw of distal end 718.
A position measuring device 720 may provide information about the position of instrument body 712 as it moves on insertion stage 708 along an insertion axis A. Position measuring device 720 may include resolvers, encoders, potentiometers, and/or other sensors that determine the rotation and/or orientation of the actuators controlling the motion of instrument carriage 706 and consequently the motion of instrument body 712. In some embodiments, insertion stage 708 is linear, while in other embodiments, the insertion stage 708 may be curved or have a combination of curved and linear sections.
In the description, specific details have been set forth describing some embodiments. Numerous specific details are set forth in order to provide a thorough understanding of the embodiments. It will be apparent, however, to one skilled in the art that some embodiments may be practiced without some or all of these specific details. The specific embodiments disclosed herein are meant to be illustrative but not limiting. One skilled in the art may realize other elements that, although not specifically described here, are within the scope and the spirit of this disclosure.
Elements described in detail with reference to one embodiment, implementation, or application optionally may be included, whenever practical, in other embodiments, implementations, or applications in which they are not specifically shown or described. For example, if an element is described in detail with reference to one embodiment and is not described with reference to a second embodiment, the element may nevertheless be claimed as included in the second embodiment. Thus, to avoid unnecessary repetition in the following description, one or more elements shown and described in association with one embodiment, implementation, or application may be incorporated into other embodiments, implementations, or aspects unless specifically described otherwise, unless the one or more elements would make an embodiment or implementation non-functional, or unless two or more of the elements provide conflicting functions. Not all the illustrated processes may be performed in all embodiments of the disclosed methods. Additionally, one or more processes that are not expressly illustrated in may be included before, after, in between, or as part of the illustrated processes. In some embodiments, one or more of the processes may be performed by a control system or may be implemented, at least in part, in the form of executable code stored on non-transitory, tangible, machine-readable media that when run by one or more processors may cause the one or more processors to perform one or more of the processes.
Any alterations and further modifications to the described devices, instruments, methods, and any further application of the principles of the present disclosure are fully contemplated as would normally occur to one skilled in the art to which the disclosure relates. In addition, dimensions provided herein are for specific examples and it is contemplated that different sizes, dimensions, and/or ratios may be utilized to implement the concepts of the present disclosure. To avoid needless descriptive repetition, one or more components or actions described in accordance with one illustrative embodiment can be used or omitted as applicable from other illustrative embodiments. For the sake of brevity, the numerous iterations of these combinations will not be described separately. For simplicity, in some instances the same reference numbers are used throughout the drawings to refer to the same or like parts.
The systems and methods described herein may be suited for navigation and treatment of anatomic tissues, via natural or surgically created connected passageways, in any of a variety of anatomic systems, including the lung, colon, the intestines, the kidneys and kidney calices, the brain, the heart, the circulatory system including vasculature, and/or the like. While some embodiments are provided herein with respect to medical procedures, any reference to medical or surgical instruments and medical or surgical methods is non-limiting. For example, the instruments, systems, and methods described herein may be used for non-medical purposes including industrial uses, general robotic uses, and sensing or manipulating non-tissue work pieces. Other example applications involve cosmetic improvements, imaging of human or animal anatomy, gathering data from human or animal anatomy, and training medical or non-medical personnel. Additional example applications include use for procedures on tissue removed from human or animal anatomies (without return to a human or animal anatomy) and performing procedures on human or animal cadavers. Further, these techniques can also be used for surgical and nonsurgical medical treatment or diagnosis procedures.
One or more elements in embodiments of this disclosure may be implemented in software to execute on a processor of a computer system such as control processing system. When implemented in software, the elements of the embodiments of this disclosure may be code segments to perform various tasks. The program or code segments can be stored in a processor readable storage medium or device that may have been downloaded by way of a computer data signal embodied in a carrier wave over a transmission medium or a communication link. The processor readable storage device may include any medium that can store information including an optical medium, semiconductor medium, and/or magnetic medium. Processor readable storage device examples include an electronic circuit; a semiconductor device, a semiconductor memory device, a read only memory (ROM), a flash memory, an erasable programmable read only memory (EPROM); a floppy diskette, a CD-ROM, an optical disk, a hard disk, or other storage device. The code segments may be downloaded via computer networks such as the Internet, Intranet, etc. Any of a wide variety of centralized or distributed data processing architectures may be employed. Programmed instructions may be implemented as a number of separate programs or subroutines, or they may be integrated into a number of other aspects of the systems described herein. In some examples, the control system may support wireless communication protocols such as Bluetooth, Infrared Data Association (IrDA), HomeRF, IEEE 802.11, Digital Enhanced Cordless Telecommunications (DECT), ultra-wideband (UWB), ZigBee, and Wireless Telemetry.
Note that the processes and displays presented may not inherently be related to any particular computer or other apparatus. Various general-purpose systems may be used with programs in accordance with the teachings herein, or it may prove convenient to construct a more specialized apparatus to perform the operations described. The required structure for a variety of these systems will appear as elements in the claims. In addition, the embodiments of the invention are not described with reference to any particular programming language. It will be appreciated that a variety of programming languages may be used to implement the teachings of the invention as described herein.
This disclosure describes various instruments, portions of instruments, and anatomic structures in terms of their state in three-dimensional space. As used herein, the term “position” refers to the location of an object or a portion of an object in a three-dimensional space (e.g., three degrees of translational freedom along Cartesian x-, y-, and z-coordinates). As used herein, the term “orientation” refers to the rotational placement of an object or a portion of an object (three degrees of rotational freedom—e.g., roll, pitch, and yaw). As used herein, the term “pose” refers to the position of an object or a portion of an object in at least one degree of translational freedom and to the orientation of that object or portion of the object in at least one degree of rotational freedom (up to six total degrees of freedom). As used herein, the term “shape” refers to a set of poses, positions, or orientations measured along an object.
While certain exemplary embodiments of the invention have been described and shown in the accompanying drawings, it is to be understood that such embodiments are merely illustrative of and not restrictive on the broad invention, and that the embodiments of the invention not be limited to the specific constructions and arrangements shown and described, since various other modifications may occur to those ordinarily skilled in the art.
This application claims the benefit of and priority to U.S. Provisional Application No. 62/955,184, filed Dec. 30, 2019, which is incorporated by reference herein in its entirety.
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/US2020/066267 | 12/19/2020 | WO |
Number | Date | Country | |
---|---|---|---|
62955184 | Dec 2019 | US |