The present disclosure is directed to systems and methods for generating three-dimensional images from two-dimensional images and associated image system localization information.
Minimally invasive medical techniques are intended to reduce the amount of tissue that is damaged during medical procedures, thereby reducing patient recovery time, discomfort, and harmful side effects. Such minimally invasive techniques may be performed through natural orifices in a patient anatomy or through one or more surgical incisions. Through these natural orifices or incisions, an operator may insert minimally invasive medical tools to reach a target tissue location. Minimally invasive medical tools include instruments such as therapeutic, diagnostic, biopsy, and surgical instruments. Medical tools may be inserted into anatomic passageways and navigated toward a region of interest within a patient anatomy. Navigation and interventional procedures at the region of interest may be assisted using intra-operative images of the anatomic passageways and surrounding anatomy. Improved systems and methods are needed to generate intra-operative images to visualize target anatomic structures during interventional procedures.
The following presents a simplified summary of various examples described herein and is not intended to identify key or critical elements or to delineate the scope of the claims.
Consistent with some examples, a system may comprise an elongate flexible instrument including an imaging device disposed at a distal portion of the elongate flexible instrument and a localization sensor within the elongate flexible instrument. The system may also comprise a controller comprising one or more processors configured to capture a first two-dimensional image with the imaging device in a first imaging configuration and receive first localization data for the distal portion of the elongate flexible instrument from the localization sensor while the imaging device is in the first imaging configuration. The one or more processors may also be configured to create a first image data set including the first localization data and the first two-dimensional image, capture a second two-dimensional image with the imaging device in a second imaging configuration, and receive second localization data for the distal portion of the elongate flexible instrument from the localization sensor while the imaging device is in the second imaging configuration. The one or more processors may also be configured to create a second image data set including the second localization data and the second two-dimensional image and generate a three-dimensional image based on a plurality of image data sets, including the first and second image data sets.
Consistent with some examples, a system may comprise an elongate flexible instrument including an imaging device disposed at a distal portion of the elongate flexible instrument and a localization sensor extending within the elongate flexible instrument to the distal portion. The system may also comprise a controller comprising one or more processors configured to capture a first two-dimensional image with the imaging device in a first configuration, receive first localization data for the distal portion of the elongate flexible instrument from the localization sensor while the imaging device is in the first configuration, and create a first image data set including the first two-dimensional image and the first localization data. The one or more processor may also be configured to generate a plurality of image data sets with the imaging device arranged in a plurality of different configurations, the plurality of image data sets including the first image data set and generate a three-dimensional image based on the plurality of image data sets.
Consistent with some examples, a method may comprise capturing a first two-dimensional image with an imaging device in a first configuration, the imaging device disposed at a distal portion of an elongate flexible instrument, receiving first localization data for the distal portion of the elongate flexible instrument from a localization sensor extending within the elongate flexible instrument to the distal portion while the imaging device is in the first configuration and creating a first image data set including the first localization data and the first two-dimensional image. The method may also comprise capturing a second two-dimensional image with the imaging device in a second configuration, receiving second position data for the distal portion of the elongate flexible instrument from the 1 sensor while the imaging device is in the second configuration, creating a second image data set including the second position data and the second two-dimensional image and generating a three-dimensional image based on the first and second image data sets.
Other examples include corresponding computer systems, apparatus, and computer programs recorded on one or more computer storage devices, each configured to perform the actions of the methods.
It is to be understood that both the foregoing general description and the following detailed description are illustrative and explanatory in nature and are intended to provide an understanding of the present disclosure without limiting the scope of the present disclosure. In that regard, additional aspects, features, and advantages of the present disclosure will be apparent to one skilled in the art from the following detailed description.
Embodiments of the present disclosure and their advantages are best understood by referring to the detailed description that follows. It should be appreciated that like reference numerals are used to identify like elements illustrated in one or more of the figures, wherein showings therein are for purposes of illustrating embodiments of the present disclosure and not for purposes of limiting the same.
The techniques disclosed in this document may be used to enhance intra-operative sensing instruments, including intra-operative imaging instruments, and their use in minimally invasive procedures. In some examples, intra-operative sensing data, including imaging data and localization data, may be utilized to generate three-dimensional intraoperative images of target tissue. The sensing instrument may include sensing systems including an imaging system and an imaging localization system. Although some of the imaging systems described herein are ultrasound imaging systems and some of the imaging localization systems described herein are optical fiber shape sensor systems, it is contemplated that the systems and methods described herein may be applied to other imaging and sensing modalities without departing from the scope of the present disclosure.
The systems and techniques described in this document may be used in a variety of medical procedures that may improve accuracy and outcomes through use of intra-operative imaging. For example, intra-operative imaging may be used to biopsy lesions or other tissue to, for example, evaluate the presence or extent of diseases such as cancer or surveil transplanted organs. As another example, intra-operative imaging may be used in cancer staging to determine via biopsy whether the disease has spread to lymph nodes. The medical procedure may be performed using hand-held or otherwise manually controlled imaging probes and tools (e.g., a bronchoscope). In other examples, the described imaging probes and tools may be manipulated with a robot-assisted medical system.
As shown in
In some examples, the imaging localization system 156 may include, for example, a localization sensor such as an optical fiber shape sensor, an electromagnetic (EM) sensor, or plurality of EM sensors positioned at a known location relative to the imaging system 154 to track the position and orientation of imaging system 154. The localization system 156 may be used to track the configuration, including position and orientation, of the distal end portion 163 of the sensing instrument 152, including the imaging device 161, in six degrees of freedom. Thus, the localization data from the localization system 156 may be used to record the configuration of the image data from the imaging device 161 in three-dimensional space. In one example, an optical fiber forms a fiber optic bend sensor for determining the shape of the sensing instrument 152. The optical fiber or a portion of the optical fiber may be fixed at the distal end portion 163 of the sensing instrument 152 or at a known location relative to the imaging device 161 to provide localization data, including position and/or orientation data, for the imaging device 161. With a distal end of the optical fiber shape sensor fixed at a constant offset to the imaging device, the position and orientation of the imaging device may be determined by the measured shape of the sensor. A proximal end of the shape sensor may be fixed or known relative to a robot-assisted medical system. In other examples, if the sensing instrument is manually manipulated, the proximal end of the shape sensor may be fixed to the patient body or another fixed or tracked location near the patient.
Optical fibers including Fiber Bragg Gratings (FBGs) may be used to provide strain measurements in structures in one or more dimensions. Various systems and methods for monitoring the shape and relative position of an optical fiber in three dimensions are described in U.S. patent application Ser. No. 11/180,389 (filed Jul. 13, 2005) (disclosing “Fiber optic position and shape sensing device and method relating thereto”); U.S. patent application Ser. No. 12/047,056 (filed on Jul. 16, 2004) (disclosing “Fiber-optic shape and relative position sensing”); and U.S. Pat. No. 6,389,187 (filed on Jun. 17, 1998) (disclosing “Optical Fiber Bend Sensor”), which are all incorporated by reference herein in their entireties. Sensors in some embodiments may employ other suitable strain sensing techniques, such as Rayleigh scattering, Raman scattering, Brillouin scattering, and Fluorescence scattering. In some examples, the shape of the localization of the imaging system 154 may be determined using other techniques. For example, a history of the distal end pose of sensing instrument can be used to reconstruct the shape of sensing instrument over the interval of time. In some embodiments, a shape sensor may comprise a plurality of position sensors (such as electromagnetic position sensors) which collectively provide shape data regarding a shape of at least a portion of the sensing instrument. It should be appreciated that “shape sensor” as used herein may refer to any suitable localization sensor. Generally, a shape sensor as that term is used herein may provide any number of data points in any number of degrees of freedom including three or six degrees of freedom at a series of points monitored by the shape sensor along the length of the elongate instrument.
In some examples, the sensing instrument 152 may include a steering system 160, including control wires, cables, or other control apparatus to bend or steer a distal end portion of the sensing instrument, which may include the imaging system 154. In some examples, the sensing instrument 152 may also include a channel or passage 164 through which an interventional tool 166 may be extended to emerge from a distal or side port of the sensing instrument 152 to engage the target tissue 113. The interventional tool 166 may include, for example, a biopsy or tissue sampling tool, an ablation tool including a heated or cryo-probe, an electroporation tool, a forceps, a medication delivery device, a fiducial delivery device, or another type of diagnostic or therapeutic device. In some examples, the interventional tool may have a flexible shaft. The interventional tool may include control wires or other control apparatus to bend or steer the direction of the interventional tool. Since it may be beneficial to provide real time visualization of the interventional tool positioned within or near a target tissue, the interventional tool may be delivered within the imaging field view of view of an ultrasound imaging instrument (e.g. instrument 152) for direct visualization of the interventional tool into the target tissue 113. The sensing instrument 152 may, optionally, include other sensing systems such as a visible light optical imaging system 158 positioned at a distal end portion of the sensing instrument 152.
At a process 202, an image data set may be generated. An image data set may include a constituent image of a composite image and a set of localization information associated with the constituent image. In some examples, the image data set may include image-only data. In other examples, the image data set may include localization information per image. In some examples, some image data sets used in the composite image may include localization data and others may include image-only data. As explained in process 214, one or more subprocesses of process 202 may be repeated to generate a plurality of image data sets.
In some examples, each image data set used in the generation of a composite image may be generated at a periodic stage of an anatomic cycle. Thus, at an optional process 204, the periodic stage of an anatomic cycle may be identified. The capture of each constituent image data set used to form a composite image may be gated or confined to the same stage of the identified anatomic cycle. The anatomic cycle may be, for example, a cardiac or respiratory cycle. For example, each image data set used in the generation of a composite image may be gathered at a gated stage of a respiratory cycle, such as full inhalation or full exhalation. Alternatively, if the patient's breathing is under control of a respirator, the capture of the constituent image data set may be performed under a breath hold controlled by the respirator.
At a process 206, a constituent image may be captured with a sensing instrument while the sensing instrument is arranged in an initial configuration. For example, a constituent image may be a two-dimensional ultrasound image of the target tissue 113 captured by the imaging device 161 (e.g., an ultrasound transducer) of the sensing instrument 152 while the imaging device is located in an initial configuration within the passageway 102. In the example of
At a process 208, localization data for the sensing instrument may be received, recorded, or otherwise captured while the imaging device is located in the initial configuration. For example, the imaging localization system 156 may capture localization data, including, for example, position and/or orientation data while the imaging device 161 is located in the initial configuration within the passageway 102. In the example of
At a process 210, an image data set may be created including the constituent image and the localization data. For example, an image data set may include the two-dimensional ultrasound image of the target tissue 113 generated by the sensing instrument 152 with the imaging device 161 is at the initial position and orientation and may include the associated localization data for the imaging device 161 at the initial position and orientation. In the example of
At a process 212, the sensing instrument may be moved from the initial configuration to a different imaging configuration. For example, the distal end portion 163 of sensing instrument 152 may be moved from the initial configuration to a second configuration to change a position and/or orientation of the imaging device 161. More specifically, the distal end portion 163 of the sensing instrument 152 may be translated (e.g. in an XA, YA, and/or ZA direction) or rotated (e.g., about any of the axes XA, YA, ZA) in any of six degrees of freedom to change the configuration and field of view of the imaging device 161.
At a process 214, with the sensing instrument moved to another configuration, the processes 202-212 may be repeated one or more times to generate a plurality of image data sets, each at a different imaging configuration. For example, the distal end portion 163 of the sensing instrument 152 may be moved through a series of configurations, collecting an image data set, including a two-dimensional ultrasound image and a set of associated localization data, at each configuration in the series of configurations. In the example of
At a process 216, a three-dimensional image may be generated from the plurality of image data sets. For example, the two-dimensional constituent images may be stitched together with imaging software to generate a three-dimensional composite image. The image stitching algorithms may utilize the localization data from each of the image data sets with feature detection and matching from the two-dimensional images to register, align, calibrate, and blend the two-dimensional constituent images to form the three-dimensional composite image.
At an optional process 218, the three-dimensional composite image may be registered to a three-dimensional, pre-operative or intra-operative model. Pre-operative and/or intra-operative image data may be captured and used to generate a three-dimensional model using imaging technology such as computerized tomography (CT), cone-beam CT, magnetic resonance imaging (MRI), fluoroscopy, tomosynthesis, thermography, optical coherence tomography (OCT), thermal imaging, impedance imaging, laser imaging, nanotube X-ray imaging, and/or the like. For example, a CT scan of the patient's anatomy may be performed pre-operatively or intra-operatively and the resulting image data may be used to construct a segmented 3D model. The 3D CT model may be registered with the three-dimensional composite image so that information such as annotations, navigation routes, identified target structures, identified sensitive tissues, or other structures and features identified, highlighted, or otherwise noted on the 3D CT model may be transferred or associated with the three-dimensional composite image. If the sensing instrument is integrated with a catheter of a robot-assisted medical system that has been registered to the patient anatomy and 3D CT model, the registration of three-dimensional composite image may be registered 3D CT model based on the registration of the catheter. If the sensing instrument is integrated with a manual catheter such as a manual bronchoscope, the proximal end of the shape sensor of the sensing instrument may be fixed to or near the patient or may be trackable (e.g., with encoders, optical fiducials, shape sensors, and/or the like) relative to the patient.
At an optional process 220, an interventional procedure may be conducted with reference to the three-dimensional composite image. In some examples, the three-dimensional composite image may be displayed to a clinician for use in better understanding the configuration of the target tissue, the surrounding structures, possible directions of approach, or other considerations for planning an interventional procedure. In some examples, the three-dimensional image may be displayed with a real-time optical and/or ultrasound image to assist the interventional procedure. In some examples, the three-dimensional image may be displayed with (including adjacent to, overlayed with, or merged with) the registered pre-operative or intra-operative image (e.g., CT image or model). In some examples, markers may be overlayed and stored with the three-dimensional image to track the locations where an interventional procedure (e.g., a biopsy or ablation) occurred. In some examples, the surrounding structures may include blood vessels or other vasculature structures which may be imaged using ultrasound or be part of the three-dimensional ultrasound composite image. The images of the vasculature together with the target tissue may help the clinician avoid damaging the vasculature while performing a procedure.
An interventional procedure may be performed, for example, using the interventional tool 166 which may be, for example, a biopsy or tissue sampling tool, an ablation tool including a heated or cryo-probe, an electroporation tool, a forceps, a medication delivery device, a fiducial delivery device, or another type of diagnostic or therapeutic device. In some examples, the three-dimensional composite image may be used to evaluate the size and shape of the target tissue 113 and to plan multiple locations for performing biopsies to gather sample tissue for evaluation. In some examples, a user-selected virtual interventional path for the interventional instrument may be displayed with the three-dimensional composite image. The interventional instrument may be aligned with the virtual interventional path. A real-time two-dimensional ultrasound image may be displayed with or overlayed on the three-dimensional composite image during the interventional procedure to help guide the intervention, such as a biopsy. Virtual markers may be placed on the three-dimensional image to mark the location of interventional areas, such as biopsy locations.
In various examples, supplemental images may be provided to a clinician to guide or assist with positioning an interventional tool and/or previewing an activity. The supplemental images may be two-dimensional guidance images used, for example, in planning, navigation, or conducting an interventional procedure. Further, the supplemental images may be used to supplement a registered three-dimensional pre-operative model (e.g., a pre-operative CT model), a three-dimensional composite image, or other models or images that relate to the procedure.
As shown in
In some alternative examples, a side-facing imaging device may include a side-facing curved phased array of ultrasound transducer elements.
As shown in
In other examples, the movement of the distal end portion 413 may be in another translational direction along the wall of the passageway 102, for example the +Y direction or a Z direction. In some examples, the motion of the distal end portion 413, including the distal end portion 413 may be achieved by movement of a control cable (e.g., a control cable of the steering system 160). In some examples a rotational degree of freedom of motion may be provided by manual or robot-assisted control. For example, the motion of the distal end portion may be a 180 degree reciprocating rotational motion. In some examples, translation and rotation may be combined or performed simultaneously. If the target tissue 113 can be imaged from other nearby passageways (e.g., the target tissue is located near a passageway bifurcation), the sensing instrument 402 may be moved to the nearby airway to obtain additional image constituent image data sets that may be used to form a composite three-dimensional image.
In various alternative examples, the sensing instruments described herein, such as the sensing instrument 402, may be slidable through a working channel of a delivery catheter and extendable from a distal opening of the delivery catheter (e.g., the elongate device 802). The sensing instrument may also be withdrawn proximally and removed from the delivery catheter. In examples in which the sensing instrument is separable from a delivery catheter, each of the sensing instrument and delivery catheter may include a shape sensor (e.g., an optical fiber shape sensor). The two shape sensors may be commonly referenced, allowing images generated with data from the sensing instrument to be located relative to the delivery catheter. As the sensing instrument is extended from a delivery catheter, the translational motion and rotational motion of the sensing instrument may be relative to the delivery catheter. For example, movement of the sensing instrument in a −Y direction may cause the sensing instrument to be withdrawn into the delivery catheter, and movement of the sensing instrument in a +Y direction may cause the sensing instrument to extend away from a distal end of the delivery catheter. To register the three-dimensional composite image generated by the sensing instrument to a pre-operative 3D CT model, delivery catheter of a robot-assisted medical system may be first registered to the 3D CT model. The sensing instrument, separable from the delivery catheter, may then be registered to the shape sensor of the delivery catheter.
In some examples, a forward-facing imaging device (e.g., imaging device 511) may include a linear array of ultrasound transducer elements.
As shown in
In other examples, the movement of the distal end portion 513 may be in another translational direction along the wall of the passageway 102, for example the −Y direction or a Z direction. In some examples, the motion of the distal end portion 513, including the distal end portion 513 may be achieved by movement of a control cable (e.g., a control cable of the steering system 160). In some examples a dedicated control cable may actuate the rotational degree of freedom of motion. In some examples, motion of the distal end portion may be a reciprocating rotational motion. In some examples, translation and the rotation may be combined or performed simultaneously. If the target tissue 113 can be imaged from other nearby passageways (e.g., the target tissue is located near a passageway bifurcation), the sensing instrument 502 may be moved to the nearby airway to obtain additional image constituent image data sets that may be used to form a composite three-dimensional image.
In some examples, a forward-facing imaging device (e.g., imaging device 511) may include an annular array of ultrasound transducer elements.
As shown in
In some examples, if electronically rotating the image plane does not sufficiently image the full area of interest, the distal end portion 513 of the sensing instrument 502 may be translated or pivoted in a generally Y or Z direction (e.g., along or about an axis C) to a second position within the passageway 102. While at the second longitudinal position, a plurality of image data sets may be obtained by capturing image data and localization data as the annular array electronically rotates the image plane at the second position.
As shown in
In some examples, medical procedures may be performed using hand-held or otherwise manually controlled medical instrument systems of this disclosure. In other examples, the described medical instrument systems or components thereof may be manipulated with a robot-assisted medical system as shown in
Robot-assisted medical system 700 also includes a display system 710 (which may display, for example, constituent ultrasound image generated by the sensing instrument or the composite three-dimensional image) for displaying an image or representation of the interventional site and medical instrument system 704 generated by a sensor system 708, an intra-operative imaging system 718, and/or an endoscopic imaging system 709. Display system 710 and master assembly 706 may be oriented so operator O can control medical instrument system 704 and master assembly 706 with the perception of telepresence.
In some examples, medical instrument system 704 may include components for use in surgery, biopsy, ablation, illumination, irrigation, or suction. In some examples, medical instrument system 704 may include components of the endoscopic imaging system 709, which may include an imaging scope assembly or imaging instrument (e.g. a visible light and/or near infrared light imaging) that records a concurrent or real-time image of a interventional site and provides the image to the operator or operator O through the display system 710. The concurrent image may be, for example, a two or three-dimensional image captured by an imaging instrument positioned within the interventional site. In some examples, the endoscopic imaging system components may be integrally or removably coupled to medical instrument system 704. However, in some examples, a separate endoscope, attached to a separate manipulator assembly may be used with medical instrument system 704 to image the interventional site. The endoscopic imaging system 709 may be implemented as hardware, firmware, software, or a combination thereof which interact with or are otherwise executed by one or more computer processors, which may include the processors of the control system 712.
The sensor system 708 may include a position/location sensor system (e.g., an electromagnetic (EM) sensor system) and/or a shape sensor system for determining the position, orientation, speed, velocity, pose, and/or shape of the medical instrument system 704. The imaging localization systems described herein may include all or portions of the sensor system 708.
Robot-assisted medical system 700 may also include control system 712. Control system 712 includes at least one memory 716 and at least one computer processor 714 for effecting control between medical instrument system 704, master assembly 706, sensor system 708, endoscopic imaging system 709, intra-operative imaging system 718, and display system 710. Control system 712 also includes programmed instructions (e.g., a non-transitory machine-readable medium storing the instructions) to implement some or all of the methods described in accordance with aspects disclosed herein, including instructions for providing information to display system 710.
Control system 712 may further include a virtual visualization system to provide navigation assistance to operator O when controlling medical instrument system 704 during an image-guided interventional procedure. Virtual navigation using the virtual visualization system may be based upon reference to an acquired pre-operative or intra-operative dataset of anatomic passageways. The virtual visualization system processes images of the interventional site imaged using imaging technology such as computerized tomography (CT), magnetic resonance imaging (MRI), fluoroscopy, thermography, ultrasound, optical coherence tomography (OCT), thermal imaging, impedance imaging, laser imaging, nanotube X-ray imaging, and/or the like.
An intra-operative imaging system 718 may be arranged in the surgical environment 701 near the patient P to obtain images of the anatomy of the patient P during a medical procedure. The intra-operative imaging system 718 may provide real-time or near real-time images of the patient P. In some examples, the intra-operative imaging system 718 may comprise an ultrasound imaging system (e.g. imaging system 154, 404, 504, 604) for generating two-dimensional and/or three-dimensional ultrasound images. For example, the intra-operative imaging system 718 may be at least partially incorporated into sensing instrument 152. In this regard, the intra-operative imaging system 718 may be partially or fully incorporated into the medical instrument system 704.
Medical instrument system 800 includes elongate device 802, such as a flexible catheter, coupled to a drive unit 804. Elongate device 802 includes a flexible body 816 having proximal end 817 and distal end, or tip portion, 818. In some embodiments, flexible body 816 has an approximately 3 mm outer diameter. Other flexible body outer diameters may be larger or smaller.
Medical instrument system 800 further includes a tracking system 830 for determining the position, orientation, speed, velocity, pose, and/or shape of distal end 818 and/or of one or more segments 824 along flexible body 816 using one or more sensors and/or imaging devices as described in further detail below. The entire length of flexible body 816, between distal end 818 and proximal end 817, may be effectively divided into segments 824. Tracking system 830 may optionally be implemented as hardware, firmware, software or a combination thereof which interact with or are otherwise executed by one or more computer processors, which may include the processors of control system 712 in
Tracking system 830 may optionally track distal end 818 and/or one or more of the segments 824 using a shape sensor 822. Shape sensor 822 may optionally include an optical fiber aligned with flexible body 816 (e.g., provided within an interior channel (not shown) or mounted externally). In one embodiment, the optical fiber has a diameter of approximately 200 μm. In other embodiments, the dimensions may be larger or smaller. The tracking system 830 may include the any of the imaging localization systems described above (e.g., imaging localization system 156). The optical fiber of shape sensor 822 forms a fiber optic bend sensor for determining the shape of flexible body 816. In one alternative, optical fibers including Fiber Bragg Gratings (FBGs) are used to provide strain measurements in structures in one or more dimensions. Sensors in some embodiments may employ other suitable strain sensing techniques, such as Rayleigh scattering, Raman scattering, Brillouin scattering, and Fluorescence scattering. In some embodiments, the shape of the elongate device may be determined using other techniques. In some embodiments, tracking system 830 may optionally and/or additionally track distal end 818 using a position sensor system 820, such as an electromagnetic (EM) sensor system. An EM sensor system may include one or more conductive coils that may be subjected to an externally generated electromagnetic field. Each coil of the EM sensor system then produces an induced electrical signal having characteristics that depend on the position and orientation of the coil relative to the externally generated electromagnetic field. In some examples, position sensor system 820 may be configured and positioned to measure six degrees of freedom, e.g., three position coordinates X, Y, Z and three orientation angles indicating pitch, yaw, and roll of a base point or five degrees of freedom, e.g., three position coordinates X, Y, Z and two orientation angles indicating pitch and yaw of a base point.
Flexible body 816 includes a channel 821 sized and shaped to receive a medical instrument 826 including any of the interventional tools described above (e.g., interventional tool 166).
Medical instrument 826 may additionally house cables, linkages, or other actuation controls (not shown) that extend between its proximal and distal ends to controllably the bend distal end of medical instrument 826. Flexible body 816 may also house cables, linkages, or other steering controls (not shown) that extend between drive unit 804 and distal end 818 to controllably bend distal end 818 as shown, for example, by broken dashed line depictions 819 of distal end 818. In some examples, at least four cables are used to provide independent “up-down” steering to control a pitch of distal end 818 and “left-right” steering to control a yaw of distal end 881. In embodiments in which medical instrument system 800 is actuated by a robot-assisted assembly, drive unit 804 may include drive inputs that removably couple to and receive power from drive elements, such as actuators, of the robot-assisted assembly. In some embodiments, medical instrument system 800 may include gripping features, manual actuators, or other components for manually controlling the motion of medical instrument system 800.
In some embodiments, medical instrument system 800 may include a flexible bronchial instrument, such as a bronchoscope or bronchial catheter, for use in examination, diagnosis, biopsy, or treatment of a lung. Medical instrument system 800 is also suited for navigation and treatment of other tissues, via natural or surgically created connected passageways, in any of a variety of anatomic systems, including the colon, the intestines, the kidneys and kidney calices, the brain, the heart, the circulatory system including vasculature, and/or the like. The information from tracking system 830 may be sent to a navigation system 832 where it is combined with information from visualization system 831 and/or the preoperatively obtained models to provide the physician or other operator with real-time position information. In some examples, the real-time position information may be displayed on display system 710 of
In the description, specific details have been set forth describing some examples. Numerous specific details are set forth in order to provide a thorough understanding of the examples. It will be apparent, however, to one skilled in the art that some examples may be practiced without some or all of these specific details. The specific examples disclosed herein are meant to be illustrative but not limiting. One skilled in the art may realize other elements that, although not specifically described here, are within the scope and the spirit of this disclosure.
Elements described in detail with reference to one example, implementation, or application optionally may be included, whenever practical, in other examples, implementations, or applications in which they are not specifically shown or described. For example, if an element is described in detail with reference to one example and is not described with reference to a second example, the element may nevertheless be claimed as included in the second example. Thus, to avoid unnecessary repetition in the following description, one or more elements shown and described in association with one example, implementation, or application may be incorporated into other examples, implementations, or aspects unless specifically described otherwise, unless the one or more elements would make an example or implementation non-functional, or unless two or more of the elements provide conflicting functions.
Any alterations and further modifications to the described devices, instruments, methods, and any further application of the principles of the present disclosure are fully contemplated as would normally occur to one skilled in the art to which the disclosure relates. In particular, it is fully contemplated that the features, components, and/or steps described with respect to one example may be combined with the features, components, and/or steps described with respect to other examples of the present disclosure. In addition, dimensions provided herein are for specific examples and it is contemplated that different sizes, dimensions, and/or ratios may be utilized to implement the concepts of the present disclosure. To avoid needless descriptive repetition, one or more components or actions described in accordance with one illustrative example can be used or omitted as applicable from other illustrative examples. For the sake of brevity, the numerous iterations of these combinations will not be described separately. For simplicity, in some instances the same reference numbers are used throughout the drawings to refer to the same or like parts.
The systems and methods described herein may be suited for imaging, via natural or surgically created connected passageways, in any of a variety of anatomic systems, including the lung, colon, the intestines, the stomach, the liver, the kidneys and kidney calices, the brain, the heart, the circulatory system including vasculature, and/or the like. While some examples are provided herein with respect to medical procedures, any reference to medical or surgical instruments and medical or surgical methods is non-limiting. For example, the instruments, systems, and methods described herein may be used for non-medical purposes including industrial uses, general robotic uses, and sensing or manipulating non-tissue work pieces. Other example applications involve cosmetic improvements, imaging of human or animal anatomy, gathering data from human or animal anatomy, and training medical or non-medical personnel. Additional example applications include use for procedures on tissue removed from human or animal anatomies (without return to a human or animal anatomy) and performing procedures on human or animal cadavers. Further, these techniques can also be used for surgical and nonsurgical medical treatment or diagnosis procedures.
The methods described herein are illustrated as a set of operations or processes. Not all the illustrated processes may be performed in all examples of the methods. Additionally, one or more processes that are not expressly illustrated or described may be included before, after, in between, or as part of the example processes. In some examples, one or more of the processes may be performed by the control system (e.g., control system 712) or may be implemented, at least in part, in the form of executable code stored on non-transitory, tangible, machine-readable media that when run by one or more processors (e.g., the processors 714 of control system 712) may cause the one or more processors to perform one or more of the processes.
One or more elements in examples of this disclosure may be implemented in software to execute on a processor of a computer system such as control processing system. When implemented in software, the elements of the examples may be the code segments to perform the necessary tasks. The program or code segments can be stored in a processor readable storage medium or device that may have been downloaded by way of a computer data signal embodied in a carrier wave over a transmission medium or a communication link. The processor readable storage device may include any medium that can store information including an optical medium, semiconductor medium, and magnetic medium. Processor readable storage device examples include an electronic circuit; a semiconductor device, a semiconductor memory device, a read only memory (ROM), a flash memory, an erasable programmable read only memory (EPROM); a floppy diskette, a CD-ROM, an optical disk, a hard disk, or other storage device. The code segments may be downloaded via computer networks such as the Internet, Intranet, etc. Any of a wide variety of centralized or distributed data processing architectures may be employed. Programmed instructions may be implemented as a number of separate programs or subroutines, or they may be integrated into a number of other aspects of the systems described herein. In one example, the control system supports wireless communication protocols such as Bluetooth, IrDA, HomeRF, IEEE 802.11, DECT, and Wireless Telemetry.
Note that the processes and displays presented may not inherently be related to any particular computer or other apparatus. Various general-purpose systems may be used with programs in accordance with the teachings herein, or it may prove convenient to construct a more specialized apparatus to perform the operations described. The required structure for a variety of these systems will appear as elements in the claims. In addition, the examples described herein are not described with reference to any particular programming language. It will be appreciated that a variety of programming languages may be used to implement the teachings as described herein.
In some instances well known methods, procedures, components, and circuits have not been described in detail so as not to unnecessarily obscure aspects of the examples. This disclosure describes various instruments, portions of instruments, and anatomic structures in terms of their state in three-dimensional space. As used herein, the term “position” refers to the location of an object or a portion of an object in a three-dimensional space (e.g., three degrees of translational freedom along Cartesian x-, y-, and z-coordinates). As used herein, the term “orientation” refers to the rotational placement of an object or a portion of an object (three degrees of rotational freedom—e.g., roll, pitch, and yaw). As used herein, the term “pose” refers to the position of an object or a portion of an object in at least one degree of translational freedom and to the orientation of that object or portion of the object in at least one degree of rotational freedom (up to six total degrees of freedom). As used herein, the term “shape” refers to a set of poses, positions, or orientations measured along an object.
While certain illustrative examples have been described and shown in the accompanying drawings, it is to be understood that such examples are merely illustrative of and not restrictive on the broad invention, and that the examples of the invention not be limited to the specific constructions and arrangements shown and described, since various other modifications may occur to those ordinarily skilled in the art.
This application claims priority to and benefit of U.S. Provisional Applications No. 63/497,644 filed Apr. 21, 2023 and entitled “Systems and Methods for Three-Dimensional Imaging,” which is incorporated by reference herein in its entirety.
Number | Date | Country | |
---|---|---|---|
63497644 | Apr 2023 | US |