The present disclosure is directed to systems and methods for planning and performing an image-guided procedure.
Minimally invasive medical techniques are intended to reduce the amount of tissue that is damaged during medical procedures, thereby reducing patient recovery time, discomfort, and harmful side effects. Such minimally invasive techniques may be performed through natural orifices in a patient anatomy or through one or more surgical incisions. Through these natural orifices or incisions, an operator may insert minimally invasive medical tools to reach a target tissue location. Minimally invasive medical tools include instruments such as therapeutic, diagnostic, biopsy, and surgical instruments. Medical tools may be inserted into anatomic passageways and navigated toward a region of interest within a patient anatomy. Navigation may be assisted using optical or ultrasound images of the anatomic passageways and surrounding anatomy, obtained pre-operatively and/or intra-operatively. Intra-operative imaging of a tool from a probe or catheter through which the tool is inserted may provide improved navigational guidance and confirmation of engagement of the tool with the target tissue.
Improved systems and methods are needed to efficiently track a tool location with respect to a catheter or probe through which the tool is inserted to improve imaging of the tool.
Consistent with some examples, a system may include a flexible elongate device, a tool, and a controller. The flexible elongate device may include a working channel extending to an opening in a distal portion of the flexible elongate device and a plurality of imaging elements. The tool may be extendable through the working channel and the opening. The controller may have one or more processors configured to determine a position associated with a portion of the tool, and control at least a portion of the plurality of imaging elements to generate an image that includes an imaging plane that includes the position associated with the portion of the tool. In some examples, the portion of the tool may include a distal tip of the tool.
In some examples, the plurality of imaging elements may include an annular array of the imaging elements disposed around the opening. The annular array may include at least two rows of imaging elements. In some examples, the plurality of imaging elements may include an array of ultrasound transmitters configured to generate ultrasound waves and an array of ultrasound sensors configured to detect reflected ultrasound waves. The ultrasound sensors may include whisper gallery mode (WGM) resonators. In some examples, the plurality of imaging elements may include an array of transducers configured to generate and detect ultrasound waves. In some examples, the plurality of imaging elements may include a grid array in which the plurality of imaging elements are arranged in at least two rows and at least two columns.
In some examples, the plurality of imaging elements may be configured to capture images including imaging planes defined within an imaging field located distally of the distal portion of the flexible elongate device. The working channel may extend through a distal end surface of the flexible elongate device to direct the tool into the imaging field. In some examples, the plurality of imaging elements may be configured to capture images including imaging planes defined within in an imaging field located radially outward from an outer side surface of the flexible elongate device. The working channel may extend through a side wall of the flexible elongate device to direct the tool into the imaging field.
In some examples, the controller may be further configured to determine a second position associated with one of a second position of the tool or a region of interest in a target tissue. The imaging plane may include both the position and the second position.
In some examples, the flexible elongate device may further comprise a balloon configured for inflation with a fluid for acoustic transmission between the plurality of imaging elements and tissue.
In some examples, the tool may comprise at least one of a biopsy needle, a therapeutic-delivery needle, an ablation device, or a cryogenic device. At least the portion of the tool may be flexible. The tool may include a localization sensor located at the portion of the tool. The localization sensor may comprise a fiber optic sensor configured to detect a change in a wavelength of light in a fiber in response to pressure waves generated by the plurality of imaging elements. The localization sensor may comprise a WGM resonator. The localization sensor may comprise a piezoelectric sensor. The localization sensor may comprise a capacitance micromachined ultrasound transducer.
In some examples, the controller may be configured to determine the position of the portion of the tool by sequentially activating a subset of the plurality of imaging elements to transmit pressure waves, detecting a pressure wave from each imaging element of the subset, determining a time of travel of the pressure wave from each imaging element of the subset to the localization sensor, and triangulating the position of the portion of the tool based on the time of travel of each pressure wave from the respective imaging element to the localization sensor.
In some examples, the plurality of imaging elements may be arranged in an annular array and the subset may comprise three imaging elements of the plurality of imaging elements. The three imaging elements may be spaced around a perimeter of the annular array with 120° of separation.
In some examples, the controller may be configured to determine the position of the portion of the tool by activating the localization sensor to transmit a pressure wave, activating a subset of the plurality of imaging elements to detect the pressure wave, determining a time of travel of the pressure wave from the localization sensor to each imaging element of the subset, and triangulating the position of the portion of the tool based on the time of travel of the pressure wave from the localization sensor to each imaging element of the subset.
In some examples, the controller may be configured to control the at least a portion of the plurality of imaging elements to generate the image by determining a firing sequence of the at least a portion of the plurality of imaging elements based on the imaging plane. The firing sequence may include a timing delay between successively firing imaging elements. The timing delay may be determined based on a geometry of an imaging field of the image with respect to a position of the plurality of imaging elements.
In some examples, the controller may be further configured to associate sets of imaging elements with different image planes, determine the imaging plane as being nearest to the position associated with the portion of the tool, and generate the image using a set of imaging elements associated with the determined imaging plane.
Consistent with some examples, a method for capturing an image performed by a controller including one or more processors may include determining a position associated with a portion of a tool. The tool may be at least partially located in a flexible elongate device. The flexible elongate device may include a plurality of imaging elements and a working channel extending to an opening in a distal portion of the flexible elongate device through which the portion of the tool extends. The method may further include controlling at least a portion of the plurality of imaging elements to generate an image that includes an imaging plane that includes the position associated with the portion of the tool.
In some examples, the method may include displaying a virtual indicator providing a visual representation of a position of the portion of the tool with respect to the imaging plane. The virtual indicator may include a distance indication providing a visual representation of a distance between the portion of the tool and the imaging plane. The virtual indicator may include a direction indication providing a representation of a direction of the portion of the tool with respect to the imaging plane. The direction indication may include a color in which at least a portion of the virtual indicator is displayed.
Other examples include corresponding computer systems, apparatus, and computer programs recorded on one or more computer storage devices, each configured to perform the actions of the methods.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory in nature and are intended to provide an understanding of the present disclosure without limiting the scope of the present disclosure. In that regard, additional aspects, features, and advantages of the present disclosure will be apparent to one skilled in the art from the following detailed description.
Examples of the present disclosure and their advantages are best understood by referring to the detailed description that follows. It should be appreciated that like reference numerals are used to identify like elements illustrated in one or more of the figures, wherein showings therein are for purposes of illustrating examples of the present disclosure and not for purposes of limiting the same.
The techniques disclosed in this document may be used to enhance the workflow processes of minimally invasive procedures using intra-operative direct visualization of a tool, such as intra-operative ultrasound imaging. In some examples, imaging data from an imaging probe may be utilized to verify real-time accurate placement of a treatment or diagnostic tool within an anatomical target during a medical procedure. Although described in the context of an ultrasound imaging probe, it is contemplated that the systems and methods described herein may be applied to other imaging modalities without departing from the scope of the present disclosure. For example, an imaging probe may be used to provide direct visual guidance of a tool as the tool is delivered via a flexible elongate device (e.g., a catheter or imaging probe) into a target. Such a tool may include, but is not limited to, an ablation device, biopsy tool (e.g., needle), therapeutic delivery tool, cryogenic device, irrigation, suction, or any other suitable medical treatment tool.
The imaging probe 104 may include one or more lumens for delivery of instruments, tools, devices, etc. In the illustrated example, a tool 106 comprising a needle has been delivered through a lumen of the imaging probe 104 into the target tissue 108. In some examples, the imaging probe 104 may include an imaging device for visualization of the tool 106, e.g., an ultrasound or optical imaging array, capturing images in an imaging field including an imaging plane 110. The imaging probe 104 includes a plurality (e.g., one or more arrays) of imaging elements. The imaging elements may be arranged to be any size, configuration, or shape. For example, the imaging elements may be arranged along a surface to form an array having a moon, ring, circle, or rectangle pattern. In some embodiments, the imaging elements include an array of transducers (e.g., lead zirconate titanate (PZT) transducers) that generate ultrasound waves and/or detect reflected ultrasound waves. In some embodiments, the imaging elements include an array of ultrasound receivers (e.g., whisper gallery mode (WGM) resonators) and an array of ultrasound transmitters (e.g., piezoelectric array). For additional description of an array of separate ultrasound receivers and ultrasound transmitters, see International Pat. Pub. No. WO 2021/119182 filed Dec. 9, 2020 (disclosing “Whispering Gallery Mode Resonators for Sensing Applications”), which is incorporated herein by reference in its entirety for all purposes.
The imaging probe 104 may generate images at various imaging planes based on control of the ultrasound transmission by the imaging elements. The transmitting imaging elements may use phased array ultrasonics or synthetic aperture techniques to capture images with selectable imaging planes. For a phased array ultrasonics, in order to direct the ultrasound energy in a desired direction to create a desired imaging plane, the imaging elements (e.g., transducers) are fired at a desired phase or time delay. In some examples, the phase or time delay can be calculated from a desired focal point/beam pattern. The firing sequence of the ultrasound imaging elements may be set to direct the ultrasound energy towards a target tissue. In some embodiments, different imaging planes that can be selected are associated with different configurations of the imaging elements. For example, various combinations and/or firing sequences of imaging elements may each be associated with one or more particular imaging planes. The control of the imaging elements to generate an image including the desired imaging plane may be based on the configuration and/or firing sequence associated with the desired imaging plane. A particular set of potential imaging planes may be preset and the associated combinations and/or firing sequences for each preset potential imaging plane may be stored in memory. In an example including an imaging array with imaging elements arranged radially around a central working channel, a preset set of potential imaging planes may be established at 5° intervals about an axis of the working channel. The physical arrangement of imaging elements may at least partially influence the spacing and orientation between the preset imaging planes. For example, an imaging array with a relatively high number of imaging elements in a dense arrangement may allow for a greater number of potential imaging planes than an imaging array with a relatively low number of imaging elements spaced apart. Upon determining a desired (or “optimum”) imaging plane as discussed further below, the controller may select the appropriate preset imaging plane of the set of preset imaging planes that most closely corresponds to the desired imaging plane. The controller may then activate the combination of imaging elements associated with the selected preset imaging plane in the firing sequence associated with the selected preset imaging plane. In some examples, one or more imaging element selection algorithms and/or firing sequence selection algorithms may be stored in memory and, upon determining the desired imaging plane, the controller may execute the one or more algorithms to select the appropriate imaging elements and/or firing sequence to generate an image along the desired imaging plane.
Since it can be beneficial to provide real time visualization of a tool positioned within a target, e.g., lesion, tumor, or nodule, to ensure accurate delivery of the tool within the target, the tool 106 may be delivered within the imaging plane 110 of the imaging probe 104 for direct visualization of the tool into the target tissue 108. Vasculature 109 is illustrated as being visualized within the imaging plane 110 in proximity to the target tissue 108, for example, by Doppler processing of ultrasound data. Identifying and avoiding vasculature 109 in the region of interest around the target tissue 108 may be particularly important in certain regions of the anatomy, such as in the mediastinum.
The imaging probe 104 may be steerable to navigate the imaging probe to a deployment location near the target tissue. For example, a plurality of pull wires or tendons may extend along a length of the imaging probe 104 and may be manipulated to steer the distal end of the imaging probe. Alternatively, the imaging probe 104 may be passively flexible and may be navigated to a deployment location by a steerable catheter or sheath having a lumen through which the imaging probe is disposed. In some examples, imaging probe 104 may include a lumen for receipt of a guidewire to navigate the imaging probe to a deployment location. Further discussion of navigation of an imaging probe 104 to a deployment location is provided below in relation to
Surface enhancements may be included on the tool 106 to facilitate imaging. For example, an external surface of the tool 106 may be coated, roughened, or include grooves or slits to affect a scatter pattern of a signal (e.g., pressure or acoustic wave) from the imaging elements. In some examples, such a surface enhancement may be provided on only a limited portion or portions of the tool 106. In an example, a portion of the tool 106 near or at the distal tip 105 may be smooth while a portion of the tool proximal of the distal tip may include a roughened surface to disperse imaging signals. In this regard, the portion including the distal tip 105 may appear more distinct in an image than a portion proximal thereof, providing visual confirmation that the distal tip 105 has been captured in the image(s). In an example, a portion of the tool 106 near or at the distal tip 105 may be coated in an echogenic material while a portion of the tool proximal of the distal tip may not. In this regard, the portion including the distal tip 105 may appear more distinct in an image than a portion proximal thereof, providing visual confirmation that the distal tip 105 has been captured in the image(s).
As will be appreciated, in the event that the tool 106 is rigid, it may be expected to extend distally from the opening of the working channel 112 directly along an axis 115 extending from the working channel. As such, the alignment of the tool 106 with respect to the imaging array 114 may be assumed based on a known location of the axis 115 of the working channel 112 with respect to the imaging array 114. In this regard, an optimum imaging plane, e.g., the imaging plane aligned with the axis 115, may be predetermined. However, in the event that the tool 106 is flexible, the tool may bend away from the axis 115 as it is extended from the imaging probe 104. For example, a flexible biopsy needle may bend away from the axis 115 as it encounters resistance from tissue between the imaging probe 104 and the target tissue. Accordingly, it will be appreciated that an imaging plane aligned with the working channel 112 may not always be suitable for displaying the distal tip of a flexible tool. Moreover, when a flexible tool bends out of plane, it may not be readily apparent when viewing an image from the imaging probe that the portion of the tool being displayed does not include the distal tip. Therefore, it may be desirable to use an alternative imaging plane laterally displaced from the axis 115 of the working channel 112 that may provide an optimum imaging plane based on the determined position of the distal tip 105.
After determining the position of the distal tip 105, the controller may identify another subset and a corresponding activation order of the imaging elements 118 suitable to generate an imaging plane 110 that includes the distal tip 105. For example, different imaging planes 110 are associated with different positions, and thus the determined position of the distal tip 105 may be used to select the appropriate imaging plane 110. In some examples, the spatial resolution of the potential imaging planes of the imaging array 114 may result in an inability of the imaging array to generate an imaging plane that captures the distal tip 105. That is, the potential imaging planes of the imaging array 114 may be spaced apart by some known distance (e.g., 0.25 mm) and the distal tip 105 may be determined to be positioned between two adjacent potential imaging planes. Accordingly, the controller may determine which potential imaging plane is nearest to the distal tip 105 and select that imaging plane as the optimum imaging plane 110. The controller may then activate the selected imaging elements 118 to generate an image in the optimum imaging plane 110. That is, the controller may control at least a portion of the plurality of imaging elements 118 to generate an image that includes an imaging plane that includes the position of the distal tip.
Advantageously, the imaging technique described herein may reduce the computational burden of imaging processing across a plurality of imaging planes by identifying the optimum imaging plane and capturing an image only in that plane. This may reduce the time required to construct and display an image to user, enhancing real-time visualization of a tool intra-operatively. The imaging plane may be selected to display any portion of the tool 106 of interest (e.g., not necessarily the distal tip 105), or some other feature (e.g., target tissue 108) of interest.
In some embodiments, an imaging plane 110 may be selected based on multiple positions or features of interest. For example, an imaging plane 110 that includes both the distal tip 105 of the tool 106 and a portion of the target tissue 108 may be selected. In another example, an imaging plane 110 that includes the distal tip 105 of the tool 106 and another (e.g., more proximal) portion of the tool 106 may be selected. As such, an optimal imaging plane 110 may be selected that includes multiple positions or features of interest.
In some of the above-described examples, the imaging elements 118 may be transducers capable of both transmitting and detecting acoustic waves. In some examples, the imaging elements 118 may each be a piezoelectric transmitter or a resonator. In some examples, the imaging array includes a piezoelectric array configured to generate ultrasound waves. The imaging elements of the imaging array includes WGM resonators configured to detect reflected ultrasound waves. The WGM resonators may include optical fiber resonators, microsphere resonators, microbubble resonators, microbottle resonators, microtoroid resonators, microdisk resonators, microring resonators, or some other structure of resonators. WGM resonators measure the intensity of reflected ultrasound waves. For example, incoming ultrasound echoes change the resonant frequency of the WGM resonators to generate resonance shift, by modulating the refractive index of the material of the WGM resonators, or physically deforming the WGM resonators.
Although the examples of
As with the imaging probe 104 of
It should be appreciated that the shapes, arrangements, sizes, and placement of the imaging arrays 114 of the illustrated examples may be adjusted as is suitable for any given implementation. That is, the figures should not be considered to be limiting with regard to the illustrated specific array shapes, specific number of imaging elements in the imaging arrays, the specific shapes of each imaging element, the spacing between adjacent imaging elements, etc. These illustrations are provided only as examples of the various configurations contemplated within the scope of the present disclosure.
It should further be appreciated that with ultrasound imaging, it is often desirable to secure the imaging probe in place for stabilization and minimize the volume of air and other gases between an imaging array and tissue which is to be imaged. Accordingly, a variety of balloons or inflatable/expandable fluid containing devices can be used to ensure fluid contact. For example, a balloon 119a may be disposed adjacent to or around the distal end of the imaging probe 104 and/or a balloon 119b may be disposed adjacent to the distal end of the catheter 124. A balloon 119 may be inflated with a coupling fluid to expand the balloon into contact with the surrounding tissue of an anatomical passageway to park or secure the imaging probe 104 in place and/or to fill the space between the imaging array and tissue that is to be imaged with a fluid that is conducive to imaging.
In some embodiments, an imaging array may be disposed within an inflatable balloon 119a and the inflatable balloon may be inflated with a fluid that is conducive to the applicable imaging medium, thereby reducing the volume of air or other gases between the imaging array and the patient's tissue. Inflating the balloon 119a in this manner may also park the imaging probe 104 to secure it in a desired location within the passageway. A balloon 119b may be disposed on catheter 124 to aid in parking the catheter.
In some examples, an inflatable balloon 119 may also be used to seal an anatomical passageway for suction or insufflation through a device such as the imaging probe 104 or catheter 124. In this regard, a balloon 119 may be used to block a passageway so that air may be suctioned from a portion of the passageway distal to the device, thereby collapsing the passageway. As an alternative to collapsing the passageway, a fluid or imaging gel may be injected into a portion of the passageway distal to the device after inflating the balloon, thereby filling the portion of the passageway with a medium that is conducive to imaging. Collapsing the passageway or filling the passageway with fluid as described may eliminate or reduce any volume of air which otherwise may hinder imaging quality.
In some examples, a balloon 119 may be used to retain an imaging array in direct contact with tissue. For example, in the case of a side-facing imaging array, a balloon on one side of an imaging probe may be inflated to push the imaging probe laterally into contact with tissue. In this regard, an imaging array on an opposing side of the imaging probe from the balloon may be forced into direct contact with the tissue. In another example, in the case of a forward-facing imaging array, the imaging device at the distal end of the imaging probe may be driven into direct contact with tissue to be imaged. Then a balloon 119a extending radially around the imaging probe may be inflated into contact with surrounding tissue to secure the imaging probe in place.
In this regard, it should be appreciated that an inflatable balloon may be beneficial for use with any forward-facing imaging array or side-facing imaging array discussed herein.
Additional description of examples of a flexible elongate instrument which may be similar to the imaging probes discussed herein and/or an inflatable balloon for facilitating imaging are described in U.S. Provisional Patent Application No. 63/240,471 filed Sep. 3, 2021 (disclosing “Ultrasound Elongate Instrument Systems And Methods”), which is incorporated herein by reference in its entirety for all purposes. Although illustrated only in
Once the position of the distal tip 105 has been determined relative to the imaging array, the controller may select an imaging plane that includes or is nearest to the distal tip 105. In the illustrated example of
In the illustrated image 130 of
Although shown as having an arcuate shape, it will be appreciated that each imaging element 118 of
After navigating an imaging probe (e.g., imaging probe 104) to a deployment location in the vicinity of a target tissue, at process 1002 a position of a distal tip of a tool (e.g., a tool 106 such as a needle) is determined. This may include any suitable method for determining the position of the distal tip including, but not limited to, triangulating the location of a localization sensor disposed on the tool, determining the position of an EM position sensor on the tool, capturing a plurality of images with different imaging planes and processing the images to identify the distal tip, etc. With regard to the latter, multiple or all of the possible combinations of imaging elements to capture different imaging planes may be used sequentially to capture an image in each plane. Image processing may then be performed to identify the optimal image that captures the distal tip, or other portion of interest, of the tool. Based on the known orientation of the imaging plane of the optimal image, the location of the distal tip can be determined from the optimal image. In this regard, a localization or EM sensor on the tool may not be necessary. In some examples, only a subset of all possible imaging planes may be used to determine the position of the distal tip. For example, an imaging probe having 20 possible imaging planes may utilize only a subset of 4 or 5 imaging planes. In the event the distal tip is not captured in one of the images of the subset of imaging planes, the position of the distal tip may be determined by interpolating between the imaging planes.
At a process 1004, the controller selects the imaging elements of an imaging array on the imaging probe that are to be used to generate an image that includes the distal tip of the tool. The selected imaging elements may be selected based on the determined position of the distal tip of the tool. The selected imaging elements may include all of the imaging elements of a particular imaging array or a subset thereof. For example, a single column of imaging elements extending across the imaging array may be selected (e.g., column 122a including imaging elements 118a-118f of
At a process 1006, the controller may proceed to generate the image using the selected imaging elements. If the generated image is determined to be unsatisfactory, for example if none or only a small portion of the tool is displayed, the controller may select a different subset of imaging elements and repeat the process. This selection of a different subset may be initiated automatically based on processing of the image data or in response to a user input.
As such, the controller controls at least a portion of the plurality of imaging elements to generate the image that includes an imaging plane that includes a position associated with a portion of the tool. The position may be the distal tip of the tool, or some other portion. The imaging elements may be controlled to only generate an image with the determined position of the portion of the tool. Alternatively, the imaging elements may be controlled to generate multiple images in different image planes and then an image processing may be performed to identify the image that includes the imaging plane that includes the position associated with the portion of the tool from the multiple images. In some embodiments, generating the image via the image processing may include generating a (e.g., 3D) model of the tool based on the multiple images in different imaging planes, identifying a position of a desired portion of the tool (e.g., distal tip) from the model, and selecting the image that includes the portion of the tool based on the identified position of the distal portion of the tool.
After navigating the imaging probe (e.g., imaging probe 104) to a deployment location in the vicinity of a target tissue, at process 1052 a tool (e.g., a tool 106 such as a needle) is extended from a working channel of the imaging probe.
At process 1054, a controller in communication with the imaging array (e.g., imaging array 114) of the imaging probe selects a subset of imaging elements (e.g., imaging elements 118) of the imaging array for performing localization of a localization sensor (e.g., sensor 116) and, in turn, a distal tip (e.g., distal tip 105) of the tool by a triangulation process. In some examples, the subset of imaging elements may be a default subset of imaging elements used for all localization processes or the subset may be selected based on properties or characteristics of current procedure, for example, a type of tool being used, a flexibility of the tool, a distance by which the tool has been extended from the imaging probe, a type of tissue in which the target tissue is disposed, etc. If a tool is sufficiently rigid or the tissue is sufficiently soft, it may be acceptable to select a subset of only two imaging elements to determine the position of the localization sensor and distal tip. On the other hand, if the tool is substantially flexible, at least three imaging elements may be needed to accurately determine the position of the localization sensor and distal tip. With reference to
At process 1056, the imaging elements of the selected subset are activated by the controller. The selected imaging elements may be activated sequentially in an order determined by the controller or in a default order. During process 1056, a first of the selected imaging elements may be “fired” to transmit an acoustic wave through the tissue between the imaging array and the localization sensor (e.g., along travel paths 128a-128c of
Alternatively, such as when the localization sensor comprises a CMUT, the controller may instruct the localization sensor to transmit a signal and may activate the selected imaging elements to detect the signal. A time at which the signal is transmitted may be recorded by the controller, as well as the time at which each respective imaging element or localization element received the signal.
Regardless of the direction in which the signal is sent between the imaging elements and the localization sensor, at a process 1058 the controller determines the position of the localization sensor with respect to the imaging array by triangulation using the time of travel of the respective signal between the selected imaging elements and the localization sensor. When the type of tissue through which the respective signals are transmitted is uniform, a longer time of travel indicates a longer distance between the localization sensor and an imaging element while a shorter time of travel indicates a shorter distance. When different types of tissue are disposed along different travel paths, the controller may adjust recorded travel times to compensate for different signal speeds through the different tissues. Using known properties regarding the speed at which a signal travels through the particular tissue(s), the controller may convert the travel times into distances. With the distances along each travel path being determined, the controller can determine the position of the localization sensor and, in turn, the position of the distal tip. In some examples, the controller may determine a position of the distal tip using the known fixed relationship between the distal tip and the localization sensor and, in some examples, the localization sensor may be disposed sufficiently close to the distal tip that the position of the distal tip may be assumed to be the position of the localization sensor.
With the position of the localization sensor and/or distal tip having been determined, at process 1060 the controller selects an imaging plane (e.g., imaging plane 110c of
At a process 1062, the controller identifies the imaging elements of the imaging array that are to be used to generate an image of the selected imaging plane. The identified imaging elements may include all of the imaging elements of a particular array or a subset thereof. For example, a single column of imaging elements extending across the imaging array may be identified (e.g., column 122a including imaging elements 118a-118f of
If the generated image is determined to be unsatisfactory, for example if only a small portion of the tool or target tissue is displayed, the controller may select a different subset of imaging elements. This selection of a different subset may be initiated automatically based on processing of the image data or in response to a user input.
Optionally, if the generated image indicates the tool has veered away from the target tissue or toward tissue to be avoided, the method 1050 may further include retracting the tool, repositioning the imaging probe, and repeating the processes 1052-1062. If the generated image indicates the tool is on an acceptable path to the target tissue, the processes 1054-1062 may be repeated.
In some examples, the techniques of this disclosure, such as those discussed in relation to
Robot-assisted medical system 1100 also includes a display system 1110 (which may display image 100 of
In some examples, medical instrument system 1104 may include components for use in surgery, biopsy, ablation, illumination, irrigation, or suction. Medical instrument system 1104, together with sensor system 1108 may be used to gather (i.e., measure) a set of data points corresponding to locations within anatomic passageways of a patient, such as patient P. In some examples, medical instrument system 1104 may include components of the endoscopic imaging system 1109, which may include an imaging scope assembly or imaging instrument (such as imaging probe 104) that records a concurrent or real-time image of a surgical site and provides the image to the operator or operator O through the display system 1110. The concurrent image may be, for example, a two or three-dimensional image captured by an imaging instrument positioned within the surgical site. In some examples, the endoscopic imaging system components may be integrally or removably coupled to medical instrument system 1104. However, in some examples, a separate endoscope, attached to a separate manipulator assembly may be used with medical instrument system 1104 to image the surgical site. The endoscopic imaging system 1109 may be implemented as hardware, firmware, software, or a combination thereof which interact with or are otherwise executed by one or more computer processors, which may include the processors of the control system 1112.
The sensor system 1108 may include a position/location sensor system (e.g., an electromagnetic (EM) sensor system) and/or a shape sensor system for determining the position, orientation, speed, velocity, pose, and/or shape of the medical instrument system 1104.
Robot-assisted medical system 1100 may also include control system 1112. Control system 1112 includes at least one memory 1116 and at least one computer processor 1114 for effecting control between medical instrument system 1104, master assembly 1106, sensor system 1108, endoscopic imaging system 1109, intra-operative imaging system 1118, and display system 1110. Control system 1112 (which may include a controller in operative communication with the imaging array 114 and/or localization sensor 116) also includes programmed instructions (e.g., a non-transitory machine-readable medium storing the instructions) to implement some or all of the methods described in accordance with aspects disclosed herein, including instructions for providing information to display system 1110.
Control system 1112 may further include a virtual visualization system to provide navigation assistance to operator O when controlling medical instrument system 1104 during an image-guided surgical procedure. Virtual navigation using the virtual visualization system may be based upon reference to an acquired pre-operative or intra-operative dataset of anatomic passageways. The virtual visualization system processes images of the surgical site imaged using imaging technology such as computerized tomography (CT), magnetic resonance imaging (MRI), fluoroscopy, thermography, ultrasound, optical coherence tomography (OCT), thermal imaging, impedance imaging, laser imaging, nanotube X-ray imaging, and/or the like.
An intra-operative imaging system 1118 may be arranged in the surgical environment 1101 near the patient P to obtain images of the anatomy of the patient P during a medical procedure. The intra-operative imaging system 1118 may provide real-time or near real-time images of the patient P. In some examples, the intra-operative imaging system 1118 may comprise an ultrasound imaging system for generating two-dimensional and/or three-dimensional images. For example, the intra-operative imaging system 1118 may be at least partially incorporated into an imaging probe such as imaging probe 104. In this regard, the intra-operative imaging system 1118 may be partially or fully incorporated into the medical instrument system 1104.
As shown in
Elongate instrument 1202 includes a channel (not shown) sized and shaped to receive a medical instrument 1210. In some examples, medical instrument 1210 may be used for procedures such as surgery, biopsy, ablation, illumination, irrigation, or suction. Medical instrument 1210 can be deployed through elongate instrument 1202 and used at a target location within the anatomy. In an example in which elongate instrument 1202 comprises a catheter (e.g., catheter 124), medical instrument 1210 may include, for example, an imaging probe. In an example in which elongate instrument 1202 comprises an imaging probe (e.g., imaging probe 104), medical instrument 1210 may include a biopsy instrument, laser ablation fiber, and/or other surgical, diagnostic, or therapeutic tool. Medical instrument 1210 may be advanced from the distal end 1218 of the elongate instrument 1202 to perform the procedure and then retracted back into the channel when the procedure is complete. Medical instrument 1210 may be removed from proximal end of elongate instrument 1202 or from another instrument port (not shown) along elongate instrument 1202.
Elongate instrument 1202 may also house cables, linkages, or other steering controls (not shown) to controllably bend distal end 1218. In some examples, at least four cables are used to provide independent “up-down” steering to control a pitch of distal end 1218 and “left-right” steering to control a yaw of distal end 1218.
A position measuring device 1220 provides information about the position of instrument body 1212 as it moves on insertion stage 1208 along an insertion axis A. Position measuring device 1220 may include resolvers, encoders, potentiometers, and/or other sensors that determine the rotation and/or orientation of the actuators controlling the motion of instrument carriage 1206 and consequently the motion of instrument body 1212. In some examples, insertion stage 1208 is linear, while in other examples, the insertion stage 1208 may be curved or have a combination of curved and linear sections.
Advantages of the present disclosure will be appreciated and include direct visualization providing confirmation of a tool such as a needle being inserted into a target tissue and avoiding a hazard such as vasculature even as the tool bends out of an imaging plane. In this regard, if the tip of a needle veers out of an imaging plane during insertion, a different imaging plane can be selected and a new image can be generated. Further, the systems and methods described herein provide assurance that a portion of a tool that is visible in an image is the distal tip or is at least near the distal tip. Moreover, the described techniques provide reduced image processing latency. Rather than firing all of the imaging elements through all imaging planes in an attempt to locate the distal tip of the tool, the position is determined and only the imaging elements needed to produce the optimal imaging plane may be fired. It should also be appreciated that annular imaging arrays for a forward-facing approach may provide direct alignment of the imaging probe to the target tissue and there may be less frictional resistance to deployment of the tool as compared to a side-facing approach in which the tool may be substantially bent to exit through a side port.
In the description, specific details have been set forth describing some examples. Numerous specific details are set forth in order to provide a thorough understanding of the examples. It will be apparent, however, to one skilled in the art that some examples may be practiced without some or all of these specific details. The specific examples disclosed herein are meant to be illustrative but not limiting. One skilled in the art may realize other elements that, although not specifically described here, are within the scope and the spirit of this disclosure.
Elements described in detail with reference to one example, implementation, or application optionally may be included, whenever practical, in other examples, implementations, or applications in which they are not specifically shown or described. For example, if an element is described in detail with reference to one example and is not described with reference to a second example, the element may nevertheless be claimed as included in the second example. Thus, to avoid unnecessary repetition in the following description, one or more elements shown and described in association with one example, implementation, or application may be incorporated into other examples, implementations, or aspects unless specifically described otherwise, unless the one or more elements would make an example or implementation non-functional, or unless two or more of the elements provide conflicting functions.
Any alterations and further modifications to the described devices, instruments, methods, and any further application of the principles of the present disclosure are fully contemplated as would normally occur to one skilled in the art to which the disclosure relates. In particular, it is fully contemplated that the features, components, and/or steps described with respect to one example may be combined with the features, components, and/or steps described with respect to other examples of the present disclosure. In addition, dimensions provided herein are for specific examples and it is contemplated that different sizes, dimensions, and/or ratios may be utilized to implement the concepts of the present disclosure. To avoid needless descriptive repetition, one or more components or actions described in accordance with one illustrative example can be used or omitted as applicable from other illustrative examples. For the sake of brevity, the numerous iterations of these combinations will not be described separately. For simplicity, in some instances the same reference numbers are used throughout the drawings to refer to the same or like parts.
While some examples are provided herein with respect to medical procedures, any reference to medical or surgical instruments and medical or surgical methods is non-limiting. For example, the instruments, systems, and methods described herein may be used for non-medical purposes including industrial uses, general robotic uses, and sensing or manipulating non-tissue work pieces. Other example applications involve cosmetic improvements, imaging of human or animal anatomy, gathering data from human or animal anatomy, and training medical or non-medical personnel. Additional example applications include use for procedures on tissue removed from human or animal anatomies (without return to a human or animal anatomy) and performing procedures on human or animal cadavers. Further, these techniques can also be used for surgical and nonsurgical medical treatment or diagnosis procedures.
The methods described herein are illustrated as a set of operations or processes. Not all the illustrated processes may be performed in all examples of the methods. Additionally, one or more processes that are not expressly illustrated or described may be included before, after, in between, or as part of the example processes. In some examples, one or more of the processes may be performed by the control system (e.g., control system 1112) or may be implemented, at least in part, in the form of executable code stored on non-transitory, tangible, machine-readable media that when run by one or more processors (e.g., the processors 1114 of control system 1112) may cause the one or more processors to perform one or more of the processes.
Any described “imaging device” herein may include an ultrasound array, optical imaging device, or any other suitable imaging hardware. Any described “imaging probe” may include an ultrasound probe, an optical imaging probe, or a probe incorporating any other suitable imaging modality. Additionally, any “ultrasound array,” “imaging array,” or “imaging device” as described herein may comprise a single imaging element (e.g., transducer) or a plurality of such devices.
One or more elements in examples of this disclosure may be implemented in software to execute on a processor of a computer system such as control processing system. When implemented in software, the elements of the examples of the invention are essentially the code segments to perform the necessary tasks. The program or code segments can be stored in a processor readable storage medium or device that may have been downloaded by way of a computer data signal embodied in a carrier wave over a transmission medium or a communication link. The processor readable storage device may include any medium that can store information including an optical medium, semiconductor medium, and magnetic medium. Processor readable storage device examples include an electronic circuit; a semiconductor device, a semiconductor memory device, a read only memory (ROM), a flash memory, an erasable programmable read only memory (EPROM); a floppy diskette, a CD-ROM, an optical disk, a hard disk, or other storage device. The code segments may be downloaded via computer networks such as the Internet, Intranet, etc. Any of a wide variety of centralized or distributed data processing architectures may be employed. Programmed instructions may be implemented as a number of separate programs or subroutines, or they may be integrated into a number of other aspects of the systems described herein. In one example, the control system supports wireless communication protocols such as Bluetooth, IrDA, HomeRF, IEEE 802.11, DECT, and Wireless Telemetry.
Note that the processes and displays presented may not inherently be related to any particular computer or other apparatus. Various general-purpose systems may be used with programs in accordance with the teachings herein, or it may prove convenient to construct a more specialized apparatus to perform the operations described. The required structure for a variety of these systems will appear as elements in the claims. In addition, the examples of the invention are not described with reference to any particular programming language. It will be appreciated that a variety of programming languages may be used to implement the teachings of the invention as described herein.
In some instances well known methods, procedures, components, and circuits have not been described in detail so as not to unnecessarily obscure aspects of the examples. This disclosure describes various instruments, portions of instruments, and anatomic structures in terms of their state in three-dimensional space. As used herein, the term “position” refers to the location of an object or a portion of an object in a three-dimensional space (e.g., three degrees of translational freedom along Cartesian x-, y-, and z-coordinates). As used herein, the term “orientation” refers to the rotational placement of an object or a portion of an object (three degrees of rotational freedom—e.g., roll, pitch, and yaw). As used herein, the term “pose” refers to the position of an object or a portion of an object in at least one degree of translational freedom and to the orientation of that object or portion of the object in at least one degree of rotational freedom (up to six total degrees of freedom). As used herein, the term “shape” refers to a set of poses, positions, or orientations measured along an object.
While certain exemplary examples of the invention have been described and shown in the accompanying drawings, it is to be understood that such examples are merely illustrative of and not restrictive on the broad invention, and that the examples of the invention not be limited to the specific constructions and arrangements shown and described, since various other modifications may occur to those ordinarily skilled in the art.
This application claims priority to and benefit of U.S. Provisional Application No. 63/325,082, filed Mar. 29, 2022 and entitled “Needle Sensor Derived Image Plane,” which is incorporated by reference herein in its entirety.
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/US2023/065060 | 3/28/2023 | WO |
Number | Date | Country | |
---|---|---|---|
63325082 | Mar 2022 | US |