Vascular passages are defined by delicate vascular walls. Endovascular devices are passed through vascular passages in endovascular interventional medical procedures, and constantly present risk of perforating the delicate vascular wall. Examples of endovascular devices include guidewires and catheters. In some endovascular interventional medical procedures, endovascular devices are navigated to treatment sites. A guidewire may be inserted endovascularly before a coaxial catheter is endovascularly guided along the guidewire to the treatment site. Medical imaging such as 2D perspective fluoroscopy may be used to provide image feedback to a skilled professional to assist in guiding the endovascular devices.
A typical endovascular interventional medical procedure using endovascular devices involves a set of discrete maneuvers. Each discrete maneuver may involve coordinating the motion of two endovascular devices to first deposit the guidewire in the target vascular branch, and then to advance the catheter over the guidewire.
Precise manipulation of multiple coaxial endovascular devices such as guidewire/catheter combinations requires superb image interpretation ability and hand-eye coordination, both of which are typically the product of years of experience. Experienced professionals obtain skills through observation and trial and error over many years. However, endovascular navigation is not routine for novices, and presents difficulties even for experienced professionals in cases that are considered difficult due, for example, to tortuous anatomy. Skilled professionals are sometimes unavailable for emergencies such as stroke treatments, and the absence of required skills presents heightened risks of perforating the delicate vascular walls, which may sometimes prove fatal.
A first aspect of the invention is a controller for controlling a robotic device according to claim 1, with optional technical features of claims 2 to 21 each considered alone or in combination.
A third aspect of the invention is a system for controlling a robotic device according to claim 22, optionally comprising subject-matter of claims 23.
A third aspect of the invention is a method for controlling a robotic device according to claim 24, with optional technical features of claims 25 to 42 each considered alone or in combination.
According to a more detailed aspect of the present disclosure, a robotic device is configured to operate an interventional device for insertion into an anatomical structure of a subject. The interventional device comprises an outer device and an inner device movably positioned with the outer device. A method for controlling the robotic device includes receiving, from an imaging device, image data from an image of a portion of the interventional device and a branched intersection of a plurality of branches of the anatomical structure. The plurality of branches includes a main branch and a target branch which is branched from the main branch. The method also includes analyzing the image data to measure at least one of a location or an orientation of a distal portion of the outer device and of a distal portion of the inner device in the image, with respect to vasculature including the main branch and the target branch and an intersection of the main branch and the target branch in the image. The method further includes determining, based on analyzing the image data, a path for the interventional device through the anatomical structure using a plurality of modules of predefined motions stored in a memory; and controlling, based on determining the path for the interventional device, the robotic device to operate the interventional device through the path by a sequence of the predefined motions. The method further includes displaying the path on a display.
According to another aspect of the present disclosure, a robotic device is configured to operate an interventional device for insertion into an anatomical structure of a subject. The interventional device comprises an outer device and an inner device movably positioned within the outer device. A system for controlling the robotic device incudes an imaging interface, a user interface, a robotic device, a robotic device controller and a display. The imaging interface is configured to receive image data corresponding to an image from an imaging device. The image includes a portion of the interventional device and a branched intersection of a plurality of branches of the anatomical structure. The plurality of branches includes a main branch and a target branch which is branched from the main branch. The user interface is configured to receive input from a user. The robotic device is connected to the interventional device for operating the interventional device for insertion into the anatomical structure of the subject. The robotic device controller comprises a memory that stores instructions and a plurality of modules of predefined motions, and a processor that executes the instructions. When the instructions are executed by the processor, the robotic device controller is configured to analyze the image data to measure at least one of a location or an orientation of a distal portion of the outer device and of a distal portion of the inner device in the image, with respect to vasculature including the main branch and the target branch and an intersection of the main branch and the target branch in the image. The robotic device controller is also configured to determine, based on analyzing the image data, a path for the interventional device through the anatomical structure using modules of predefined motions stored in the memory. The robotic device controller is also configured to control the robotic device in response to the input from the user interface and based on determining the path for the interventional device, to operate the interventional device through the path by a sequence of the predefined motions. The display is configured to display at least the branch intersection of the plurality of branches of the anatomical structure, the path, the distal portion of the outer device and the distal portion of the inner device, while the robotic device controller controls the robotic device.
According to another aspect of the present disclosure, a robotic device is configured to operate an interventional device for insertion into an anatomical structure of a subject. The interventional device comprises an outer device and an inner device movably positioned within the outer device. A controller for controlling the robotic device includes a memory that stores instructions and a library of a plurality of modules of predefined motions for navigating through anatomical structures, and a processor that executes the instructions. When executed by the processor, the instructions cause the controller to receive, from an imaging device, image data from an image of a portion of the interventional device and a branched intersection of a plurality of branches of the anatomical structure. The plurality of branches includes a main branch and a target branch which is branched from the main branch. The instructions also cause the controller to analyze the image data to measure at least one of a location or an orientation of a distal portion of the outer device and of a distal portion of the inner device in the image, with respect to vasculature including the main branch and the target branch and an intersection of the main branch and the target branch in the image. The instructions also cause the controller to determine, based on analyzing the image data, a path for the interventional device through the anatomical structure using a plurality of modules of predefined motions stored in the memory. The instructions further cause the controller to control, based on a plurality of the predefined motions, the robotic device to operate the interventional device through the path by a sequence of the predefined motions. The instructions also cause the controller to display the path on a display.
In particular, an optional embodiment provides that the instructions further cause the controller to display a representation of different predefined motions as a part or the entirety of the said sequence, or an optional embodiment provides a method further comprising displaying at least a portion of the path on a display (180) such that the predefined motions, as a part or the entirety of the said sequence, are represented
The example embodiments are best understood from the following detailed description when read with the accompanying drawing figures. It is emphasized that the various features are not necessarily drawn to scale. In fact, the dimensions may be arbitrarily increased or decreased for clarity of discussion. Wherever applicable and practical, like reference numerals refer to like elements.
In the following detailed description, for the purposes of explanation and not limitation, representative embodiments disclosing specific details are set forth in order to provide a thorough understanding of an embodiment according to the present teachings. Descriptions of known systems, devices, materials, methods of operation and methods of manufacture may be omitted so as to avoid obscuring the description of the representative embodiments. Nonetheless, systems, devices, materials and methods that are within the purview of one of ordinary skill in the art are within the scope of the present teachings and may be used in accordance with the representative embodiments. It is to be understood that the terminology used herein is for purposes of describing particular embodiments only and is not intended to be limiting. The defined terms are in addition to the technical and scientific meanings of the defined terms as commonly understood and accepted in the technical field of the present teachings.
It will be understood that, although the terms first, second, third etc. may be used herein to describe various elements or components, these elements or components should not be limited by these terms. These terms are only used to distinguish one element or component from another element or component. Thus, a first element or component discussed below could be termed a second element or component without departing from the teachings of the inventive concept.
The terminology used herein is for purposes of describing particular embodiments only and is not intended to be limiting. As used in the specification and appended claims, the singular forms of terms ‘a’, ‘an’ and ‘the’ are intended to include both singular and plural forms, unless the context clearly dictates otherwise. Additionally, the terms “comprises”, and/or “comprising.” and/or similar terms when used in this specification, specify the presence of stated features, elements, and/or components, but do not preclude the presence or addition of one or more other features, elements, components, and/or groups thereof. As used herein, the term “and/or” includes any and all combinations of one or more of the associated listed items.
Unless otherwise noted, when an element or component is said to be “connected to”, “coupled to”, or “adjacent to” another element or component, it will be understood that the element or component can be directly connected or coupled to the other element or component, or intervening elements or components may be present. That is, these and similar terms encompass cases where one or more intermediate elements or components may be employed to connect two elements or components. However, when an element or component is said to be “directly connected” to another element or component, this encompasses only cases where the two elements or components are connected to each other without any intermediate or intervening elements or components.
The present disclosure, through one or more of its various aspects, embodiments and/or specific features or sub-components, is thus intended to bring out one or more of the advantages as specifically noted below. For purposes of explanation and not limitation, example embodiments disclosing specific details are set forth in order to provide a thorough understanding of an embodiment according to the present teachings. However, other embodiments consistent with the present disclosure that depart from specific details disclosed herein remain within the scope of the appended claims. Moreover, descriptions of well-known apparatuses and methods may be omitted so as to not obscure the description of the example embodiments. Such methods and apparatuses are within the scope of the present disclosure.
As described herein, fluoroscopic imaging may be synchronized with servo control of endovascular navigation devices to guide the endovascular devices to anatomical targets.
The system 100 in
The controller 150 is further depicted in
The controller 150 is configured to analyze images from the imaging system 170 and parametrize features of the interventional devices 101 in the images from the imaging system 170. For example, the controller 150 may use artificial intelligence (AI) to segment the interventional devices 101 in images. The controller 150 is also configured to detect operator input for a control mode. The controller 150 may create one or more goal reference metric(s) relative to one of the parameterized features of the interventional devices 101, or different goal reference metric(s) for multiple of the parameterized features of the interventional devices 101. The controller 150 may servo-drive the robotic device 160 so that a metric of a parameterized feature is minimized (or maximized) in the next image, or at least reduced (or increased) in the next image. The controller 150 may stop driving the robotic device 160 when the metric is within a tolerance value or the operator deactivates control. An example of a metric is 5 pixels, such as when a fluoroscopic image shows 5 pixels or more of the exposed inner device (the guidewire) protruding beyond the tip of the outer device (the catheter).
The robotic device 160 is controlled by the controller 150 to drive the interventional devices 101. The robotic device 160 is configured to drive the interventional devices in one or more degrees of freedom, such as in three dimensions and about one or more axes. The robotic device 160 may include a servo motor used to drive the interventional devices 101 under the control of the controller 150, and based on fluoroscopic feedback from the imaging system 170. The robotic device 160 may control one or more degrees of freedom of control for one or both of the interventional devices 101. The robotic device is controlled by the controller 150, so as to drive the motions of one or both of the interventional devices 101.
The imaging system 170 may be a fluoroscopic imaging system that captures fluoroscopic images of anatomy of a subject and the interventional devices 101 as the interventional devices 101 is inserted into the anatomy of the subject. The imaging system 170 may image the patient and the interventional devices 101 and may be movable directly by a user or under the control of the user. The imaging system 170 may be an interventional X-ray imaging system. An interventional X-ray imaging system may include an X-ray tube adapted to generate X-rays and an X-ray detector configured to acquire time-series X-ray images such as fluoroscopy images. Examples of such X-ray imaging systems include digital radiography-fluoroscopy systems such as ProxiDiagnost from Philips, fixed C-arm X-ray systems such as Azurion from Philips. and mobile C-arm X-ray systems such as Veradius from Philips.
The display 180 may be local to the controller 150 or may be remotely connected to the controller 150 via a standard web interface. The display 180 may be connected to the controller 150 via a local wired interface such as an Ethernet cable or via a local wireless interface such as a Wi-Fi connection. The display 180 is configured to display imaging content from the fluoroscopic images from the imaging system 170, along with supplementary depictions of the interventional devices 101 in the fluoroscopic images and one or more predefined motions used to generate a path through branches in the anatomy of the subject. In some embodiments, predefined motions used to generate a path through branches in the anatomy of the subject are not necessarily displayed, and the robotic device 160 controls the interventional devices 101 based on standing instructions. The display 180 may also be configured to display data and visual representations of a target anatomical structure, and data and visual representations of the interventional devices 101 relative to the target anatomical structure. The display 180 may be interfaced with other user input devices by which users can input instructions, including mouses, keyboards, thumbwheels and so on.
In some embodiments, the display 180 may display motions for the interventional device 101 to travel along a suggested path to a target vessel. The path through an anatomical structure using a sequence of the motions being displayed on the display 180 may be highlighted for confirmation by a user, so that the user may confirm the suggested path and the robotic device 160 may control the interventional device 101.
The display 180 may be a monitor such as a computer monitor, a display on a mobile device, an augmented reality display, a television, an electronic whiteboard, or another screen configured to display electronic imagery. The display 180 may also include one or more input interface(s) such as those noted above that may connect other elements or components to the controller 150, as well as an interactive touch screen configured to display prompts to users and collect touch input from users.
The controller 150 includes a memory 151, a processor 152, a first interface 156, a second interface 157, a third interface 158, and a fourth interface 159. The memory 151 stores instructions which are executed by the processor 152. The memory 151 also stores a library of controlling tools related to specific motions of the interventional devices 101. The controlling tools in the library stored in the memory 151 may comprise instructions for alignment, retraction, rotation, advancement or projections based on expected motions/maneuvers. The processor 152 executes the instructions. The processor 152 may execute instructions to measure distances and/or orientations of the interventional devices 101 in the images and to parametrize the features of the interventional devices 101 in images. The analysis and parameterization by the processor may be performed based on the branched anatomy surrounding the interventional devices 101 in the images, along with predefined target in the anatomy in the images such as a target branch or intersection of branches in the images. The processor 152 may also execute instructions comprising direct encoder information from the robotic device 160, or from the interventional devices 101 such as from shape sensing sensors/devices which sense the shape of the interventional devices 101.
The interfaces of the controller include a first interface 156 to the robotic device 160, a second interface 157 to the imaging system 170, a third interface 158 to the display 180, and a fourth interface 159 to a user. The first interface 156, the second interface 157 and the third interface 158 may include ports, disk drives, wireless antennas, or other types of receiver circuitry. The first interface 156 may be a data interface that received data from the robotic device 160 and that provides instructions to the robotic device 160. The second interface 157 may be an image interface that receives data of images and of the identified interventional devices 101 in the images from the imaging system 170. The third interface 158 may be a data interface and an image interface that provides data and images to the display 180. The fourth interface 159 may include one or more user interfaces, such as a mouse, a keyboard, a microphone, a video camera, a gesture tracking system, a touchscreen display, or other forms of interactive user interfaces. The fourth interface 159 is therefore a user interface that receives user inputs, including inputs to set an operation mode for the robotic device 160 and inputs to make selections such as a selection of a predefined motion among multiple selectable options provided on the display 180.
The controller 150 is configured to control the interventional devices 101 using fluoroscopic feedback from images from the imaging system 170. The controller 150 may be provided as a stand-alone component as shown in
When the processor 152 executes the instructions in the memory 151, the controller 150 is configured to perform a variety of processes corresponding to specific predetermined motions from a library of predetermined motions stored in the memory 151.
As an initial example, the controller 150 may control the robotic device 160 to advance either or both of the interventional devices 101 to a location. Advancing is technically more complex than retraction due to the potential of interacting with tissue such as the vascular walls. Advancing under the control of the robotic device 160 may be limited to the entrance of a branch, or may be more tolerated such as when advancing inside a main branch rather than a smaller-diameter target branch which is branched from the main branch. Advancing past a target is explained further with respect to S 252 in
As another example, the controller 150 may control the robotic device 160 to align tips of the interventional devices 101 within a predetermined distance range (e.g., 5 pixels) of alignment. Alignment may be performed when a distance between the respective tips of the inner device and the outer device is outside of the predetermined distance range, such as by more than 5 pixels.
In some embodiments, an estimate of physical distances may be used as the metric. For example, detector pixels may be converted to millimeters using standard projection geometry principles and information or a definition of a C-arm geometry. An estimate may be generated based on assuming that the interventional devices 101 are in a particular elevation (between the X-ray detector and the X-ray source) in the projection space (e.g., an iso center). Aligning is explained with respect to S254 in
As another example, the controller 150 may control the robotic device 160 to rotate the interventional devices 101 once the tips are aligned, such as when both of the interventional devices 101 are located in an intersection between at least two branches but are pointing to the wrong branch. Rotating is explained further with respect to S256 in
As yet another example, the controller 150 may also control the robotic device 160 to retract the two interventional devices 101 once the tips are aligned. The interventional devices 101 may be retracted to a target point such as to an intersection between three branches. The fourth interface 159 may be a thumbwheel user interface used to allow the user to indicate the target point with a marking and direct the robotic device 160. Retracting is explained further with respect to S258 in
As a further example, the inner device or the outer device among the interventional devices 101 may be controlled by the robotic device 160 to rotate alone, separately and independently of the other of the outer device or the inner device. The inner device or the outer device may be rotated to align the curvature of the ends of the interventional devices 101 based on the tangent of the tips. Separately rotating is explained further with respect to S260 in
In some embodiments, a user alert may be issued to the user via a visual guidance system, and the user may be prompted to rotate one of the inner device or the outer device. Alternatively, the user may be provided guidance as to the direction in which the robotic device 160 is moving the interventional devices 101, and where the interventional devices 101 will be stopped by the robotic device 160 based on the controller 150 automatically interpreting images and the state of the robotic device 160.
As another example, the inner device (the guidewire) of the interventional devices 101 may be advanced by the robotic device 160 to a target point. Advancing the inner device (the guidewire) to a target point is explained with respect to S262 in
The controller 150 may also advance the inner device by a distance past the outer device, while keeping the outer device static. The controller 150 may servo-control the robotic device 160 to retract the outer device by a distance relative to the inner device, keeping the inner device static. The controller can also retract the inner device relative to the outer device. The controller 150 may actively advance and retract the inner device relative to the initial position of the tip of the inner device relative to the image. In this way, the controller 150 may anchor the inner device to the vessel/branch. Advancing the inner device past the outer device may be performed when the controller 150 is retracting or rotating the outer device.
The controller 150 may also provide suggestions on the display 180 for the user to show expected motions in a graphical sequence. The motions may be suggested when a list of one or more common motions/maneuvers are presented to a physician based on, for example, geometry of a specified vessel, definition of a target, the specific type(s) of the interventional devices 101, and so on. The motions may be suggested using a trained prediction model using past sequences of motions and the shape of the anatomy of the current subject, as well as the location of the target point in the anatomy of the current subject.
The processor 152 may also provide servo-commands to the robotic device 160 based on the selection of one or more tools either automatically by the controller 150 or based on input from the user via the fourth interface 159. The servo-commands are communicated to the robotic device 160 via the first interface 156.
In
The user interface/control room 1502 is used by a user to input instructions to the overall system. The instructions may be input via a graphical user interface such as the display 180, or may be confirmed via the display 180 as the user inputs the instructions via another user input such as a mouse and/or keyboard.
The robot servo controller 1504 receives commands from the system controller 1506, and controls the robotic device 160 based on the commands from the system controller 1506.
The system controller 1506 is the closest functional element to the controller 150 in
The image system controller is used to control the imaging system 170 based on instructions from the system controller 1506.
The method of
At S220, the image data is analyzed. The image data may be analyzed by the controller 150, and specifically by the processor 152 executing instructions from the memory 151 to analyze the image data received at S210. The analyzing at S220 may include segmenting the interventional devices 101 in the image, and/or analyzing the image data to measure geometric characteristics of the outer device and the inner device in the image, based on the branched intersection including the main branch and the target branch in the image. The geometric characteristics may include dimensions and distances of the interventional devices 101 such as an exposed length of the interventional devices 101 visible in the image, width and/or heights of the interventional devices 101 visible in the image, and dimensions and distances of the plurality of branches. The geometric characteristics may include a location of a distal portion of the outer device and a distal portion of the inner device in the image. The geometric characteristics may also include orientations of the distal portion of the outer device and of the distal portion of the inner device in the image. The dimensions, distances and orientations may be parameterized in the analysis at S220, such as based on a count of pixels corresponding to the distances or by angles defining the difference between orientations and predetermined axes. Accordingly, the analyzing at S220 may include measuring at least one of a location or an orientation of a distal portion of the outer device of the interventional devices 101 and of a distal portion of the inner device of the interventional devices 101 in the image, with respect to vasculature including the main branch and the target branch and an intersection of the main branch and the target branch in the image.
At S230, the process of
At S240, the path is displayed on the image. In some embodiments the entire path may be displayed on the image, and in other embodiments less than the entire path may be displayed on the image so that at least a portion of the path is displayed on a display. The path may be superimposed on the branches of the anatomy of the current subject in the images from the imaging system 170 displayed on the display 180. For example, the path may be color-coded to delineate different motions to be made.
At S250, the interventional devices 101 are controlled. The interventional devices 101 may be controlled by the robotic device 160 under instructions from the controller 150, and the control may be one or more of the actions described with respect to
The control at S250 may include direct control by the controller 150, or indirect control such as by defining and sending a command to control the robotic device 160 to operate the interventional devices 101. The control is based on the sequence of motions defining a path or/and by a selection by a user of a displayed predefined motion of the sequence. Individual motions of the sequence may be displayed and made selectable by a user, such as via an interactive touch interface or a mouse or keyboard or voice-command user interface.
The method of
At S252, the method of
At S254, the method of
At S256, the method of
At S258, the method of
At S260, the method of
At S262, the method of
In
At S225, the interventional devices 101 are located in the planning image, and then segmented and parameterized. The vessel boundaries are segmented from angiography, or the user may define the vessel boundaries directly on the image such as on an interactive display displayed on the display 180.
At S230, the parameterized device is co-registered. The planning image is registered with an image that contains the interventional devices 101. The co-registering may be performed from the same X-ray position. In embodiments using a 3D patient (vessel) model, a 2D to 3D registration may be performed.
At S232, the sequence of motions/maneuvers are computed. The sequence may be computed by the controller 150 in
At S233, the next motion/maneuver is selected, and an X-ray image is again taken at S236. At S237, the interventional devices 101 are again segmented and parameterized, and at S238 the parameterization of the interventional devices 101 is co-registered.
At S239, a metric is calculated, and at S251 the robotic device moves the interventional devices 101 via a velocity servo loop.
At S252, a determination is made whether the metric being used is below (or above) the relevant threshold (goal). If the error metric is not below (or above) the relevant threshold (S252=No), the process returns to S237. If the error metric is below (or above) the relevant threshold (S252=Yes), a determination is made at S253 as to whether the last motion/maneuver was the final motion/maneuver. If the last motion/maneuver was the final motion/maneuver (S253=Yes), the process stops or the guidewire is projected at S255. If the last motion/maneuver was not the final motion/maneuver (S253=No), the process returns to S232.
In the process of
At S271, a parametric representation of the interventional devices 101 in the image is obtained. At S272, the parameters for the interventional devices 101 are normalized. For example, image adjustments used for normalizing the parameters may include rotating the interventional devices 101, correcting for pixel size, and so on.
At S275, the motions/maneuvers are loaded. The motions/maneuvers may be loaded by a selection of task at S273 and identification of the type of the interventional devices 101 at S274.
At S276, a selection of which approach among approaches 1, 2 or 3 is made. At S277, a graphical user interface selection for parameter adjustment is received, and used to identify the parametric representation of the interventional devices 101 at S271.
In
As one example, a prescribed set of operations may include A catharize a renal vessel, B cross an aortic aneurysm, C cross the calcific partial occlusion, D catharize the supra aortic vessel, and so on. Some of the operations may be automatically suggested based on the location of the interventional devices 101 and the desired target location in the vasculature. For example, if the interventional devices 101 are in in the iliac artery and the target location is in the renal artery, operations may be automatically suggested for a given type of the interventional devices 101. As an example, measuring the current arterial vessel width and target arterial vessel width may be taken to imply task A. An example of the arterial vessel width might be 100 pixels, and an example of the target arterial vessel might be 60 pixels.
The motions in
The general control process for each maneuver in A through E of
In some embodiments using the teachings of
In
In
At S510, X-ray images are collected. The interventional devices 101 are segmented in each fluoroscopic image at S520. Parametric representations of each of the interventional devices 101 are provided as a centerline in each image.
At S530, the process of
If the exposed length of the guidewire is within the threshold (S540—Yes), the process of
In
In
In
At S610, a new X-ray image is obtained. At S620, the interventional devices 101 are segmented in the image obtained at S610. The parametric representation of the interventional devices 101 is again provided as a device centerline and the exposed length from the interventional devices 101 to the target is again calculated at S630 as a distance to target.
In
At S710, an X-ray image is obtained, and the interventional devices 101 in the image are segmented at S720. At S730, an angle between the tangents at the distal portions of each of the interventional devices 101 is calculated. At S740, the process of
When a curved catheter appears straight in the X-ray image, such that the curve is perpendicular to the X-ray plane, the operator may be prompted to change the X-ray perspective, or rotate the catheter to expose the curvature. When the catheter is straight, the objective may be a straight alignment or mirroring of the guidewire about the catheter. Additionally, the target reference angle may also be tuned to uniquely shaped catheters. In some cases, a more refined alignment is desired, in which case the C-arm of the imaging system 170 is moved to a new pose to provide another perspective for servo alignment.
As shown in
In some other embodiments, a guidewire may be actively anchored to a location. For example, when the guidewire is in a side branch while the catheter is in the proximal larger branch, the motion of the catheter may begin to retract the guidewire. This presents a challenge in manual manipulation because even a skilled professional needs to manipulate both of the interventional devices 101 in four degrees of freedom simultaneously with only two hands. Using locations of the interventional devices 101 in images, the robotic device 160 may counteract this by advancing the guidewire relative to the catheter such that the guidewire maintains its depth in the side branch. This is accomplished by active guidewire servoing, with the feedback metric being the location of the tip of the guidewire inside the side vessel. When the tip of the guidewire is retracting due to external forces such as catheter motion, the guidewire may be automatically advanced in proportional to the retraction amount from the initial position. This minimizes the movement of guidewire escaping the cannulated side branch. In other embodiments, the guidewire may be further extended into the side branch to provide additional stabilization.
In some other embodiments, a guidewire position in a cannulated vessel may be slipping, and when this is recognized, the movement of the catheter may be stopped automatically and a new maneuver suggested. Additionally, when the catheter is not rotated in the correct orientation to be facing the opposite direction of the cannulated vessel, the incorrect orientation may be detected and communicated to the user along with a suggested correction. Alternatively, the correction may be automatically implemented once the incorrect orientation is detected. As a result, the guidewire may be actively anchored to the vessel by automatically correcting the catheter orientation.
In
Input from a user may also be provided in the form of planning. For example, ring landmarks may be provided in a vessel navigation tool to define which ostea to cannulate. Planning may also be provided in the form of data from prior procedures, such as in interventional oncology where procedures may repeatedly involve the same location in the body.
In some embodiments using the teachings of
The controller 150 may provide a servo-command based on the execution or selection of one of the tools shown in
In embodiments with X-ray biplane imaging, the target reference point can be defined in 3D using triangulation. In this case the tip of the interventional devices 101 is continuously triangulated using epipolar geometry techniques for stereo reconstruction borrowed from computer vision. In cases where the two X-ray imaging sources are not well calibrated, independent target references are used, and each one is met for the goal to be reached. These embodiments may be useful for rotating the interventional devices 101 to align their respective curvatures.
In some embodiments, open-loop control may be incorporated. To minimize X-ray use, open-loop control using pre-planned motions can be interweaved with the image feedback maneuvers. Open-loop control may be used initially as a first pass, even when precision requires end-point sensing (e.g., X-ray) feedback. For example, to align ends of the guidewire and catheter, a pre-calibrated wire/catheter model may be used to retract the guidewire so the ends are coincident. However, when the estimated difference is within some tolerance such as 2 cm, the X-ray imaging resumes and is used for servoing to complete the maneuver.
In some embodiments, collimation around a guidewire may be synchronized. Since maneuvers are performed semi-automatically, the area around the interventional devices 101 is the region required for feedback. Synchronization of collimation during servo motions may be performed based on segmentation of the interventional devices 101 that is to be controlled. A bounding box of the distal portion to be controlled may be created. Adjustments of the collimation are performed based on expected motion of the interventional devices 101 per the current maneuver sequence.
In some embodiments, a guidewire may be loaded to the tip of catheter. For exchanging interventional devices 101 and for roughly calibrating lengths of the interventional devices 101, the guidewire (or any other device) may be advanced after installation in the catheter until the tip becomes visible in the X-ray view. The guidewire may then be retracted until both tips are aligned. The result may serve as a “homing” calibration which can be revisited and which can enable some open-loop coordinate motions between the interventional devices 101. The calibration may be useful in cases when the guidewire is difficult to visualize through an opaque catheter.
In some embodiments, artificial intelligence (AI) methods may be used for reinforcement learning. AI methods may be more tractable if they operate on a lower dimensional space, and this is particularly relevant in Reinforcement Learning techniques where the actions can be limited to a few operations such as the discrete maneuvers described herein (i.e., for output actions for a reinforcement learning agent to reach a goal). Although direct 4-degree-of-freedom control of the robotic device 160 as actions are theoretically possible, a reinforcement learning network may require learning a significant number of micro-steps in order to achieve a particular outcome, which in practice requires a large number of trials. As a result, composing actions of the reinforcement learning agent from relatively higher-level actions is preferable, and are useful in the context of the teachings herein.
In some embodiments, the interventional devices 101 may be modified such as by adding unique properties of the interventional devices 101 to augment the target points for the servo methods. The interventional devices which may be appropriate for modification include stents and balloons, and the unique properties which may be added or modified include elasticity, geometry model (X-ray visible or not), the estimated interaction between the interventional devices 101 such as expected curvature, etc. For example, when the interventional devices 101 have a marker which is mostly opaque, the actual geometry may be registered with the visible marker and superimposed on the invisible sections of the image. The augmentation may be used as input into the maneuver control methods described herein.
In some embodiments, discrete markers may be provided for the interventional devices 101 for velocity control. When a guidewire has markers it is possible to estimate velocity of the markers along the guidewire to improve out of plane ambiguity when approaching distance alignment. This may be performed by retracting at constant velocity when the markers on the guidewire appear in the X-ray image but are not moving.
The user interface in
In some embodiments using the teachings of
Referring to
In a networked deployment, the computer system 1000 operates in the capacity of a server or as a client user computer in a server-client user network environment, or as a peer computer system in a peer-to-peer (or distributed) network environment. The computer system 1000 can also be implemented as or incorporated into various devices, such as a workstation that includes the controller 150 in
As illustrated in
The term “processor” as used herein encompasses an electronic component able to execute a program or machine executable instruction. References to a computing device comprising “a processor” should be interpreted to include more than one processor or processing core, as in a multi-core processor. A processor may also refer to a collection of processors within a single computer system or distributed among multiple computer systems. The term computing device should also be interpreted to include a collection or network of computing devices each including a processor or processors. Programs have software instructions performed by one or multiple processors that may be within the same computing device or which may be distributed across multiple computing devices.
The computer system 1000 further includes a main memory 1020 and a static memory 1030, where memories in the computer system 1000 communicate with each other and the processor 1010 via a bus 1008. Either or both of the main memory 1020 and the static memory 1030 may be considered representative examples of the memory 151 of the controller 150 in
“Memory” is an example of a computer-readable storage medium. Computer memory is any memory which is directly accessible to a processor. Examples of computer memory include, but are not limited to RAM memory, registers, and register files. References to “computer memory” or “memory” should be interpreted as possibly being multiple memories. The memory may for instance be multiple memories within the same computer system. The memory may also be multiple memories distributed amongst multiple computer systems or computing devices.
As shown, the computer system 1000 further includes a video display unit 1050, such as a liquid crystal display (LCD), an organic light emitting diode (OLED), a flat panel display, a solid-state display, or a cathode ray tube (CRT), for example. Additionally, the computer system 1000 includes an input device 1060, such as a keyboard/virtual keyboard or touch-sensitive input screen or speech input with speech recognition, and a cursor control device 1070, such as a mouse or touch-sensitive input screen or pad. The computer system 1000 also optionally includes a disk drive unit 1080, a signal generation device 1090, such as a speaker or remote control, and/or a network interface device 1040.
In an embodiment, as depicted in
In an embodiment, dedicated hardware implementations, such as application-specific integrated circuits (ASICs), field programmable gate arrays (FPGAs), programmable logic arrays and other hardware components, are constructed to implement one or more of the methods described herein. One or more embodiments described herein may implement functions using two or more specific interconnected hardware modules or devices with related control and data signals that can be communicated between and through the modules. Accordingly, the present disclosure encompasses software, firmware, and hardware implementations. Nothing in the present application should be interpreted as being implemented or implementable solely with software and not hardware such as a tangible non-transitory processor and/or memory.
In accordance with various embodiments of the present disclosure, the methods described herein may be implemented using a hardware computer system that executes software programs. Further, in an exemplary, non-limited embodiment, implementations can include distributed processing, component/object distributed processing, and parallel processing. Virtual computer system processing may implement one or more of the methods or functionalities as described herein, and a processor described herein may be used to support a virtual processing environment.
Accordingly, control of robotic endovascular devices with fluoroscopic feedback enables automatic and consistent wire manipulation assistance. The wire manipulation assistance may reduce risks in medical interventions. An examples of the risks reduced by the subject matter described herein include the risk of perforating delicate vascular walls, which may result in fatal complications. Another example of the risks reduced by the subject matter described herein include the risks presented by stroke treatment (thrombectomy), where timely treatment by skilled professional is essential but not always available.
As set forth above, endovascular intervention workflow may be improved by enabling repetitive maneuvers of devices inside vessels to be executed by robotic means with high precision and speed. A library of motions that can be sequenced together to execute a higher level autonomous task may be provided for dynamic path determination when needed, such as for automatic navigation to deposit an intravascular device into a vascular branch.
Although control of robotic endovascular devices with fluoroscopic feedback has been described with reference to several exemplary embodiments, it is understood that the words that have been used are words of description and illustration, rather than words of limitation. Changes may be made within the purview of the appended claims, as presently stated and as amended, without departing from the scope and spirit of control of robotic endovascular devices with fluoroscopic feedback in its aspects. Although control of robotic endovascular devices with fluoroscopic feedback has been described with reference to particular means, materials and embodiments, control of robotic endovascular devices with fluoroscopic feedback is not intended to be limited to the particulars disclosed: rather control of robotic endovascular devices with fluoroscopic feedback extends to all functionally equivalent structures, methods, and uses such as are within the scope of the appended claims.
The illustrations of the embodiments described herein are intended to provide a general understanding of the structure of the various embodiments. The illustrations are not intended to serve as a complete description of all of the elements and features of the disclosure described herein. Many other embodiments may be apparent to those of skill in the art upon reviewing the disclosure. Other embodiments may be utilized and derived from the disclosure, such that structural and logical substitutions and changes may be made without departing from the scope of the disclosure. Additionally, the illustrations are merely representational and may not be drawn to scale. Certain proportions within the illustrations may be exaggerated, while other proportions may be minimized. Accordingly, the disclosure and the figures are to be regarded as illustrative rather than restrictive.
One or more embodiments of the disclosure may be referred to herein, individually and/or collectively, by the term “invention” merely for convenience and without intending to voluntarily limit the scope of this application to any particular invention or inventive concept. Moreover, although specific embodiments have been illustrated and described herein, it should be appreciated that any subsequent arrangement designed to achieve the same or similar purpose may be substituted for the specific embodiments shown. This disclosure is intended to cover any and all subsequent adaptations or variations of various embodiments. Combinations of the above embodiments, and other embodiments not specifically described herein, will be apparent to those of skill in the art upon reviewing the description.
The Abstract of the Disclosure is provided to comply with 37 C.F.R. § 1.72 (b) and is submitted with the understanding that it will not be used to interpret or limit the scope or meaning of the claims. In addition, in the foregoing Detailed Description, various features may be grouped together or described in a single embodiment for the purpose of streamlining the disclosure. This disclosure is not to be interpreted as reflecting an intention that the claimed embodiments require more features than are expressly recited in each claim. Rather, as the following claims reflect, inventive subject matter may be directed to less than all of the features of any of the disclosed embodiments. Thus, the following claims are incorporated into the Detailed Description, with each claim standing on its own as defining separately claimed subject matter.
The preceding description of the disclosed embodiments is provided to enable any person skilled in the art to practice the concepts described in the present disclosure. As such, the above disclosed subject matter is to be considered illustrative, and not restrictive, and the appended claims are intended to cover all such modifications, enhancements, and other embodiments which fall within the true spirit and scope of the present disclosure. Thus, to the maximum extent allowed by law, the scope of the present disclosure is to be determined by the broadest permissible interpretation of the following claims and their equivalents and shall not be restricted or limited by the foregoing detailed description.
Number | Date | Country | Kind |
---|---|---|---|
22154694.8 | Feb 2022 | EP | regional |
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/EP2022/082445 | 11/18/2022 | WO |
Number | Date | Country | |
---|---|---|---|
63281207 | Nov 2021 | US |