The invention relates generally to medical instruments, such as elongate steerable instruments for minimally-invasive intervention or diagnosis, and more particularly to a method, system, and apparatus for sensing or measuring the shape or position and shape of one or more parts of a shapeable elongate medical instrument.
Currently known minimally invasive procedures for diagnosis and treatment of medical conditions use shapeable instruments, such as steerable devices, flexible catheters or more rigid arms or shafts, to approach and address various tissue structures within the body. Hereafter, such devices are referred to as “shapeable” instruments. Such a term can include steerable devices, flexible devices, devices having one or more pre-determined shapes (such as an articulating device that locks into a particular shape). For various reasons, it is highly valuable to be able to determine the 3-dimensional spatial position of portions of such shapeable instruments relative to other structures, such as the operating table, other instruments, or pertinent anatomical tissue structures. Such information can be used for a variety of reasons, including, but not limited to: improve device control; to improve mapping of the region; to adapt control system parameters (whether kinematic and/or solid mechanic parameters); to estimate, plan and/or control reaction forces of the device upon the anatomy; and/or to even monitor the system characteristics for determination of mechanical problems. Alternatively, or in combination, shape information can be useful to simply visualize the tool with respect to the anatomy or other regions whether real or virtual.
Conventional systems can be improved by incorporating shape information into the control of the medical device. To better understand such improvements a discussion of the concept of shape might be useful. Most generally, shape can include geometric information about an object without information of location, scale or rotation. While the discussion focuses on the use of robotics to control a shapeable device, the concepts disclosed herein can be applied to any robotic, automated, or machine assisted control of a medical device.
Shape can be important for improved control of shapeable devices. In the field of discrete robotics, joint positions are used extensively to describe relative positions of connected articulating members. In the case of a shapeable instrument being advanced by a robotic or other system, there are effectively infinite joints with multiple degrees of freedom. Instead of just knowing the scalar or vector configuration of a joint, the shape of a shapeable section is needed and must be either inferred or measured.
A machine that controls a shapeable medical device carries force and flexure continuously through sections with some degree of smoothness in one or more path derivatives. The shape of these sections can provide valuable information. However, purely defined shape excludes location, rotation and scaling of a body. Shape as described hereafter generally includes shape with scale. Thus with the shape, the relative position and orientation of any two points on the shape are known. For example, as shown in
Without shape measurement, other information must be used to control a shapeable device. However, such control is subject to error by multiple sources.
Based upon the applied positioning element displacements, the actual (physical) catheter mechanics including any constraints and obstructions acting on the catheter determine the real configuration or shape that the shapeable device achieves. This is illustrated on the right (slave/actual) side of
To generate the control inputs, the system must calculate inverse kinematics and translate to configuration space. These mathematical operations are essentially inverted by the physical system in actuating the device, subject to disturbances such as interference with the environment.
In many conventional systems, the catheter (or other shapeable instrument) is controlled in an open-loop manner as shown in FIG. IC. In this type of open loop control model, the shape configuration command comes in to the beam mechanics, is translated to beam moments and forces, then is translated to tendon tensions given the actuator geometry, and finally into tendon displacement given the entire deformed geometry. However, there are numerous reasons why the assumed motion of the catheter will not match the actual motion of the catheter, one important factor is the presence of unanticipated or unmodeled constraints imposed by the patient's anatomy.
Clearly, the presence of unanticipated or unmodeled portions of the anatomy affects the behavior and therefore kinematics of the shapeable instrument. This affect will often alter any mapping between configuration or shape and task space or endpoint for the instrument.
Accordingly, a control system that directs shapeable instruments can command joint configurations that can achieve a desired tip position. However, the presence of modeling inaccuracies and environment interaction causes a differential between the actual position from that intended. A simple tip position can quantify this error, but addressing the source of the error requires the additional information regarding the shapeable instrument. Data defining the actual or real shape of the instrument can provide much of this information.
Conventional technologies such as electromagnetic position sensors, available from providers such as the Biosense Webster division of Johnson & Johnson, Inc., can be utilized to measure 3-dimensional spatial position but may be limited in utility for elongate medical instrument applications due to hardware geometric constraints, electromagnetivity issues, etc.
It is well known that by applying the Bragg equation (wavelength=2*d*sin(theta)) to detect wavelength changes in reflected light, elongation in a diffraction grating pattern positioned longitudinally along a fiber or other elongate structure maybe be determined. Further, with knowledge of thermal expansion properties of fibers or other structures which carry a diffraction grating pattern, temperature readings at the site of the diffraction grating may be calculated.
“Fiberoptic Bragg grating” (“FBG”) sensors or components thereof, available from suppliers such as Luna Innovations, Inc., of Blacksburg, Virginia, Micron Optics, Inc., of Atlanta, Georgia, LxSix Photonics, Inc., of Quebec, Canada, and Ibsen Photonics A/S, of Denmark, have been used in various applications to measure strain in structures such as highway bridges and aircraft wings, and temperatures in structures such as supply cabinets.
The use of such technology in shapeable instruments is disclosed in commonly assigned U.S. patent application Ser. No. 11/690,116, published as U.S. Pub. No. 2007/0265503 on Nov. 15, 2007, now abandoned; Ser. No. 11/176,598, published as U.S. Pub. No. 2006/0100610 on May 11, 2006, now abandoned; Ser. No. 12/012,795, published as U.S. Pub. No. 2008/0218770 on Sep. 11, 2008, now abandoned; Ser. No. 12/106,254, issued as U.S. Pat. No. 8,050,523 on Nov. 1, 2011; and Ser. No. 12/507,727, published as U.S. Pub. No. 2010/0114115 on May 6, 2010, now abandoned. Such technology is also described in U.S. Provisional application Nos: 60/785,001; 60/788,176; 60/678,097; 60/677,580; 60/600,869; 60/553,029; 60/550,961; 60/644,505. The entirety of each of the above applications is incorporated by reference herein. Related disclosures of systems and methods for controlling a shapeable instrument can be found U.S. Ser. No. 12/822,876, filed Jun. 24, 2010, issued as U.S. Pat. No. 8,460,236 on Jun. 11, 2013, the entirety of which is incorporated by reference.
There remains a need to apply the information gained by the spatial information or shape and applying this information to produce improved device control or improved modeling when directing a robotic or similar device. There also remains a need to apply such controls to medical procedures and equipment.
The systems, methods, and devices described herein include a robotic medical system for controlling a shapeable instrument within an anatomical region. The systems, methods, and devices described herein incorporate shape measurement and apply the measured information as feedback for controlling of the shapeable member or for performing other tasks related to controlling the instrument (e.g., improving a map or a model of the anatomy or region).
The shapeable medical instruments, in most variations described herein, include any steerable devices, flexible catheters or more rigid arms or shafts whether such devices are used to access a region for advancement of a treatment device, or any actual shapeable treatment device. A shapeable device as used herein include includes flexible, steerable, or otherwise positionable devices that are advanced to various tissue structures within the body. Such devices can assume a shaped configuration via manipulation or steering. Moreover, shapeable devices include those flexible devices that conform to anatomic or other obstructions. In many variations, shapeable instruments include a working end and one or more positioning elements that move the shapeable instrument. In one example, the positioning elements comprise control elements such as tendons wires, or other mechanical structures that are moved by one or more actuators to affect a shape or reposition the shapeable instrument. Unless specifically used to indicate a particular device, the term catheter is one example of a shapeable instrument.
In a first variation, the robotic medical system comprises a medical system for controlling a shapeable instrument within an anatomical region, where the shapeable instrument includes at least a working section and one or more positioning elements that move the shapeable instrument.
One variation of the system includes a controller including a master input device, where the controller generates a position control signal in response to the master input device to position the working section at a desired position; one or more actuators operatively coupleable to the one or more positioning elements, where the actuators manipulate the positioning elements based on the position control signal to drive at least a first portion of the shapeable instrument to position the working section toward the desired position; a localization system configured to obtain a plurality of localized shape data from the first portion of the shapeable instrument; and where the controller generates a signal based upon a differential between the localized shape data and a desired configuration of the first portion of the shapeable instrument. The desired configuration of the first portion can include a desired position of the first portion or the desired position of the working section. Alternatively, or in combination, the desired configuration of the first portion comprises a desired shape of the first portion.
The localization system can determine a position of the working section from the plurality of localized shape data. In another variation, the desired configuration of the first portion comprises a desired position of the first portion and where controller generates the signal based upon the differential between the position of the working section and the desired position of the working section. The controller of the robotic medical system can be configured to derive a position of the working section from a kinematic model of the shapeable instrument.
A variation of the robotic medical system includes a localization system that determines a shape of the first portion of the shapeable instrument from the plurality of localized shape data. The desired configuration of the first portion can comprises a desired shape of the first portion and where controller generates the signal based upon the differential between the shape of the first portion and the desired shape of the first portion. In another variation, the localization system also determines a position of the working section from the plurality of localized shape data, and where the desired configuration of the first portion also includes a desired position of the first portion.
In another variation, the controller generates the signal also based upon the differential between a desired position of the first portion and the position of the first portion.
The robotic medical system can also include a controller that is configured to feed the signal to the actuators such that the actuators manipulate one or more of the positioning elements using the signal to position the working section or the first portion of the shapeable instrument.
In one variation, the localization system comprises a fiber optic localization system configured to supply the plurality of localization data. Furthermore, the shapeable instrument can include at least one optic fiber and where the localization system is configured to measure a plurality of data of Rayleigh scatter of the optic fiber. The Rayleigh scatter data can be used to supplement or supply the localization data.
The localization system can comprises a system selected from the group consisting of a plurality of positioning sensors, a vision system, a plurality of strain sensors. In another variation, the localization system can comprise an electromagnetic localization system and where the shapeable instrument includes at least one electromagnetic coil. In yet another variation, the localization system can comprise an impedance based localization system and where the shapeable instrument includes at least one sensor, where the system further includes at least one electrode where the impedance based localization system determines a voltage gradient between the sensor and the electrode. However, any number of localization systems can be employed with the robotic medical system as described herein.
The robotic medical system described herein can also include a control configured to generate the position control feed signal using an inverse kinematic model of the shapeable instrument. The controller can generate the position control signal to maximize a probability of achieving a prescribed shape or position by optimizing a cost function subject to a set of constraints based upon a model and a measurement estimate. For example, the model can be selected from the group consisting of a kinematic and solid mechanic model and the measurement projection can be selected from the group consisting of a shape, a strain, and a projection (e.g., a fluoroscopic projection, etc.).
In one variation, the controller can use the signal to alter at least one parameter of the inverse kinematic model of the shapeable instrument to produce an improved inverse kinematic model of the shapeable instrument. Furthermore, the controller can modify the position control signal using the improved kinematic model.
The robotic medical system described herein can also be configured such that the actuators alter a force applied to one or more of the positioning elements based on the signal to reposition the working section or the first portion of the shapeable instrument.
In yet another variation, the robotic medical system can be further configured to measure an axial deformation of the shapeable member, and where the controller further generates the signal based on the axial deformation of the shapeable member.
In variations of the robotic medical system the controller can be configured to determine an applied force on the first portion of the shapeable instrument using a shape of the first portion of the shapeable instrument, the position control signal and at least one characteristic of the shapeable instrument. The controller can further use one or more actuators to reposition the portion of the shapeable instrument to reduce the applied force.
In another variation, when the robotic medical system generates the signal, the controller can trigger an operator alert signal alarm to the master input device. The operator signal can cause a haptic effect on the master input device for feedback. In some variations, the controller triggers the operator alert signal only if the signal is greater than a pre-determined level. Any number of safety measures can be employed when the controller triggers the operator alert signal. For example, the controller can be configured to stop movement of the shapeable instrument; the controller can be configured to reverse movement of the shapeable instrument; and/or the controller can be configured to increase a force required to operate the master input device.
The controller of the robotic medical system described herein can be configured to further determine a calculated curvature from the real shape and compares the calculated curvature to a pre-determined curvature to assess a fracture of the shapeable element.
The present disclosure also includes methods for controlling a shapeable instrument within an anatomical region using a robotic medical system. For example, the method can include operatively coupling one or more actuators to one or more positioning elements of a shapeable instrument where the one or more positioning elements are adapted to move the shapeable instrument and where the actuators manipulate the positioning elements; advancing the shapeable instrument to the anatomical region, where the shapeable instrument includes a working section; generating a position control signal in response to position the working section at a desired position; obtaining a plurality of localized shape data of a first portion of the shapeable instrument using a localization system; and controlling the actuators using the position control signal to manipulate the positioning control elements to drive at least the first portion of the shapeable instrument to position the working section toward the desired position, where the controller generates a signal based upon a differential between the localized shape data and a desired configuration of the first portion of the shapeable instrument.
The methods described herein can also permit tracking of a device through an anatomic path using shape information of the device. For instance, one method includes controlling advancement of a shapeable medical device within an anatomic path. Such variation includes identifying a reference shape of one or more portions of the shapeable medical device; advancing the shapeable medical device along the anatomic path; obtaining a plurality of localization data to determine a real shape of at least the one or more portions of the shapeable instrument when advanced along the anatomic path; and monitoring advancement of the shapeable medical device by determining a differential between the real shape and the reference shape of the one or more portions.
In addition, the method of controlling advancement of the medical device can further comprise controlling advancement of the shapeable medical device if the differential between the real shape and the reference shape of the one or more portions is greater than a threshold value.
In one variation controlling the advancement of the shapeable medical device comprises reversing the shapeable medical device along the anatomic path until the differential between the real shape and the reference shape decreases. In another variation, controlling the advancement of the shapeable medical device comprises slowing advancement of the shapeable medical device along the anatomic path until the differential between the real shape and the reference shape decreases. Another variation controlling the advancement of the shapeable medical device comprises advancing a guide device from the shapeable medical device within the anatomic path and subsequently advancing the shapeable medical device along the guide track. In additional variations controlling the advancement of the shapeable medical device comprises stopping the shapeable medical device and withdrawing a proximal end of the shapeable medical device until the differential between the real shape and the reference shape decreases.
The disclosure also includes method for reduced model control. One such method includes altering a data model of an anatomical region. For example, such a method can include advancing a shapeable instrument relative to the anatomical region, the shapeable instrument comprising one or more positioning elements that alter a shape of a first portion of the shapeable instrument obtaining a plurality of localization data to determine a real shape of the first portion of the shapeable instrument; correlating the real shape of the first portion of the shapeable instrument against a desired shape of the first portion to determine a data model of an anatomic feature affecting the real shape of at least the first portion of the shapeable instrument; and updating the data model with the data model of the anatomic feature.
The method described above can further comprise measuring at least one force on at least one positioning element and where correlating the real shape of the first portion of the shapeable instrument includes assessing the force on the at least one positioning element to determine the data model of the anatomic feature affecting the real shape of at least the first portion of the shapeable instrument. The methods can further include cycling movement of the shapeable instrument by advancing and retracting the shapeable instrument, and where obtaining the localization data occurs after advancing the shapeable instrument.
The methods described herein can repositioning the shapeable instrument to maintain a historical database of real shapes, where the historical database comprises a plurality of active spaces through which the shapeable instrument moved and a plurality of void spaces through which the shapeable instrument did not move, and determining a location of an atomic feature using the plurality of void spaces.
The present disclosure also include methods of preparing a robotic medical system for use with a shapeable instrument, where the shapeable instrument includes a working end and one or more positioning elements that move the shapeable instrument. A variation of this method includes obtaining a plurality of localization data to determine a real shape of at least the first portion of the shapeable instrument; pretensioning the shapeable instrument by incrementally actuating at least one of the actuators to determine a zero displacement point of the actuator after which the shapeable instrument moves from the real shape; providing the zero displacement point to a controller including a master input device, where the controller adds the first displacement point to at least one actuation command where the actuation command manipulates one or more of the positioning elements to reposition the working end or the first portion of the shapeable instrument and where the first displacement point compensates for slack in the shapeable element.
In one example, a shapeable instrument comprises an elongate instrument body; an optical fiber coupled in a constrained manner to the elongate instrument body, the optical fiber is in communication with one or more optical gratings; and a detector operably coupled to a proximal end of the optical fiber and configured to detect respective light signals reflected by the one or more optical gratings. The system further includes a controller operatively coupled to the detector, wherein the controller is configured to determine a geometric configuration of at least a portion of the shapeable instrument based on a spectral analysis of the detected reflected portions of the light signals. Variations of the devices, systems and methods described herein can employ Bragg Fiber gratings as mentioned above. However, additional variations of the devices, systems and method contained in this disclosure can employ any number of optical gratings.
The systems, methods, and devices described herein can also employ alternate means to obtain information regarding shape of the device. For example, such alternate means includes, but is not limited to positioning sensors, a vision system, a plurality of strain sensors.
By way of non-limiting example, a shapeable instrument can be robotically controlled, or manually controlled with automated assistance. In some variations, the shapeable instrument includes a reference reflector coupled to the optical fiber in an operable relationship with the one or more optical gratings. In yet additional embodiments, the detector comprises a frequency domain reflectometer. The optical fiber can include multiple fiber cores, each core including one or more optical gratings. The optical fiber (or each fiber core of a multi-core optical fiber) can optionally comprise a plurality of paced apart optical gratings.
In another variation, a localization system as described herein can use measurement of Rayleigh scatter in the optical fiber. Measurement of Rayleigh scatter can be used to measure strain in the fiber. Such information can be used as an alternate mode of obtaining shape data. Alternatively, Rayleigh scatter can be combined with other localization systems to supplement or improve the localized shape data.
When single mode optical fiber is drawn there can be slight imperfections that result in index of refraction variations along the fiber core. These variations result in a small amount of backscatter that is called Rayleigh scatter. Changes in strain or temperature of the optical fiber cause changes to the effective length of the optical fiber. This change in the effective length results in variation or change of the spatial position of the Rayleigh scatter points. Cross correlation techniques can measure this change in the Rayleigh scattering and can extract information regarding the strain. These techniques can include using optical frequency domain reflectometer techniques in a manner that is very similar to that associated with low reflectivity fiber gratings. A more complete discussion of these methods can be found in M. Froggatt and J. Moore, “High-spatial-resolution distributed strain measurement in optical fiber with Rayleigh scatter”, Applied Optics, Vol. 37, p. 1735, 1998 the entirety of which is incorporated by reference herein.
Methods and devices for calculating birefringence in an optical fiber based on Rayleigh scatter as well as apparatus and methods for measuring strain in an optical fiber using the spectral shift of Rayleigh scatter can be found in PCT Publication No. W02006099056 filed on Mar. 9, 2006 and U.S. Pat. No. 6,545,760 filed on Mar. 24, 2000 both of which are incorporated by reference herein. Birefringence can be used to measure axial strain and/or temperature in a waveguide. Using Rayleigh scatter to determine birefringence rather than Bragg gratings offers several advantages. First, the cost of using Rayleigh scatter measurement is less than when using Bragg gratings. Rayleigh scatter measurement permits birefringence measurements at every location in the fiber, not just at predetermined locations. Since Bragg gratings require insertion at specific measurement points along a fiber, measurement of Rayleigh scatter allows for many more measurement points. Also, the process of physically “writing” a Bragg grating into an optical fiber can be time consuming as well as compromises the strength and integrity of the fiber. Such drawbacks do not occur when using Rayleigh scatter measurement.
In various embodiments, the optical fiber may be substantially encapsulated in a wall of the elongate instrument body. Alternatively, the elongate instrument body may define an interior lumen, wherein the optical fiber is disposed in the lumen. Further alternatively, the optical fiber may be disposed in an embedded lumen in a wall of the elongate instrument body.
In various embodiments, the elongate instrument body has a neutral axis of bending, and the optical fiber is coupled to the elongate instrument body so as to be substantially aligned with the neutral axis of bending when the elongate instrument body is in a substantially unbent configuration, and to move relative to the neutral axis of bending as the elongate instrument body undergoes bending. In other embodiments, the optical fiber is coupled to the elongate instrument body so as to be substantially aligned with the neutral axis of bending regardless of bending of the elongate instrument body. In still further embodiments, the optical fiber is coupled to the elongate instrument body so as to remain substantially parallel to, but not aligned with, the neutral axis of bending regardless of bending of the elongate instrument body.
Shape feedback can be used directly along with system models in both control for task-space (e.g., the distal end) and/or configuration-space (the elongate portion). The configuration can be extended over time to plan for the environment, for example to track the shape inside a vessel. On the other hand, a control architecture uses less device models instead relying on the information rich shape feedback.
Shape feedback can also be used in device kinematics. Shape provides a measurement of the real kinematics of a device. Kinematic parameters may be estimated using the shape measurement. Extending that concept, shape measurement can be used to adapt the kinematic model by several methods as described herein. Moreover, a real measured shape may be displayed to a system operator in addition to or in lieu of an idealized virtual shape.
Moving beyond the geometry of the device, the physical properties of the device materials, its solid mechanics, make up a fourth area. Addressing a specific challenge for elongate flexible devices, shape can be used to measure axial deformation. Shape can also be used to pretension actuating tendons or control elements. More generally, real device shape may be compared with model expectation to adapt real model parameters or estimate a state of health based on degradation of material properties.
Shape data can further assist in estimation and control of reaction force between the device and environment. Deflection of measured shape from the predicted free shape belies application of external forces. Estimates of these forces may be used to navigate the environment or if an environment model is available, plan a navigation path.
In another variation, use of data from shape feedback can be used to detect mechanical failures of the shapeable instrument. Such feedback allows a mechanism for detecting, displaying, and handling mechanical fractures. Basic diagnosis is extended with an active secondary diagnostic to test potential fractures, redundant sensors for model-based diagnosis and shape sensor diagnostics.
Other and further embodiments, objects and advantages of the invention will become apparent from the following detailed description when read in view of the accompanying figures.
Referring to
Referring to
Each of the embodiments depicted in
Referring to
Constraints (30) may be provided to prohibit axial or longitudinal motion of the fiber (12) at the location of each constraint (30). Alternatively, the constraints (30) may only constrain the position of the fiber (12) relative to the lumen (31) in the location of the constraints (30). For example, in one variation of the embodiment depicted in
The embodiment of
Referring to
Referring to
Indeed, various configurations may be employed, depending upon the particular application, such as those depicted in
In essence, the 3-dimensional position of an elongate member may be determined by determining the incremental curvature experienced along various longitudinal sections of such elongate member. In other words, if you know how much an elongate member has curved in space at several points longitudinally down the length of the elongate member, you can determine the position of the distal portion and more proximal portions in three-dimensional space by virtue of the knowing that the sections are connected, and where they are longitudinally relative to each other. Towards this end, variations of embodiments such as those depicted in
Referring to
In the single fiber embodiment depicted in
In another embodiment of a single sensing fiber, depicted in
Referring to
As will be apparent to those skilled in the art, the fibers in the embodiments depicted herein will provide accurate measurements of localized length changes in portions of the associated catheter or elongate instrument only if such fiber portions are indeed coupled in some manner to the nearby portions of the catheter or elongate instrument. In one embodiment, it is desirable to have the fiber or fibers intimately coupled with or constrained by the surrounding instrument body along the entire length of the instrument, with the exception that one or more fibers may also be utilized to sense temperature distally, and may have an unconstrained portion, as in the two scenarios described in reference to
Referring to
Referring to
Referring to
Tension and compression loads on an elongate instrument may be detected with common mode deflection in radially-outwardly positioned fibers, or with a single fiber along the neutral bending axis. Torque may be detected by sensing common mode additional tension (in addition, for example, to tension and/or compression sensed by, for example, a single fiber coaxial with the neutral bending axis) in outwardly-positioned fibers in configurations such as those depicted in
In another embodiment, the tension elements utilized to actuate bending, steering, and/or compression of an elongate instrument, such as a steerable catheter, may comprise optical fibers with gratings, as compared with more conventional metal wires or other structures, and these fiber optic tension elements may be monitored for deflection as they are loaded to induce bending/steering to the instrument. Such monitoring may be used to prevent overstraining of the tension elements, and may also be utilized to detect the position of the instrument as a whole, as per the description above.
Referring to
Referring to
In
The drive schema for the four guide instrument interface sockets (270) is more complicated, due in part to the fact that they are coupled to a carriage (240) configured to move linearly along a linear bearing interface (250) to provide for motor-driven insertion of a guide instrument toward the patient relative to the instrument driver, hospital table, and sheath instrument. Various conventional cable termination and routing techniques are utilized to accomplish a preferably high-density instrument driver structure with the carriage (240) mounted forward of the motors for a lower profile patient-side interface.
Still referring to
Referring to
Referring to
Another embodiment of a master input device (12) is depicted in
Referring to
Referring to
Referring to
The above-described instrument embodiments present various techniques for managing tension control in various guide instrument systems having between two and four control elements. For example, in one set of embodiments, tension may be controlled with active independent tensioning of each control element in the pertinent guide catheter via independent control element interface assemblies (132) associated with independently-controlled guide instrument interface sockets (270) on the instrument driver (16). Thus, tension may be managed by independently actuating each of the control element interface assemblies (132) in a four-control-element embodiment, a three-control-element embodiment, or a two-control-element embodiment.
In another set of embodiments, tension may be controlled with active independent tensioning with a split carriage design. For example, a split carriage with two independent linearly movable portions, may be utilized to actively and independently tension each of the two control element interface assemblies (132), each of which is associated with two dimensions of a given degree of freedom. For example, one interface assembly can include +and −pitch, with +and −yaw on the other interface assembly, where slack or tension control provided for pitch by one of the linearly movable portions (302) of the split carriage (296), and slack or tension control provided for yaw by the other linearly movable portion (302) of the split carriage (296).
Similarly, slack or tension control for a single degree of freedom, such as yaw or pitch, may be provided by a single-sided split carriage design, with the exception that only one linearly movable portion would be required to actively tension the single control element interface assembly of an instrument.
In another set of embodiments, tensioning may be controlled with spring-loaded idlers configured to keep the associated control elements out of slack. The control elements preferably are pre-tensioned in each embodiment to prevent slack and provide predictable performance. Indeed, in yet another set of embodiments, pre-tensioning may form the main source of tension management. In the case of embodiments only having pre-tensioning or spring-loaded idler tensioning, the control system may need to be configured to reel in bits of slack at certain transition points in catheter bending, such as described above in relation to
To accurately coordinate and control actuations of various motors within an instrument driver from a remote operator control station such as that depicted in
Referring to
The term “localization” is used in the art in reference to systems for determining and/or monitoring the position of objects, such as medical instruments, in a reference coordinate system. In one embodiment, the instrument localization software is a proprietary module packaged with an off-the-shelf or custom instrument position tracking system, such as those available from Ascension Technology Corporation, Biosense Webster, Inc., Endocardial Solutions, Inc., Boston Scientific (EP Technologies), Medtronic, Inc., and others. Such systems may be capable of providing not only real-time or near real-time positional information, such as X-Y-Z coordinates in a Cartesian coordinate system, but also orientation information relative to a given coordinate axis or system. For example, such systems can employ an electromagnetic based system (e.g., using electromagnetic coils inside a device or catheter body). Information regarding one electromagnetic based system can be found on: http://www.biosensewebster.com/products/navigation/carto3.aspx. The relevant portions of which are incorporated by reference.
Some of the commercially-available localization systems use electromagnetic relationships to determine position and/or orientation, while others, such as some of those available from Endocardial Solutions, Inc.—St Jude Medical, utilize potential difference or voltage, as measured between a conductive sensor located on the pertinent instrument and conductive portions of sets of patches placed against the skin, to determine position and/or orientation. Referring to
http://www.sjmprofessional.com/Products/US/Mapping-and-Visualization/EnSite-Velocity.aspx. The relevant portions of which are incorporated by reference.
As shown in
In another similar embodiment (not shown), one or more conductive rings may be electronically connected to a potential-difference-based localization/orientation system, along with multiple sets, preferably three sets, of conductive skin patches, to provide localization and/or orientation data utilizing a system such as those available from Endocardial Solutions—St. Jude Medical. The one or more conductive rings may be integrated into the walls of the instrument at various longitudinal locations along the instrument, or set of instruments. For example, a guide instrument may have several conductive rings longitudinally displaced from each other toward the distal end of the guide instrument, while a coaxially-coupled sheath instrument may similarly have one or more conductive rings longitudinally displaced from each other toward the distal end of the sheath instrument—to provide precise data regarding the location and/or orientation of the distal ends of each of such instruments.
Referring back to
Using the operation of an automobile as an example, if the master input device is a steering wheel and the operator desires to drive a car in a forward direction using one or more views, his first priority is likely to have a view straight out the windshield, as opposed to a view out the back window, out one of the side windows, or from a car in front of the car that he is operating. The operator might prefer to have the forward windshield view as his primary display view, such that a right turn on the steering wheel takes him right as he observes his primary display, a left turn on the steering wheel takes him left, and so forth. If the operator of the automobile is trying to park the car adjacent another car parked directly in front of him, it might be preferable to also have a view from a camera positioned, for example, upon the sidewalk aimed perpendicularly through the space between the two cars (one driven by the operator and one parked in front of the driven car), so the operator can see the gap closing between his car and the car in front of him as he parks. While the driver might not prefer to have to completely operate his vehicle with the sidewalk perpendicular camera view as his sole visualization for navigation purposes, this view is helpful as a secondary view.
Referring still to
In one embodiment, subsequent to development and display of a digital model of pertinent tissue structures, an operator may select one primary and at least one secondary view to facilitate navigation of the instrumentation. By selecting which view is a primary view, the user can automatically toggle a master input device (12) coordinate system to synchronize with the selected primary view. In an embodiment with the leftmost depicted view (410) selected as the primary view, to navigate toward the targeted tissue site (418), the operator should manipulate the master input device (12) forward, to the right, and down. The right view will provide valued navigation information, but will not be as instinctive from a “driving” perspective.
To illustrate: if the operator wishes to insert the catheter tip toward the targeted tissue site (418) watching only the rightmost view (412) without the master input device (12) coordinate system synchronized with such view, the operator would have to remember that pushing straight ahead on the master input device will make the distal tip representation (416) move to the right on the rightmost display (412). Should the operator decide to toggle the system to use the rightmost view (412) as the primary navigation view, the coordinate system of the master input device (12) is then synchronized with that of the rightmost view (412), enabling the operator to move the catheter tip (416) closer to the desired targeted tissue location (418) by manipulating the master input device (12) down and to the right.
The synchronization of coordinate systems described herein may be conducted using fairly conventional mathematic relationships. For example, in one embodiment, the orientation of the distal tip of the catheter may be measured using a 6-axis position sensor system such as those available from Ascension Technology Corporation, Biosense Webster, Inc., Endocardial Solutions, Inc., Boston Scientific (EP Technologies), and others. A 3-axis coordinate frame, C, for locating the distal tip of the catheter, is constructed from this orientation information. The orientation information is used to construct the homogeneous transformation matrix, TGrefG0 which transforms a vector in the Catheter coordinate frame “C” to the fixed Global coordinate frame “G” in which the sensor measurements are done (the subscript Gref and superscript Cref are used to represent the O′th, or initial, step). As a registration step, the computer graphics view of the catheter is rotated until the master input and the computer graphics view of the catheter distal tip motion are coordinated and aligned with the camera view of the graphics scene. The 3-axis coordinate frame transformation matrix TGrefG0 for the camera position of this initial view is stored (subscripts Gref and superscript Cref stand for the global and camera “reference” views). The corresponding catheter “reference view” matrix for the catheter coordinates is obtained as:
TGrefC0=TG0C0TGrefG0TCrefGref=(TC0G0)−1TGrefG0TC1G1
Also note that the catheter's coordinate frame is fixed in the global reference frame G, thus the transformation matrix between the global frame and the catheter frame is the same in all views, i.e., Tc0G0=TCrefGref=TCiGi for any arbitrary view i. The coordination between primary view and master input device coordinate systems is achieved by transforming the master input as follows: Given any arbitrary computer graphics view of the representation, e.g. the i′th view, the 3-axis coordinate frame transformation matrix TGiG0 of the camera view of the computer graphics scene is obtained from the computer graphics software. The corresponding catheter transformation matrix is computed in a similar manner as above:
TCiC0=TG0C0TGiG0TCiGi=(TC0G0)−1TGiG0TCiGi
The transformation that needs to be applied to the master input which achieves the view coordination is the one that transforms from the reference view that was registered above, to the current ith view, i.e., TCrefCi. Using the previously computed quantities above, this transform is computed as:
TCrefCi=TC0CiTCrefC0
The master input is transformed into the commanded catheter input by application of the transformation TCrefCi. Given a command input
Under such relationships, coordinate systems of the primary view and master input device may be aligned for instinctive operation.
Referring back to embodiment of
In one embodiment, the safety signal commands represent a simple signal repeated at very short intervals, such as every 10 milliseconds, such signal chain being logically read as “system is ok, amplifiers stay active”. If there is any interruption in the safety signal chain, the amplifiers are logically toggled to inactive status and the instrument cannot be moved by the control system until the safety signal chain is restored. Also shown in the signal flow overview of
In master following mode (442), the control system receives signals from the master input device, and in a closed loop embodiment from both a master input device and a localization system, and forwards drive signals to the primary servo loop (436) to actuate the instrument in accordance with the forwarded commands. Aspects of this embodiment of the master following mode (442) are depicted in further detail in
Referring to
Referring back to
The kinematic relationships for many catheter instrument embodiments may be modeled by applying conventional mechanics relationships. In summary, a control-element-steered catheter instrument is controlled through a set of actuated inputs. In a four-control-element catheter instrument, for example, there are two degrees of motion actuation, pitch and yaw, which both have + and −directions. Other motorized tension relationships may drive other instruments, active tensioning, or insertion or roll of the catheter instrument. The relationship between actuated inputs and the catheter's end point position as a function of the actuated inputs is referred to as the “kinematics” of the catheter.
Referring to
An inverse kinematic model translates intended device motion into the commands that will adjust the actuator and/or control element to position the shapeable instrument as desired. Referring back to
These inverse kinematic algorithms are derived based upon certain assumptions about how the shapeable instrument moves. Examples of these assumptions include but are not limited to: 1) Each catheter segment bends in a constant curvature arc; 2) Each catheter segment bends within a single plane; 3) Some catheter segments have fixed (constant) lengths; 4) Some catheter segments have variable (controllable) lengths.
The development of the catheter's kinematics model is derived using a few essential assumptions. Included are assumptions that the catheter structure is approximated as a simple beam in bending from a mechanics perspective, and that control elements, such as thin tension wires, remain at a fixed distance from the neutral axis and thus impart a uniform moment along the length of the catheter.
In addition to the above assumptions, the geometry and variables shown in
The actuator forward kinematics, relating the joint coordinates (□pitch, □pitch, L) to the actuator coordinates (□Lx, □Lz, L) is given as follows:
As illustrated in
Calculation of the catheter's actuated inputs as a function of end-point position, referred to as the inverse kinematics, can be performed numerically, using a nonlinear equation solver such as Newton-Raphson. A more desirable approach, and the one used in this illustrative embodiment, is to develop a closed-form solution which can be used to calculate the required actuated inputs directly from the desired end-point positions.
As with the forward kinematics, we separate the inverse kinematics into the basic inverse kinematics, which relates joint coordinates to the task coordinates, and the actuation inverse kinematics, which relates the actuation coordinates to the joint coordinates. The basic inverse kinematics, relating the joint coordinates (□pitch, □pitch, L), to the catheter task coordinates (Xc, Yc, Zc) is given as follows:
The actuator inverse kinematics, relating the actuator coordinates (□Lx, □Lz, L) to the joint coordinates (□pitch, □pitch, L) is given as follows:
Referring back to
This functional block is depicted in further detail in
In one embodiment, the roll correction angle is determined through experimental experience with a particular instrument and path of navigation. In another embodiment, the roll correction angle may be determined experimentally in-situ using the accurate orientation data available from the preferred localization systems. In other words, with such an embodiment, a command to, for example, bend straight up can be executed, and a localization system can be utilized to determine at which angle the defection actually went—to simply determine the in-situ roll correction angle.
Referring briefly back to
Tension within control elements may be managed depending upon the particular instrument embodiment, as described above in reference to the various instrument embodiments and tension control mechanisms. As an example,
Referring back to
Referring to
In particular, the lead filter embodiment in
In an embodiment where a tuned master following mode is paired with a tuned primary servo loop, an instrument and instrument driver, such as those described above, may be “driven” accurately in three-dimensions with a remotely located master input device. Other preferred embodiments incorporate related functionalities, such as haptic feedback to the operator, active tensioning with a split carriage instrument driver, navigation utilizing direct visualization and/or tissue models acquired in-situ and tissue contact sensing, and enhanced navigation logic.
Referring to
Referring to
A conventional Jacobian may be utilized to convert a desired force vector (352) to torques desirably applied at the various motors comprising the master input device, to give the operator a desired signal pattern at the master input device. Given this embodiment of a suitable signal and execution pathway, feedback to the operator in the form of haptics, or touch sensations, may be utilized in various ways to provide added safety and instinctiveness to the navigation features of the system, as discussed in further detail below.
The transformed desired instrument position (359) may then be sent down one or more controls pathways to, for example, provide haptic feedback (360) regarding workspace boundaries or navigation issues, and provide a catheter instrument position control loop (361) with requisite catheter desired position values, as transformed utilizing inverse kinematics relationships for the particular instrument (362) into yaw, pitch, and extension, or “insertion”, terms (363) pertinent to operating the particular catheter instrument with open or closed loop control.
As discussed above, a system that controls a shapeable instrument can be improved using a shape measurement. The shape measurement relies upon a localization system as described above. In most cases the localization system generates a plurality of data defining real-time or near real-time positional information, such as X-Y-Z coordinates in a Cartesian coordinate system, orientation information relative to a given coordinate axis or system. Typically, the reference of the coordinate system can be taken from one or more points along the shapeable instrument. In additional variations, the reference of the coordinate system can be taken from one or more points on the anatomy, on the robotic control system, and/or on any other point as required by the particular application. As noted herein, the methods, systems, and device described herein are useful for device shape sensing that uses the shape data to improved catheter control, estimation, visualization, and diagnosis.
Shape sensing is one observation into the state of a flexible device.
The following disclosure includes combining different pieces of information (e.g. position) with shape to estimate other unknowns (e.g. tissue contact). Shape is a very important piece of information for a flexible robot. Also, shape value increases when combined with other information channels. For example, shape combined with a solid mechanics model of the flexible section can be used to calculate possible forces acting on the device. More simply, with the position of the base and the shape, the position of the tip is known. This is described further below.
For simple flexible devices, shape can be estimated, albeit correctly only in near ideal scenarios, with tip position and orientation, base position and orientation and a model of the device. To effectively estimate shape, position and orientation information is needed at multiple points in the device. Corresponding to the complexity of estimating shape, it is useful for estimating other things. If the device should follow a perfect arc (a simple model), knowing the tip and base orientations and the path length (a subset of full position information) is sufficient information to estimate shape from simple geometric principles.
Tissue contact is useful for constructing geometric models of the environment or safely traversing constrained volumes. Contact can be inferred from shape deflection from the intended shape. However, analog forces may also be inferred from this difference in shape. We would like to simplify the estimate requirements to the most straightforward or smallest subset. Thus, we can say tip contact can be estimated from distal forces (which might be measured with local strain gauges or also inferred from IntelliSense measurements). If a three-dimensional model already exists, measured position registered to the model should indicate whether the device is in contact near the measurement.
If shape base position and base orientation are known, tip position and orientation are straightforward to calculate. Considering that tip position and orientation may be measured directly with off-the-shelf products (NavX, Carto), it may be more useful to consider the case with shape and tip position and orientation measurements. Base position and orientation can be calculated with shape and tip information. Further, position and orientation at any point along the path of the device are also known. With a geometric model of the environment, knowledge of the entire device position can be used to prevent or reduce contact or plan paths.
Since a device model (along with known actuator inputs) should allow estimation of ideal device shape, real shape could be used to adapt the model to achieve closer agreement between expected and measured shape. Device contact with the environment is important because it will cause its own shape deflection and thus the model adaptation could ignore data when in contact or include contact forces in the estimation (as discussed further below).
Distal forces can be estimated from shape, a solid mechanics device model and knowledge of contact. If the device is inserted straight but the measured shape indicates a bend, the amount and point of application of force must be estimated. The device will bend differently if force is applied at the middle of the bisected length than if force is applied at the tip. The shape of the bends along with their degree and the solid mechanics model will indicate external forces applied to the device. More detailed descriptions of the estimation of force and the use of force data will be described below.
Knowledge of contact can be used to create a model of the environment in the internal reference frame. Position information registered to an external reference frame allows such a model to also be registered. Shape is not indicated in the table for estimating a geometric environment model, but shape is important for estimating contact.
Finally, positioning element tension (and other internal device forces) can be estimated using shape and the solid-mechanics model of the device (assuming no contact). Positioning element tension may also be used in estimating or improving estimates of other items of this set, but shape is more important or useful.
Task Space Feedback:
A task space application involves those situations where a robotic system attempts to position a working end of a shapeable device at a target region or goal. Applying shape feedback information to a task spaced application can assist in producing the desired task space positions and motions.
The shape of a catheter, when given some reference, can be integrated to yield position or orientation, such as tip position estimation (530) represented in
The task space can be specified in terms of the distal tip motion; however a given task may be concerned with intermediate proximal points as well to traverse a path. Therefore, task space control can include the explicit control of one or more positions and orientations along the length of the flexible device. In the simplest case, the model assumes free motion of all sections (such as in an open heart chamber) where the system controls the distal tip position to apply some therapy such as ablation. In this case, the system feeds an estimated tip position and orientation and compares against an input reference position as seen in
This error signal can then go through some compensator (550) such as a time derivative and gain to yield a command such as a tip velocity. The forward kinematics can then be inverted differentially (552), (e.g., via Jacobian pseudo-inverse or other non-linear constrained optimization techniques) that translates the velocity command into an actuator command (such as displacement or tensioning of a positioning element). These actuator commands in turn put a force on the instrument and as it interacts with the environment (captured together as plant (554)), the sensor will read the new position/orientation for further feedback. If the sensor were not available, the feedback could simply be the model forward kinematics (556) that would at least prevent integration error in this scheme.
One example of task space feedback control using the scheme shown in
The preceding example only uses the measured tip position as a means of generating an error signal. However, the sensor feedback can be used for other purposes. For example, the Jacobian (or inverse kinematics) can be adaptive based on shape or angle, or other feedback as described previously. This can be important because the inverse kinematics themselves assume a known configuration of the catheter and updated information would be beneficial. The adaptive component could be as simple as updating configuration parameters such as angle and curvature, or could be as complex as a learning algorithm that adapts over time to learn the mapping from commanded to achieved velocity.
An even simpler form of distal tip feedback control is simply to run the normal inverse kinematics as in
These methods for controlling the distal tip position and orientation could also be applied to more intermediate points of interest provided sufficient degrees of freedom. In some variations, it may be necessary to specify weightings or some other criteria on specific task goals if the dimension of the task space is greater than the dimension of the actuator space. In variations where the system controls multiple points simultaneously, the system might select directions that align axially with the local point so as to move along the present path. Alternatively, the system can place emphasis to distal degrees of freedom since proximal motions have large effects on distal points due to the moment arm from proximal constraints. These trade-offs motivate lower level control of the device in configuration space.
Configuration Space Feedback:
Shape can describe the configuration of a device better than feedback of a single position or orientation. Having the capability to obtain shape information allows for the current shape of a device (including angle, curvature, profile, torsion, etc.) to be fed back into a controller. The controller can then process a command to actuate the device into a desired shape.
Because both pure feed forward and pure feedback topologies have clearly identifiable advantages and disadvantages, the topologies can be combined as shown in
Tracking with Shape Information:
In an additional variation, systems, methods and devices using shape information can be used to track advancement of a shapeable instrument through an anatomic path. The anatomic path can include a path through a vessel, organ, or between organs. For example, the anatomic path can include a path through one or more vessels to access the heart. Alternatively, the path can include navigation through bronchial passages or the digestive tract. The robotic system then monitors the shape of the instrument for any changes in shape that would indicate the need for a corrective action.
To protect against this situation, the robotic system (52) monitors the shape of one or more portions of the instrument (50) to detect for the expected shape or to monitor for unexpected shapes such as the sharp bend (56) shown in
Although tracking of shape is beneficial when advanced using a robotic system, tracking of shape can also benefit procedures in which the instrument is manually advanced or advancement is assisted via a robotic system. In such cases, the instrument will be coupled to a system that can provide information to the operator regarding the shape of the instrument as described above.
Reduced Model Control:
One of the primary benefits of feedback control is avoiding reliance on models and their associated errors. The less data that is available, the more important a model becomes. With full state feedback of some combination of shape, position, orientation, or deformation over a sufficient history, the relationships between actuator and position or shape can be learned. For example, during a procedure the robotic system can use shape data to correlate a map between force on one or more positioning elements and multi-section bending of the instrument. This map can be inferred from many measurements. For a particular region, this map may depend on the anatomical constraints and may need to be refreshed (or continuously updated) if conditions are significantly altered. To supplement the data, a model could be used for the anatomy to estimate its presence and characteristics as it interacts with the catheter. This mapping could at one extreme serve as a black box within the controller (508) in
A simple use of reduced model control could be to monitor the mapping between proximal instrument insertion and distal tip motion. A distal tip of the instrument can move in almost any direction under proximal insertion depending on distal contact with anatomy. Therefore, knowing this mapping between proximal insert and distal motion is of critical importance in maintaining intuitive robotic control. In the simplest form, the user could request a calibration where the proximal insertion would be dithered, and the distal tip motion measured. Subsequent movements of the master input device in the measured direction would then map to distal insertion and preserve intuitive driving.
To further reduce the model, it might be possible to instrument a non-robotic catheter with shape sensing for a given procedure or sub-routine of a procedure. The specific goals of each part of the procedure could be stated and associated with the measured catheter state. This association could provide the basis for a mapping between user intent and catheter result that could be applied to a robotic system. The user could then simply operate the robotic system in supervisory fashion where the user provides a command such as cannulate the carotid and this command is then translated to catheter shape, then actuator commands: This process could be largely based on data history and not be overly reliant on complex models.
Different device and control architectures, shape sensing accuracies, and device configurations requires different modeling trade-offs. A less accurate shape sensor may require more device modeling to achieve accurate control. Thus, models are useful in many scenarios. Kinematic models of robotic devices are very useful. The strong geometric basis of kinematic models is well informed by shape. As discussed below, systems and methods use shape information to adjust or inform kinematic models.
Applying Shape Information to Kinematic Models:
Due in large part to the open-loop nature of the existing instrument control, the instrument mechanics algorithms can be dependent upon an accurate kinematic model that represents the mechanical and physical characteristics and parameters of the instrument. Some of the parameters in this model are “tuned” at development/design time while others are characterized on an individual device basis or lot-by-lot basis as part of the manufacturing process. A partial list of these model parameters includes: segment length, overall lengths, diameter, bending stiffness, axial stiffnesses, and control positioning element stiffness, etc.
In one variation, systems and methods disclosed herein can use an instruments measured shape data in combination with commanded shape, commanded tendon displacements, measured tendon displacements, and any other available sensor data (e.g. measured tendon tensions) to estimate improved parameters for the instrument model used by the instrument mechanics algorithms. This combination can modify an instrument's configuration control algorithms as discussed below. Alternatively or in combination, the combination can allow for new parameter values that make the control model more accurately match the specific instrument being manipulated. The estimation of these improved parameters can be accomplished with existing model-fitting, machine learning, or adaptive control techniques.
Another application of applying shape sensing data involves improving a performance of the instrument driving when a mismatch occurs between an assumed and a real kinematic relationship. For example, as shown in
In a general implementation, the adaptive kinematics module will likely combine measured shape information with knowledge about the commanded shape, actuator commands, and any other sensory information that may be available (e.g. measured pullwire or positioning element tensions). Several more specific algorithmic implementations are listed below.
One variation of an adaptation scheme includes a “trial and error” type approach. This approach requires an initial guess as to instrument position with a subsequent collection of the shape data to determine the actual result. The control system then compares the initial guessed position or shape with the actual measured resulting shape and uses the measured error signal to improve correlation to learn from the error. In this sense, the control system uses an original or idealized inverse kinematic model to compute the initial robot control commands whenever the user starts driving in a new portion of the anatomy or driving in a previously unexplored direction. Because of this, the instruments response to the user's first motion commands should contain the greatest error. As the user makes repeated efforts to access a target, the response of the control system to manipulate the instrument should increase in accuracy.
In another variation, an adaption scheme could include the use of building a lookup table of commanded catheter configurations vs. actually achieved catheter shapes and their corresponding tip positions relative to the base coordinate frame.
In one example, adapting a kinematic model can benefit on a temporary basis. For example, if the shapeable instrument encounters an anatomic constraint during navigation the behavior of the instrument will be affected as long as it encounters the constraint. For instance, a common occurrence is where a shapeable instrument comprises a catheter where a proximal portion of the catheter is largely constrained by a blood vessel and only' a distal portion of the catheter is free to articulate. In this case, adapting the kinematic model can be as simple as estimating a new effective articulation length based upon sorting the length of the catheter into a section that, upon measuring shape, is observed to move freely and a section that generally is not changing shape (or encounters significant resistive force when changing shape).
In another example of adapting a kinematic model, a measured shape of an instrument can be used to compute curvature as a function of arc length along the instrument. By analyzing the shape data for sharp transitions or discontinuities the assumed articulating length of the catheter can be broken up into several subsections that may be behaving differently from each other, but within each subsection the behavior more accurately matches the assumed constant curvature arcs. Based upon this auto-segmentation, the inverse kinematics can be computed as a chain of several smaller subsections that each can be accurately computed with the traditional kinematic models.
In another variation, shape data can be overlayed, as shown in
Improved Kinematic Models
An initial or existing inverse kinematic model is typically based upon a parametric model that describes the number and nature (length, degrees of freedom, coupling, etc.) of bending segments of the shapeable instrument. One variation of improving the kinematic model is to adapt the model via a model fitting exercise. The system or an operator can engage the robotic system to perform a training routine in which the shapeable instrument moves through a series of locations and shapes. Once sufficient training data is collected on the observed correlation between instrument configuration (shape) and tip position, one or more model fitting techniques can find the parameter set for the kinematic model that produces predictions that best match observations or that produces the predictions desired for the particular application. This best fit model is then used to compute the inverse kinematics for future iterations.
In yet another variation of adapting the model, an implementation of an adaptive kinematics module could use some intelligent combination of several of the concepts described herein. Keep in mind, that while all of these approaches deal with the translation of commands from task space to configuration space, they do not necessarily all employ the exact same description of the catheter in joint space.
Adaptive Jacobian
Another approach to adapting kinematics is to solve the inverse kinematics based on velocity rather than position. Typically, at each time step the system solves for an instrument configuration that achieves a desired tip position. Instead, the system can solve for the change in shape of the instrument, relative to its current configuration or shape, where the change produces a desired incremental movement or velocity of the instrument's tip change in catheter configuration mathematically, the general relationship between velocities is a derivative of the kinematic relationships between positions this relationship is referred to as the Jacobian. One significant benefit of an adaptive Jacobian model is that exact kinematic relationships between positions are not needed. The model could simply infer, estimate, or learn the Jacobian from the sensed shape information. However, when combined with other methods, determining the relationships would be required.
One example of a very simple adaptive Jacobian approach is in a modified control architecture where a configuration space command is a measure of how hard the system is trying to bend or direct the shapeable instrument in a particular direction. Using a measurement of the shape of the instrument and its associated tip orientation, the adaptive Jacobian approach translates a tip velocity command into a configuration commands as follows:
1) To move the tip laterally (relative to the tip orientation) in the direction of the catheter bend, apply more bending effort to the catheter segment.
2) To move the tip laterally (relative to the tip orientation) in the direction opposite the catheter bend, apply less bending effort to the catheter segment.
3) To move the tip in the direction it is pointed, insert the catheter segment.
4) To move the tip in the opposite direction than it is pointed, retract the catheter segment.
As noted above, one advantage to this control model is to reduced dependence upon accurately modeling all aspects of the catheter and the environment with which it interacts. The model simply relies on moving the shapeable instrument and its tip from its current position to a desired position.
Real Shape Display
Estimation of an instrument's mechanical parameter can be performed post manufacturing to be set statically for the procedure. Alternatively or in combination, mechanical and other parameters can be updated intraoperatively to improve control. When the properties of a flexible section are used to determine control inputs, system feedback can be used to estimate those properties. As a basic example, the stiffness of a section of the instrument can be used to decide how much to displace positioning elements (e.g., pull tension wires) to produce a desired configuration. If the bulk bending stiffness of a section is different than the existing stiffness parameters, then the section will not bend or bend too much and not produce the desired configuration. Using shape data to assess the shape of the instrument, a new bending stiffness can be continuously updated based on the degree to which the positioning elements are displaced and the resulting actual bending. A recursive-least-squares technique can be used to quickly recalculate a new bending stiffness as new data is measured. However, any number of other methods of adapting parameters can also be used.
However, in adapting the actual instrument parameters, it will be useful to determine if the instrument is in contact with the environment. When the instrument is operated in free space, the parameters can be adapted to reduce the difference between the desired configurations versus the predicted configuration. However, if the instrument (1) is not bending as far as expected due to contact with an external object (49), as shown in
One instrument state that is difficult to observe in practice is a compression or extension along a longitudinal axis of the shapeable instrument. Knowledge of the compression or extension of the instrument is important in the control of the instrument because the actuators often act in series with this axial mode. For example, positioning elements or tendon actuation can be used to control an instrument by routing the positioning elements through a conduit along a length of the instrument. The control elements can be terminated at the distal end and actuated proximally to alter a shape of the instrument. In this case, compression of the instrument also affects the positioning elements and could even lead to slack. Therefore, the axial compression ore extension should be known or estimated to maintain improved control authority. In addition, the axial deformation can be important if some other member is routed co-axially with the shapeable instrument. For example, an ablation catheter can be routed down a central through lumen. If the instrument compresses, the ablation catheter becomes further exposed. Clinically, this exposure, or protrusion, can cause difficulty in performing articulations in small spaces since the uncontrolled ablation catheter will be sticking out further and occupying significant volume.
Ideally, an observer or a localization system could provide complete deformation information for the entire length of the device (similar to Finite Element Method results), including the axial deformation. In reality, measurements only of select states are practical, each potentially requiring its own unique sensor. In the case of axial deformation, there are a number of potential methods to sense the desired information. For example in
Alternatively, as shown in
This linear motion of the fiber (17) can be measured by a linear encoder or similar method. Since the bending can be calculated using the Bragg Gratings or another localization system, the manipulator's axial compression can be found from this proximal linear motion of the fiber. Additional variations of this configuration allow for any other flexible element to be used instead of the freely floating fiber. Such flexible elements include, but are not limited to: a wire, coil tube, or actuation element so long as its own axial deformation is observable or reasonably assumed to be negligible. If such axial deformation is predicted to be significant, the force on the elongate element could be measured to estimate its own axial deformation. Similarly, the force of all actuation elements could be measured or modeled to estimate the total axial deformation of the flexible manipulator itself.
Given information of axial deformation, there are various steps that can be taken to make use of this information. For example, as shown in
As previously mentioned, the axial deformation can be important simply for maintaining control authority as it affects the actuators. For example, if the axial deformation were accurately known, that quantity could be adjusted via the actuators that drive the positioning elements in addition to the amount due to bending and deformation of the positioning elements themselves. Furthermore, with accurate knowledge of the axial deformation combined with some knowledge of bending, the absolute angle of a point on the catheter may be identifiable with respect to the robot. For example, if the total force on the catheter is measured or modeled, we can infer the axial deformation based on stiffness.
An alternative way to measure the axial deformation would be to obtain localization data (from sensor, image processing, etc.) of the ablation catheter differenced from the inner guide catheter. Cumulative bending and axial deformation can also be obtained by the amount which a coil tube (60) (which is axially rigid and allowed to float in the catheter) displaces out of the proximal end of the instrument (I). The coil tube (60) displacement less the axial deformation estimate yields pure bend information that indicates the angle of deflection A of the distal coil tube termination with respect to the robot as shown in
Pre-Tensioning
When a shapeable instrument is coupled to a robotic control mechanism there is typically an unknown amount of slack in positioning element. If the catheter is to be effectively controlled by position (rather than tension) controlling each of the positioning elements, this unknown positioning element slack must be removed by finding appropriate position offsets for each positioning element. Pre-tensioning is the process of finding out how much slack is in each tendon and removing it.
Several sources of initial slack in the tendons are illustrated in
The first assumption is important because by pre-tensioning is intended to find the “zero-point” for the control element displacements. It is from this “zero point” that control element commands are added to displace the control element to control the shapeable device. If the articulating portion of the catheter is in fact bent during pre-tensioning, this will introduce an un-modeled disturbance to the catheter control algorithms and result in degraded tracking of user commands. In theory, this assumption could be relaxed to require that the articulating portion of the catheter is in any known configuration during pre-tensioning. However, without shape sensing, a straight catheter (held there by the catheter's internal bending stiffness) is the most practical configuration to assume.
The second assumption comes from the fact that any motion in the non-articulating portion of the instrument (1) introduces additional offsets to the positions of the positioning elements (62). Because the configuration of the non-articulating portion of the instrument (1) is neither modeled nor (without shape sensing) sensed, the instrument (1) control algorithms are unaware of these changes in the positioning element (62) displacements, therefore resulting in degraded tracking of user commands. Conventionally, pre-tensioning the instrument (1) occurs after it has been inserted into the patient and taken the shape of the patient's anatomy. If subsequent significant changes to the shape of the non-articulating portion of the instrument (1) occur, the operator must straighten out the articulating portion and re-execute pre-tensioning to account for these changes.
Shape sensing provides an opportunity to relax both of these assumptions. The articulating portion of the instrument (1) could be pre-tensioned in any configuration by using the shape sensing system to measure the configuration of the catheter at the completion of pre-tensioning. The measured pretension offsets are then corrected by computing the deviation in position of the positioning element (62) due to the measured configuration. This computation itself if very similar to portions of the existing instrument (1) control algorithms.
The non-articulating portion of the instrument (1) can also be allowed to move after pre-tensioning by continually or periodically measuring the shape of the non-articulating portion. The pretension offsets can then be corrected by computing the deviation in tendon position due to the measured change in the shape of the non-articulating portion of the instrument (1).
Reaction Force
While contact deflection can disturb catheter parameter adaptation, use of shape data allows contact deflection to be used to estimate the contact force. If the bending stiffness of a section is known or has been adapted before coming in contact with an obstruction (49), the deviation between a predicted shape and an actual shape as measured can reveal where the instrument (1) makes contact and the degree of force with the environment or obstruction (49). If the estimate of bending stiffness, calculated simply from tendon positions and shape, were to rapidly change in conjunction with commanded movement, the device could be assumed to have come into contact with its environment. The new bending stiffness should be ignored in lieu of that calculated prior to the rapid change. Next, since the shape is dependent on the control inputs, device properties and the force from the environment. The force from the environment can be determined or estimated since the control inputs and device properties are known. For example, as shown in
Estimating Force Applied to Anatomy with Shape Information
With no interaction with the environment, a flexible device will have a default shape for a sequence of actuator inputs. If the flexible device contacts with the environment, the device (or a portion) will deflect in shape along portions of the device again as shown in
Also, if there are multiple possible shapes for a set input configurations (i.e. path dependence), the previous configuration may also need to be considered in solving the forces.
If shape is not tracking the command, it could be in contact with tissue. Constantly observing the error between commanded shape and actual shape allows rapid detection of contact. An obvious metric for this error is the norm of the distance between measured and commanded positions along the path of the catheter (any number of norms could be used). However, this metric would amplify disturbances in curvature at one point (since such a disturbance is integrated to position). Another metric includes the mean square error between commanded and measured curvature at each point. This weights the points more evenly and attributes error properly at the afflicted location rather than the effect of the error on other points, as distance does.
Once contact is made, there are several possible actions depending on the severity of tissue contact. If no tissue contact is tolerable, the controller generates a signal or stops the instrument from moving and an alarm notice may be posted for the operator. In another variation, the control system may issue commands to automatically retrace motion to move away from the contact by a set distance.
If contact should merely be limited, there are more possibilities. For example, haptic feedback such as a vibration or force can be applied to the input device to indicate contact. Vibration is useful if the location or direction of contact is unknown. However, a reasonable guess of direction is the vector of motion when contact occurred. Force could be added to resist motion in that direction on the input device. Also, slave movement could be de-scaled when in contact.
If force is estimated as previously described, a haptic force could be applied to the input device proportional to the estimated force. The direction of the master haptic force should be that which will reduce the reaction force in the slave.
Use of Force Data for Path Planning
If the environment geometry is known, the forces can be estimated for all actual instrument configurations. Two sub-cases are described where the goal is either known or unknown.
If a goal position in the environment is known, the path to that point can be planned given knowledge of the environment geometry. The best path will depend on the cost associated with forces applied to the instrument (1). For example, one example constraint could be that distal tip forces must be below one threshold and body forces must be below another (this could be to effectively limit the pressure on tissue). These two conditions constrain the possible path to the goal. In fact, there may be no path and this could be communicated to the operator allowing the operator to change threshold or reposition the starting point. If there is one path that meets the constraints, it is likely there are many. Haptic forces or visual shading may be used to guide the operator to stay in the set of acceptable paths. Also, the path which minimizes peak forces (or sum-of-square forces) may be calculated and light haptic forces could be used to guide the operator to drive directly along that path. Similarly, that path may be automatically driven by the robot.
If the goal is unknown, there may still be a predictable set of acceptable paths through the environment. If the operator is interested in two distinct modes—maneuvering to a specific area then applying treatment in that area, the user interface can provide an interface to switch between the two modes. In the maneuvering mode, the path planner will try to keep the treatment tip and catheter body away from tissue and minimize forces along potential paths. As shown in
Mechanical Failure and Structural Integrity of the Instrument:
The use of shape information also allows for determining different types of mechanical failure.
In the case of a fracture in the structure, bending with non-zero force around the point of fracture results in higher than normal curvature. The curvature may be higher than expected for a device and lead to an immediate failure diagnosis simply monitoring each point along the path of the device. An example of a block diagram to assess such a failure is shown in
For a more subtle partial fracture, the curvature could be compared to an expected curvature generated by a model and a diagnosis made based on the cumulative residual (e.g. integral of norm of distance between expected and measured shapes), diagramed in
A simple model might be a minimum smoothness of curvature such that a discontinuity at any point gives rise to indicate a mechanical failure. The expected shape of a device could also be generated by a solid mechanics model taking into account the actuator inputs. The expected curvature for each point would be compared to the measurement of curvature at that point, iterating along the path of the device.
In the case of fracture of actuating tendons, the curvature would be less than expected for a given level of actuation. If the flexible section defaults to a zero-force position, the curvature must be compared to a modeled curvature. The flowchart in
As discussed above, contact with the environment will alter the shape from that expected when the instrument is actuated in free space. Such environmental factors can be considered when comparing measured curvature to expected curvature to prevent a false indication of failure. The diagnostic algorithm can simply ignore shape measurements when in the shapeable instrument (1) contacts an obstruction. If distal force is measured, it can be included in models that estimate shape.
Dynamics can be used to assess the instrument for failure modes while it is in contact with the environment, meaning when there is an error between the desired shape and measured shapes. To detect a quick mechanical fracture (such as a break or tear), the measured curvature at each point along the device could be compared with a low-pas filtered version of itself. If the two differ greatly, a fracture may have occurred. The bandwidth of the filter would have to be faster than the bandwidth of actuation.
Following, the actuator inputs could be considered and applied to a current estimate of catheter shape. The shape estimate would filter the measured shape to accommodate contact deflection and apply the actuation relatively to the estimate. A beneficial by-product of this shape estimate would be an estimate of when the device makes contact with its environment.
Display of Fracture Information
When a fracture is detected, it is important to convey the failure to the automated system and user. When the control system is notified of a mechanical fracture, tension on tendons should be configured to freeze the actuator state or reduce passive applied force which would likely be the relaxed tendon positions. In an additional variation, the user interface could provide haptic feedback to the operator by changing the force characteristics on the master device (e.g. turn off force-feedback, centering, gravity compensation, etc.). Also, the system could display a visual indicator of device failure. Visual notification could be simply textual. Alternatively, the visual notification can be information rich. If the system has a visual depiction of the instrument, the depiction of the instrument could be altered in shape, color, and/or other representative features to reflect the failure. Thus, if the instrument structure fractures, then the operator interface display could show a displacement of material at the suspected point of fracture on the corresponding depiction. If the actual fracture mode is not visually compelling to demonstrate the degree of corresponding mechanical problem that will result from the failure, the main display could show an icon near the area, magnify the area and show the fracture or highlight the area with contrasting colors. Simulating a colored semi-transparent light illuminating the fractured section would imitate other alarms such as police revolving lights to give operators an intuitive reaction.
Communicating the point of failure and type if possible is important in those cases where stopping treatment or operation of the device is not the chosen fail-safe response to a failure. If the instrument remains functional (albeit impaired), then the system could permit a user with the option of continuing in a reduced workspace or reduced functionality. In such a case, showing the old and new workspace as well as the fracture point will aid an operator in safely performing their tasks or exiting the workspace.
Active Secondary Diagnostic
Because of variation in devices and operating conditions, there is often some overlap between acceptable and unacceptable failures in a given diagnostic metric. Thus, there may be what are statistically called Type I or Type II errors for the hypothesis that there is a failure (false failures/positive or false passes/negative, respectively). To avoid false passes in critical diagnostics, the pass-fail criteria may be biased to report failures when the behavior is questionable due to operating condition. As one example, if a positioning element fractures while a default-straight device is nearly straight it will be difficult to diagnose with the small change in movement from nearly straight to straight. The system may request an operator perform a specific maneuver or input in order to verify the failure behavior. In the positioning element fractures situation, the system will request the operator move in the direction the positioning element in question would normally pull. A lack of motion with such intentional motion would clearly indicate a fracture which could be reported with confidence to the user.
Structural Integrity of the Instrument:
One aspect of using any flexing material is that the bulk properties of flexible sections change over time as the section is repeatedly flexed. Materials may fatigue, slightly surpass their elastic regions or in the case of composites experience slip on the micro-scale between materials. In extreme cases, these behaviors qualify as a fracture of the material. However, in many cases, less significant changes allow continued use of the flexible section but alter the flexural properties of the flexing material and therefore of the shapeable device. If the degradation is fairly continuous and smoothly progressing in time, knowledge of the level of degradation could be useful to the system operator. This “state of the health” of the flexible section as well as other adapted properties are useful information for users and extra-control subsystems.
The state of health could be conveyed to the user as numerical percentage of original life or with an icon such as a battery level type indicator. The state of health can be calculated by comparing measured behavior to behavior predicted from a model of the device populated with parameters from the initial manufactured properties. It could also be calculated by comparing the behavior to models populated with parameters of known fatigued devices. This second technique would also provide an estimate of the device properties which may be used to in place of the initial parameters improve control of the device. Finally, the device properties may be adapted directly by minimizing the residual by varying the value of the target property.
Sensor Integrity
In order to diagnose failures of a flexible robotic mechanism, the integrity of the shape sensor must be known. Differentiating between a failure of the sensor and a failure of that which is sensed can be complicated depending on the type of sensor used for the measurement.
The first diagnostic of sensor integrity can be based on the physics of the sensor itself. First, the range of reasonable outputs of the sensor should be bounded to enable detection of disconnected wiring. For example, if curvature at point x along the flexible path S is measured as an electrical potential between 1 and 4 Volts, the electrical system should be capable of measuring between 0 and 5 Volts. Thus, a measurement of 0.5 Volts would lead to obvious diagnosis of a sensor fault. The physics of the sensor and measuring system can be considered in designing this level of diagnostic. When the sensor is measuring outside the realm of physically reasonable, information about the robot cannot be known.
If more subtle failures of the sensor are possible, a scalar error for example, lack of sensor integrity could easily be confused for a mechanical failure. If the sensor registers an extremely large curvature at one point, that would seem to indicate a fractured structure. If such errors are possible in the sensor though, other information may be considered to differentiate sensor integrity from mechanical integrity. This is important in order to properly inform the user what component has failed for repair or replacement and for the control system to implement a fail-safe response.
In this case, model based diagnostics may be used to combine other sources of information to decide which component has failed.
If tip position is measured, a sudden movement of the tip position could indicate a real fracture. If the shape sensor measures a change that should move the tip position, but no tip motion is detected, the sensor may be faulted.
The current shape and tip position of the device can be predicted based on a model of the device, the previous shape measurement and the previous tip position measurement. Thus, the measured shape and tip position may be compared to their predicted values to produce residuals. These residuals may be the output of a Kalman filter or other such estimator. The residuals may then be analyzed to decide which failure to report. They may each simply be compared to a threshold, the residual may be integrated then compared to a threshold, or the residuals may be incorporated with operating conditions in a Bayesian network trained with operating and failed components.
Tip orientation can be used analogously or combined with to tip position.
If a measured shape has been registered to a hollow geometric environment model, the device should pass through the model surface boundaries. If the sensor measurement leaves the model, the sensor is probably erroneous.
If the measured device shape changes rapidly but there is no change in actuating wire-tensions, the device probably did not fracture and thus the sensor is probably erroneous.
Similarly, if the device is held in a state with natural potential energy by actuators connected with a back drivable drive train and the measured shape changes but the actuator effort does not, the device probably did not fracture and thus the sensor is probably erroneous.
The systems described herein can predict the current shape and other properties the device and actuators based on a model of the device, the previous shape measurement and other estimates of previous properties. Thus, comparing measured shape and other measurements to their predicted values provides residuals. These residuals may be the output of a Kalman filter or other such estimator. The residuals may then be analyzed to decide which failure to report. They may each simply be compared to a threshold, the residual may be dynamically transformed then compared to a threshold, or the residuals may be incorporated with operating conditions in a Bayesian network trained with operating and failed components.
While multiple embodiments and variations of the many aspects of the invention have been disclosed and described herein, such disclosure is provided for purposes of illustration only. Many combinations and permutations of the disclosed system are useful in minimally invasive medical intervention and diagnosis, and the system is configured to be flexible. The foregoing illustrated and described embodiments of the invention are susceptible to various modifications and alternative forms, and it should be understood that the invention generally, as well as the specific embodiments described herein, are not limited to the particular forms or methods disclosed, but also cover all modifications, equivalents and alternatives falling within the scope of the appended claims. Further, the various features and aspects of the illustrated embodiments may be incorporated into other embodiments, even if no so described herein, as will be apparent to those skilled in the art.
This application is a continuation of U.S. patent application Ser. No. 16/198,602, filed Nov. 21, 2018, and issued as U.S. Pat. No. 11,051,681 entitled “METHODS AND DEVICES FOR CONTROLLING A SHAPEABLE MEDICAL DEVICE,” which is a divisional of U.S. patent application Ser. No. 14/164,961, filed Jan. 27, 2014, and issued as U.s. Pat. No. 10,143,360 entitled “METHODS AND DEVICES FOR CONTROLLING A SHAPEABLE MEDICAL DEVICE”, which is a divisional of U.S. patent application Ser. No. 12/823,032, filed Jun. 24, 2010, and issued as U.S. Pat. No. 8,672,837, entitled “METHODS AND DEVICES FOR CONTROLLING A SHAPEABLE MEDICAL DEVICE,” the entire disclosures of all of which are expressly incorporated by reference herein for all purposes.
Number | Name | Date | Kind |
---|---|---|---|
4745908 | Wardle | May 1988 | A |
5273025 | Sakiyam et al. | Dec 1993 | A |
5526812 | Dumoulin et al. | Jun 1996 | A |
5550953 | Seraji | Aug 1996 | A |
5831614 | Tognazzini et al. | Nov 1998 | A |
5935075 | Casscells | Aug 1999 | A |
6038467 | De Bliek et al. | Mar 2000 | A |
6047080 | Chen | Apr 2000 | A |
6059718 | Taniguchi et al. | May 2000 | A |
6063095 | Wang et al. | May 2000 | A |
6167292 | Badano | Dec 2000 | A |
6203493 | Ben-Haim | Mar 2001 | B1 |
6246784 | Summers | Jun 2001 | B1 |
6246898 | Vesely | Jun 2001 | B1 |
6332089 | Acker | Dec 2001 | B1 |
6425865 | Salcudean et al. | Jul 2002 | B1 |
6466198 | Feinstein | Oct 2002 | B1 |
6490467 | Bucholz | Dec 2002 | B1 |
6545760 | Froggatt et al. | Apr 2003 | B1 |
6553251 | Landesmaki | Apr 2003 | B1 |
6665554 | Charles | Dec 2003 | B1 |
6690963 | Ben-Haim | Feb 2004 | B2 |
6690964 | Beiger et al. | Feb 2004 | B2 |
6755797 | Stouffer | Jun 2004 | B1 |
6812842 | Dimmer | Nov 2004 | B2 |
6899672 | Chin | May 2005 | B2 |
6926709 | Beiger et al. | Aug 2005 | B2 |
7180976 | Wink | Feb 2007 | B2 |
7206627 | Abovitz | Apr 2007 | B2 |
7233820 | Gilboa | Jun 2007 | B2 |
7386339 | Strommer et al. | Jun 2008 | B2 |
7618371 | Younge et al. | Nov 2009 | B2 |
7697972 | Verard | Apr 2010 | B2 |
7756563 | Higgins | Jul 2010 | B2 |
7850642 | Moll et al. | Dec 2010 | B2 |
7901348 | Soper | Mar 2011 | B2 |
7935059 | Younge et al. | May 2011 | B2 |
7972298 | Wallace et al. | Jul 2011 | B2 |
7974681 | Wallace et al. | Jul 2011 | B2 |
7976539 | Hlavka et al. | Jul 2011 | B2 |
8021326 | Moll et al. | Sep 2011 | B2 |
8041413 | Barbagli et al. | Oct 2011 | B2 |
8050523 | Younge et al. | Nov 2011 | B2 |
8052621 | Wallace et al. | Nov 2011 | B2 |
8052636 | Moll et al. | Nov 2011 | B2 |
8092397 | Wallace et al. | Jan 2012 | B2 |
8155403 | Tschirren | Apr 2012 | B2 |
8190238 | Moll et al. | May 2012 | B2 |
8257303 | Moll et al. | Sep 2012 | B2 |
8285364 | Barbagli et al. | Oct 2012 | B2 |
8290571 | Younge et al. | Oct 2012 | B2 |
8298135 | Ito et al. | Oct 2012 | B2 |
8317746 | Sewell et al. | Nov 2012 | B2 |
8388538 | Younge et al. | Mar 2013 | B2 |
8388556 | Wallace et al. | Mar 2013 | B2 |
8391957 | Carlson et al. | Mar 2013 | B2 |
8394054 | Wallace et al. | Mar 2013 | B2 |
8409136 | Wallace et al. | Apr 2013 | B2 |
8409172 | Moll et al. | Apr 2013 | B2 |
8409234 | Stahler et al. | Apr 2013 | B2 |
8460236 | Roelle et al. | Jun 2013 | B2 |
8498691 | Moll et al. | Jul 2013 | B2 |
8515215 | Younge et al. | Aug 2013 | B2 |
8617102 | Moll et al. | Dec 2013 | B2 |
8672837 | Roelle et al. | Mar 2014 | B2 |
8705903 | Younge et al. | Apr 2014 | B2 |
8801661 | Moll et al. | Aug 2014 | B2 |
8811777 | Younge et al. | Aug 2014 | B2 |
8818143 | Younge et al. | Aug 2014 | B2 |
8821376 | Tolkowsky | Sep 2014 | B2 |
8858424 | Hasegawa | Oct 2014 | B2 |
8864655 | Ramamurthy et al. | Oct 2014 | B2 |
8926603 | Hlavka et al. | Jan 2015 | B2 |
8929631 | Pfister et al. | Jan 2015 | B2 |
8974408 | Wallace et al. | Mar 2015 | B2 |
9014851 | Wong et al. | Apr 2015 | B2 |
9039685 | Larkin et al. | May 2015 | B2 |
9066740 | Carlson et al. | Jun 2015 | B2 |
9084623 | Gomez et al. | Jul 2015 | B2 |
9125639 | Mathis | Sep 2015 | B2 |
9138129 | Diolaiti | Sep 2015 | B2 |
9173713 | Hart et al. | Nov 2015 | B2 |
9183354 | Baker et al. | Nov 2015 | B2 |
9186046 | Ramamurthy et al. | Nov 2015 | B2 |
9186047 | Ramamurthy et al. | Nov 2015 | B2 |
9271663 | Walker et al. | Mar 2016 | B2 |
9272416 | Hourtash et al. | Mar 2016 | B2 |
9289578 | Walker et al. | Mar 2016 | B2 |
9404734 | Ramamurthy et al. | Aug 2016 | B2 |
9441954 | Ramamurthy et al. | Sep 2016 | B2 |
9457168 | Moll et al. | Oct 2016 | B2 |
9459087 | Dunbar | Oct 2016 | B2 |
9498601 | Tanner et al. | Nov 2016 | B2 |
9500472 | Ramamurthy et al. | Nov 2016 | B2 |
9500473 | Ramamurthy et al. | Nov 2016 | B2 |
9504604 | Alvarez | Nov 2016 | B2 |
9561083 | Yu et al. | Feb 2017 | B2 |
9603668 | Weingarten et al. | Mar 2017 | B2 |
9622827 | Yu et al. | Apr 2017 | B2 |
9629595 | Walker et al. | Apr 2017 | B2 |
9629682 | Wallace et al. | Apr 2017 | B2 |
9636184 | Lee et al. | May 2017 | B2 |
9710921 | Wong et al. | Jul 2017 | B2 |
9713509 | Schuh et al. | Jul 2017 | B2 |
9717563 | Tognaccini | Aug 2017 | B2 |
9726476 | Ramamurthy et al. | Aug 2017 | B2 |
9727963 | Mintz et al. | Aug 2017 | B2 |
9737371 | Romo et al. | Aug 2017 | B2 |
9737373 | Schuh | Aug 2017 | B2 |
9744335 | Jiang | Aug 2017 | B2 |
9763741 | Alvarez et al. | Sep 2017 | B2 |
9788910 | Schuh | Oct 2017 | B2 |
9844412 | Bogusky et al. | Dec 2017 | B2 |
9867635 | Alvarez et al. | Jan 2018 | B2 |
9918681 | Wallace et al. | Mar 2018 | B2 |
9931025 | Graetzel et al. | Apr 2018 | B1 |
9949749 | Noonan et al. | Apr 2018 | B2 |
9955986 | Shah | May 2018 | B2 |
9962228 | Schuh et al. | May 2018 | B2 |
9980785 | Schuh | May 2018 | B2 |
9993313 | Schuh et al. | Jun 2018 | B2 |
10016900 | Meyer et al. | Jul 2018 | B1 |
10022192 | Ummalaneni | Jul 2018 | B1 |
10046140 | Kokish et al. | Aug 2018 | B2 |
10080576 | Romo et al. | Sep 2018 | B2 |
10123755 | Walker et al. | Nov 2018 | B2 |
10130345 | Wong et al. | Nov 2018 | B2 |
10136950 | Schoenefeld | Nov 2018 | B2 |
10136959 | Mintz et al. | Nov 2018 | B2 |
10143360 | Roelle et al. | Dec 2018 | B2 |
10143526 | Walker et al. | Dec 2018 | B2 |
10145747 | Lin et al. | Dec 2018 | B1 |
10149720 | Romo | Dec 2018 | B2 |
10159532 | Ummalaneni et al. | Dec 2018 | B1 |
10159533 | Moll et al. | Dec 2018 | B2 |
10169875 | Mintz et al. | Jan 2019 | B2 |
10278778 | State | May 2019 | B2 |
10434660 | Meyer | Oct 2019 | B2 |
10492741 | Walker et al. | Oct 2019 | B2 |
10464209 | Ho et al. | Nov 2019 | B2 |
10470830 | Hill | Nov 2019 | B2 |
10482599 | Mintz et al. | Nov 2019 | B2 |
10517692 | Eyre et al. | Dec 2019 | B2 |
10524866 | Srinivasan | Jan 2020 | B2 |
10531864 | Wong et al. | Jan 2020 | B2 |
10539478 | Lin | Jan 2020 | B2 |
10555778 | Ummalaneni et al. | Feb 2020 | B2 |
10639114 | Schuh | May 2020 | B2 |
10667875 | DeFonzo | Jun 2020 | B2 |
10702348 | Moll et al. | Jul 2020 | B2 |
10743751 | Landey et al. | Aug 2020 | B2 |
10751140 | Wallace et al. | Aug 2020 | B2 |
10765303 | Graetzel et al. | Sep 2020 | B2 |
10765487 | Ho | Sep 2020 | B2 |
10820954 | Marsot et al. | Nov 2020 | B2 |
10835153 | Rafii-Tari et al. | Nov 2020 | B2 |
10850013 | Hsu | Dec 2020 | B2 |
11051681 | Roelle | Jul 2021 | B2 |
20010021843 | Bosselmann et al. | Sep 2001 | A1 |
20010039421 | Heilbrun | Nov 2001 | A1 |
20020065455 | Ben-Haim et al. | May 2002 | A1 |
20020077533 | Bieger et al. | Jun 2002 | A1 |
20020120188 | Brock et al. | Aug 2002 | A1 |
20030105603 | Hardesty | Jun 2003 | A1 |
20030125622 | Schweikard | Jul 2003 | A1 |
20030181809 | Hall et al. | Sep 2003 | A1 |
20030195664 | Nowlin et al. | Oct 2003 | A1 |
20040047044 | Dalton | Mar 2004 | A1 |
20040072066 | Cho et al. | Apr 2004 | A1 |
20040097806 | Hunter et al. | May 2004 | A1 |
20040186349 | Ewers | Sep 2004 | A1 |
20040249267 | Gilboa | Dec 2004 | A1 |
20040263535 | Birkenbach et al. | Dec 2004 | A1 |
20050027397 | Niemeyer | Feb 2005 | A1 |
20050060006 | Pflueger | Mar 2005 | A1 |
20050085714 | Foley et al. | Apr 2005 | A1 |
20050107679 | Geiger | May 2005 | A1 |
20050143649 | Minai et al. | Jun 2005 | A1 |
20050143655 | Satoh | Jun 2005 | A1 |
20050182295 | Soper et al. | Aug 2005 | A1 |
20050182319 | Glossop | Aug 2005 | A1 |
20050193451 | Quistgaard et al. | Sep 2005 | A1 |
20050197557 | Strommer et al. | Sep 2005 | A1 |
20050256398 | Hastings | Nov 2005 | A1 |
20050272975 | McWeeney et al. | Dec 2005 | A1 |
20060004286 | Chang | Jan 2006 | A1 |
20060013523 | Childers et al. | Jan 2006 | A1 |
20060015096 | Hauck et al. | Jan 2006 | A1 |
20060025668 | Peterson | Feb 2006 | A1 |
20060058643 | Florent | Mar 2006 | A1 |
20060084860 | Geiger | Apr 2006 | A1 |
20060095066 | Chang | May 2006 | A1 |
20060098851 | Shoham et al. | May 2006 | A1 |
20060100610 | Wallace et al. | May 2006 | A1 |
20060149134 | Soper et al. | Jul 2006 | A1 |
20060173290 | Lavallee et al. | Aug 2006 | A1 |
20060184016 | Glossop | Aug 2006 | A1 |
20060200026 | Wallace et al. | Sep 2006 | A1 |
20060209019 | Hu | Sep 2006 | A1 |
20060258935 | Pile-Spellman et al. | Nov 2006 | A1 |
20060258938 | Hoffman et al. | Nov 2006 | A1 |
20070013336 | Nowlin et al. | Jan 2007 | A1 |
20070032826 | Schwartz | Feb 2007 | A1 |
20070055128 | Glossop | Mar 2007 | A1 |
20070055144 | Neustadter | Mar 2007 | A1 |
20070073136 | Metzger | Mar 2007 | A1 |
20070083193 | Werneth | Apr 2007 | A1 |
20070123748 | Meglan | May 2007 | A1 |
20070135803 | Belson | Jun 2007 | A1 |
20070135886 | Maschke | Jun 2007 | A1 |
20070156019 | Larkin et al. | Jul 2007 | A1 |
20070161857 | Durant et al. | Jul 2007 | A1 |
20070167743 | Honda | Jul 2007 | A1 |
20070167801 | Webler et al. | Jul 2007 | A1 |
20070208252 | Makower | Sep 2007 | A1 |
20070253599 | White et al. | Nov 2007 | A1 |
20070265503 | Schlesinger et al. | Nov 2007 | A1 |
20070269001 | Maschke | Nov 2007 | A1 |
20070293721 | Gilboa | Dec 2007 | A1 |
20070299353 | Harley et al. | Dec 2007 | A1 |
20080027464 | Moll et al. | Jan 2008 | A1 |
20080071140 | Gattani | Mar 2008 | A1 |
20080079421 | Jensen | Apr 2008 | A1 |
20080082109 | Moll et al. | Apr 2008 | A1 |
20080103389 | Begelman et al. | May 2008 | A1 |
20080118118 | Berger | May 2008 | A1 |
20080118135 | Averbach | May 2008 | A1 |
20080123921 | Gielen et al. | May 2008 | A1 |
20080140087 | Barbagli et al. | Jun 2008 | A1 |
20080147089 | Loh | Jun 2008 | A1 |
20080161681 | Hauck | Jul 2008 | A1 |
20080183064 | Chandonnet | Jul 2008 | A1 |
20080183068 | Carls et al. | Jul 2008 | A1 |
20080183073 | Higgins et al. | Jul 2008 | A1 |
20080183188 | Carls et al. | Jul 2008 | A1 |
20080201016 | Finlay | Aug 2008 | A1 |
20080207997 | Higgins et al. | Aug 2008 | A1 |
20080212082 | Froggatt et al. | Sep 2008 | A1 |
20080218770 | Moll et al. | Sep 2008 | A1 |
20080221425 | Olson et al. | Sep 2008 | A1 |
20080243142 | Gildenberg | Oct 2008 | A1 |
20080255505 | Carlson et al. | Oct 2008 | A1 |
20080262297 | Gilboa | Oct 2008 | A1 |
20080275349 | Halperin | Nov 2008 | A1 |
20080275367 | Barbagli et al. | Nov 2008 | A1 |
20080287963 | Rogers et al. | Nov 2008 | A1 |
20080300478 | Zuhars et al. | Dec 2008 | A1 |
20080306490 | Lakin et al. | Dec 2008 | A1 |
20080312501 | Hasegawa et al. | Dec 2008 | A1 |
20090012533 | Barbagli et al. | Jan 2009 | A1 |
20090024141 | Stahler et al. | Jan 2009 | A1 |
20090030307 | Govari | Jan 2009 | A1 |
20090054729 | Mori | Feb 2009 | A1 |
20090076476 | Barbagli et al. | Mar 2009 | A1 |
20090137952 | Ramamurthy | May 2009 | A1 |
20090138025 | Stahler et al. | May 2009 | A1 |
20090149867 | Glozman | Jun 2009 | A1 |
20090209817 | Averbuch | Aug 2009 | A1 |
20090227861 | Ganatra | Sep 2009 | A1 |
20090228020 | Wallace et al. | Sep 2009 | A1 |
20090248036 | Hoffman et al. | Oct 2009 | A1 |
20090259230 | Khadem | Oct 2009 | A1 |
20090262109 | Markowitz et al. | Oct 2009 | A1 |
20090292166 | Ito | Nov 2009 | A1 |
20090295797 | Sakaguchi | Dec 2009 | A1 |
20100008555 | Trumer | Jan 2010 | A1 |
20100030061 | Canfield | Feb 2010 | A1 |
20100039506 | Sarvestani et al. | Feb 2010 | A1 |
20100041949 | Tolkowsky | Feb 2010 | A1 |
20100054536 | Huang | Mar 2010 | A1 |
20100113852 | Sydora | May 2010 | A1 |
20100114115 | Schlesinger et al. | May 2010 | A1 |
20100121139 | OuYang | May 2010 | A1 |
20100160733 | Gilboa | Jun 2010 | A1 |
20100161022 | Tolkowsky | Jun 2010 | A1 |
20100161129 | Costa et al. | Jun 2010 | A1 |
20100225209 | Goldberg | Sep 2010 | A1 |
20100240989 | Stoianovici | Sep 2010 | A1 |
20100290530 | Huang et al. | Nov 2010 | A1 |
20100292565 | Meyer | Nov 2010 | A1 |
20100298641 | Tanaka | Nov 2010 | A1 |
20100328455 | Nam et al. | Dec 2010 | A1 |
20100331856 | Carlson et al. | Dec 2010 | A1 |
20110015648 | Alvarez et al. | Jan 2011 | A1 |
20110054303 | Barrick | Mar 2011 | A1 |
20110092808 | Shachar | Apr 2011 | A1 |
20110152880 | Alvarez et al. | Jun 2011 | A1 |
20110184238 | Higgins | Jul 2011 | A1 |
20110234780 | Ito | Sep 2011 | A1 |
20110238082 | Wenderow | Sep 2011 | A1 |
20110238083 | Moll et al. | Sep 2011 | A1 |
20110245665 | Nentwick | Oct 2011 | A1 |
20110248987 | Mitchell | Oct 2011 | A1 |
20110249016 | Zhang | Oct 2011 | A1 |
20110257480 | Takahashi | Oct 2011 | A1 |
20110270273 | Moll et al. | Nov 2011 | A1 |
20110276179 | Banks et al. | Nov 2011 | A1 |
20110295247 | Schlesinger et al. | Dec 2011 | A1 |
20110295248 | Wallace et al. | Dec 2011 | A1 |
20110295267 | Tanner et al. | Dec 2011 | A1 |
20110295268 | Roelle et al. | Dec 2011 | A1 |
20110319910 | Roelle et al. | Dec 2011 | A1 |
20120046521 | Hunter et al. | Feb 2012 | A1 |
20120056986 | Popovic | Mar 2012 | A1 |
20120059248 | Noising | Mar 2012 | A1 |
20120062714 | Liu | Mar 2012 | A1 |
20120065481 | Hunter | Mar 2012 | A1 |
20120069167 | Liu et al. | Mar 2012 | A1 |
20120071782 | Patil et al. | Mar 2012 | A1 |
20120082351 | Higgins | Apr 2012 | A1 |
20120116253 | Wallace et al. | May 2012 | A1 |
20120120305 | Takahashi | May 2012 | A1 |
20120165656 | Montag | Jun 2012 | A1 |
20120172712 | Bar-Tal | Jul 2012 | A1 |
20120191079 | Moll et al. | Jul 2012 | A1 |
20120209069 | Popovic | Aug 2012 | A1 |
20120209293 | Carlson | Aug 2012 | A1 |
20120215094 | Rahimian et al. | Aug 2012 | A1 |
20120219185 | Hu | Aug 2012 | A1 |
20120289777 | Chopra | Nov 2012 | A1 |
20120289783 | Duindam et al. | Nov 2012 | A1 |
20120302869 | Koyrakh | Nov 2012 | A1 |
20130060146 | Yang et al. | Mar 2013 | A1 |
20130085330 | Ramamurthy et al. | Apr 2013 | A1 |
20130085331 | Ramamurthy et al. | Apr 2013 | A1 |
20130085333 | Ramamurthy et al. | Apr 2013 | A1 |
20130090528 | Ramamurthy et al. | Apr 2013 | A1 |
20130090530 | Ramamurthy | Apr 2013 | A1 |
20130090552 | Ramamurthy et al. | Apr 2013 | A1 |
20130144116 | Cooper et al. | Jun 2013 | A1 |
20130165945 | Roelle | Jun 2013 | A9 |
20130190741 | Moll et al. | Jul 2013 | A1 |
20130204124 | Duindam | Aug 2013 | A1 |
20130225942 | Holsing | Aug 2013 | A1 |
20130243153 | Sra | Sep 2013 | A1 |
20130246334 | Ahuja | Sep 2013 | A1 |
20130259315 | Angot et al. | Oct 2013 | A1 |
20130303892 | Zhao | Nov 2013 | A1 |
20130345718 | Crawford | Dec 2013 | A1 |
20140058406 | Tsekos | Feb 2014 | A1 |
20140072192 | Reiner | Mar 2014 | A1 |
20140114180 | Jain | Apr 2014 | A1 |
20140148808 | Inkpen et al. | Apr 2014 | A1 |
20140142591 | Alvarez et al. | May 2014 | A1 |
20140148673 | Bogusky | May 2014 | A1 |
20140180063 | Zhao | Jun 2014 | A1 |
20140235943 | Paris | Aug 2014 | A1 |
20140243849 | Saglam | Aug 2014 | A1 |
20140257746 | Dunbar et al. | Sep 2014 | A1 |
20140261453 | Carlson | Sep 2014 | A1 |
20140264081 | Walker et al. | Sep 2014 | A1 |
20140275988 | Walker et al. | Sep 2014 | A1 |
20140276033 | Brannan | Sep 2014 | A1 |
20140276594 | Tanner et al. | Sep 2014 | A1 |
20140276937 | Wong et al. | Sep 2014 | A1 |
20140296655 | Akhbardeh et al. | Oct 2014 | A1 |
20140296657 | Izmirli | Oct 2014 | A1 |
20140309527 | Namati et al. | Oct 2014 | A1 |
20140309649 | Alvarez et al. | Oct 2014 | A1 |
20140343416 | Panescu | Nov 2014 | A1 |
20140350391 | Prisco et al. | Nov 2014 | A1 |
20140357984 | Wallace et al. | Dec 2014 | A1 |
20140364739 | Liu | Dec 2014 | A1 |
20140364870 | Alvarez et al. | Dec 2014 | A1 |
20150051482 | Liu et al. | Feb 2015 | A1 |
20150051592 | Kintz | Feb 2015 | A1 |
20150054929 | Ito et al. | Feb 2015 | A1 |
20150057498 | Akimoto | Feb 2015 | A1 |
20150073266 | Brannan | Mar 2015 | A1 |
20150119638 | Yu et al. | Apr 2015 | A1 |
20150133963 | Barbagli | May 2015 | A1 |
20150141808 | Elhawary | May 2015 | A1 |
20150141858 | Razavi | May 2015 | A1 |
20150142013 | Tanner et al. | May 2015 | A1 |
20150164594 | Romo et al. | Jun 2015 | A1 |
20150164596 | Romo | Jun 2015 | A1 |
20150223725 | Engel | Aug 2015 | A1 |
20150223765 | Chopra | Aug 2015 | A1 |
20150223897 | Kostrzewski et al. | Aug 2015 | A1 |
20150223902 | Walker et al. | Aug 2015 | A1 |
20150255782 | Kim et al. | Sep 2015 | A1 |
20150265087 | Park | Sep 2015 | A1 |
20150265359 | Camarillo | Sep 2015 | A1 |
20150265368 | Chopra | Sep 2015 | A1 |
20150275986 | Cooper | Oct 2015 | A1 |
20150287192 | Sasaki | Oct 2015 | A1 |
20150297133 | Jouanique-Dubuis et al. | Oct 2015 | A1 |
20150305650 | Hunter | Oct 2015 | A1 |
20150313503 | Seibel et al. | Nov 2015 | A1 |
20150335480 | Alvarez et al. | Nov 2015 | A1 |
20150374956 | Bogusky | Dec 2015 | A1 |
20160000302 | Brown | Jan 2016 | A1 |
20160000414 | Brown | Jan 2016 | A1 |
20160000520 | Lachmanovich | Jan 2016 | A1 |
20160001038 | Romo et al. | Jan 2016 | A1 |
20160008033 | Hawkins et al. | Jan 2016 | A1 |
20160067009 | Ramamurthy et al. | Mar 2016 | A1 |
20160111192 | Suzara | Apr 2016 | A1 |
20160128992 | Hudson | May 2016 | A1 |
20160183841 | Duindam et al. | Jun 2016 | A1 |
20160199134 | Brown et al. | Jul 2016 | A1 |
20160206389 | Miller | Jul 2016 | A1 |
20160213432 | Flexman | Jul 2016 | A1 |
20160228032 | Walker et al. | Aug 2016 | A1 |
20160270865 | Landey et al. | Sep 2016 | A1 |
20160287279 | Bovay et al. | Oct 2016 | A1 |
20160287346 | Hyodo et al. | Oct 2016 | A1 |
20160314710 | Jarc | Oct 2016 | A1 |
20160331469 | Hall et al. | Nov 2016 | A1 |
20160360947 | Lida | Dec 2016 | A1 |
20160372743 | Cho et al. | Dec 2016 | A1 |
20160374541 | Agrawal et al. | Dec 2016 | A1 |
20170007337 | Dan | Jan 2017 | A1 |
20170023423 | Jackson | Jan 2017 | A1 |
20170055851 | Al-Ali | Mar 2017 | A1 |
20170079725 | Hoffman | Mar 2017 | A1 |
20170079726 | Hoffman | Mar 2017 | A1 |
20170086929 | Moll et al. | Mar 2017 | A1 |
20170100199 | Yu et al. | Apr 2017 | A1 |
20170119413 | Romo | May 2017 | A1 |
20170119481 | Romo et al. | May 2017 | A1 |
20170119484 | Tanner et al. | May 2017 | A1 |
20170165011 | Bovay et al. | Jun 2017 | A1 |
20170172673 | Yu et al. | Jun 2017 | A1 |
20170189118 | Chopra | Jul 2017 | A1 |
20170202627 | Sramek et al. | Jul 2017 | A1 |
20170209073 | Sramek et al. | Jul 2017 | A1 |
20170209224 | Walker et al. | Jul 2017 | A1 |
20170215808 | Shimol et al. | Aug 2017 | A1 |
20170215969 | Zhai et al. | Aug 2017 | A1 |
20170215978 | Wallace et al. | Aug 2017 | A1 |
20170238807 | Veritkov et al. | Aug 2017 | A9 |
20170258366 | Tupin | Sep 2017 | A1 |
20170290631 | Lee et al. | Oct 2017 | A1 |
20170296032 | Li | Oct 2017 | A1 |
20170296202 | Brown | Oct 2017 | A1 |
20170303941 | Eisner | Oct 2017 | A1 |
20170325896 | Donhowe | Nov 2017 | A1 |
20170333679 | Jiang | Nov 2017 | A1 |
20170340241 | Yamada | Nov 2017 | A1 |
20170340396 | Romo et al. | Nov 2017 | A1 |
20170348067 | Krimsky | Dec 2017 | A1 |
20170360508 | Germain et al. | Dec 2017 | A1 |
20170367782 | Schuh et al. | Dec 2017 | A1 |
20180025666 | Ho et al. | Jan 2018 | A1 |
20180055576 | Koyrakh | Mar 2018 | A1 |
20180055582 | Krimsky | Mar 2018 | A1 |
20180098690 | Iwaki | Apr 2018 | A1 |
20180177383 | Noonan et al. | Jun 2018 | A1 |
20180177556 | Noonan et al. | Jun 2018 | A1 |
20180214011 | Graetzel et al. | Aug 2018 | A1 |
20180217734 | Koenig et al. | Aug 2018 | A1 |
20180221038 | Noonan et al. | Aug 2018 | A1 |
20180221039 | Shah | Aug 2018 | A1 |
20180240237 | Donhowe et al. | Aug 2018 | A1 |
20180250083 | Schuh et al. | Sep 2018 | A1 |
20180263714 | Kostrzewski | Sep 2018 | A1 |
20180271616 | Schuh et al. | Sep 2018 | A1 |
20180279852 | Rafii-Tari et al. | Oct 2018 | A1 |
20180280660 | Landey et al. | Oct 2018 | A1 |
20180286108 | Hirakawa | Oct 2018 | A1 |
20180289243 | Landey et al. | Oct 2018 | A1 |
20180289431 | Draper et al. | Oct 2018 | A1 |
20180308232 | Gliner | Oct 2018 | A1 |
20180308247 | Gupta | Oct 2018 | A1 |
20180325499 | Landey et al. | Nov 2018 | A1 |
20180333044 | Jenkins | Nov 2018 | A1 |
20180360435 | Romo | Dec 2018 | A1 |
20180368920 | Ummalaneni | Dec 2018 | A1 |
20190000559 | Berman et al. | Jan 2019 | A1 |
20190000560 | Berman et al. | Jan 2019 | A1 |
20190000566 | Graetzel et al. | Jan 2019 | A1 |
20190000568 | Connolly et al. | Jan 2019 | A1 |
20190000576 | Mintz et al. | Jan 2019 | A1 |
20190046814 | Senden et al. | Feb 2019 | A1 |
20190066314 | Abhari | Feb 2019 | A1 |
20190086349 | Nelson | Mar 2019 | A1 |
20190110839 | Rafii-Tari et al. | Apr 2019 | A1 |
20190151148 | Alvarez et al. | Apr 2019 | A1 |
20190142519 | Siemionow et al. | May 2019 | A1 |
20190167366 | Ummalaneni | Jun 2019 | A1 |
20190175009 | Mintz | Jun 2019 | A1 |
20190175062 | Rafii-Tari et al. | Jun 2019 | A1 |
20190175799 | Hsu | Jun 2019 | A1 |
20190183585 | Rafii-Tari et al. | Jun 2019 | A1 |
20190183587 | Rafii-Tari et al. | Jun 2019 | A1 |
20190216548 | Ummalaneni | Jul 2019 | A1 |
20190216576 | Eyre | Jul 2019 | A1 |
20190223974 | Romo | Jul 2019 | A1 |
20190228525 | Mintz et al. | Jul 2019 | A1 |
20190262086 | Connolly et al. | Aug 2019 | A1 |
20190269468 | Hsu et al. | Sep 2019 | A1 |
20190274764 | Romo | Sep 2019 | A1 |
20190287673 | Michihata | Sep 2019 | A1 |
20190290109 | Agrawal et al. | Sep 2019 | A1 |
20190298160 | Ummalaneni et al. | Oct 2019 | A1 |
20190298460 | Al-Jadda | Oct 2019 | A1 |
20190298465 | Chin | Oct 2019 | A1 |
20190336238 | Yu | Nov 2019 | A1 |
20190365201 | Noonan et al. | Dec 2019 | A1 |
20190365209 | Ye et al. | Dec 2019 | A1 |
20190365479 | Rafii-Tari | Dec 2019 | A1 |
20190365486 | Srinivasan et al. | Dec 2019 | A1 |
20190375383 | Alvarez | Dec 2019 | A1 |
20190380787 | Ye | Dec 2019 | A1 |
20190380797 | Yu | Dec 2019 | A1 |
20200000533 | Schuh | Jan 2020 | A1 |
20200022767 | Hill | Jan 2020 | A1 |
20200038123 | Graetzel | Feb 2020 | A1 |
20200039086 | Meyer | Feb 2020 | A1 |
20200046434 | Graetzel | Feb 2020 | A1 |
20200054408 | Schuh et al. | Feb 2020 | A1 |
20200060516 | Baez | Feb 2020 | A1 |
20200078103 | Duindam | Mar 2020 | A1 |
20200085516 | DeFonzo | Mar 2020 | A1 |
20200093549 | Chin | Mar 2020 | A1 |
20200093554 | Schuh | Mar 2020 | A1 |
20200100845 | Julian | Apr 2020 | A1 |
20200100855 | Leparmentier | Apr 2020 | A1 |
20200101264 | Jiang | Apr 2020 | A1 |
20200107894 | Wallace | Apr 2020 | A1 |
20200121502 | Kintz | Apr 2020 | A1 |
20200146769 | Eyre | May 2020 | A1 |
20200155084 | Walker | May 2020 | A1 |
20200170630 | Wong | Jun 2020 | A1 |
20200170720 | Ummalaneni | Jun 2020 | A1 |
20200171660 | Ho | Jun 2020 | A1 |
20200188043 | Yu | Jun 2020 | A1 |
20200197112 | Chin | Jun 2020 | A1 |
20200206472 | Ma | Jul 2020 | A1 |
20200217733 | Lin | Jul 2020 | A1 |
20200222134 | Schuh | Jul 2020 | A1 |
20200237458 | DeFonzo | Jul 2020 | A1 |
20200261172 | Romo | Aug 2020 | A1 |
20200268459 | Noonan et al. | Aug 2020 | A1 |
20200268460 | Tse | Aug 2020 | A1 |
20200281787 | Ruiz | Sep 2020 | A1 |
20200297437 | Schuh | Sep 2020 | A1 |
20200297444 | Camarillo | Sep 2020 | A1 |
20200305983 | Yampolsky | Oct 2020 | A1 |
20200305989 | Schuh | Oct 2020 | A1 |
20200305992 | Schuh | Oct 2020 | A1 |
20200315717 | Bovay | Oct 2020 | A1 |
20200315723 | Hassan | Oct 2020 | A1 |
20200323596 | Moll | Oct 2020 | A1 |
20200330167 | Romo | Oct 2020 | A1 |
20200345216 | Jenkins | Nov 2020 | A1 |
20200352420 | Graetzel | Nov 2020 | A1 |
20200360183 | Alvarez | Nov 2020 | A1 |
20200367726 | Landey et al. | Nov 2020 | A1 |
20200367981 | Ho et al. | Nov 2020 | A1 |
20200375678 | Wallace | Dec 2020 | A1 |
20200405317 | Wallace | Dec 2020 | A1 |
20200405411 | Draper et al. | Dec 2020 | A1 |
20200405419 | Mao | Dec 2020 | A1 |
20200405420 | Purohit | Dec 2020 | A1 |
20200405423 | Schuh | Dec 2020 | A1 |
20200405424 | Schuh | Dec 2020 | A1 |
20200405434 | Schuh | Dec 2020 | A1 |
20200406002 | Romo | Dec 2020 | A1 |
20210007819 | Schuh | Jan 2021 | A1 |
20210008341 | Landey et al. | Jan 2021 | A1 |
Number | Date | Country |
---|---|---|
101147676 | Mar 2008 | CN |
101222882 | Jul 2008 | CN |
102316817 | Jan 2012 | CN |
102946801 | Feb 2013 | CN |
102973317 | Mar 2013 | CN |
103705307 | Apr 2014 | CN |
103735313 | Apr 2014 | CN |
103813748 | May 2014 | CN |
104758066 | Jul 2015 | CN |
105559850 | May 2016 | CN |
105559886 | May 2016 | CN |
105611881 | May 2016 | CN |
106455908 | Feb 2017 | CN |
106821498 | Jun 2017 | CN |
104931059 | Sep 2018 | CN |
3025630 | Jun 2016 | EP |
20140009359 | Jan 2014 | KR |
101713676 | Mar 2017 | KR |
2569699 | Nov 2015 | RU |
WO 2005087128 | Sep 2005 | WO |
WO 2006051523 | May 2006 | WO |
WO 2006099056 | Sep 2006 | WO |
WO 2009097461 | Jun 2007 | WO |
102458295 | May 2012 | WO |
WO 2013116140 | Aug 2013 | WO |
WO 2014058838 | Apr 2014 | WO |
WO 2015089013 | Jun 2015 | WO |
WO 2016077419 | May 2016 | WO |
WO 2016203727 | Dec 2016 | WO |
WO 2017030916 | Feb 2017 | WO |
WO 2017036774 | Mar 2017 | WO |
WO 2017048194 | Mar 2017 | WO |
WO 2017066108 | Apr 2017 | WO |
WO 2017146890 | Aug 2017 | WO |
WO 2017167754 | Oct 2017 | WO |
Entry |
---|
Al-Ahmad, Amin, Jessica D. Grossman, and Paul J. Wang. “Early experience with a computerized robotically controlled catheter system.” Journal of Interventional Cardiac Electrophysiology 12 (2005): 199-202. |
Bell, Charreau S., et al. “Six DOF motion estimation for teleoperated flexible endoscopes using optical flow: A comparative study.” 2014 IEEE International Conference on Robotics and Automation (ICRA). IEEE, 2014. |
Ciuti, Gastone, et al. “Intra-operative monocular 3D reconstruction for image-guided navigation in active locomotion capsule endoscopy.” 2012 4th IEEE RAS & EMBS International Conference on Biomedical Robotics and Biomechatronics (BioRob). IEEE, 2012. |
Duncan, Roger. “Sensing Shape: Fiber-Bragg-grating sensor arrays monitor shape at a high resolution.” Spie's OE Magazine (2005): 18-21. |
Fallavollita, Pascal. “Acquiring multiview c-arm images to assist cardiac ablation procedures.” EURASIP Journal on Image and Video Processing 2010 (2010): 1-10. |
Froggatt, Mark, and Jason Moore. “High-spatial-resolution distributed strain measurement in optical fiber with Rayleigh scatter.” Applied optics 37.10 (1998): 1735-1740. |
Gutiérrez, Luis F., et al. “A practical global distortion correction method for an image intensifier based x-ray fluoroscopy system.” Medical physics 35.3 (2008): 997-1007. |
Haigron, Pascal, et al. “Depth-map-based scene analysis for active navigation in virtual angioscopy.” IEEE Transactions on Medical Imaging 23.11 (2004): 1380-1390. |
Hansen Medical, Inc. 2005, System Overview, product brochure, 2 pp., dated as available at http://hansenmedical.com/system.aspx on Jul. 14, 2006 (accessed Jun. 25, 2019, using the internet archive way back machine). |
Hansen Medical, Inc. Bibliography, product brochure, 1 p., dated as available at http://hansenmedical.com/bibliography.aspx on Jul. 14, 2006 (accessed Jun. 25, 2019, using the internet archive way back machine). |
Hansen Medical, Inc. dated 2007, Introducing the Sensei Robotic Catheter System, product brochure, 10 pages. |
Hansen Medical, Inc. dated 2009, Sensei X Robotic Catheter System, product brochure, 5 pp. |
Hansen Medical, Inc. Technology Advantages, product brochure, 1 p., dated as available at http://hansenmedical.com/advantages.aspx on Jul. 13, 2006 (accessed Jun. 25, 2019, using the internet archive way back machine). |
http://www.sjmprofessional.com-Products-US-Mapping-and-Visualization-EnSite-Velocity.aspx. |
Kiraly, Atilla P., et al. “Three-dimensional human airway segmentation methods for clinical virtual bronchoscopy.” Academic radiology 9.10 (2002): 1153-1168. |
Kiraly, Atilla P., et al. “Three-dimensional path planning for virtual bronchoscopy.” IEEE Transactions on Medical Imaging 23.11 (2004): 1365-1379. |
Konen, W., M. Scholz, and S. Tombrock. “The VN project: endoscopic image processing for neurosurgery.” Computer Aided Surgery 3.3 (1998): 144-148. |
Kumar, Atul, et al. “Stereoscopic visualization of laparoscope image using depth information from 3D model.” Computer methods and programs in biomedicine 113.3 (2014): 862-868. |
Livatino, Salvatore, et al. “Stereoscopic visualization and 3-D technologies in medical endoscopic teleoperation.” IEEE Transactions on Industrial Electronics 62.1 (2014): 525-535. |
Luó, Xióngbião, et al. “Modified hybrid bronchoscope tracking based on sequential monte carlo sampler: Dynamic phantom validation.” Computer Vision—ACCV 2010: 10th Asian Conference on Computer Vision, Queenstown, New Zealand, Nov. 8-12, 2010, Revised Selected Papers, Part III 10. Springer Berlin Heidelberg, 2011. |
Marrouche, Nassir F., et al. “Preliminary human experience using a novel robotic catheter remote control.” Heart Rhythm 2.5 (2005): S63. |
Mayo Clinic, Robotic Surgery, https://www.mayoclinic.org/tests-procedures/robotic-surgery/about/pac-20394974?p=1, downloaded from the internet on Jul. 12, 2018, 2 pages. |
Mourgues, Fabien, Eve Coste-Maniere, and CHIR Team www. inria. fr/chir. “Flexible calibration of actuated stereoscopic endoscope for overlay in robot assisted surgery.” Medical Image Computing and Computer-Assisted Intervention—MICCAI 2002: 5th International Conference Tokyo, Japan, Sep. 25-28, 2002 Proceedings, Part I 5. Springer Berlin Heidelberg, 2002. |
Nadeem, Saad, and Arie Kaufman. “Depth reconstruction and computer-aided polyp detection in optical colonoscopy video frames.” arXiv preprint arXiv:1609.01329 (2016). |
Oh, Seil, et al. “Novel robotic catheter remote control system: Safety and accuracy in delivering RF lesions in all 4 cardiac chambers.” Heart Rhythm 2.5 (2005): S277-S278. |
Point Cloud, Sep. 10, 2010, Wikipedia, 2 pages. |
Racadio, John M., et al. “Live 3D guidance in the interventional radiology suite.” American Journal of Roentgenology 189.6 (2007): W357-W364. |
Reddy, Vivek Y., et al. “Porcine pulmonary vein ablation using a novel robotic catheter control system and real-time integration of CT imaging with electroanatomical mapping.” Heart Rhythm 2.5 (2005): S121. |
Ren, Hongliang, et al. “Multisensor data fusion in an integrated tracking system for endoscopic surgery.” IEEE Transactions on Information Technology in Biomedicine 16.1 (2011): 106-111. |
Sato, Masaaki, Tomonori Murayama, and Jun Nakajima. “Techniques of stapler-based navigational thoracoscopic segmentectomy using virtual assisted lung mapping (VAL-MAP).” Journal of thoracic disease 8.Suppl 9 (2016): S716. |
Shen, Mali, Stamatia Giannarou, and Guang-Zhong Yang. “Robust camera localisation with depth reconstruction for bronchoscopic navigation.” International journal of computer assisted radiology and surgery 10 (2015): 801-813. |
Shi, Chaoyang, et al. “Simultaneous catheter and environment modeling for trans-catheter aortic valve implantation.” 2014 IEEE/RSJ International Conference on Intelligent Robots and Systems. IEEE, 2014. |
Slepian, dated 2010, Robotic Catheter Intervention: the Hansen Medical Sensei Robot Catheter System, PowerPoint presentation, 28 pages. |
Solheim, Ole, et al. “Navigated resection of giant intracranial meningiomas based on intraoperative 3D ultrasound.” Acta neurochirurgica 151 (2009): 1143-1151. |
Solomon, Stephen B., et al. “Three-dimensional CT-guided bronchoscopy with a real-time electromagnetic position sensor: a comparison of two image registration methods.” Chest 118.6 (2000): 1783-1787. |
Song, Kai-Tai, and Chun-Ju Chen. “Autonomous and stable tracking of endoscope instrument tools with monocular camera.” 2012 IEEE/ASME International Conference on Advanced Intelligent Mechatronics(AIM). IEEE, 2012. |
Vemuri, Anant Suraj, et al. “Interoperative biopsy site relocalization in endoluminal surgery.” IEEE Transactions on Biomedical Engineering 63.9 (2015): 1862-1873. |
Verdaasdonk, R. M., et al. “Effect of microsecond pulse length and tip shape on explosive bubble formation of 2.78 μm Er,Cr;YSGG and 2.94 μm Er:YAG laser.” Proceedings of SPIE, vol. 8221, 12. |
Wilson, Emmanuel, et al. “A buyer's guide to electromagnetic tracking systems for clinical applications.” Medical imaging 2008: visualization, image-guided procedures, and modeling. vol. 6918. SPIE, 2008. |
Yip, Michael C., et al. “Tissue tracking and registration for image-guided surgery.” IEEE transactions on medical imaging 31.11 (2012): 2169-2182. |
Zhou, Jin, et al. “Synthesis of stereoscopic views from monocular endoscopic videos.” 2010 IEEE Computer Society Conference on Computer Vision and Pattern Recognition—Workshops. IEEE, 2010. |
Extended European Search Report dated Jul. 28, 2017, for Application No. 11799017.6. |
Number | Date | Country | |
---|---|---|---|
20210353129 A1 | Nov 2021 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 14164961 | Jan 2014 | US |
Child | 16198602 | US | |
Parent | 12823032 | Jun 2010 | US |
Child | 14164961 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 16198602 | Nov 2018 | US |
Child | 17340641 | US |