Flexible instrument localization from both remote and elongation sensors

Information

  • Patent Grant
  • 11426095
  • Patent Number
    11,426,095
  • Date Filed
    Monday, February 29, 2016
    8 years ago
  • Date Issued
    Tuesday, August 30, 2022
    a year ago
Abstract
A system and method of tracking a flexible elongate instrument within a patient is disclosed herein. The system is configured to obtain remote localization measurement data of the flexible elongate instrument and obtain elongation measurement data of the flexible elongate instrument. This data is combined and transformed to a coordinate reference frame to produce a localization of the flexible elongate instrument that is more accurate than the remote localization measurements or elongation measurement data alone. The combined localization is then provided to a localization consumer.
Description
TECHNICAL FIELD

The proposed disclosure utilizes techniques to determine the location, orientation, and shape of a flexible device by utilizing remote localization techniques (such as fluoroscopy, electromagnetic sensors, etc.) with elongation measurements (such as drive wire displacements, tendon tension, etc.) to produce effective localization of the flexible device in a target coordinate frame.


BACKGROUND

Currently known minimally invasive procedures for diagnosis and treatment of medical conditions use shapeable instruments, such as steerable devices, flexible catheters or more rigid arms or shafts, to approach and address various tissue structures within the body. For various reasons, it is highly valuable to be able to determine the 3-dimensional spatial position of portions of such shapeable instruments relative to other structures, such as the operating table, other instruments, or pertinent anatomical tissue structures. Such information can be used for a variety of reasons, including, but not limited to: improve device control; to improve mapping of the region; to adapt control system parameters (whether kinematic and/or solid mechanic parameters); to estimate, plan and/or control reaction forces of the device upon the anatomy; and/or to even monitor the system characteristics for determination of mechanical problems. Alternatively, or in combination, shape information can be useful to simply visualize the tool with respect to the anatomy or other regions whether real or virtual.


However, a primary difficulty in using flexible devices is the inability to determine the location and/or shape of the flexible device within the body. In non-flexible, discrete devices, such detection and monitoring tasks may be accomplished with encoders or other techniques (such as visual tracking of a portion of the device external to the body) that utilized the rigid nature of linkages. Such conventional techniques are not practical with flexible device as such devices contain non-rigid linkages or too many individual linkages to effectively determine location and/or shape of the flexible device.


There remains a need to apply the information gained by the localization techniques to determine the location, orientation and shape of a flexible device and applying this information to produce improved device control or improved modeling when directing a robotic or similar device. There also remains a need to apply such controls to medical procedures and equipment.


SUMMARY

A system and method of tracking a flexible elongate instrument within a patient is disclosed herein. The system is configured to obtain remote localization measurement data of the flexible elongate instrument and obtain elongation measurement data of the flexible elongate instrument. This data is combined and transformed to a coordinate reference frame to produce a localization of the flexible elongate instrument that is more accurate than the remote localization measurements or elongation alone. The combined localization is then provided to a localization consumer.





BRIEF DESCRIPTION OF THE DRAWINGS


FIG. 1 is an illustration of various coordinate frames of an elongation element and a global reference frame;



FIG. 2 is a schematic illustration of a system for localization of a flexible instrument using a remote localization measurement and an elongation measurement;



FIG. 3 is a map of remote localization technologies;



FIG. 4A is an illustration of a flexible device with actuation elements terminated distally, in an unarticulated configuration;



FIG. 4B is an illustration of the flexible device of FIG. 4A in an articulated configuration;



FIG. 5 is a schematic illustration of the flexible device of FIG. 4B with respect to an imaging plane;



FIG. 6 illustrates how separate orientation frames may be registered to each other by correlating motion of a common object in each;



FIG. 7 is a schematic illustration of a system for localization of a flexible instrument using data from a 5 degrees of freedom electromagnetic sensor with measured pullwire position data;



FIG. 8 is a schematic illustration of a system for localization of a flexible instrument using fluoroscopy imaging data with measured pullwire position data;



FIG. 9 is a schematic illustration of a system for localization of a flexible instrument using active contour tracking and template matching with linear wire model data;



FIG. 10A-C is a schematic illustration of ambiguities that may occur in and out of an imaging plane;



FIG. 11 is a schematic illustration of an particle filter for filter image and elongation measurements;



FIG. 12 is a schematic illustration of a system for localization of a flexible instrument using electromagnetic localization with linear wire model data;



FIG. 13 is a schematic illustration of a system for using elongation measurement with electromagnetic localization data to determine roll estimation of a flexible instrument;



FIG. 14A illustrates elongation measurements of a flexible device taken over time;



FIG. 14B is an illustration of an anatomical map of a vessel; and



FIG. 14C is an illustration of localization of a device within a map.





DETAILED DESCRIPTION OF THE INVENTION

Overview of Flexible Device Localization


With reference to FIG. 1, localization of a flexible device 10 will be described. Localization of a flexible device is determining a mapping from one or more device coordinate frames for a flexible device 10 to a global target frame. More specifically, the flexible device 10 of FIG. 1 is defined in terms of localization by one or more Cartesian device reference frames, such as, for example, coordinate frames CDF1, CDF2, CDF3, CDF4. These reference frames are typically defined by three-dimensions of position and three dimensions of orientation. Thus, the goal of localization is to determine the relationship, or mapping between the local device reference frames (i.e., CDF1, CDF2, CDF3, CDF4) and a fixed, global or target reference frame GF in relation to the task to be performed. The mapping is performed using a combination of one or more remote localization measurements LM (which will be referenced to the global reference frame) and an elongation measurement EM (which will be referenced to the device itself).


An exemplary technique of utilizing a remote localization measurement with elongation measurements are particularly useful when the remote localization measurements are insufficient to fully describe the position, orientation, and shape of the flexible device (i.e., mapping for each device reference frame) or remote localization measurements are not accurate enough as a standalone localization technique.


The basic structure for the proposed localization technique 100 is illustrated in FIG. 2. The proposed localization technique 100 utilizes both remote localization measurements 110 and elongation measurements 112. Elements of technique 100 are discussed in further detail below.


The flexible device 10 may be any device that contains continuously articulating joints (including, but not limited to, a manual catheter, robot catheter, etc.). Alternatively, the flexible device 10 may be configured as a device that includes a substantial number of rigid linkages that makes modeling kinematics of the device 10 difficult and unmanageable. The device 10 may include one more sensors 12. The device 10 may also include one or more elongation elements 14. Sensors 12 and elongation elements 14 will be described below in further detail.


A remote localization device is any signal from a sensor 12 that is positioned outside of the flexible device 10 that is registered to the global reference frame GRF. An example of a remote localization device is a fluoroscopy system 16, shown in FIG. 1. The fluoroscopy view provides the shape and location of flexible device 10, although certain information, such as depth, cannot be measured. The remote localization device can accurately sense the shape of the flexible device 10, usually through the use of markers or sensors mounted on the flexible device 10, as well as sensing the flexible device 10 in relation to a reference frame outside of the flexible device 10. Various technologies that may be used for remote localization are discussed in further detail below.


As depicted in the schematic of FIG. 1, an elongation element 14 is an axially disposed element located within the flexible device 10. Elongation element 14 may take various forms, but its primary purpose is to measure differences in an axial path length APL between a reference path RP within the flexible device 10. In one exemplary configuration, the reference path RP is a center axis of the flexible device 10 and a secondary path is positioned off the center axis. One exemplary configuration of an elongation element 14 is a tendon that may be used to articulate the flexible device 10. However, it is understood that any element that allows a differential measurement between two axial paths of the flexible device 10 may be used to calculate an elongation measurement. The primary use of the differential measurement from an elongation element is to determine the approximate shape and heading of the flexible device 10.


It is also understood that an elongation measurement is not limited to displacement measurements from stretching. Alternatively, elongation measurements such as compression, tension in an element fixed at either the proximal or distal end of the flexible device 10, or displacement of fluid in a tube may be used. Elongations measurements are discussed in further detail below.


Because the information gleaned from the sensors 12 and the elongation elements 14 are disparate streams of data, a signal processing system 114 that can combine the two data streams into a single localization signal is proposed. There are a variety of methods that may be used with the signal processing system 114. However, a primary element of the signal processing system 114 is a method of representing the possible shapes of the device in the global reference frame and using sensor information to reduce the number of possibilities. Filters, estimation, and modeling are suitable techniques for accomplishing effective mapping between position and orientation of multiple points on the device and the global reference frame GRF. In addition, commands for a flexible robot may be incorporated into the signal processing system to increase accuracy.


The signal processing system 114 is coupled to a localization consumer 116. The localization consumer 116 is the eventual user of the relationship between the flexible device 10 and the global reference frame GRF. Non-limiting examples include a robot control system, a sensing system for recording a surgical procedure, or a graphical display utilized by a human operator.


Remote Localization Overview


As discussed above, a number of technologies may be used to determine the shape and position of a flexible device 10 with respect to a fixed reference frame GRF using sensors 12. In the context of medical devices, including, but not limited to medical catheters, the reference frame may be defined by a part of the patient's anatomy, the table upon which the patient lies, or the world. Complete configuration of a rigid three-dimensional object may be specified by six degrees of freedom. These degrees of freedom are described by three positional coordinates (e.g., the x, y, and z coordinates of a Cartesian frame) and three rotational coordinates (e.g., roll, pitch, and yaw).


Depending on which type, and the number of sensors 12, a localization system may measure the configuration of a flexible device 10 with respect to all six degrees of freedom, or a subset thereof. For example, a 2 dimensional sensor may determine the position of a given point on the flexible device 10 as projected onto a fixed plane. A 3 dimensional sensor may determine the complete 3D position of a given point on a flexible device 10, but not the rotation of the flexible device 10. A 2 degrees of freedom directional sensor may measure the heading direction (pitch and yaw) of the flexible device 10, but not its roll about that axis. A five degrees of freedom sensor may measure both the position and heading of an object. A five degrees of freedom localization system can constructed from two 3D positional sensors, with the heading direction determined by the vector between them. In a sufficiently constrained anatomy, a 3 degrees of freedom position sensor may be combined with a 3D map (generated from a preoperative imaging scan, such as a CT scan), registered to the same reference frame as the sensor, in order to estimate the heading direction of flexible device 10 (modulo a 180 degree rotation) under the assumption that the flexible device 10 must be in the direction allowed locally by the anatomical map. A six degrees of freedom sensor may full specify the position, roll, pitch, and yaw of the flexible device 10.


As demonstrated in FIG. 3, a variety of physical phenomena may form the basis for a localization system. Many systems operate by determining a distance of one or more sensors from reference locations and triangulating a position based on that data. For example, some systems determine the distance by measuring impedance of an electric field. In one known system, electrodes on a flexible catheter and a plurality electrical patches placed on the patient's body are used to compute the 3D position of at least two electrodes, thereby resolving five degrees of freedom. In another known system, six degrees of freedom are resolved by using magnets and a sensor that measures the strength of the magnetic field. In still another known system, time-of-flight waves between emitters and receivers are measured. For example, for GPS, radio waves may be used. In ultra-sound based systems, pressure waves may be used.


An imaging system (such as a fluoroscopy system 16) may also be used to localize a flexible device 10 with respect to its camera frame when coupled with an algorithm that can detect one or more specified points of the flexible device 10 within the image. Certain imageable materials (such as radio-opaque markers) may be used to enable the specified points to stand out on the image. If only a single point is tracked by the algorithm, a 2D position of the object as projected onto the imaging plane may be determined. If two points are tracked, one rotational degree of freedom may also be computed. More specifically, the heading direction as projected onto the imaging plane may be determined. The distance between the two points, distinct texturing of the markers, and/or simultaneous imaging and tracking in another image plane can be used to determine the heading direction in or out of the original imaging plane. The simultaneously imaging and tracking in different planes may also be used to resolve the third positional coordinate. Further, distinctive texturing of the markers may also be used to estimate the final rotational degree of freedom, i.e., roll.


While remote localization sensors may aide in determining some information about the location of a flexible device 10 disposed within a body, often the sensors are unable to capture important information. For example, as discussed above, a fluoroscopy image is usually a single, planar projection. Thus depth information is not present. As another example, coil electromagnetic sensors disposed within the flexible device may not be able to sense roll about a principal axis. Thus, a supplemental source of information to compliment the information obtainable by remote localization sensors may lead to greater accuracy in localization.


Elongation Measurements Overview


As illustrated in FIG. 2, information about deformation of a flexible device 10, which may be obtained from elongation sensors, may be used to supplement information obtained by remote localization sensors. The deformation information may be transmitted for the length of the flexible device 10 by a member that may be configured to elongate or contract based on its relative placement to the centroid of the flexible device 10 as the flexible device 10 articulates.


In a catheter, for example, one exemplary source of elongation information would be actuation elements 14 that move to selectively articulate the catheter. Referring to FIGS. 4A-4B, a catheter 100 includes at least a pair of tendons 114 that each include a first end 116 that is terminated distally. As the first end 116 is displaced proximally, illustrated in FIG. 4B, a distal motion results. Thus, for the catheter 100 to bend, the material at the circumference of the catheter 100 must either elongate or compress. The motion of the tendon 114 gives an indication of the degree of elongation, and therefore the extent of the articulation. This information can assist in predicting the location and/or orientation of the catheter 100 within the body, especially if the bending motion flexible device 10 is out of a fluoroscopy plane, as illustrated in FIG. 5. Indeed, as may be seen in FIG. 5, for a catheter motion out of the fluoroscopy plane 120, only a small catheter tip motion will be observed on an image panel 122 as the bulk of the motion is toward the fluoroscopy plane 120.


In the case of a catheter 100 being oriented in a straight line within a patient, the pitch and yaw angles of the catheter 100 can be related linearly to the tendon 114 displacements, as represented by the following formula:

∝=f(y1,y2).   [1]


For more complicated anatomy where the catheter 100 will curve and twist, external remote localization may assist with a global reference frame GRF, whereas elongation measurement may be desirable for local differential measurements.


Displacement of a tendon 114 may be indicative of elongation on the circumference of the catheter 100. However there may be some components that must be compensated for. As a non-limiting example, if the tendon 114 is under load and stretches, an indication of its force may be used to remove its own strain. Similarly, the total force on the catheter 100 may indicate compression of the catheter 100 that must be compensated for since the differential elongation of the catheter 100 from one side to the other side is the quantity of interest. If multiple tendons 114 are used, their respective differential displacement information may be sufficient to indicate orientation.


Although tendons 114 are one example as a source of elongation information, there are other potential sources that are contemplated by the present discussion. One non-limiting example includes an electrical lead that may be routed circumferentially, but configured to float within a lumen. The electrical lead may be used for obtaining elongation information as it displaces.


Instrument channels may also provide a mechanism for measuring elongation if such channels are positioned off center of the flexible device 10. In one exemplary configuration, the instrument channel is fixed distally. In another exemplary configuration, instrument channel is fixed proximally and measured distally by either internal or external sensing. In yet another exemplary configuration, instrument channel may be fixed at either end, elongation is measured at both the distal and proximal ends.


In yet another exemplary configuration, a pressure filled circumferential lumen may be utilized. More specifically the lumen is filled with either gas or fluid which may be used to generate signals as the pressure and/or displaced volume changes with articulation. Further, the pressure filled lumens may also be used to exert positive pressure or negative pressure to articulate the flexible device 10.


In a further exemplary configuration, pushrods may be utilized. More specifically, pushrods may be pushed or pulled to articulate the flexible member. Displacement/force may be measured to estimate circumferential elongation on the flexible device.


When using a tension element, such as a cable, use of a Bowden Coil (also known as a coil tube) that can support the tension distally and decouple the flexible shaft motion from tendon actuation may be useful. The Bowden Coil itself could then be measured for device elongation information, as the Bowden Coil must displace back and forth as the device shaft is articulated. However, the Bowden Coil is most likely to be under significant load and therefore compress during articulation. Thus, the load should be estimated and compensated for in calculating the device elongation.


The Bowden Cable may be used in tandem with a tendon 114. The elongation measurement from the Bowden Coil may give articulation information at a point along the length of flexible where the Bowden Cables are terminated, and the tendons may provide articulation information about the segment distal to the Bowden Coil termination.


In another exemplary arrangement, a fiber optic line may be used to deliver energy or estimate shape/strain of a flexible device 10. The proximal motion of the fiber optic line, if fixed distally, would indicate circumferential elongation of the flexible device 10.


Elongation information of interest is at specific known locations about the circumference of the flexible device 10. However, elongation at multiple points about the circumference of the flexible device 10 provides value since differential elongation provides an indication of device articulation. Thus in one exemplary arrangement, a plurality of elongate elements may be disposed about the circumference to improve knowledge about the elongation of the flexible device 10. In one exemplary configuration, the plurality of elongate elements may all be terminated at the distal tip of the flexible device 10 to give fine articulation information. In another exemplary arrangement, the plurality of elongate elements may be terminated (or simply measured) at multiple points along the length of the flexible device 10 to give more complete shape information. Once such exemplary configuration is described above; i.e., utilizing the Bowden Coil with tendons, in which two points along the length of the catheter will generate articulation information. A continuous strain measuring instrument could be embedded and fixed in the flexible device that would provide elongation information at multiple points along the length of the flexible device. If a plurality of these strain measurements are routed in parallel on the length of the flexible device 10, the differential measurements could provide complete articulation information about any point along the length of the flexible device.


Motion Based Orientation Registration


The remote localization measurement and elongation time history data are useful for registering to orientation in a localization frame, such as global reference frame GRF. FIG. 6 illustrates two views of an object 150 in motion in a pair of reference frames, an elongation reference frame ERF and a localization reference frame LRF. Correlating each measured motion in time allows a combination of the signals and improvement of a localization estimate. For example, separate orientation frames may be registered to each other by correlating motion of a common object in each frame. In certain exemplary arrangements that actuate a flexible device, a rich, but low amplitude identification signal may be embedded in an actuation command to aid in the correlation.


If two reference frames are not registered in orientation, objects appearing in each reference frame will be interpreted as rotated differently to an observer of both, such as shown in FIG. 6. Rotation values for each translational degree of freedom are thus required to register the elongation reference frame ERF and the localization reference frame LRF.


Thus, to register the orientation, the signal processing system 114 illustrated in FIG. 2 translates elongation measurements 112 into an expected motion in the elongation reference frame ERF of those measurements. Next, the signal processing system 114 determines the orientation (rotation angles) which will best match the expected motion to motion that is measured by the remote localization system 110. This determination may be accomplished by analyzing the time history of measurements (to infer motion), or by keeping an estimate of the state of the system and updating the estimate based on new measurements. In one exemplary configuration, a level of uncertainty of some states may be accounted for by a Kalman filter.


While the above proposed technique does not register position, in some instances orientation is sufficient for a control system to register its commands to an operator selected reference frame. With certain systems, the proposed orientation registration technique may be able to also register positions over time if the two localization data streams include position measurements. Such systems will require correspondingly more data measurements to register the extra degrees of freedom.


Exemplary configurations of the proposed orientation registration technique are described below. In a first exemplary configuration, a 5 degree of freedom measurement is combined with motion of flexible device 100 with four pullwires actuating near a point of measurement with a proximally measured position. In a second exemplary configuration, the same flexible device 100 with the four pullwires is combined with a 2 degree of freedom fluoroscopic measurement of a projected motion. Each will be discussed in turn.


First Exemplary Configuration


Known five degrees of freedom remote localization systems 200 that are marketed for medical use do not measure the roll of a flexible device 100. However, roll may be estimated by combining a 5 degree of freedom measurement taken from an electromagnetic sensor 202 with measured pullwire positions 204 in the presence of a multi-dimensional motion path, as illustrated in FIG. 7. The flexible device 100 may be moved with the pullwires to follow a path so as to involve multiple pullwires. In one example, the path is a circle. During this movement, the time history of the pullwire positions and the remote localization measurements may be captured. The localization measurements are recorded as a matrix y. The pullwires positions are translated into the expected motion path, x, with a roll of zero. A roll matrix A, in the registered frame around the y-axis will translate x into the reference frame of y.










A


(
θ
)


=

[




cos


(
θ
)




0



sin


(
θ
)






0


1


0





-

sin


(
θ
)





0



cos


(
θ
)





]





(
1
)







The roll will satisfy the optimization as follows:













minimize









θ









y
-


A


(
θ
)



x




2
2

.





(
2
)







In another exemplary configuration, the roll optimization may be formulated as a state-update estimation.


Second Exemplary Configuration


A second exemplary configuration 300 is illustrated schematically in FIG. 8. In this proposed technique, registration of flexible device 100 is accomplished by using position data 302 from an actuated pullwire device moving in three dimensions with a projected motion measured in two dimensions 304. All three rotation angles must be estimated to register orientation of the flexible device 100. A control system 306 combines the position data 302 from the pullwires with the position data 304 from the fluoroscopy system to register 308 an orientation of the device 100.


As with the first exemplary configuration discussed above, the pullwire motions are translated to expected device motions. The expected device motions in this exemplary configuration is projected onto a fluoroscopy plane 304. The pullwires further have a very small amount of motion overlaid constantly.


This technique begins by first assuming that the pullwires lie at a cardinal angel positions, λ. λ is represented by the following:









λ
=

[

0
,

π
2

,
π
,


3





π

2


]





(
3
)







Next, a rotation matrix for motion in the plane 304 is determined:










P


(
ψ
)


=

[




cos






(
ψ
)






-
sin







(
ψ
)







sin






(
ψ
)





cos






(
ψ
)





]





(
4
)







In the rotation matrix, ψ represents the yaw in the fluoroscopy reference frame 304.


The expected motion is calculated as follows:










A


(

ϕ
,
θ

)


=

[




sin






(

λ
-
ϕ

)




0





sin






(

λ
-
ϕ
-

π
2


)


sin






(
θ
)





cos






(
θ
)





]





(
5
)







Where ϕ and θ represents roll, both in the fluoroscopy reference frame 304. The transformation matrix to apply to motion in each tendon and insert axis is as follows:

B(ψ, ϕ, θ)=P(ψ)A(ϕ, θ)


Where B∈IR2×5


Small motions may be created and measured using the elongation tendons. Such motions should be translated into motion in a Cartesian direction at the distal section, such as, for example:









x
=

[




1


0


0


0


0


1




0


0


1


1


0


0




0


1


0


0


0


1




0


0


0


0


1


0




0


0


1


1


0


1














]





(
7
)







Where the columns in the above matrix each represent a time step and each row represents a direction to be projected onto the measurement plane. x∈IR5×N for N time samples. Here x(1) is a sideways motion of the tip caused by actuation of a single tendon.


The measurements in the fluoroscopy plane y∈IRN×2, represent two-dimensional motion over the same time-steps used for input matrix. However, mechanical systems have bandwidth limitations such that time steps will be much slower than the sampling rate of system electronics −1 to 10 Hz.


Finally, the difference between the time histories of the estimated motion projection and the fluoroscopy measurement should be minimized by properly selecting the rotation angles, as represented by the below formula:













minimize










ψ
,
ϕ
,
θ










y
-


B


(

ψ
,
ϕ
,
θ

)



x




2
2

.





(
8
)







With constraints on ψ, ϕ, and θ, the minimization performed by the control system 306 will reveal the proper orientation to register the flexible device frame (i.e., through the use of the proximal pullwire position 302) to the fluoroscopy frame (through the use of the fluoroscopy projection 304). In another exemplary configuration, the orientation may be formulated as a state-update estimation.


It is understood that the present disclosure is not limited to the first and second exemplary configuration described above. Indeed, other combinations of motion measured via a remote localization system and elongation measurements may be combined with a signal processing system to register orientation in an elongation reference frame to the localization reference frame and improve localization.


Combining Remote Localization and Elongation Measurement


There are a number of different techniques that may be used to combine remote localization data and elongation data to register a tool or surgical device 100 to a target frame. Several exemplary techniques will now be discussed in turn.


Localization with Fluoroscopy and Elongation Measurements


There are a number of imaging techniques that may be used to project a 3D view of a flexible medical device, such as flexible device 100, to a 2D planar view. One commonly known and used technique is fluoroscopy, although the techniques presented herein may also be used with any camera view, including visible light and unobstructed views. For ease of explanation, the exemplary proposed technique will be described in the context of a fluoroscopy


With reference to FIG. 9, there are three main elements to a system 400 that augments an imaging technique with elongation measurements to register a flexible device 100 to a target frame 402. A first element is computer vision techniques that track the flexible device 100 within a fluoroscopy view to produce a computer vision model 404. While it is understood that there are many ways to track the flexible device 100, the primary goal is to process an image (and optionally, device commands) and prior information about the flexible device 100 to produce two-dimensional estimations of the position and heading of known positions on the flexible device 100. A second element of the system 400 is a model of the elongation elements 406, such as, for example, control wires, that are used to process tension and/or displacement information so as to estimate a heading and position. A third element of the system 400 is a signal processing element 408 that combines information from the computer vision model 404 and the linear wire model 406 to produce six degrees of freedom that maps the end of the flexible device 100 to coordinate frames of a proximal portion of the flexible device 100, and an imaging frame. Each element of the system 400 will be discussed below in further detail.


Computer Vision Technique


A primary goal of computer vision technique for remote localization measurement is to obtain estimates of distinguishable features on the flexible device 100 with a coordinate frame of an imaging device. In most known surgical suite setups, the imaging device coordinate frame is registered within the global frame GF, so once the flexible device 100 is localized or registered to the imaging device coordinate frame, it may effectively be registered within the global frame GF.


However, the localization from an imaging device is limited to two dimensions, as the images are two dimensional. More specifically, the position of each distinguishable feature of the flexible device 100 is typically represented as two-dimensional and with an unknown depth. Moreover, orientation from the imaging system may only provide a single degree of freedom—i.e., orientation angel within the plane. Thus, localization from an image gives no indication of the orientation in and out of the plan. Nor is roll around an axis of the flexible device 100 represented. Orientation and roll information can provide valuable information to effect proper localization and registration of a flexible medical device 100 within a target frame.


Tracking multiple features on a flexible device can improve depth and orientation estimates if the distances between each of the features are known. In typical 3D object tracking with computer vision techniques, all six degrees of freedom may be determined using the multiple features, as well as knowledge about occlusions. Thus, for flexible devices 100, it is possible to estimate the displacement of two reference points in or out of the imaging plane, but not possible to determine if the reference points are positioned in or out of the plane because of the inability to visualize depth.


More specifically, referring to FIG. 10A, a projection to the x-y image plane is shown. However, the actual depth of the reference points along the z-axis is not possible to determine, even if the distance in the projected plane, d′, and the actual distance between the features, d, is known. For example, because the reference point on the right side of the image could be lower or higher than the reference point on the left side of the image, as demonstrated in FIGS. 10B-10C.


There are many methods that may be utilized to track the flexible devices 100 in an image. Exemplary methods include placing specific markers or features, based on color, shape, location, or a combination of thereof, on the flexible device 100 and designing computer vision methods to track those markers and/or features. Other techniques require segmentation of the flexible device 100 from the background using shape, color, density, or motion information and then locating features within the flexible device 100 by exploiting the segmentation.


Another exemplary method is active contour tracking, which is augmented with template matching tuned to the flexible device. Many flexible devices 100 show up in fluoroscopy images as a single solid or textured compound curve. Thus, in such cases it may be useful to track the flexible device 100 shape with an active contour and then locate the specific features on the flexible device 100 with template matching techniques. An active contour is a method of maintaining a set of points on a curve by minimizing an energy function. Terms of the energy function maintain spacing between the points, minimize curvature (since flexible devices 100 resist sharp bends), and proximity to the curve in the image. The end result is a sequence of points that tracks the flexible device 100 and grows or shrinks as the projection of the flexible device 100 grows and shrinks. An active contour can be designed to track edges in the image (using an energy term that is minimized when on an edge) or, in the case of flexible devices 100, two edges with a darker area inside them. Active contour tracking is described in co-pending U.S. patent application Ser. No. 13/832,586, issued as U.S. Pat. No. 9,629,595 on Apr. 25, 2017, the contents of which are incorporated by reference in its entirety.


An active contour allows tracking of specific flexible device features 100, such as control rings, markers, or other distinctive features that show up in images, by reducing the search space for image template matching techniques, for example by defining a “local area”. Techniques such as correlation coefficients may be used to match specific image features by generating a template image based on heading and construction of the flexible device 100 in the local area of the active contour and finding the position where it best matches the image. This technique results in robust tracking of the shape of the flexible device 100 with a reduced search space for tracking specific features.


Active contour guided template matching is effective in that the active contour provides a good estimate of heading in two dimensions while the template matching provides a good estimate of feature position within two dimensions. Thus the overall system is much more robust to disturbances and can recover lost features faster. Initialization of the tracker can use a human interface or image search for specific features to seed the tracker.


Linear Model of Heading with Angle and Tension


The three to four degrees of freedom information that are acquired from imaging (x and y position, orientation angle around the z axis, and a partial, or ambiguous, position and orientation in z axis), when augmented with elongation measurements, provide full localization. One way of processing elongation measurements from a flexible device 100 is to fit a linear model to the relationship between the displacement and force on the elongation measurements and the orientation of a sensor.


A simple linear model of a planar, two-pull wire flexible device is:

θ=a0x0+a1τ0+a2x1+a3τ1+a4   (9)

where ai represents the constants to the linear equations, xk represents wire displacements, Tk represents wire tensions, and θ represents estimated orientation of the flexible device 100. ai can be determined experimentally using a least-squares fit.


The model may be improved if it is known that pairs of pull-wires are antagonistic, or located on opposite sides of the flexible device 100. In that case, the coefficients of the antagonistic displacements or tensions should be equal and opposite. Equation 9 may be simplified using this information to:

θ=a0(x0−x1)+a10−τ1)+a2   (10)


The primary assumption in Equation 10 is that bending will be entirely kinematic—there will be no torsion around the axis of the flexible device 100. In this way, it is possible to represent angle as a two-dimensional vector in the plane perpendicular to the flexible device 100 at the base of an articulating section. A pseudo-angle vector, {right arrow over (θ)}=[θxy] where the magnitude of {right arrow over (θ)} corresponds to the amount of rotation around the vector defined by unit({right arrow over (θ)}). In this representation, it is possible to represent θx and θy into two equations:

θx=a0(x0−x2)+a1(x1−x3a20−τ2)+a31−τ3)+a4   (11)
θy=b0(x0−x2)+b1(x1−x3)+b20−τ2)+b31−τ3)+b4   (12)


Note that each of θx and θy is expressed as a function of all pull-wire measurements because of the inability to decouple angle directions from the pull wire directions.


The linear model works relatively well, provided that the underlying system is linear. Thus for systems incorporating metal pull-wires and limited friction, the linear model may be applicable. In systems with non-linear pull-wires or complicated friction characteristics (such as coil tubes) a non-linear model may be utilized. However, elongation elements cannot capture the roll angle from torsion. If there is limited torsion however, a linear model is effective for computing the orientation of the distal tip of a flexible device 100 in relation to the proximal end.


Filtering Image Information and Elongation Measurements


While image processing techniques can yield two dimensional position and orientation in a plane (plus some depth information) and a linear model of the flexible device 100 may estimate heading, the combination of the data yielded from the two techniques can produce a much stronger localization estimate. Indeed, the goal of localization estimation is to represent possible configurations of the flexible device 100 given the inputs of imaging data and elongation data and then correlate the two inputs to reduce the number of possibilities. While the elongation measurements map from a proximal device frame to a distal robot frame and the image processing maps from a global (image) frame GF to a distal robot frame, there is enough common information to provide a final mapping from the global frame to the distal device frame, thereby yielding an information filter using the image as one input and the elongation measurements as the second input.


Filtering techniques needed in this case require the ability to represent multi-modal hypotheses. An illustration for using multi-modal hypotheses is provided in FIG. 10. Because it is ambiguous whether the flexible device 100 is projected in or out of a given plane, two or more separated hypotheses are needed until more information is known. There are many ways of managing a filter with multiple hypotheses, including, but not limited to, the use of multiple Kalman filters, particle filters, and combinations of Gaussians.


In one exemplary configuration, particle filters are used. Particle filters maintain a sampling of a hypothesis space using particles, or discrete estimates. This sampling of a hypothesis space is updated using probability-based coefficients based on expected changes in state and future measurements. The result is a sequence of prediction-correction steps that causes the distribution to converge to the most likely hypothesis or hypotheses. Thus particle filters are well suited to the problem of combining image localization information with elongation measurements. The basic steps of a particle filter technique is represented in FIG. 11.


More specifically, while a single set of static measurements SM (images and elongation displacements and tensions) can provide valuable information, additional information may be gleaned when the system moves. For example, referring to FIG. 11, initial measurements of a flexible catheter 100 may be represented by an initial sample distribution SD1. An input command IC may be directed to move the flexible catheter 100. Based on the initial data from the sample distribution SD1, an updated sample distribution SD2 is generated to predict the movement of the catheter. Next, image data ID and elongation data ED collected may be combined together. The image data ID and elongation data ED may then be used to correct or update the sample distribution SD3 to provide a more accurate representation of the flexible device 100. The updated sample distribution SD3 may then be used as the initial data to repeat the process.


As a further example, if an initial roll orientation of the pull wires is known in relation to the flexible device 100 tip it is possible to recover all six degrees of freedom for localization, but if this roll is unknown or difficult to determine, only five degrees of freedom may be recovered. In contrast, if the system moves, the uncertainty about motion may be compensated in a particle filter by spreading particles in the different possible roll directions. The next set of measurements will then reduce the possibilities related to the roll degree of freedom. This is technique is similar to what is described above in connection with FIG. 8.


Bowden Coil Supported Tendons


Using fluoroscopy with any number of computer vision algorithms such as those described above, a planar position and heading of a flexible device 100 may be estimated. If the flexible device 100 is a catheter that has one or more articulating segments that are steered by tendons supported by Bowden Coils, information from these mechanical elements may also be used to improve a location estimation by convex optimization.


More specifically, in a vascular procedure there is a proximal portion of the vasculature where a catheter 100 may be inserted without articulation moving toward an anatomical location of interest. During this insertion, a robot can be used to position the tendons so that all tension is removed from the catheter to maintain maximum flexibility during this “passive” insertion. Once the anatomical location of interest has been reached, such as a calcified bifurcation for example, it may be necessary to begin controlled articulation of the catheter 100 to select a branch into which the catheter 100 may be directed. During controlled articulation, it can be helpful to know the location/orientation of the catheter 100 in the anatomy. This may be accomplished using the framework of FIG. 2 whereby the Remote Localization Measurement 110 is obtained from fluoroscopy processed with computer vision algorithms, and the “Elongation Measurement” 112 is obtained from tendon displacements and force.


Once passive insertion is completed, the (while at zero force) can be recorded as a starting tendon position for zero articulation. This will baseline for future tendon motions that articulate the catheter. The fluoroscopic image can, at that point, be processed to obtain a planar position and angle of the flexible device 100. The localization “signal processing” algorithm may be seeded by assuming that the articulating segment of the catheter 100 lies in the image plane and has zero roll. Therefore, at this first instant, a complete estimate for the orientation of the catheter tip, which would be in the plane of articulation with zero roll, may be determined. This technique assumes that the orientation of the distal termination of the Bowden Coils is coincident with catheter tip to begin.


The next step in the procedure is to apply controlled articulation to access the vessel of interest. At this point, force is applied to the tendons. The pitch, yaw, and roll of the distal tip relative to the reference frame of the distal termination of the Bowden Coils may be estimated based on the tendon displacements and forces using a method, such as, for example, a linear model discussed above. If the initial starting point for the Bowden and tendon termination frames were incorrect, then the fluoroscopy measurements will not coincide with the prediction from the elongation measurements, generating an error signal. This error signal can be used in a convex optimization routine that will generate a new estimate for the Bowden frame that minimizes the error between the fluoroscopy measurement and distal tip estimate from the elongation information. As the catheter 100 is exercised about its workspace, the estimate for Bowden frame should converge so that the out of plane information may be presented to the “localization customer” 116.


Localization with Electromagnetic Sensors and Elongation Measurements


With reference to FIG. 12, another proposed system 500 is illustrated that utilizes electromagnetic tracking as a remote localization measurement 504. The electromagnetic tracking is augmented with elongation measurements 501 by an estimator 506 to register a flexible device 100 to a target frame 508. In one exemplary configuration, 6 degrees of freedom of the flexible device 100 may be mapped to the global frame 508.


Electromagnetic (EM) trackers can produce up to six degrees of freedom of localization by embedding coils in the flexible device 100 and measuring induced currents from magnetic fields generated by transmitters. While it is possible for an EM tracker to provide all localization of a flexible device 100, in some situations it may be desirable to augment electromagnetic tracking with elongation measurements, as illustrated in FIG. 12.


Electromagnetic trackers are based on induced currents from a changing magnetic field. A current is induced in a receiver coil based on the change in magnetic flux through the coil. Typically, a transmitter turns on and off coils to produce magnetic fields with a known intensity and speed. Then, with knowledge of the size and orientation of the coil in relation to the flexible device 100, it is possible to estimate the location and/or orientation of the coil. For accuracy and more degrees of freedom, most EM systems use multiple transmitter coils and one or more receiver coils.


A single receiver coil coupled with a multi-coil transmitter can estimate five degrees of freedom—all localization except roll around the axis of the coil. Two or more receiver coils can estimate all six degrees of freedom if the coils are non-parallel, but often the accuracy of such systems is limited due to the size of the device 100. For instance, a typical flexible device requires very small coils if the device contains a center lumen—and the accuracy of the EM tracker is related to the amount of area enclosed within the coil. The result is that small coils are limited in range from the transmitter and other equipment or ferromagnetic materials in the area can cause warping of the magnetic field and the localization measurements.


Single-Coil Roll Estimation


In the case where only a single coil is incorporated in the device (or multiple coils that are non-collocated and parallel) the EM tracking system will not be able to localize roll. In these situations, the elongation measurements would be useful to determine the roll. It is important to note that the roll of the flexible device cannot be recovered without motion in the flexible device 100 because only the correlation of EM tracker motion and elongation measurement motion allows recovery of roll.


Since the two sensing modalities, namely electromagnetic localization 504 and linear wire modeling 502, must be correlated with respect to motion, algorithms are needed that can maintain hypotheses of the current roll and update that distribution of hypotheses with each new piece of sensory information. Many types of filters or estimators 506 could be used to maintain this distribution of hypotheses, such as, for example, Kalman filters and particle filters. It is also possible to record recent motions of the flexible device 100 and solve for the current roll angle using regression techniques such as non-linear least squares. Visually, this technique is shown in FIG. 13.


As an example, a flexible device 100 with a single coil at the tip would allow all localization except for the roll axis. Moving the device (either manually or robotically) would cause both the EM sensor and elongation measurements to change, reflecting a change in position in each of their coordinate systems. Specifically, the EM sensor would have a new position and heading in the global coordinate system 610, while the elongation measurements would reflect a new heading and orientation in the proximal device coordinate system 612. If the device coordinate system was known in relation to the global coordinate system, the motion could be correlated between the two sensing modalities. For a given motion, there would be a specific roll angle that would be consistent with the motion. By combining multiple measurements over time, an estimator 606/506 could keep the roll angle updated, even as the roll angle changes as the flexible device 100 moves.


Multi-Coil Disturbance Rejection


The primary disturbances in EM tracker measurements are due to magnetic interference and electrical interference. Electrical interference causes noise in the measurements and often can be filtered out, but for faster motion, the elongation measurements could be used to provide higher frequency information than the EM tracker can provide on its own.


Another application for elongation measurements in conjunction with an EM tracker is in rejecting warping of the localization field due to changes in the magnetic field. Since EM trackers depend on magnetic fields in the area, and large bodies of ferromagnetic metal can warp this field, often the types of equipment in an operating room can disturb or change the localization as they move. For example, as a C-arm of a fluoroscopy system is rotated around a patient or moves closer or further from the patient, the magnetic field (and hence localization) is warped. By maintaining knowledge of elongation measurements, detection of the magnetic field being warped may be detected, even when the flexible device 100 is not moving. Thus localization field warping may be compensated for in software.


Localization with Maps and Elongation Measurements


As discussed above, elongation measurements can provide information about the deformation of a flexible medical device. If the device is operating in a region of the anatomy whose walls are sufficiently rigid so as to constrain the device, then a map of that anatomy can also provide information about the possible shapes the device may assume. The map may be obtained from pre-operative imaging, for example, from a CT scan; or intra-operatively, for example, from multiple views of contrast-enhanced 2D images or by collecting surface points using EM sensors. By correlating elongation measurements with shapes allowed by the anatomical map at different locations, an estimate of the shape of the device and its position within the map may be determined.


An example is use of elongation measurements are shown in FIG. 14. More specifically, FIG. 14A illustrates elongation measurements that continuously provide a 2 degree of freedom tip heading angle as a catheter 100 advances from time t1 to t4. FIG. 14B illustrates a map of the vessel in which the catheter is navigating. By finding the best match between the measured heading angles over time in FIG. 14A and the gradient of the center line of the map shown in FIG. 14B, the current tip position within the anatomy can be determined, as illustrated in FIG. 14C. Additionally, if the elongation measurements are noisy, imprecise, or reliable only within a plane, the shape estimate derived from the history of heading angles shown in FIG. 14A can be improved based on the corresponding gradients of the center line of the map shown in FIG. 14B.


It will be appreciated that the surgical instrument and methods described herein have broad applications. The foregoing embodiments were chosen and described in order to illustrate principles of the methods and apparatuses as well as some practical applications. The preceding description enables others skilled in the art to utilize methods and apparatuses in various embodiments and with various modifications as are suited to the particular use contemplated. In accordance with the provisions of the patent statutes, the principles and modes of operation of this disclosure have been explained and illustrated in exemplary embodiments.


It is intended that the scope of the present methods and apparatuses be defined by the following claims. However, it must be understood that this disclosure may be practiced otherwise than is specifically explained and illustrated without departing from its spirit or scope. It should be understood by those skilled in the art that various alternatives to the embodiments described herein may be employed in practicing the claims without departing from the spirit and scope as defined in the following claims. The scope of the disclosure should be determined, not with reference to the above description, but should instead be determined with reference to the appended claims, along with the full scope of equivalents to which such claims are entitled. It is anticipated and intended that future developments will occur in the arts discussed herein, and that the disclosed systems and methods will be incorporated into such future examples. Furthermore, all terms used in the claims are intended to be given their broadest reasonable constructions and their ordinary meanings as understood by those skilled in the art unless an explicit indication to the contrary is made herein. In particular, use of the singular articles such as “a,” “the,” “said,” etc. should be read to recite one or more of the indicated elements unless a claim recites an explicit limitation to the contrary. It is intended that the following claims define the scope of the invention and that the method and apparatus within the scope of these claims and their equivalents be covered thereby. In sum, it should be understood that the invention is capable of modification and variation and is limited only by the following claims.

Claims
  • 1. A method of determining roll localization of a flexible elongate instrument as the flexible elongate instrument is navigated within a patient, wherein the flexible elongate instrument comprises at least one pullwire, the method comprising: obtaining (i) an initial position or heading measurement for the flexible elongate instrument from at least a first sensor of a plurality of sensors;obtaining (ii) a time history of measurements of motion in the at least one pullwire from at least a second sensor of the plurality of sensors as the flexible elongate instrument moves;obtaining (iii) a time history of subsequent position or heading measurements from a remote localization system for the flexible elongate instrument as the flexible elongate instrument moves, the subsequent position or heading measurements being obtained from at least the first sensor of the plurality of sensors after the initial position or heading measurement; anddetermining localization of a roll angle for the flexible elongate instrument from a comparison of (i) the initial position or heading measurement for the flexible elongate instrument, (ii) the time history of the measurements of motion in the at least one pullwire as the flexible elongate instrument moves, and (iii) the time history of the subsequent position or heading measurements from the remote localization system for the flexible elongate instrument as the flexible elongate instrument moves.
  • 2. The method of claim 1, wherein the first sensor includes a two-degree of freedom orientation sensor.
  • 3. The method of claim 2, wherein (i) the initial position or heading measurement for the flexible elongate instrument comprises measurements of pitch and yaw, and wherein (iii) the time history of the subsequent position or heading measurements from the remote localization system for the flexible elongate instrument as the flexible elongate instrument moves comprise comprises measurements of pitch and yaw.
  • 4. The method of claim 1, wherein the first sensor includes a five-degree of freedom sensor.
  • 5. The method of claim 4, wherein the five-degree of freedom sensor is an electromagnetic sensor.
  • 6. The method of claim 1, wherein said obtaining (ii) the time history of the measurements of motion in the at least one pullwire as the flexible elongate instrument moves comprises receiving measurements of motion in multiple pullwires.
  • 7. The method of claim 1, wherein said obtaining (ii) the time history of the measurements of motion in the at least one pullwire as the flexible elongate instrument moves is determined based on a time history of pullwire position commands.
  • 8. The method of claim 1, wherein said obtaining (ii) the time history of the measurements of motion in the at least one pullwire as the flexible elongate instrument moves comprises receiving a measurement of pullwire displacement.
  • 9. The method of claim 1, wherein said obtaining (ii) the time history of the measurements of motion in the at least one pullwire as the flexible elongate instrument moves comprises receiving a measure of tension changes in the at least one pullwire.
  • 10. The method of claim 1, wherein said determining localization of the roll angle for the flexible elongate instrument comprises determining a value that satisfies an optimization equation.
  • 11. The method of claim 1, wherein said determining localization of the roll angle for the flexible elongate instrument comprises maintaining and filtering multiple hypotheses over successive motions of the flexible elongate instrument.
  • 12. The method of claim 11, wherein the multiple hypotheses are maintained and filtered using one or more maintenance and filter options selected from the group consisting of Kalman filters, particle filters, and Gaussians combinations.
  • 13. The method of claim 1, wherein said determining localization of the roll angle for the flexible elongate instrument results in full localization of the flexible elongate instrument in six degrees of freedom.
  • 14. The method of claim 1, further comprising receiving an input command to move the flexible elongate instrument prior to said obtaining (ii) the time history of the measurements of motion in the at least one pullwire as the flexible elongate instrument moves.
  • 15. The method of claim 1, wherein said obtaining (ii) the time history of the measurements of motion in the at least one pullwire as the flexible elongate instrument moves comprises predicting motion of the at least one pullwire based on an input command.
  • 16. The method of claim 1, further comprising predicting a movement of the flexible elongate instrument based on the (ii) the time history of the measurements of motion in the at least one pullwire as the flexible elongate instrument moves.
  • 17. The method of claim 1, further comprising outputting the localization of the roll angle to a localization consumer.
  • 18. A method of determining roll localization of a flexible elongate instrument as the flexible elongate instrument is navigated within a patient, wherein the flexible elongate instrument comprises at least one pullwire, the method comprising: obtaining (i) an initial position or heading measurement for the flexible elongate instrument from at least a first sensor of a plurality of sensors;obtaining (ii) a time history of measurements of motion in the at least one pullwire from at least a second sensor of the plurality of sensors as the flexible elongate instrument moves;obtaining (iii) a time history of a plurality of subsequent position or heading measurements as the flexible elongate instrument moves, the plurality of subsequent position or heading measurements being obtained from at least the first sensor of the plurality of sensors after the initial position or heading measurement; anddetermining localization of a roll angle for the flexible elongate instrument from a comparison of (i) the initial position or heading measurement for the flexible elongate instrument, (ii) the time history of the measurements of motion in the at least one pullwire as the flexible elongate instrument moves, and (iii) the time history of the plurality of subsequent position or heading measurements for the flexible elongate instrument as the flexible elongate instrument moves.
  • 19. A method comprising: obtaining (i) an initial position or heading measurement for a flexible elongate instrument from at least a first sensor of a plurality of sensors;obtaining (ii) a time history of measurements of motion in one or more pullwires of the flexible elongate instrument from at least a second sensor of the plurality of sensors as the flexible elongate instrument moves within a patient;obtaining (iii) a time history of a plurality of subsequent position or heading measurements as the flexible elongate instrument moves within the patient, the plurality of subsequent position or heading measurements being obtained from at least the first sensor of the plurality of sensors after the initial position or heading measurement; anddetermining localization of a roll angle for the flexible elongate instrument from a comparison of (i) the initial position or heading measurement for the flexible elongate instrument, (ii) the time history of the measurements of motion in the one or more pullwires as the flexible elongate instrument moves within the patient, and (iii) the time history of the plurality of subsequent position or heading measurements for the flexible elongate instrument as the flexible elongate instrument moves within the patient.
CROSS-REFERENCE TO RELATED APPLICATION

This application is a Continuation of application Ser No. 13/833,733, filed Mar. 15, 2013, issued as U.S. Pat. No. 9,271,663 on Mar. 1, 2016, and entitled “FLEXIBLE INSTRUMENT LOCALIZATION FROM BOTH REMOTE AND ELONGATION SENSORS.” The entirety of which is herein incorporated by reference.

US Referenced Citations (520)
Number Name Date Kind
4745908 Wardle May 1988 A
5273025 Sakiyam et al. Dec 1993 A
5398691 Martin et al. Mar 1995 A
5408409 Glassman et al. Apr 1995 A
5524180 Wang et al. Jun 1996 A
5526812 Dumoulin et al. Jun 1996 A
5550953 Seraji Aug 1996 A
5553611 Budd et al. Sep 1996 A
5558091 Acker et al. Sep 1996 A
5631973 Green May 1997 A
5636255 Ellis Jun 1997 A
5713946 Ben-Haim Feb 1998 A
5729129 Acker Mar 1998 A
5749362 Funda et al. May 1998 A
5831614 Tognazzini et al. Nov 1998 A
5859934 Green Jan 1999 A
5873822 Ferre et al. Feb 1999 A
5876325 Mizuno et al. Mar 1999 A
5902239 Buurman May 1999 A
5920319 Vining et al. Jul 1999 A
5935075 Casscells Aug 1999 A
5951475 Gueziec et al. Sep 1999 A
6016439 Acker Jan 2000 A
6019724 Gronningsaeter et al. Feb 2000 A
6038467 De Bliek et al. Mar 2000 A
6047080 Chen Apr 2000 A
6059718 Taniguchi et al. May 2000 A
6063095 Wang et al. May 2000 A
6167292 Badano Dec 2000 A
6203493 Ben-Haim Mar 2001 B1
6226543 Gilboa et al. May 2001 B1
6233476 Strommer et al. May 2001 B1
6246784 Summers Jun 2001 B1
6246898 Vesely Jun 2001 B1
6253770 Acker et al. Jul 2001 B1
6259806 Green Jul 2001 B1
6272371 Shlomo Aug 2001 B1
6332089 Acker Dec 2001 B1
6424885 Niemeyer et al. Jul 2002 B1
6425865 Salcudean et al. Jul 2002 B1
6466198 Feinstein Oct 2002 B1
6490467 Bucholz Dec 2002 B1
6553251 Lahdesmaki Apr 2003 B1
6592520 Peszynski Jul 2003 B1
6593884 Gilboa et al. Jul 2003 B1
6665554 Charles Dec 2003 B1
6690963 Ben-Haim Feb 2004 B2
6690964 Beiger et al. Feb 2004 B2
6711429 Gilboa et al. Mar 2004 B1
6726675 Beyar Apr 2004 B1
6782287 Grzeszczuk et al. Aug 2004 B2
6812842 Dimmer Nov 2004 B2
6892090 Verard et al. May 2005 B2
6899672 Chin May 2005 B2
6926709 Beiger et al. Aug 2005 B2
6994094 Schwartz Feb 2006 B2
6996430 Gilboa et al. Feb 2006 B1
7155315 Niemeyer et al. Dec 2006 B2
7180976 Wink Feb 2007 B2
7206627 Abovitz Apr 2007 B2
7233820 Gilboa Jun 2007 B2
7386339 Strommer et al. Jun 2008 B2
7756563 Higgins Jul 2010 B2
7850642 Moll et al. Dec 2010 B2
7901348 Soper Mar 2011 B2
8052621 Wallace et al. Nov 2011 B2
8092397 Wallace et al. Jan 2012 B2
8155403 Tschirren Apr 2012 B2
8190238 Moll et al. May 2012 B2
8298135 Ito et al. Oct 2012 B2
8317746 Sewell et al. Nov 2012 B2
8388556 Wallace et al. Mar 2013 B2
8394054 Wallace et al. Mar 2013 B2
8460236 Roelle et al. Jun 2013 B2
8657781 Sewell et al. Feb 2014 B2
8672837 Roelle et al. Mar 2014 B2
8821376 Tolkowsky Sep 2014 B2
8858424 Hasegawa Oct 2014 B2
8929631 Pfister et al. Jan 2015 B2
9014851 Wong et al. Apr 2015 B2
9039685 Larkin et al. May 2015 B2
9057600 Walker et al. Jun 2015 B2
9125639 Mathis Sep 2015 B2
9138129 Diolaiti Sep 2015 B2
9173713 Hart et al. Nov 2015 B2
9183354 Baker et al. Nov 2015 B2
9186046 Ramamurthy et al. Nov 2015 B2
9271663 Walker et al. Mar 2016 B2
9272416 Hourtash et al. Mar 2016 B2
9283046 Walker et al. Mar 2016 B2
9289578 Walker et al. Mar 2016 B2
9459087 Dunbar Oct 2016 B2
9498291 Balaji et al. Nov 2016 B2
9498601 Tanner et al. Nov 2016 B2
9504604 Alvarez Nov 2016 B2
9532840 Wong et al. Jan 2017 B2
9561083 Yu et al. Feb 2017 B2
9566414 Wong et al. Feb 2017 B2
9603668 Weingarten et al. Mar 2017 B2
9622827 Yu et al. Apr 2017 B2
9629595 Walker et al. Apr 2017 B2
9629682 Wallace et al. Apr 2017 B2
9636184 Lee et al. May 2017 B2
9710921 Wong et al. Jul 2017 B2
9713509 Schuh et al. Jul 2017 B2
9717563 Tognaccini Aug 2017 B2
9726476 Ramamurthy et al. Aug 2017 B2
9727963 Mintz et al. Aug 2017 B2
9737371 Romo et al. Aug 2017 B2
9737373 Schuh Aug 2017 B2
9744335 Jiang Aug 2017 B2
9763741 Alvarez et al. Sep 2017 B2
9788910 Schuh Oct 2017 B2
9818681 Machida Nov 2017 B2
9827061 Balaji et al. Nov 2017 B2
9844353 Walker et al. Dec 2017 B2
9844412 Bogusky et al. Dec 2017 B2
9867635 Alvarez et al. Jan 2018 B2
9918681 Wallace et al. Mar 2018 B2
9931025 Graetzel et al. Apr 2018 B1
9949749 Noonan et al. Apr 2018 B2
9955986 Shah May 2018 B2
9962228 Schuh et al. May 2018 B2
9980785 Schuh May 2018 B2
9993313 Schuh et al. Jun 2018 B2
10016900 Meyer et al. Jul 2018 B1
10022192 Ummalaneni Jul 2018 B1
10123755 Walker et al. Nov 2018 B2
10130427 Tanner et al. Nov 2018 B2
10136950 Schoenefeld Nov 2018 B2
10145747 Lin et al. Dec 2018 B1
10159532 Ummalaneni et al. Dec 2018 B1
10278778 State May 2019 B2
10299870 Connolly et al. May 2019 B2
10482599 Mintz et al. Nov 2019 B2
10517692 Eyre et al. Dec 2019 B2
10524866 Srinivasan Jan 2020 B2
10639114 Schuh May 2020 B2
10667875 DeFonzo Jun 2020 B2
10743751 Landey et al. Aug 2020 B2
10751140 Wallace et al. Aug 2020 B2
10765303 Graetzel et al. Sep 2020 B2
10765487 Ho Sep 2020 B2
20010021843 Bosselmann et al. Sep 2001 A1
20010039421 Heilbrun Nov 2001 A1
20020065455 Ben-Haim et al. May 2002 A1
20020077533 Bieger et al. Jun 2002 A1
20020120188 Brock et al. Aug 2002 A1
20030105603 Hardesty Jun 2003 A1
20030125622 Schweikard Jul 2003 A1
20030135204 Lee Jul 2003 A1
20030181809 Hall et al. Sep 2003 A1
20030195664 Nowlin et al. Oct 2003 A1
20040047044 Dalton Mar 2004 A1
20040072066 Cho et al. Apr 2004 A1
20040186349 Ewers Sep 2004 A1
20040249267 Gilboa Dec 2004 A1
20040263535 Birkenbach et al. Dec 2004 A1
20050004516 Vanney Jan 2005 A1
20050027397 Niemeyer Feb 2005 A1
20050033149 Strommer et al. Feb 2005 A1
20050060006 Pflueger Mar 2005 A1
20050085714 Foley et al. Apr 2005 A1
20050107679 Geiger May 2005 A1
20050143649 Minai et al. Jun 2005 A1
20050143655 Satoh Jun 2005 A1
20050171508 Gilboa Aug 2005 A1
20050182295 Soper et al. Aug 2005 A1
20050193451 Quistgaard et al. Sep 2005 A1
20050256398 Hastings Nov 2005 A1
20050272975 McWeeney et al. Dec 2005 A1
20060004286 Chang Jan 2006 A1
20060015096 Hauck et al. Jan 2006 A1
20060025668 Peterson Feb 2006 A1
20060025676 Viswanathan et al. Feb 2006 A1
20060058643 Florent Mar 2006 A1
20060076023 Rapacki et al. Apr 2006 A1
20060084860 Geiger Apr 2006 A1
20060095066 Chang May 2006 A1
20060098851 Shoham May 2006 A1
20060149134 Soper et al. Jul 2006 A1
20060173290 Lavallee et al. Aug 2006 A1
20060184016 Glossop Aug 2006 A1
20060209019 Hu Sep 2006 A1
20060258935 Pile-Spellman et al. Nov 2006 A1
20060258938 Hoffman et al. Nov 2006 A1
20070015997 Higgins et al. Jan 2007 A1
20070032826 Schwartz Feb 2007 A1
20070055128 Glossop Mar 2007 A1
20070055144 Neustadter Mar 2007 A1
20070073136 Metzger Mar 2007 A1
20070083193 Werneth Apr 2007 A1
20070123748 Meglan May 2007 A1
20070135886 Maschke Jun 2007 A1
20070156019 Larkin et al. Jul 2007 A1
20070167743 Honda Jul 2007 A1
20070167801 Webler et al. Jul 2007 A1
20070208252 Makower Sep 2007 A1
20070225559 Clerc et al. Sep 2007 A1
20070249901 Ohline et al. Oct 2007 A1
20070253599 White et al. Nov 2007 A1
20070269001 Maschke Nov 2007 A1
20070276180 Greenburg et al. Nov 2007 A1
20070287992 Diolaiti Dec 2007 A1
20070293721 Gilboa Dec 2007 A1
20070299353 Harlev et al. Dec 2007 A1
20080071140 Gattani Mar 2008 A1
20080079421 Jensen Apr 2008 A1
20080103389 Begelman et al. May 2008 A1
20080118118 Berger May 2008 A1
20080118135 Averbach May 2008 A1
20080123921 Gielen et al. May 2008 A1
20080147089 Loh Jun 2008 A1
20080161681 Hauck Jul 2008 A1
20080183064 Chandonnet Jul 2008 A1
20080183068 Carls et al. Jul 2008 A1
20080183073 Higgins et al. Jul 2008 A1
20080183188 Carls et al. Jul 2008 A1
20080201016 Finlay Aug 2008 A1
20080207997 Higgins et al. Aug 2008 A1
20080212082 Froggatt et al. Sep 2008 A1
20080218770 Moll et al. Sep 2008 A1
20080243142 Gildenberg Oct 2008 A1
20080262297 Gilboa Oct 2008 A1
20080275349 Halperin Nov 2008 A1
20080287963 Rogers et al. Nov 2008 A1
20080306490 Lakin et al. Dec 2008 A1
20080312501 Hasegawa et al. Dec 2008 A1
20090030307 Govari Jan 2009 A1
20090054729 Mori Feb 2009 A1
20090076476 Barbagli et al. Mar 2009 A1
20090149867 Glozman Jun 2009 A1
20090227861 Ganatra Sep 2009 A1
20090248036 Hoffman et al. Oct 2009 A1
20090259230 Khadem Oct 2009 A1
20090262109 Markowitz et al. Oct 2009 A1
20090292166 Ito Nov 2009 A1
20090295797 Sakaguchi Dec 2009 A1
20100008555 Trumer Jan 2010 A1
20100039506 Sarvestani et al. Feb 2010 A1
20100041949 Tolkowsky Feb 2010 A1
20100054536 Huang Mar 2010 A1
20100113852 Sydora May 2010 A1
20100121139 OuYang May 2010 A1
20100160733 Gilboa Jun 2010 A1
20100161022 Toikowsky Jun 2010 A1
20100161129 Costa et al. Jun 2010 A1
20100225209 Goldberg Sep 2010 A1
20100240989 Stoianovici Sep 2010 A1
20100290530 Huang et al. Nov 2010 A1
20100292565 Meyer Nov 2010 A1
20100298641 Tanaka Nov 2010 A1
20100328455 Nam et al. Dec 2010 A1
20110054303 Barrick Mar 2011 A1
20110092808 Shachar Apr 2011 A1
20110113852 Prisco May 2011 A1
20110184238 Higgins Jul 2011 A1
20110234780 Ito Sep 2011 A1
20110238082 Wenderow Sep 2011 A1
20110245665 Nentwick Oct 2011 A1
20110248987 Mitchell Oct 2011 A1
20110249016 Zhang Oct 2011 A1
20110276179 Banks et al. Nov 2011 A1
20110295247 Schlesinger et al. Dec 2011 A1
20110295267 Tanner et al. Dec 2011 A1
20110295268 Roelle et al. Dec 2011 A1
20110319815 Roelle et al. Dec 2011 A1
20110319910 Roelle Dec 2011 A1
20120046521 Hunter et al. Feb 2012 A1
20120056986 Popovic Mar 2012 A1
20120062714 Liu Mar 2012 A1
20120065481 Hunter Mar 2012 A1
20120069167 Liu et al. Mar 2012 A1
20120071782 Patil et al. Mar 2012 A1
20120082351 Higgins Apr 2012 A1
20120116253 Wallace et al. May 2012 A1
20120120305 Takahashi May 2012 A1
20120165656 Montag Jun 2012 A1
20120191079 Moll et al. Jul 2012 A1
20120209069 Popovic Aug 2012 A1
20120215094 Rahimian et al. Aug 2012 A1
20120219185 Hu Aug 2012 A1
20120289777 Chopra Nov 2012 A1
20120289783 Duindam et al. Nov 2012 A1
20120302869 Koyrakh Nov 2012 A1
20130060146 Yang et al. Mar 2013 A1
20130072787 Wallace Mar 2013 A1
20130096377 Duindam Apr 2013 A1
20130144116 Cooper et al. Jun 2013 A1
20130165945 Roelle Jun 2013 A9
20130204124 Duindam Aug 2013 A1
20130225942 Holsing Aug 2013 A1
20130243153 Sra Sep 2013 A1
20130246334 Ahuja Sep 2013 A1
20130259315 Angot et al. Oct 2013 A1
20130303892 Zhao Nov 2013 A1
20130345718 Crawford Dec 2013 A1
20140058406 Tsekos Feb 2014 A1
20140107390 Brown Apr 2014 A1
20140114180 Jain Apr 2014 A1
20140148808 Inkpen et al. Apr 2014 A1
20140142591 Alvarez et al. May 2014 A1
20140148673 Bogusky May 2014 A1
20140180063 Zhao Jun 2014 A1
20140235943 Paris Aug 2014 A1
20140243849 Saglam Aug 2014 A1
20140257334 Wong et al. Sep 2014 A1
20140257746 Dunbar et al. Sep 2014 A1
20140264081 Walker et al. Sep 2014 A1
20140275985 Walker et al. Sep 2014 A1
20140275988 Walker et al. Sep 2014 A1
20140276033 Brannan Sep 2014 A1
20140276392 Wong et al. Sep 2014 A1
20140276394 Wong et al. Sep 2014 A1
20140276594 Tanner et al. Sep 2014 A1
20140276646 Wong et al. Sep 2014 A1
20140276934 Balaji et al. Sep 2014 A1
20140276937 Wong et al. Sep 2014 A1
20140277747 Walker et al. Sep 2014 A1
20140296655 Akhbardeh et al. Oct 2014 A1
20140309527 Namati et al. Oct 2014 A1
20140309649 Alvarez et al. Oct 2014 A1
20140343416 Panescu Nov 2014 A1
20140350391 Prisco et al. Nov 2014 A1
20140357953 Roelle et al. Dec 2014 A1
20140357984 Wallace et al. Dec 2014 A1
20140364739 Liu Dec 2014 A1
20140364870 Alvarez et al. Dec 2014 A1
20140379000 Romo et al. Dec 2014 A1
20150051482 Liu et al. Feb 2015 A1
20150051592 Kintz Feb 2015 A1
20150054929 Ito et al. Feb 2015 A1
20150057498 Akimoto Feb 2015 A1
20150073266 Brannan Mar 2015 A1
20150101442 Romo Apr 2015 A1
20150112486 Larkin et al. Apr 2015 A1
20150119638 Yu et al. Apr 2015 A1
20150141808 Elhawary May 2015 A1
20150141858 Razavi May 2015 A1
20150142013 Tanner et al. May 2015 A1
20150164594 Romo et al. Jun 2015 A1
20150164596 Romo Jun 2015 A1
20150223725 Engel Aug 2015 A1
20150223765 Chopra Aug 2015 A1
20150223897 Kostrzewski et al. Aug 2015 A1
20150223902 Walker et al. Aug 2015 A1
20150255782 Kim et al. Sep 2015 A1
20150265087 Park Sep 2015 A1
20150265359 Camarillo Sep 2015 A1
20150265368 Chopra Sep 2015 A1
20150265807 Park et al. Sep 2015 A1
20150275986 Cooper Oct 2015 A1
20150287192 Sasaki Oct 2015 A1
20150297133 Jouanique-Dubuis et al. Oct 2015 A1
20150297864 Kokish et al. Oct 2015 A1
20150305650 Hunter Oct 2015 A1
20150313503 Seibel et al. Nov 2015 A1
20150335480 Alvarez et al. Nov 2015 A1
20150374956 Bogusky Dec 2015 A1
20150375399 Chiu et al. Dec 2015 A1
20160000302 Brown Jan 2016 A1
20160000414 Brown Jan 2016 A1
20160000520 Lachmanovich Jan 2016 A1
20160001038 Romo et al. Jan 2016 A1
20160008033 Hawkins et al. Jan 2016 A1
20160111192 Suzara Apr 2016 A1
20160128992 Hudson May 2016 A1
20160175059 Walker et al. Jun 2016 A1
20160183841 Duindam et al. Jun 2016 A1
20160199134 Brown et al. Jul 2016 A1
20160202053 Walker et al. Jul 2016 A1
20160206389 Miller Jul 2016 A1
20160213432 Flexman Jul 2016 A1
20160270865 Landey et al. Sep 2016 A1
20160287279 Bovay et al. Oct 2016 A1
20160287346 Hyodo et al. Oct 2016 A1
20160296294 Moil et al. Oct 2016 A1
20160314710 Jarc Oct 2016 A1
20160331469 Hall et al. Nov 2016 A1
20160360947 Iida Dec 2016 A1
20160372743 Cho et al. Dec 2016 A1
20160374541 Agrawal et al. Dec 2016 A1
20170007337 Dan Jan 2017 A1
20170055851 Al-All Mar 2017 A1
20170065356 Balaji et al. Mar 2017 A1
20170079725 Hoffman Mar 2017 A1
20170079726 Hoffman Mar 2017 A1
20170100084 Walker et al. Apr 2017 A1
20170100199 Yu et al. Apr 2017 A1
20170105803 Wong et al. Apr 2017 A1
20170113019 Wong et al. Apr 2017 A1
20170119413 Romo May 2017 A1
20170119481 Romo et al. May 2017 A1
20170119484 Tanner et al. May 2017 A1
20170151027 Walker et al. Jun 2017 A1
20170165011 Bovay et al. Jun 2017 A1
20170172673 Yu et al. Jun 2017 A1
20170189118 Chopra Jul 2017 A1
20170202627 Sramek et al. Jul 2017 A1
20170209073 Sramek et al. Jul 2017 A1
20170209224 Walker et al. Jul 2017 A1
20170215808 Shimol et al. Aug 2017 A1
20170215969 Zhai et al. Aug 2017 A1
20170238807 Veritkov et al. Aug 2017 A9
20170258366 Tupin Sep 2017 A1
20170290631 Lee et al. Oct 2017 A1
20170296032 Li Oct 2017 A1
20170296202 Brown Oct 2017 A1
20170303941 Eisner Oct 2017 A1
20170325896 Donhowe Nov 2017 A1
20170333679 Jiang Nov 2017 A1
20170340241 Yamada Nov 2017 A1
20170340396 Romo et al. Nov 2017 A1
20170348067 Krimsky Dec 2017 A1
20170360418 Wong et al. Dec 2017 A1
20170360508 Germain et al. Dec 2017 A1
20170365055 Mintz et al. Dec 2017 A1
20170367782 Schuh et al. Dec 2017 A1
20180025666 Ho et al. Jan 2018 A1
20180055582 Krimsky Mar 2018 A1
20180098690 Iwaki Apr 2018 A1
20180177383 Noonan et al. Jun 2018 A1
20180177556 Noonan et al. Jun 2018 A1
20180177561 Mintz et al. Jun 2018 A1
20180184988 Walker et al. Jul 2018 A1
20180214011 Graetzel et al. Aug 2018 A1
20180217734 Koenig et al. Aug 2018 A1
20180221038 Noonan et al. Aug 2018 A1
20180221039 Shah Aug 2018 A1
20180240237 Donhowe et al. Aug 2018 A1
20180250083 Schuh et al. Sep 2018 A1
20180271616 Schuh et al. Sep 2018 A1
20180279852 Rafii-Tari et al. Oct 2018 A1
20180280660 Landey et al. Oct 2018 A1
20180286108 Hirakawa Oct 2018 A1
20180289243 Landey et al. Oct 2018 A1
20180289431 Draper et al. Oct 2018 A1
20180308247 Gupta Oct 2018 A1
20180325499 Landey et al. Nov 2018 A1
20180326181 Kokish et al. Nov 2018 A1
20180333044 Jenkins Nov 2018 A1
20180360435 Romo Dec 2018 A1
20180368920 Ummalaneni Dec 2018 A1
20190000559 Berman et al. Jan 2019 A1
20190000560 Berman et al. Jan 2019 A1
20190000566 Graetzel et al. Jan 2019 A1
20190000576 Mintz et al. Jan 2019 A1
20190046814 Senden et al. Feb 2019 A1
20190066314 Abhari Feb 2019 A1
20190083183 Moll et al. Mar 2019 A1
20190086349 Nelson Mar 2019 A1
20190105776 Ho et al. Apr 2019 A1
20190105785 Meyer Apr 2019 A1
20190107454 Lin Apr 2019 A1
20190110839 Rafii-Tari et al. Apr 2019 A1
20190110843 Ummalaneni et al. Apr 2019 A1
20190117176 Walker et al. Apr 2019 A1
20190117203 Wong et al. Apr 2019 A1
20190151148 Alvarez et al. Apr 2019 A1
20190125164 Roelle et al. May 2019 A1
20190167361 Walker et al. Jun 2019 A1
20190167366 Ummalaneni Jun 2019 A1
20190167367 Walker et al. Jun 2019 A1
20190175009 Mintz Jun 2019 A1
20190175062 Rafii-Tari et al. Jun 2019 A1
20190175287 Hill Jun 2019 A1
20190175799 Hsu Jun 2019 A1
20190183585 Rafii-Tari et al. Jun 2019 A1
20190183587 Rafii-Tari et al. Jun 2019 A1
20190216548 Ummalaneni Jul 2019 A1
20190216576 Eyre Jul 2019 A1
20190223974 Romo Jul 2019 A1
20190228525 Mintz et al. Jul 2019 A1
20190246882 Graetzel et al. Aug 2019 A1
20190262086 Connolly et al. Aug 2019 A1
20190269468 Hsu et al. Sep 2019 A1
20190274764 Romo Sep 2019 A1
20190287673 Michihata Sep 2019 A1
20190290109 Agrawal et al. Sep 2019 A1
20190298160 Ummalaneni et al. Oct 2019 A1
20190298460 Al-Jadda Oct 2019 A1
20190298465 Chin Oct 2019 A1
20190328213 Landey et al. Oct 2019 A1
20190336238 Yu Nov 2019 A1
20190365209 Ye et al. Dec 2019 A1
20190365479 Rafii-Tari Dec 2019 A1
20190365486 Srinivasan et al. Dec 2019 A1
20190374297 Wallace et al. Dec 2019 A1
20190375383 Alvarez Dec 2019 A1
20190380787 Ye Dec 2019 A1
20190380797 Yu Dec 2019 A1
20200000530 DeFonzo Jan 2020 A1
20200000533 Schuh Jan 2020 A1
20200022767 Hill Jan 2020 A1
20200039086 Meyer Feb 2020 A1
20200046434 Graetzel Feb 2020 A1
20200054405 Schuh Feb 2020 A1
20200054408 Schuh et al. Feb 2020 A1
20200060516 Baez Feb 2020 A1
20200093549 Chin Mar 2020 A1
20200093554 Schuh Mar 2020 A1
20200100845 Julian Apr 2020 A1
20200100853 Ho Apr 2020 A1
20200100855 Leparmentier Apr 2020 A1
20200101264 Jiang Apr 2020 A1
20200107894 Wallace Apr 2020 A1
20200121502 Kintz Apr 2020 A1
20200146769 Eyre May 2020 A1
20200155084 Walker May 2020 A1
20200170630 Wong Jun 2020 A1
20200170720 Ummalaneni Jun 2020 A1
20200188043 Yu Jun 2020 A1
20200197112 Chin Jun 2020 A1
20200206472 Ma Jul 2020 A1
20200217733 Lin Jul 2020 A1
20200222134 Schuh Jul 2020 A1
20200237458 DeFonzo Jul 2020 A1
20200261172 Romo Aug 2020 A1
20200268459 Noonan et al. Aug 2020 A1
20200268460 Tse Aug 2020 A1
Foreign Referenced Citations (24)
Number Date Country
101147676 Mar 2008 CN
101222882 Jul 2008 CN
102316817 Jan 2012 CN
102458295 May 2012 CN
102973317 Mar 2013 CN
103735313 Apr 2014 CN
105559850 May 2016 CN
105559886 May 2016 CN
105611881 May 2016 CN
106821498 Jun 2017 CN
104931059 Sep 2018 CN
3 025 630 Jun 2016 EP
2015519131 Jul 2015 JP
10-2014-0009359 Jan 2014 KR
2569699 Nov 2015 RU
03086190 Oct 2003 WO
WO 05087128 Sep 2005 WO
WO 09097461 Jun 2007 WO
2013116140 Aug 2013 WO
2014058838 Apr 2014 WO
WO 15089013 Jun 2015 WO
WO 17048194 Mar 2017 WO
WO 17066108 Apr 2017 WO
WO 17167754 Oct 2017 WO
Non-Patent Literature Citations (36)
Entry
Wilson et al., “A Buyer's Guide to Electromagnetic Tracking Systems for Clinical Applications” Medical Imaging 2008: Visualization, Image-guided Procedures and Modeling Proc. of SPIE vol. 6918, 69182B, (2008) (Year: 2008).
Haigron et al., 2004, Depth-map-based scene analysis for activew navigation in virtual angioscopy, IEEE Transactions on Medical Imaging, 23(11):1380-1390.
Mayo Clinic, Robotic Surgery, https://www.mayoclinic.org/tests-procedures/robotic-surgery/about/pac-20394974?p=1, downloaded from the internet on Jul. 12, 2018, 2 pp.
Ciuti et al., 2012, Intra-operative monocular 30 reconstruction for image-guided navigation in active locomotion capsule endoscopy. Biomedical Robotics and Biomechatronics (Biorob), 4th IEEE Ras & Embs International Conference on IEEE.
Fallavollita et al., 2010, Acquiring multiview C-arm images to assist cardiac ablation procedures, EURASIP Journal on Image and Video Processing, vol. 2010, Article ID 871408, pp. 1-10.
Kumar et al., 2014, Stereoscopic visualization of laparoscope image using depth information from 3D model, Computer methods and programs in biomedicine 113(3):862-868.
Livatino et al., 2015, Stereoscopic visualization and 3-D technologies in medical endoscopic teleoperation, IEEE.
Luo et al., 2010, Modified hybrid bronchoscope tracking based on sequential monte carlo sampler: Dynamic phantom validation, Asian Conference on Computer Vision. Springer, Berlin, Heidelberg.
Mourgues et al., 2002, Flexible calibration of actuated stereoscopic endoscope for overlay inrobot assisted surgery, International Conference on Medical Image Computing and Computer-Assisted Intervention. Springer, Berlin, Heidelberg.
Nadeem et al., 2016, Depth Reconstruction and Computer-Aided Polyp Detection in Optical Colonoscopy Video Frames, arXiv preprint arXiv:1609.01329.
Point Cloud, Sep. 10, 2010, Wikipedia, 2 pp.
Racadio et al., Dec. 2007, Live 3D guidance in the interventional radiology suite, AJR, 189:W357-W364.
Sato et al., 2016, Techniques of stapler-based navigational thoracoscopic segmentectomy using virtual assisted lung mapping (VAL-MAP), Journal of Thoracic Disease, 8(Suppl 9):S716.
Shen et al., 2015, Robust camera localisation with depth reconstruction for bronchoscopic navigation. International Journal of Computer Assisted Radiology and Surgery, 10(6):801-813.
Solheim et al., May 14, 2009, Navigated resection of giant intracranial meningiomas based on intraoperative 3D ultrasound, Acta Neurochir, 151:1143-1151.
Song et al., 2012, Autonomous and stable tracking of endoscope instrument tools with monocular camera, Advanced Intelligent Mechatronics (AIM), 2012 IEEE-ASME international Conference on. IEEE.
Verdaasdonk et al., Jan. 23, 2012, Effect of microsecond pulse length and tip shape on explosive bubble formation of 2.78 μm Er,Cr:YSGG and 2.94 μm Er:YAG laser. Proceedings of SPIE, vol. 8221, 12.
Yip et al., 2012, Tissue tracking and registration for image-guided surgery, IEEE transactions on medical imaging 31(11):2169-2182.
Zhou et al., 2010, Synthesis of stereoscopic views from monocular endoscopic videos, Computer Vision and Pattern Recognition Workshops (CVPRW), 2010 IEEE Computer Society Conference on IEEE.
Konen et al., 1998, The VN-project: endoscopic image processing for neurosurgery, Computer Aided Surgery, 3:1-6.
Shi et al., Sep. 14-18, 2014, Simultaneous catheter and environment modeling for trans-catheter aortic valve implantation, IEEE/RSJ International Conference on Intelligent Robots and Systems, pp. 2024-2029.
Vemuri et al., Dec. 2015, Inter-operative biopsy site relocations in endoluminal surgery, IEEE Transactions on Biomedical Engineering, Institute of Electrical and Electronics Engineers, <10.1109/TBME.2015.2503981>. <hal-01230752>.
Al-Ahmad et al., dated 2005, Early experience with a computerized robotically controlled catheter system, Journal of Interventional Cardiac Electrophysiology, 12:199-202.
Gutierrez et al., Mar. 2008, A practical global distortion correction method for an image intensifier based x-ray fluoroscopy system, Med. Phys, 35(3):997-1007.
Hansen Medical, Inc. 2005, System Overview, product brochure, 2 pp., dated as available at http://hansenmedical.com/system.aspx on Jul. 14, 2006 (accessed Jun. 25, 2019 using the internet archive way back machine).
Hansen Medical, Inc. Bibliography, product brochure, 1 p., dated as available at http://hansenmedical.com/bibliography.aspx on Jul. 14, 2006 (accessed Jun. 25, 2019 using the internet archive way back machine).
Hansen Medical, Inc. dated 2007, Introducing the Sensei Robotic Catheter System, product brochure, 10 pp.
Hansen Medical, Inc. dated 2009, Sensei X Robotic Catheter System, product brochure, 5 pp.
Hansen Medical, Inc. Technology Advantages, product brochure, 1 p., dated as available at http://hansenmedical.com/advantages.aspx on Jul. 13, 2006 (accessed Jun. 25, 2019 using the internet archive way back machine).
Marrouche et al., dated May 6, 2005, AB32-1, Preliminary human experience using a novel robotic catheter remote control, Heart Rhythm, 2(5):S63.
Oh et al., dated May 2005, P5-75, Novel robotic catheter remote control system: safety and accuracy in delivering RF Lesions in all 4 cardiac chambers, Heart Rhythm, 2(5):S277-S278.
Reddy et al., May 2005, P1-53. Porcine pulmonary vein ablation using a novel robotic catheter control system and real-time integration of CT imaging with electroanatomical mapping, Hearth Rhythm, 2(5):S121.
Slepian, dated 2010, Robotic Catheter Intervention: the Hansen Medical Sensei Robot Catheter System, PowerPoint presentation, 28 pp.
Kiraly et al., 2002, Three-dimensional Human Airway Segmentation Methods for Clinical Virtual Bronchoscopy, Acad Radiol, 9:1153-1168.
Kiraly et al., Sep. 2004, Three-dimensional path planning for virtual bronchoscopy, IEEE Transactions on Medical Imaging, 23(9):1365-1379.
Solomon et al., Dec. 2000, Three-dimensional CT-Guided Bronchoscopy With a Real-Time Electromagnetic Position Sensor A Comparison of Two Image Registration Methods, Chest, 118(6):1783-1787.
Related Publications (1)
Number Date Country
20160228032 A1 Aug 2016 US
Continuations (1)
Number Date Country
Parent 13833733 Mar 2013 US
Child 15056652 US