a. Field of the Invention
This invention relates to a robotic catheter system and method for automated control of a catheter and related components. In particular, the instant invention relates to a robotic catheter system for manipulating a catheter and related components, for example, for diagnostic, therapeutic, mapping and ablative procedures.
b. Background Art
Electrophysiology catheters are used in a variety of diagnostic and/or therapeutic medical procedures to correct conditions such as atrial arrhythmia, including for example, ectopic atrial tachycardia, atrial fibrillation, and atrial flutter. Arrhythmia can create a variety of dangerous conditions including irregular heart rates, loss of synchronous atrioventricular contractions and stasis of blood flow which can lead to a variety of ailments and even death.
Typically in a procedure, a catheter is manipulated through a patient's vasculature to, for example, a patient's heart, and carries one or more electrodes which may be used for mapping, ablation, diagnosis, or other treatments. Once at the intended site, treatment may include radio frequency (RF) ablation, cryoablation, lasers, chemicals, high-intensity focused ultrasound, etc. An ablation catheter imparts such ablative energy to cardiac tissue to create a lesion in the cardiac tissue. This lesion disrupts undesirable electrical pathways and thereby limits or prevents stray electrical signals that lead to arrhythmias. As readily apparent, such treatment requires precise control of the catheter during manipulation to and at the treatment site, which can invariably be a function of a user's skill level.
The inventors herein have thus recognized a need for a system and method for precise and dynamic automated control of a catheter and its related components, for example, for diagnostic, therapeutic, mapping and ablative procedures, that will minimize and/or eliminate procedural variability due to a user's skill level. The inventors herein have also recognized a need for a system and method for performing user-specified procedures at the patient site or from a remote location.
A robotic system for manipulating a catheter with a plurality of steering wires longitudinally within a length of the catheter includes a user interface configured to display a view of an anatomical model and to receive one or more user inputs; a catheter manipulator assembly configured to linearly actuate one or more control members of a catheter, and a robotic controller configured to provide a view of an anatomical model to the user interface; accept one or more user inputs from the user interface; register the one or more user inputs to a coordinate system associated with the anatomical model; compute one or more actuator commands from the one or more registered inputs; and cause the catheter manipulator assembly to linearly actuate one or more control members of a catheter in accordance with the computed actuator commands. In an embodiment, the actuator commands may be computed, for example, by calculating an inverse Jacobian Matrix
In an embodiment, the robotic system may additionally include a positioning system configured to provide an indication of a position of the catheter to the controller. The user interface may include an input device and a display device, and in an embodiment, may be a multi-touch display interface that can receive one or more touch-based inputs from a user. The display may include selectable on-screen menu buttons to activate functions such as pan, rotate, zoom, direct catheter movement, or may place lesion markers, waypoints, virtual sensors, automated movement targets, or draw movement lines.
In an embodiment, the input device may include a detectable glove or stylus that can be located in three dimensional space using a magnetic field, electrostatic field, optical positioning system, or the like. Alternatively, the input device may be designed similar to a traditional catheter handle.
Referring now to the drawings wherein like reference numerals are used to identify identical components in the various views, an embodiment of robotic catheter system 10 (described in detail below), also referred to as “the system,” may be likened to power steering for a catheter system. The system may be used, for example, to manipulate the location and orientation of catheters and sheaths in a heart chamber or in another body cavity. As shown in
An embodiment of robotic catheter system 10 may involve automated catheter movement. A user, such as an EP, could identify locations (potentially forming a path) on a rendered computer model of the cardiac anatomy. The system can be configured to relate those digitally selected points to positions within a patient's actual/physical anatomy, and may command and control the movement of a catheter to defined positions. Once in position, either the user or system could then perform the desired treatment or therapy—which may further be in accordance with a defined algorithm. This system could enable full robotic control by using optimized path planning routines together with closed-loop position control. Furthermore, the system could automate certain “best-practices,” such as pulling the catheter across the surface, or making contact at an oblique angle.
Referring to
The input control system 100 may generally allow a user to control the movement and advancement of both the catheter and sheath. Generally, several types of input devices may be employed, including, without limitation, instrumented traditional catheter handle controls, oversized catheter models, instrumented user-wearable gloves, touch screen display monitors, 2-D input devices, 3-D input devices, spatially detected styluses, and traditional joysticks. The input device may be configured to directly control the movement of the catheter and sheath, or may be configured to, for example, manipulate a target or cursor on an associated display. In embodiments, for example and without limitation, the joystick may be spring centering so that any movement from the center position causes an incremental movement of the actual catheter tip, or the joystick may work in absolute terms. Haptic feedback may also be incorporated to provide a user with a sense of when contact has been made.
Referring to
Many additional features may be included with embodiments of the system to, for example, improve the accuracy or effectiveness of the system. Such features may include, closed-loop feedback using an EnSite NavX system or gMPS system 14 for creating realistic cardiac chamber geometries or models, displaying activation timing and voltage data to identify arrhythmias, and guiding precise catheter movement, and/or optical force transducers; active tensioning of “passive” steering wires to reduce the system response time; cumulative ablation while the tip is following a front-to-back ironing motion; and/or reactive/resistive impedance monitoring.
Referring to
Visualization system 12 may provide a user with real-time or near-real-time positioning information concerning the catheter tip. In an exemplary embodiment, system 12 may include an EnSite NavX monitor 16 or other similar monitor for displaying cardiac chamber geometries or models, displaying activation timing and voltage data to identify arrhythmias, and for facilitating guidance of catheter movement. A fluoroscopy monitor 18 may be provided for displaying a real-time x-ray image or for assisting a physician with catheter movement. Additional exemplary displays may include an ICE and EP Pruka displays, 20, 22, respectively.
Referring to
EnSite NavX system 14 (described in detail in U.S. Pat. No. 7,263,397, titled “Method and Apparatus for Catheter Navigation and Location and Mapping in the Heart,” incorporated by reference in its entirety) may be provided for creating realistic cardiac chamber geometries or models, displaying activation timing and voltage data to identify arrhythmias, and guiding precise catheter movement. EnSite NavX system 14 may collect electrical position data from catheters and use this information to track or navigate their movement and construct three-dimensional (3-D) models of the chamber.
In an embodiment, position data from the catheter may be obtained using a gMPS system, commercially available from Mediguide Ltd., and generally shown and described in U.S. Pat. No. 7,386,339 entitled “Medical Imaging and Navigation System,” which is incorporated herein by reference in its entirety.
Referring to
As generally shown in
Referring to
As generally shown in
With a configuration of robotic catheter system 10, such as shown in
Referring to
As generally shown in
As shown in
Referring to
Manipulator assembly 302 may be disposed in a vertical configuration (see
Referring to
Referring to
Referring to
As briefly discussed above, robotic catheter system 10 may include one or more cartridges 400, with manipulator 302 including at least two cartridges 402, 404, each of which may be respectively designed to control the distal movement of either the catheter or the sheath. With respect to catheter cartridge 402, catheter 406 may be substantially connected or affixed to cartridge 402, so that advancement of cartridge 402 correspondingly advances catheter 406, and retraction of the cartridge retracts the catheter. As further shown in
For some embodiments, the catheter and sheath cartridge can be designed to be substantially similar, and in that context a reference to either may relate to both. For example, as shown in
Referring to
In an embodiment, a user (e.g. an EP) may first manually position catheter 406 and sheath 410 (with catheter 406 inserted in sheath 410) within the vasculature of a patient. Once the devices are roughly positioned in relation to the heart, the user may then engage or connect (e.g., “snap-in”) the catheter cartridge into place on interconnecting/interlocking bases 308, 310 of manipulator assembly 302, for example, by inserting the locking/locating pins 432, 434 of the cartridges into mating holes 360, 364 of respective base 308, 310. When the cartridge is interconnected with the base, each of the plurality of fingers 316, 318, 320 or 322 may fit into recesses formed between the distal edge of slider blocks 412, 414, 416, 418 and a lower portion of the cartridge housing. Such recesses are shown in, for example,
Each finger may be designed to be actuated in a proximal direction to correspondingly push each respective slider block. The slider block can be configured to force the finger to self center on its geometry when contact is first made. Such a centering feature may be facilitated by the contact surface of the slider block. For example, as shown in
With sufficiently rigid coupling between each slider block and a corresponding steering wire, pushing a slider block in a proximal direction may cause an attached steering wire to tension and thus laterally deflect the distal end of the catheter and sheath 406, 410. Moreover, in such an embodiment, because there is no rigid connection between each finger and its associated slider block, the manipulator assembly 302 cannot pull the steering wire in a forward direction. That is, when each block is actuated, it is only possible to tension the steering wire. Furthermore, because the push-actuation of each slider block occurs near that block's bottom surface, a moment may be imposed on the block. Because such a moment may increase the likelihood of the block binding during travel, the length of the block may be optimized to reduce or minimize contact forces between the block and the cartridge housing.
The aforementioned electrical handshake between manipulation bases 308, 310 and catheter and sheath cartridges 402, 404 will be described briefly.
Robotic catheter system 10 may be useful for a variety of procedures and in connection with a variety of tools and/or catheters. Such tools and/or catheters may include, without limitation, spiral catheters, ablation catheters, mapping catheters, balloon catheters, needle/dilator tools, cutting tools, cauterizing tools, and/or gripping tools. The system may additionally include a means of identifying the nature and/or type of catheter/tool cartridge that is installed for use, and/or position or connection related information. The system may automatically access/obtain additional information about the cartridge, such as, without limitation, its creation date, serial number, sterilization date, prior uses, etc.
Further, some embodiments of the system may include an ability to “read” or detect the type or nature of the connected cartridge through the use of memory included with the disposable cartridge together with some data/signal transmission means. By way of example, each cartridge may contain a chip (e.g., an EEPROM chip) that can be electrically interfaced by the manipulator head. Such a chip could, for instance, be programmed during the manufacturing process and may electronically store various data, such as the make; model; serial number; creation date; and/or other special features associated with the cartridge or tool. Additionally the chip may contain other worthwhile information, such as an indication of previous use, catheter specific calibration or model data, and/or any other information that may relate to the safety or performance of the particular device.
In an embodiment, upon interconnecting the cartridge (e.g. 402, 404) with the manipulator head (e.g. 302), a detection means, such as an optical or magnetic sensor, may initially detect the presence of the cartridge. Once presence is detected, the manipulator may energize a chip and initiate data/signal retrieval. Such retrieved data/signal may then be used by the system to control or alter various features and/or displays based on the type of device and/or information provided. While one embodiment may use a chip (e.g., EEPROM), due to its design flexibility, another embodiment may include a wireless transmission device, such as an RFID, which may be employed to facilitate the data storage/transfer instead of, or in addition to a chip.
Referring to
Specifically, referring to
In an embodiment, as generally illustrated in
Referring to
Referring to
Referring to
Referring to
Referring to
Referring to
Referring to
Referring to
Referring to
Referring to
Referring to
Referring to
Based on the discussion above, the aforementioned articulated support structures may hold manipulator assembly 300 in a position to better facilitate treatment or therapy (e.g., adjacent the femoral vein/artery to promote catheterization). Such support structures discussed in reference to
Referring to
As generally shown in
Mimicking traditional, manual catheter control, an embodiment of robotic catheter system 10 may be configured such that longitudinally translating the input handle may cause a respective longitudinal translation of the catheter/sheath distal tip. However, unlike the traditional, manual catheter, the automated catheter system would generally effectuate this translation by advancing or retracting the cartridge. Further, robotic catheter system 10 can be configured so that the rotation of either handle causes a virtual rotation of the catheter/sheath tip, and movement of a thumb tab causes a deflection in the current deflection plane.
In an embodiment of user interface device 1000, any or all motion controls of the device can be associated with/employ a spring centering feature that returns each control element to a set or “home” location after the element is released. Such a centering feature can allow for highly precise movement corrections of the distal tip by registering various input movements as incremental movement from the “home” location rather than by registering movement entirely in absolute terms.
In an embodiment, instead of thumb tab-type controls, user interface device 1000 may additionally include or substitute displacement dial controls. Furthermore, to suit the desires of the user, an embodiment of such a user interface device may permit the handles to be fully interchangeable so that various combinations of controls (e.g., dial and thumb tab handles) can be used for catheter/sheath input. In another embodiment, user interface device 1000 may further include safety buttons (e.g. “dead-man switches”) that must be pressed for any joystick movement to be registered by the system. This design would prevent inadvertent motion from affecting the position of the actual catheter tip. In yet another embodiment, user interface device 1000 may further include a virtual reality surgical system, wherein the physician could be positioned within a cardiac environment (see
As generally shown in
In other embodiments, the device may be constructed without a centering mechanism, where the absolute position of the device might instead be used to control the absolute position of the actual sheath and catheter. With such an absolute approach, the input device's physical limitations may be designed to mimic an actual catheter's and sheath's physical limitations (e.g., movement restrictions based on bend radius, catheter retracted into sheath, etc.).
To record user input, each degree of movement can generally be instrumented with either a potentiometer or motor/encoder. If a motor/encoder is used, the system may also provide haptic feedback upon certain events—such as a “feel” if the catheter were to contact a virtual wall. An embodiment of this invention may also include an ablation activation button on the distal end of the device.
As generally illustrated in
In an embodiment where the user input device 1000 includes a spatially detected glove, such as generally illustrated in
In an embodiment, the user may be presented with a three dimensional visualization of the catheter and/or heart anatomy, such as through holographic imagery. Using the spatially detectable stylus or glove, the user may manipulate or interact with a visualization of the catheter, for instance, by moving the actual device within the holographic image. In such an embodiment, the real-time catheter position may be configured to track the three dimensional position of the user's index finger or stylus. Alternatively, as illustrated in
The glove or stylus input device may be locatable in 3-D space through the use of a positioning system employing a magnetic field, an electrostatic field, or through the use of an optical positioning system. These systems may include, for example, the EnSite NavX system from St. Jude Medical, the gMPS system from Mediguide, the CARTO system from Biosense Webster, the Aurora system from Northern Digital, or the RMT system from Boston Scientific.
In an embodiment, the positioning system may be implemented within a liquid tank (e.g., water tank), where field generators (such as those associated with the St. Jude Medical NavX™ control system) are externally attached. For such embodiments, an instrumented glove or stylus may extend into the tank while, for example, user's finger (e.g., index finger), or stylus may be instrumented with electrodes configured to measure parameters of the electric field. In an embodiment, the construction and/or placement of the sensors (e.g., NavX-type electrodes) may be similar to sensors on the distal portion of the catheter.
In another embodiment, the positioning system may be implemented using a magnetic positioning system. As generally illustrated in
A user interface device in the form of a touch screen monitor will now be discussed with reference to
An embodiment of user interface device may include a multi-touch display interface 1100 and related hardware and software that would allow a user to physically interact with the robotic catheter system without the need for a keyboard, mouse, or other input device. Such a display may be configured to recognize multiple finger or hand contacts with or along the screen, and would allow a user to directly interface with the objects, anatomy, or devices displayed on the screen.
As shown in
In an exemplary approach, when in rotate mode, a user may rotate a 3D cardiac geometry 1104 by touching the screen with a finger and dragging across the screen to spin the 3D model about an axis orthogonal to both the surface normal of the screen and the direction of the dragging motion. When in pan mode, a dragging motion across the screen may physically move the model across the screen. Additionally, the zoom may be controlled, for example, through a pinching (zoom out) or expanding motion (zoom in) of multiple fingers, or through the use of an on-screen slider
As shown in
In an embodiment, as generally illustrated in
Once a user taps the screen in the desired location of the target point, the software may be configured to place the target point 1120 directly on the surface of the model 1104 as displayed. In such a configuration, the system may know the relative depth of each pixel or primitive on the display. By touching on a displayed element, the system may map the target point directly to the anatomical surface. The software may further allow the user to specify a fixed or minimum distance from the displayed anatomical surface where the point should be located. For example, if the user specifies a distance of 10 mm prior to selecting a point, the software may locate the target point 10 mm off of the selected surface in a direction normal to the screen/viewing plane. Alternatively, the software may generate a virtual surface located 10 mm interior to the surface of the anatomical model and then map the point to the virtual surface. (i.e. 10 mm normal to the anatomical model surface). In another embodiment, as shown in
Referring back to
In addition to setting individual target points, as illustrated in
In an embodiment, as shown in
In another embodiment, the user input may be obtained through a spatial operating environment that is configured to monitor hand or body gestures without any required direct contact with a screen or device. The interface may operate together with either a two dimensional display or a three dimensional holographic image, and may allow the user to selectively manipulate, for example, the catheter or sheath, the cardiac model, various markers or waypoints within the model, or the positioning of other informational windows or displays. Such a gestural interface may include, for example the “G-Speak” Spatial Operating Environment, developed by Oblong Industries, Inc.
Haptic feedback based on actual sensed forces on a distal catheter tip will now be discussed.
An embodiment of user interface device 1000 that incorporates movement of a physical input device may include touch-type feedback, often referred to as “haptic feedback” This type of feedback may involve forces generated by a motor connected to user interface device 1000 that the user can feel while holding the device. These forces may be based on actual or computed forces being applied to a physical catheter tip. In an embodiment, the unit may sense forces using a force and/or impedance sensor in the tip of the catheter and generate a corresponding force on an input handle. In other embodiments, the forces can be based on a computed geometric model of the cardiac anatomy, such as that associated with the St. Jude Medical, Inc. EnSite™ system.
In an embodiment, haptic feedback may be conveyed to a user by employing an input device instrumented with motors/encoders on each degree of freedom. Though the motors may operate in a passive mode for a majority of the procedure, if feedback is required by the system, the motors may be energized to produce a torque on the input controls capable of retarding the user's movement in particular degrees of freedom. While in a passive mode, the motor typically will not produce a significant retarding force, however the attached encoder may record the input for use in visualization and control routines.
Prior to a haptic response being conveyed, the system may first calculate the appropriateness and magnitude of such a force. In an embodiment, such a force may attempt to replicate a contact between an actual catheter tip and a portion of the cardiac anatomy. In an embodiment, such contact may be either directly sensed through one or more force sensors on the distal tip of the catheter/sheath, or may be calculated based on a virtual catheter/sheath position within a rendered geometric computer model.
In an embodiment where haptic forces are based on actual catheter contact, the catheter's distal tip may be instrumented with a force sensor configured to provide an indication when physical contact is detected. Such a force sensor may include, without limitation, load cells, shape memory alloy based force sensors, piezoelectric force sensors, strain gauges, or optical-based or acoustic-based force sensors. One example of a contact sensor that may be used is described in detail in U.S. patent application Ser. No. 11/941,093 entitled “Optic-Based Contact Sensing Assembly and System,” which is incorporated by reference in its entirety. In other embodiments, a contact or proximity sensor may be used, such as those associated with detected electrical impedance. One example of a proximity sensor that may be used is described in detail in U.S. patent application Ser. No. 12/465,337, entitled “System and Method for Assessing the Proximity of an Electrode to Tissue in a Body,” which is incorporated by reference in its entirety.
In an embodiment employing actual contact sensing, the sensor may generate a signal representative of the actual physical or electrical contact. Based on the magnitude and direction of the sensed force, as well as the current position of the input device, the system may produce a corresponding torque or force on the input device that may resist further movement through the obstructing anatomy. The system can be configured so that the user would feel this reaction force as if the input device was impacting a “virtual wall.”
Based on the system calibration, the resistive force the user feels at the input joystick could be more or less “spongy.” That is, the system could be tuned so that a tip impact with the cardiac wall is either felt like a rigid impact with an immovable object, or perhaps as a contact with a soft sponge.
Haptic feedback based on virtual catheter tip proximity to virtual cardiac anatomy will now be discussed.
As discussed above, in an embodiment, haptic feedback forces may be conveyed to a user based on contact forces computed from the proximity between a virtual catheter model and a computer-generated representation of the cardiac anatomy. In an embodiment, the positioning of the virtual catheter model may be obtained through an impedance-based position detection system (e.g., such as associated with St. Jude Medical's NavX™ system), or through a magnetic-based position detection system (e.g., such as associated with Mediguide's gMPS positioning system). Further such a computer-generated representation of the cardiac anatomy may be derived from prior CT or MRI data, or a model (such as that created or maintained by St. Jude Medical's EnSite™ system).
With such embodiments/configurations, a user may have a previously obtained geometric model of the cardiac anatomy. This model may be visible to an electrophysiologist user through a visualization system (such as St. Jude Medical's EnSite™ system). This model may be assembled using, for example, previously captured CT or MRI images, and/or “skinned” geometry obtained by sensing actual position data of a mapping catheter (e.g., with St Jude Medical's NavX™ system or the gMPS system). Once the model is assembled, a catheter locating system (e.g., St. Jude Medical's NavX™ System or the gMPS system) could then place the working catheter inside the computed geometric model. In an embodiment, as the catheter is moved within the geometry, a haptic system could be used to compare the positioning of the catheter to that of the generated geometry. If the catheter is perceived to be in contact with the generated geometry, a resistive force could then be generated in connection with the associated input device—e.g., using attached motors.
In an embodiment, the geometric model may be registered to a repeating physiological signal such as, for example, the cardiac rhythm or respiration rhythm. As this signal is sensed in the actual procedure, the model geometry may dynamically change. This may then enable computed haptic feedback to provide a more accurate representation of the contact actually occurring within the patient.
A displayed orientation vector within the visualization software to show direction of planar, thumb switch deflection will now be discussed.
With some traditional, non-robotic catheter procedures, a thumb switch on the catheter handle causes catheter deflection by tensioning a corresponding steering wire. Such a switch typically allows the distal tip of a catheter to laterally deflect in one of two opposing directions in a single plane. If deflection is desired in more than one plane, a user commonly must physically rotate the catheter about its longitudinal axis to cause the deflection plane to rotate.
In an embodiment of robotic catheter system 10 incorporating instrumented traditional catheter handle input controls, as described above, an indicator may be provided within a computer visualization to give the user an idea of which direction the distal tip will deflect if the deflection thumb switch is actuated. In an embodiment, such a representation (e.g., deflection plane vector) may include an arrow superimposed near the tip of the virtual representation of a physical catheter. Such an arrow may indicate the direction the catheter would move if the thumb switch were pulled toward the user. Similarly, pushing a control (e.g., thumb switch) may cause the catheter to deflect in the opposite, arrow tail direction. The user may then cause a rotation of this vector by rotating an input handle, which may then be sensed by the attached motor/encoder or potentiometer. Similarly, a deflection vector could be associated with sheath visualization.
The general mechanics of the catheter and sheath movement will now be described with reference to
As generally illustrated in
As generally depicted in the illustrated embodiment, proximal portions of the steering wires 1218, 1220 may be respectively connected to control members 1228, 1230. Control members 1228, 1230 may be, for example, slider blocks such as those mentioned above, and may be used to interface or operatively connect control devices, such as the fingers of the manipulator assembly, to the steering wires 1218, 1220. For illustrative purposes, as generally shown in
As generally shown in
As further illustrated in
To cause catheter 1210 to move or retract back to an undeflected state along longitudinal axis L, a user could, for example, actively translate control member 1230 in a proximal direction. Such a motion could cause the distal portion 1212 to rotate and deflect toward steering wire 1220, while control member 1228 would be reactively translated in a distal direction. In an embodiment, due to memory effects of catheter 1210, such as caused by plastic deformation, upon restoring catheter 1210 to an undeflected state along longitudinal axis L, control members 1228, 1230 may not necessarily return to their original positions (e.g., on datum X).
It is noted that while
As generally illustrated in
In an embodiment, as illustrated in
In the embodiment illustrated by
Similar to the cartridges described above with respect to
Active tensioning of “passive” steering wires will now be briefly discussed with reference to
As described above, an embodiment of robotic catheter system 1210 may provide for tensioning of the steering wires (e.g., by moving fingers/slider blocks in a proximal direction). As generally shown in
In an embodiment, to help prevent fingers 1460 from impeding passive steering wires 1456, each finger may be retracted to a “home” position when it is not controllably tensioning a steering wire. Such a return-to-home configuration can, at least in part, help ensure that each finger 1460 will not obstruct the distal motion of passive slider blocks 1458. It may be desirable, however, for such a configuration to include features to address issues associated with reduced system response time and potential step-wise distal tip motion, attributable to the time needed to move fingers 1460 back into contact with slider-blocks 1458 when the passive slider blocks must be tensioned to cause a desired movement.
It may be desirable, for example during a medical procedure, for the distal portion of a catheter to be capable of prompt dynamic, back and forth movements, such as those illustrated in
Pre-defined catheter “speed zones” will now be briefly discussed with reference to
To aid users in navigating a catheter safely, yet quickly, around a cardiac chamber, robotic catheter system 10 may employ pre-defined “speed zones” to optimize the movement of the catheter tip. In an embodiment of the robotic catheter system, the user may have the ability to configure the maximum allowable catheter speed, or alternatively configures the scaling factor that relates the user input to the catheter motion, as a function of the orthogonal distance between the catheter and the nearest cardiac tissue. As described in relation to
If desired, the system may include a corresponding haptic response in the input joystick. For zones A, B, and C, such a haptic response may involve changing the dampening force on the handle (e.g., as the tip moves closer to the wall, the user might feel as if the tip is caught in an increasingly dense sludge). Once the tip starts to cross the barrier between zone C and zone D, this feeling may be accompanied by a force that prevents inadvertent continued motion.
User guided robotic control will now be discussed with reference to
As schematically represented in
In an embodiment of the user interface 1510, the one or more input devices 1520 may be configured to receive input from a physician corresponding to both a continuous movement 1522 of the input device 1520 and a discrete actuation 1524 of the input device 1520. The user interface may further provide the physician with a means of selecting a particular viewing perspective 1526 of a three dimensional anatomical model 1542. As used herein, a continuous movement input is one that can be represented on a continuous spectrum, such as the movement of a joystick, mouse, or slider. While it is understood that current digital computing operates in discrete increments, the term “continuous movement” as herein used, is intended to only distinguish from a discrete actuation, such as a button press, which must be represented as a finite state. The input device 1520 is configured to provide the various forms of user input from the physician to the controller 1540 for processing.
The user interface 1510 may further include one or more visual displays 1510 that are capable of displaying one or more views 1532 of an anatomical model 1542. The display 1534 may further be configured to display one or more secondary features 1534 either together with, or apart from the displayed view of the model 1532. In an embodiment, secondary features may include markers, targets, sliders, menu buttons, patient vital data, or other useful visual information that may not be strictly representative of the anatomical model 1542. In an embodiment, the displayed view of the anatomical model may be selected 1526 by the user via the input device 1520.
As will be described in greater detail below, the controller 1540 may be configured to maintain a three dimensional anatomical model 1542 of the cardiac geometry, and execute both control logic 1544 and display logic 1546. In an embodiment, the control logic 1544 is configured to relate intended user actions into a controlled physical movement of the catheter and sheath. Such control logic may include the use of, for example, control algorithms, forward and/or inverse kinematic computations, and real-time feedback from the catheter, manipulator, or positioning system. In an embodiment, the display logic 1546 is configured to use three dimensional view rotation, translation, and/or projection techniques to present the user with a displayed representation 1532 of the anatomical model 1542 corresponding to the provided view selection input 1526. The display logic 1546 may further be configured to relate a user input 1522 made with respect to a presently displayed view 1532 into the coordinate system of the anatomical model.
The bedside system 1530 generally includes one or more manipulator assemblies 1532 configured to manipulate a catheter and sheath, and a positioning system 1534 configured to detect the real-time positioning of the catheter and sheath devices within the patient.
In an embodiment of the general control scheme, the controller 1540 may be configured to receive inputs from an input device 1520 configured to resemble a traditional catheter handle, as discussed above with reference to
In another embodiment of the general control scheme, the controller 1540 may be configured to register user inputs as they are made with respect to a displayed third-person view of the catheter and anatomic model 1542. The physician may therefore be able to use the input device 1520 to move the virtual catheter across the display 1530 in much the same manner as in a traditional computer experience, where a user can use a mouse to drag an object across a display screen. Said another way, a leftward motion of the input device 1520 would result in a leftward movement of the displayed catheter within the currently displayed view 1532 of the anatomical model 1542. The controller 1540 would then be configured to resolve the intended Cartesian distal catheter movements into deflection and translation manipulator actuation commands through control logic 1544 that may cause the actual catheter tip to follow the intended movements.
In another embodiment of the general control scheme, the controller 1520 may be configured to register user inputs as they are made with respect to a displayed third-person view of the catheter and anatomic model 1542 solely for the purpose of controlling directional bending of the catheter. In such an embodiment, translation of the catheter may be separately controlled through the use of a slider, wheel, unused device axis, or other similar device. The controller 1540 would therefore be configured to resolve the intended display-plane distal catheter movements into deflection-only manipulator actuation commands through control logic 1544, which would cause the actual catheter tip to follow the intended movements within the display plane, but would allow movement orthogonal to the display plane to be controlled by the mechanics of the catheter.
In another embodiment of the general control scheme, the controller 1520 may be configured to register user inputs as if the user was navigating the catheter from a first person point of view. In such an embodiment, the display 1530 would represent the anatomic model 1542 as if the viewing camera was positioned on the tip of the catheter. The physician would therefore be able to use the input device 1520 to steer the catheter in much the same way a driver steers a car while looking out of the front windshield.
In yet a further embodiment of the general control scheme, as generally illustrated in
Referring back to
As illustrated in
Finally, as shown in
As illustrated in
An embodiment of the control scheme will now be discussed with regard to
As generally represented by the flowchart in
In an embodiment, a model of the operative site is first generated from the physical anatomy of the subject (1702). This generated model may serve as a basis or a reference for the physician's control, and should reflect the features of the subject's anatomy. The model may be generated by, for example, using pre-existing MRI or CT imagery, or may be generated by monitoring the real-time movement of an invasive probe, such as with the EnSite NavX system available from St. Jude Medical. In the case of a probe, axes of a coordinate system may be generated between pairs of patch electrodes located on the skin of the patient (such as described in detail in U.S. Pat. No. 7,263,397, titled “Method and Apparatus for Catheter Navigation and Location and Mapping in the Heart,” incorporated by reference in its entirety). A catheter with a position sensing electrode may be swept around the internal geometry while it communicates its position relative to the pairs of electrodes to the controller. Through either contact sensing means or various skinning techniques, a shell may be constructed around the outermost points of the recorded three dimensional data cloud. This shell may then be the basis of the anatomical model maintained by the controller. Likewise, other similar positioning/modeling systems my be used to generate the stored anatomical model. Such systems may include, for example, the Mediguide gMPS system, or the Biosense Webster Carto system.
In an embodiment where the model is generated by a real-time positioning system such as EnSite NavX, the registration (1702) may be implicit (i.e., C3=C4), where no further registration is needed. If other real-time factors (e.g., breathing and/or respiration) are sensed by the positioning system, however, a registration may still be necessary. Alternatively, in an embodiment where the model is imported from previously acquired CT or MRI imagery, the model may be registered to the coordinate system of the real time positioning system through scaling and/or rotating techniques such as those provided by the EnSite Fusion dynamic registration system, commercialized by St. Jude Medical.
In a configuration where the physician makes input movements with respect to a third person view of a displayed catheter and anatomic model, the physician must first select a viewing perspective from which to perceive the model (1704). This may be accomplished through the use of a display controller. The display controller may allow the physician to manipulate the displayed view of the anatomic model, and may include, for example, a 3D mouse, or spaceball such as those commercially available from 3Dconnexion, or may include various on-screen controls that would allow the user to pan, zoom, and/or rotate the model.
In operation, as generally illustrated in the display controller may serve to manipulate a projection of the 3D model onto the 2D display by first rotating/translating the model in 3D space, and then projecting the model onto a 2D viewing plane. The rotation/translation may be accomplished using a homogeneous model view transformation matrix (TV). In an embodiment, the model view transformation matrix (TV) may be of the form shown in Equation 1, where the 3×3 matrix of R1-9 relates to a rotation of the model in three dimensional space, and the 3×1 matrix of T1-3 relates to a translation of the model in three dimensional space.
Such model view transformation matrices are commonly implemented through high-level commands in rendering applications, such as OpenGL, and ultimately have the effect of repositioning or rotating a model in front of a fixed camera. Once the model is positioned in three dimensional space, it may then be projected to a two dimensional viewing plane, as generally illustrated in
As described above, the user may indicate intended movements of the catheter to the system by using an input device. Potential input devices may include, for example, a two or three dimensional mouse or joystick, a spatially detected stylus, a touch screen, or other similar forms or input. As generally described above, the user may specify this intended movement in a manner that directly moves the catheter tip across the screen similar to controlling a computer pointer arrow with a computer mouse. Alternatively, the user may select a point along the catheter and drag it across the screen similar to dragging an icon across a computer desktop. In yet another embodiment, the user may use the input device to specify way-points or target points within the model for semi-automated or fully-automated movement.
Referring back to
As used in equation 2, (z1) represents the out-of-plane movement of the input device. While in some embodiments, the input device may only be capable of two dimensional movement, this third dimension may be directly obtained from the device if, for example, a three dimensional input device, such as a 3D joystick, is used. Alternatively, when using a two-dimensional input device, this third dimension may be obtained from another input such as the rotation of a wheel. In another embodiment, (z1) may be maintained as a constant that constrains the catheter's orthogonal motion to a plane that bisects the catheter's current position, and is parallel to the current viewing plane. If held as a constant, the catheter may be maneuvered in three dimensions by first rotating the view using the display controller, and then moving the catheter in the new viewing plane. In yet another embodiment, (z1) may be retrieved from the stored z2 buffer (i.e., the stored depth for each displayed point or primitive). In this manner, once a user selects a point on the display, the point may be immediately projected to the surface of the displayed anatomy. In an embodiment, the display may further provide an auxiliary view to aid the user in perceiving depth. It should also be understood that if the input device is configured to convey information regarding its orientation, equation 2 may be expanded to account for such rotation.
In still another embodiment, (z1) may be allowed to vary freely based on the bending mechanics of the catheter while the directional bending of the catheter is controlled by the user. In such an embodiment, the manipulator may be constrained against automatic translation, and a directional movement in, for example, a two-dimensional input space (e.g., C1) or two-dimensional display-space (e.g., C2) would cause an inherent bending motion in the catheter. As such, (z1) may be determined based on a knowledge of the current catheter pose, together with the direction of intended movement {x1, y1}, and an understanding of the bending mechanics of the catheter.
In an embodiment where the user appears to directly control a displayed catheter or sheath, the system may be configured so that the user is actually controlling a dynamic target point that is independent of the catheter. The system may be configured to cause the actual catheter to track this dynamic target point a closely as possible, though to the user, the point may either not be displayed or displayed as a marker. In an alternative embodiment, the target point may be displayed as a catheter, while the real-time catheter position may be displayed as a ghost-catheter (as generally illustrated in
As generally illustrated in
Using these “forward” relationships, the system may accurately predict how a particular manipulator actuation (often referred to as “joint variables”) would affect the catheter position registered within the model. This, however, is the direct opposite of the relationships needed from a control perspective. As generally illustrated in
Beginning with the forward kinematics as shown in
Once the steering wire lengths (e.g., {LA, LB, LC, LD}) within the bendable portion are known, the system may use known relationships to compute the deflection characteristics of the distal portion of the catheter. As illustrated in
In a four steering wire embodiment, as illustrated in the cross-sectional view shown in
As referenced in Step 1914 of
In another embodiment, instead of using closed-form analytical modeling to understand how inputs (e.g., steering wire lengths {LA, LB, LC, LD}) relate to local movement in the bendable section (e.g., {x6, y6, z6}), the system may employ empirical modeling techniques to model the catheter's behavior. These techniques may use actual observations to describe and predict how an object will behave, rather than relying on mathematically describable relationships. Examples of such empirical modeling techniques include neural network techniques such as without limitation, recurrent neural network modeling, or hysteretic recurrent neural network modeling. A hysteretic recurrent neural network model, for example, may accept the steering wire lengths and past local tip positions as inputs to the network, and may be configured to determine a resultant position from this information. The model may be trained prior to the actual procedure by experimentally manipulating the catheter throughout its full range of motion, or a portion of the full range of motion, and inputting the measured parameters and poses into the network to refine the model. These relationships may be determined from a catheter that is substantially similar in design or construction to the catheter that will be used in the procedure. The empirical model may reflect the kinematic properties of the catheter or sheath, and may be configured to account for material non-linearities, such as plastic deformation or axial compression, that may develop through use.
While local modeling, such as shown in equations, 5-7, may provide a useful insight into the mechanics of the distal catheter bending, the motions (i.e., {x6, y6, z6} or {θ6, ϕ6, L6}) are computed in a catheter-centric relative coordinate frame. As described above, however, the user desired catheter motions are specified in the coordinate system of the model/positioning system. Therefore, as referenced in Step 1916 of
In an embodiment, TC may be computed empirically by physically moving the catheter through a series of positions and recording the coordinates of the catheter in both the catheter reference frame and the positioning system reference frame. The recorded point pairs may then be used, for example, in a regression analysis, to determine the values for TC that would satisfy the relationship expressed in equation. 9, where {right arrow over (C)}4 represents the points recorded by the positioning system, and {right arrow over (C)}6 represents the points in the local catheter-centric reference frame.
{right arrow over (C)}4=TC·{right arrow over (C)}6 (eq. 9)
In an embodiment, TC may be computed by recording the point pairs at a series of three points {S0, S1, S2} as shown in
As used in the table, LA, LB, LC, LD represent the lengths of four steering wires within the bendable section (in a four steering wire catheter embodiment), while L represents the axial translation of the catheter. Point S0 may be specified such that the catheter is positioned slightly beyond the sheath, though in an undeflected state. The motion from S0 to S1 is accomplished by translating the catheter distally an amount Δz, such as an amount equal to the bendable length of the catheter. The motion from S1 to S2 is then accomplished by displacing pull wire “A” a distance Δa, sufficient to, for example, bring the catheter to a deflection angle of between π/4 and π/2. A value of ‘auto,’ as used in the table, indicates that while pull wire “A” is being displaced, pull wire C should be moved in such a manner to not impede the deflection of the catheter, though should also be auto-tensioned to prevent slack from developing.
Once points {s0, s1, s2} are established, vectors {{right arrow over (P)}0, {right arrow over (P)}1, {right arrow over (P)}2} may be defined within the coordinate frame of the positioning system and used to create a set of orthogonal basis vectors represented by equations 10-12.
{right arrow over (K)}={right arrow over (P)}0 (eq. 10)
{right arrow over (J)}={right arrow over (P)}1×{right arrow over (P)}2 (eq. 11)
{right arrow over (I)}={right arrow over (K)}×{right arrow over (J)} (eq. 12)
These vectors may then be normalized, and used to assemble the rotation portion of the homogeneous catheter transformation matrix referenced in equation 8. This rotation matrix is shown explicitly in equation 13.
Furthermore, if S0 is defined as the relative origin, for example, the transformation vector included in equation 8 may be determined through equation 14.
{right arrow over (t)}=−R{right arrow over (s)}0 (eq. 14)
Once the homogeneous catheter transformation matrix is assembled, it may be used via equation 9 to relate the computed local motion of the catheter into the coordinate system of the positioning system, as referenced in step 1918 of
Using the relationships expressed above, and graphically illustrated in
While a closed solution to the partial derivatives expressed in equation 15 may be difficult to compute, the derivatives may be approximated at a given point, by analyzing how small (delta) changes of the input motions affect the end-effecter coordinates at that point according to the model. To calculate these approximations, the controller may numerically apply a small perturbation to the current position of each of the distal steering wires ({right arrow over (L)}) in both the positive and negative direction. These perturbed motions may be passed through the forward kinematic model (illustrated in
While the relationship expressed in equation 16 may be useful to predict a catheter
motion for a given input, as explained above, the inverse of this function may be more useful from a control perspective. As shown in equation 18, the inverse Jacobian function may be used to relate a chance in desired catheter movement into the motions needed to obtain that desired result.
{right arrow over ({dot over (L)})}=J−1{dot over (X)}3 (eq. 18)
In general, however, the Jacobian Matrix (J) is not directly invertable. Therefore, in an embodiment, an approximation of J1 may be computed using linear algebra techniques. Such an approximation may rely on the pseudo-inverse methodology generally illustrated in equation 19, where A is a regularization value.
J−1≈JT(JJT−λI)−1 (eq. 19)
When solving for J1, the controller may use the approximation of J (i.e., Japprox) calculated from equation 17. Since Japprox is only valid at the point where it is computed, Japprox−1 is also only valid for that same position. As the model catheter moves away from the position, Japprox−1 may need to be recomputed to remain accurate. It should be recognized that Japprox−1 may be calculated using various techniques, such as, for example, the singular value decomposition (SVD) technique. Once the matrix Japprox−1 is calculated for a given catheter position, it may then be used, as shown in equation 20, to convert a desired movement within the model into the necessary actuator input (or distal steering wire movements) required to achieve that desired movement.
{right arrow over ({dot over (L)})}=Japprox−1{dot over (X)}3 (eq. 20)
Due to the inaccuracies caused by numerical approximations of the Jacobian and inverse Jacobian, in an embodiment where such approximations are used, a computed movement of the catheter may be made in a series of discrete steps, with the Jacobian approximation being recomputed at each discrete interval. In an embodiment where the catheter movement is configured to follow a constructed a trajectory, as generally shown in
While the above description is made in terms of controlling the position of a point located at or near the pull ring of a catheter, it may likewise be possible to control orientation the catheter at that point. Furthermore, as described above, the system may comprise both an actively controlled catheter and an actively controlled sheath. In such a case, the controller may be configured to account for a greater number of input degrees of freedom, and the model may take into account the compound dynamics of the catheter/sheath combination, as generally shown in
Referring back to
In an embodiment where the controller specifies movement commands in terms of distal steering wire lengths (i.e., {LA, LB, LC, LD, L}), the controller may be configured to move the proximal actuators (e.g., fingers) while receiving feedback on the actual length change of the distal steering wires, as generally illustrated in
Referring back to
In an embodiment where the real-time position in used to enhance the catheter's ability to track a particular target or path, the monitored position and orientation may be fed back to the controller in a closed-loop manner to account for model inaccuracies, external disturbances, or drift. The controller may be configured such that the system is either critically damped or overdamped and may cause the actual position of the distal catheter tip to rapidly converge to the desired position, though not permit the catheter to overshoot the desired position. Additionally, in an embodiment where the predicted model moves in an open-loop manner, (rather than path-tracking) positional feedback may be employed to dynamically compensate for inaccuracies in the kinematic model by periodically computing a model correction matrix. In an embodiment, the model correction matrix may be applied to the forward kinematic model, and may rotate and/or translate the position of the model catheter to reflect the sensed position/orientation of the actual catheter. This correction matrix may be maintained by the system and continuously adjusted and applied during control/movement iterations.
The catheter's actual position may also be used to infer contact with tissue by comparing the expected position with the actual position. If, during a movement, the system tensions one or more steering wires, as described above, the distal portion of the catheter is expected to bend in a predictable manner. If, during the process of tensioning, the catheter's observed movement does not correspond with the expected movement, it may be inferred that there is an obstruction preventing the expected movement. The system may be configured to likewise analyze the actual movement for changes or discontinuities in other relationships, such as for example, the speed of the movement
or the rate of movement in view of the actuation inputs
As the rate of movement decreases, potentially approaching zero, it may be inferred that the distal catheter tip has encountered an obstruction. If an obstruction is detected in this manner, the system may be configured to cease further movement or actuation. In an embodiment, the system may be configured to determine the contact direction by analyzing the heading of the catheter movement. If the heading (i.e., movement vector 2200) of the catheter 2202 unexpectedly changes direction (such as to new heading 2204), as shown in
In an embodiment where the forward kinematic relationships are constructed through empirical modeling, such as, for example, hysteretic recurrent neural network modeling, the actual positional movement of the catheter in response to the steering wire inputs may be relied on to progressively train the model. Real-time feedback during a procedure may likewise be used to further refine the model if desired. Additionally, in an embodiment where the model is configured to account for past positions or hysteresis, the positional feedback may also be logged and provided to the model as an input.
As described above, the catheter used with the robotic catheter system may incorporate a sensor in the distal tip that is configured to provide an indication of physical contact between the catheter and an obstruction. Such sensors may include load cells, shape memory alloy based force sensors, piezoelectric force sensors, strain gauges, or optical-based or acoustic-based force sensors. If the catheter encounters tissue or another obstruction during operation, the contact or force sensor may be configured to provide an indication to the controller that such contact exists. Similar to contact sensing via position monitoring, if contact is detected, the controller may be configured to refrain from applying further force on the catheter in the direction of the sensed contact. Alternatively, the system may use the indication of the force to provide a controlled amount of force between the catheter and tissue that may be pre-set by the physician.
In an embodiment, the catheter may incorporate an electrode on its distal tip that is configured to provide an indication of the degree of electrical coupling between the catheter and tissue. (described in detail in U.S. patent application Ser. No. 12/622,488, titled “System and Method for Assessing Lesions in Tissue,” incorporated by reference in its entirety). Such an indication may be based on a measured impedance and/or phase of a signal transmitted through the tissue, and may allow the system to determine the nature of the electrical coupling that exists. If the catheter is in inadequate electrical contact with the tissue, the system may, for example, alert the user, or automatically refine the position until an adequate measure of electrical coupling exists.
As illustrated in
Other forms of feedback that may be available to the controller include feedback from the manipulator about the status of each actuator within the given workspace. As described above, each steering wire actuator and carriage may have a finite range of travel. As each is manipulated, it may draw closer to the limits on its range of travel. Therefore, the manipulator may be able to convey each actuator's current position with respect to the actuator's total range of motion. If an actuator nears or reaches a limit of its individual workspace or range of motion, the controller may be configured to prevent further attempted actuation and may alert the physician so that appropriate action may be taken. The manipulator may be configured to understand the full range of each actuator motion through, for example, the use of linear encoders coupled with each actuator, or the use of sensors, such as hall effect sensors, at or near the limits of the available travel. In an embodiment, the limits may be hard coded as an absolute encoder count, or may be detected through an initialization routine prior to use.
In another embodiment, the manipulator may be configured to monitor the force exerted by each actuator. This indication of force may convey to the controller that the catheter or sheath have encountered an obstruction if the force becomes too great. Alternatively if the force applied on an actuator is lower than an acceptable range, it may signify a loss of contact between, for example, the actuator finger and the slider block. It may also signify that, for example, a steering wire's integrity has been compromised in some manner. One example of this may be a break in the coupling between the steering wire and the pull ring.
The robotic catheter system may be a useful tool in increasing the speed, precision, repeatability, and effectiveness of a particular procedure. It may allow the physician to control the catheter motion in intuitive ways that enable dynamic path planning and may allow for certain automated motions or procedures. It is necessary that during any automated movement, the actual catheter must traverse a given space without unintentionally contacting or attempting to pass through tissue. Therefore, the system may be configured to use a knowledge of the anatomical model geometry, a knowledge of the catheter dynamics, and/or available real-time feedback from the actual catheter to circumnavigate any obstacles or anatomical features. Additionally, while it is important to prevent the robotic catheter tip from unintentionally passing through tissue, contact between the tissue and a proximal portion of the catheter or sheath may serve to prevent the distal tip from reaching certain locations. In such a case, the catheter may be configured to account for proximal contact between the catheter or sheath and a particular anatomical feature.
Although several embodiments of this invention have been described above with a certain degree of particularity, those skilled in the art could make numerous alterations to the disclosed embodiments without departing from the scope of this invention. All directional references (e.g., upper, lower, upward, downward, left, right, leftward, rightward, top, bottom, above, below, vertical, horizontal, clockwise and counterclockwise) are only used for identification purposes to aid the reader's understanding of the present invention, and do not create limitations, particularly as to the position, orientation, or use of the invention. Joinder references (e.g., attached, coupled, connected, and the like) are to be construed broadly and may include intermediate members between a connection of elements and relative movement between elements. As such, joinder references do not necessarily infer that two elements are directly connected and in fixed relation to each other. It is intended that all matter contained in the above description or shown in the accompanying drawings shall be interpreted as illustrative only and not as limiting. Changes in detail or structure may be made without departing from the invention as defined in the appended claims.
This application relates to and is a continuation of and claims the benefit of and priority to U.S. patent application Ser. No. 12/751,843 filed 31 Mar. 2010 which is a continuation-in-part of and claims the benefit of and priority to U.S. patent application Ser. No. 12/347,811 filed 31 Dec. 2008 (the '811 application), which in turn claims the benefit of and priority to U.S. provisional patent application Nos. 61/040,143 filed 27 Mar. 2008 (the '143 application) and 61/099,904 filed 24 Sep. 2008 (the '904 application), the entire disclosure of each of the '811 application, the '143 application, and the '904 application are hereby incorporated by reference as though fully set forth herein.
Number | Name | Date | Kind |
---|---|---|---|
3091130 | Payerle et al. | May 1963 | A |
3605725 | Bentov | Sep 1971 | A |
3893449 | Lee et al. | Jul 1975 | A |
4160508 | Frosch et al. | Jul 1979 | A |
4348556 | Gettig et al. | Sep 1982 | A |
4393728 | Larson et al. | Jul 1983 | A |
4494417 | Larson | Jan 1985 | A |
4543090 | McCoy | Sep 1985 | A |
4758222 | McCoy | Jul 1988 | A |
4784042 | Paynter | Nov 1988 | A |
4802487 | Martin et al. | Feb 1989 | A |
4884557 | Takehana | Dec 1989 | A |
4962448 | DeMaio | Oct 1990 | A |
4974151 | Advani et al. | Nov 1990 | A |
5078140 | Kwoh | Jan 1992 | A |
5107080 | Rosen | Apr 1992 | A |
5170817 | Sunderland | Dec 1992 | A |
5238005 | Imran | Aug 1993 | A |
5298930 | Asakura | Mar 1994 | A |
5303148 | Mattson et al. | Apr 1994 | A |
5318525 | West | Jun 1994 | A |
5339799 | Kami et al. | Aug 1994 | A |
5396266 | Brimhall et al. | Mar 1995 | A |
5410638 | Colgate et al. | Apr 1995 | A |
5441483 | Avitall | Aug 1995 | A |
5449345 | Taylor | Sep 1995 | A |
5520644 | Imran | May 1996 | A |
5533967 | Imran | Jul 1996 | A |
5545200 | West | Aug 1996 | A |
5579442 | Kimoto | Nov 1996 | A |
5607158 | Chan | Mar 1997 | A |
5607462 | Imran | Mar 1997 | A |
5623582 | Rosenberg | Apr 1997 | A |
5630783 | Steinberg | May 1997 | A |
5661253 | Aoki | Aug 1997 | A |
5684512 | Schoch et al. | Nov 1997 | A |
5706827 | Ehr | Jan 1998 | A |
5755577 | Gillio | May 1998 | A |
5767839 | Rosenberg | Jun 1998 | A |
5784542 | Ohm et al. | Jun 1998 | A |
5791908 | Gillio | Aug 1998 | A |
5800178 | Gillio | Sep 1998 | A |
5805140 | Rosenberg et al. | Sep 1998 | A |
5807377 | Madhani | Sep 1998 | A |
5808665 | Green | Sep 1998 | A |
5828813 | Ohm | Oct 1998 | A |
5854622 | Brannon | Dec 1998 | A |
5861024 | Rashidi | Jan 1999 | A |
5876325 | Mizuno et al. | Mar 1999 | A |
5897488 | Ueda | Apr 1999 | A |
5907487 | Rosenberg et al. | May 1999 | A |
5913820 | Bladen et al. | Jun 1999 | A |
5917898 | Bassa et al. | Jun 1999 | A |
6007550 | Wang et al. | Dec 1999 | A |
6040758 | Sedor | Mar 2000 | A |
6063095 | Wang | May 2000 | A |
6096004 | Meglan et al. | Aug 2000 | A |
6113395 | Hon | Sep 2000 | A |
6201196 | Wergen | Mar 2001 | B1 |
6233476 | Strommer et al. | May 2001 | B1 |
6233504 | Das et al. | May 2001 | B1 |
6251015 | Caprai | Jun 2001 | B1 |
6290683 | Erez | Sep 2001 | B1 |
6348911 | Rosenberg | Feb 2002 | B1 |
6358207 | Lathbury et al. | Mar 2002 | B1 |
6385509 | Das et al. | May 2002 | B2 |
6396232 | Haanpaa | May 2002 | B2 |
6432112 | Brock | Aug 2002 | B2 |
6498944 | Ben-Haim et al. | Dec 2002 | B1 |
6500167 | Webster | Dec 2002 | B1 |
6522141 | Debbins et al. | Feb 2003 | B2 |
6540685 | Rhoads et al. | Apr 2003 | B1 |
6610007 | Belson et al. | Aug 2003 | B2 |
6663559 | Hale et al. | Dec 2003 | B2 |
6671533 | Chen et al. | Dec 2003 | B2 |
6681880 | Bernhardt et al. | Jan 2004 | B2 |
6709667 | Lowe | Mar 2004 | B1 |
6726675 | Beyar | Apr 2004 | B1 |
6770027 | Banik et al. | Aug 2004 | B2 |
6785358 | Johnson | Aug 2004 | B2 |
6850252 | Hoffberg | Feb 2005 | B1 |
6869390 | Elliott | Mar 2005 | B2 |
6869396 | Belson | Mar 2005 | B2 |
6968223 | Hanover | Nov 2005 | B2 |
7016469 | Johnson et al. | Mar 2006 | B2 |
7056123 | Gregorio et al. | Jun 2006 | B2 |
7193521 | Moberg | Mar 2007 | B2 |
7197354 | Sobe | Mar 2007 | B2 |
7199790 | Rosenberg | Apr 2007 | B2 |
7247139 | Yudkovitch et al. | Jul 2007 | B2 |
7259906 | Islam | Aug 2007 | B1 |
7263397 | Hauck | Aug 2007 | B2 |
7276044 | Ferry | Oct 2007 | B2 |
7308831 | Cunningham et al. | Dec 2007 | B2 |
7356448 | Schaeffer | Apr 2008 | B2 |
7386339 | Strommer et al. | Jun 2008 | B2 |
7455288 | Dudney et al. | Dec 2008 | B2 |
7672849 | Yudkovitch et al. | Mar 2010 | B2 |
7698966 | Gosselin | Apr 2010 | B2 |
7742803 | Viswanathan et al. | Jun 2010 | B2 |
7850642 | Moll | Dec 2010 | B2 |
7880717 | Berkley et al. | Feb 2011 | B2 |
7945546 | Bliss et al. | May 2011 | B2 |
7955316 | Weitzner et al. | Jun 2011 | B2 |
7963288 | Rosenberg et al. | Jun 2011 | B2 |
8164573 | DaCosta et al. | Apr 2012 | B2 |
8317744 | Kirschenman | Nov 2012 | B2 |
8317745 | Kirschenman | Nov 2012 | B2 |
8332072 | Schaible et al. | Dec 2012 | B1 |
8390438 | Olson | Mar 2013 | B2 |
8416203 | Tsui | Apr 2013 | B2 |
8560118 | Greer et al. | Oct 2013 | B2 |
8926511 | Bar-Tar | Jan 2015 | B2 |
20010018591 | Brock | Aug 2001 | A1 |
20010025183 | Shahidi | Sep 2001 | A1 |
20020004423 | Minami et al. | Jan 2002 | A1 |
20020065485 | DuBois et al. | May 2002 | A1 |
20020068868 | Thompson et al. | Jun 2002 | A1 |
20020072704 | Mansouri-Ruiz | Jun 2002 | A1 |
20020087048 | Brock | Jul 2002 | A1 |
20020184055 | Naghavi et al. | Dec 2002 | A1 |
20030018232 | Elliott | Jan 2003 | A1 |
20030050733 | Wang et al. | Mar 2003 | A1 |
20030114962 | Niemeyer | Jun 2003 | A1 |
20030121382 | Morson | Jul 2003 | A1 |
20030176948 | Green | Sep 2003 | A1 |
20040050247 | Topping | Mar 2004 | A1 |
20040068173 | Viswanathan | Apr 2004 | A1 |
20040133189 | Sakurai | Jul 2004 | A1 |
20040138530 | Kawai et al. | Jul 2004 | A1 |
20040145563 | Rosenberg et al. | Jul 2004 | A9 |
20040146388 | Khajepour et al. | Jul 2004 | A1 |
20040193239 | Falwell et al. | Sep 2004 | A1 |
20040223636 | Edic | Nov 2004 | A1 |
20040243147 | Lipow | Dec 2004 | A1 |
20040247267 | Gilboa | Dec 2004 | A1 |
20050004515 | Hart et al. | Jan 2005 | A1 |
20050024331 | Berkley et al. | Feb 2005 | A1 |
20050038333 | Sra | Feb 2005 | A1 |
20050059960 | Simaan et al. | Mar 2005 | A1 |
20050075538 | Banik | Apr 2005 | A1 |
20050172405 | Menkedick et al. | Aug 2005 | A1 |
20050197623 | Leeflang et al. | Sep 2005 | A1 |
20050203382 | Govari | Sep 2005 | A1 |
20050222554 | Wallace | Oct 2005 | A1 |
20050234293 | Yamamoto et al. | Oct 2005 | A1 |
20050234320 | Balasubramanian | Oct 2005 | A1 |
20060052664 | Julian | Mar 2006 | A1 |
20060089637 | Werneth et al. | Apr 2006 | A1 |
20060095022 | Moll et al. | May 2006 | A1 |
20060137476 | Bull | Jun 2006 | A1 |
20060155321 | Bressler et al. | Jul 2006 | A1 |
20060276775 | Rosenberg | Dec 2006 | A1 |
20060293643 | Wallace et al. | Dec 2006 | A1 |
20070016008 | Schoenefeld | Jan 2007 | A1 |
20070022384 | Abbott et al. | Jan 2007 | A1 |
20070043338 | Moll et al. | Feb 2007 | A1 |
20070060833 | Hauck | Mar 2007 | A1 |
20070073137 | Schoenefeld | Mar 2007 | A1 |
20070100254 | Murakami et al. | May 2007 | A1 |
20070120512 | Albu-Schaffer et al. | May 2007 | A1 |
20070135803 | Belson | Jun 2007 | A1 |
20070142726 | Carney | Jun 2007 | A1 |
20070161857 | Durant et al. | Jul 2007 | A1 |
20070172803 | Hannaford et al. | Jul 2007 | A1 |
20070185404 | Hauck et al. | Aug 2007 | A1 |
20070185485 | Hauck | Aug 2007 | A1 |
20070185486 | Hauck | Aug 2007 | A1 |
20070197896 | Moll et al. | Aug 2007 | A1 |
20070197939 | Wallace | Aug 2007 | A1 |
20070198008 | Hauck et al. | Aug 2007 | A1 |
20070233044 | Wallace | Oct 2007 | A1 |
20070233045 | Weitzner | Oct 2007 | A1 |
20070257821 | Son et al. | Nov 2007 | A1 |
20070268269 | Chang et al. | Nov 2007 | A1 |
20070270650 | Eno et al. | Nov 2007 | A1 |
20070270685 | Kang | Nov 2007 | A1 |
20070276214 | Dachille | Nov 2007 | A1 |
20070298877 | Rosenberg et al. | Dec 2007 | A1 |
20080009791 | Cohen et al. | Jan 2008 | A1 |
20080013809 | Zhu | Jan 2008 | A1 |
20080112842 | Edwards | May 2008 | A1 |
20080201847 | Menkedick | Aug 2008 | A1 |
20080297490 | Adkins | Dec 2008 | A1 |
20080312536 | Dala-Krishna | Dec 2008 | A1 |
20090012533 | Barbagli | Jan 2009 | A1 |
20090033623 | Lin | Feb 2009 | A1 |
20090061928 | Lee et al. | Mar 2009 | A1 |
20090123111 | Udd | May 2009 | A1 |
20090137952 | Ramamurthy et al. | May 2009 | A1 |
20090177454 | Bronstein | Jul 2009 | A1 |
20090192519 | Omori et al. | Jul 2009 | A1 |
20090195514 | Glynn et al. | Aug 2009 | A1 |
20090247993 | Kirschenman et al. | Oct 2009 | A1 |
20090264156 | Burghardt et al. | Oct 2009 | A1 |
20090322697 | Cao | Dec 2009 | A1 |
20090327886 | Whytock | Dec 2009 | A1 |
20100066676 | Kramer et al. | Mar 2010 | A1 |
20100073150 | Olson et al. | Mar 2010 | A1 |
20100079386 | Scott et al. | Apr 2010 | A1 |
20100082039 | Mohr et al. | Apr 2010 | A1 |
20100103127 | Park et al. | Apr 2010 | A1 |
20100256558 | Olson et al. | Oct 2010 | A1 |
20100268067 | Razzaque et al. | Oct 2010 | A1 |
20100314031 | Heideman et al. | Dec 2010 | A1 |
20110040547 | Gerber et al. | Feb 2011 | A1 |
20110128555 | Rotschild et al. | Jun 2011 | A1 |
20110137156 | Razzaque et al. | Jun 2011 | A1 |
20110152882 | Wenderow et al. | Jun 2011 | A1 |
20110289441 | Venon et al. | Nov 2011 | A1 |
20110306986 | Lee et al. | Dec 2011 | A1 |
20120071891 | Itkowitz et al. | Mar 2012 | A1 |
20120133601 | Marshall et al. | May 2012 | A1 |
20120158011 | Sandhu | Jun 2012 | A1 |
20120277663 | Millman et al. | Nov 2012 | A1 |
20130006268 | Swarup et al. | Jan 2013 | A1 |
20130154913 | Genc et al. | Jun 2013 | A1 |
20130165854 | Sandhu et al. | Jun 2013 | A1 |
20130166484 | Hartmann | Jun 2013 | A1 |
20130176220 | Merschon et al. | Jul 2013 | A1 |
20130179162 | Merschon et al. | Jul 2013 | A1 |
Number | Date | Country |
---|---|---|
0151479 | Aug 1985 | EP |
0904796 | Mar 1999 | EP |
2211280 | Jun 1989 | GB |
2397177 | Jul 2007 | GB |
S60221280 | Nov 1985 | JP |
H06344285 | Dec 1994 | JP |
H8-280709 | Oct 1996 | JP |
H10216238 | Aug 1998 | JP |
2003024336 | Jan 2003 | JP |
2007-325936 | Dec 2007 | JP |
9320535 | Oct 1993 | WO |
1996039944 | Dec 1996 | WO |
9639944 | Dec 1996 | WO |
03024336 | Mar 2003 | WO |
2003049596 | Jun 2003 | WO |
2006120666 | Nov 2006 | WO |
2007056590 | May 2007 | WO |
2007088208 | Aug 2007 | WO |
2007098494 | Aug 2007 | WO |
2007120329 | Oct 2007 | WO |
2007136803 | Nov 2007 | WO |
2007146325 | Dec 2007 | WO |
2007143859 | Dec 2007 | WO |
2008045831 | Apr 2008 | WO |
2008101228 | Aug 2008 | WO |
2008103212 | Aug 2008 | WO |
2009120982 | Oct 2009 | WO |
2009120992 | Oct 2009 | WO |
2009120940 | Oct 2009 | WO |
2009120992 | Oct 2009 | WO |
2010025338 | Mar 2010 | WO |
2010059179 | May 2010 | WO |
2010068783 | Jun 2010 | WO |
2010107916 | Sep 2010 | WO |
Entry |
---|
Title: Supplemental European Search Report Citation: EP Application No. 11763450.1 Publication Date: Oct. 29, 2014 9 pages. |
Title: International Search Report Citation: PCT Application No. PCT/US2011/030656 Publication Date: Jun. 13, 2011 8 pages. |
Title: Emotiv—Brain Computer Interface Technology (online) Citation: <URL: http://www.emotiv.com> Publication Date: Aug. 11, 2011 3 pages. |
Title: Supplemental European Search Report Citation: EP Application No. 09726364.4 Publication Date: Jan. 22, 2013 7 pages. |
Title: Supplemental European Search Report Citation: EP Application No. 09723739.0 Publication Date: Jul. 10, 2012 6 pages. |
Title: Supplemental European Search Report Citation: EP Application No. 09724550.0 Publication Date: Jul. 10, 2012 6 pages. |
Title: International Search Report Citation: PCT Application No. PCT/US2009/058121 Publication Date: Nov. 19, 2009 2 pages. |
Title: International Search Report Citation: PCT Application No. PCT/US2009/038536 Publication Date: May 28, 2009 2 pages. |
Title: International Search Report Citation: PCT Application No. PCT/US2009/038534 Publication Date: May 27, 2009 2 pages. |
Supplementary European Search Report for EP Application No. 11763410.5, dated Jun. 10, 2015. 7 pages. |
Title: International Search Report Citation: PCT Application No. PCT/US2009/038597 Publication Date: May 18, 2009 2 pages. |
Title: International Search Report Citation: PCT Application No. PCT/US2009/038618 Publication Date: May 22, 2009 2 pages. |
International Search Report for PCT Application No. PCT/US2009/069712, dated Feb. 25, 2010 10 pgs. |
Supplementary European Search Report for EP Application No. 09725131.8, dated Feb. 20, 2013. 7 pgs. |
Title: International Search Report Citation: PCT Application No. PCT/US2009/038525 Publication Date: May 27, 2009 2 pages. |
Title: International Search Report Citation: PCT Application No. PCT/US2009/038531 Publication Date: May 19, 2009 3 pages. |
Title: International Search Report Citation: PCT Application No. PCT/US2009/038533 Publication Date: Jun. 17, 2009 2 pages. |
Title: The Aurora Electromagnetic Tracking System, Aurora Electromagnetic Measurement System—3D Tracking for Medical Guidance Citation: Northern Digital, Inc. <URL: http://www.ndigital.com/medical/aurora.pho?act=print> Publication Date: (actual publication date unknown) 3 pages. |
Title: Apple Wins Strategic Multitouch and Music Tempo Workout Patents Citation: Patently Apple <URL: http://www.patentlyapple.com/patently-apple/2010/04/apple-wins-strategic-multitouch-music-tempo-workout-patents.html> Publication Date: (actual publication date unknown) 5 pages. |
Polaris Family of Optical Tracking Systems, Polaris Vicra & Spectra—Optical Measurement Systems for Medical Citation: Northern Digital Inc. <URL: http://www.ndigital.com/medical/polarisfamily.php?act=print> Publication Date: (actual publication date unknown) 5 pages. |
Author: LaBelle, Kathryn Title: Evaluation of Kinect Joint Tracking for Clinical and In-Home Stroke Rehabilitation Tools Citation: <http://netscale.cse.nd.edu/twiki/pub/Edu/KinectRehabilitation/Eval_of_Kinect_for_Rehab.pdf> Publication Date: Dec. 2011 67 pages. |
Author: Padoy, Nicolas Title: Needle Insertion Revisted (tele-surgery in depth), (online) Citation: The John Hopkins University <URL: http://www.youtube.com/watch?v=YsY_A0kLh-g> Publication Date: Jan. 2011 25 pages. |
Title: Emotiv EPOC Software Development Kit—EPOC neuroheadset (online) Citation: <URL: http://www.emotiv.com/store/hardware/epoc/bci/epoc-neuroheadseU> Publication Date: (actual publication date unknown) 2 pages. |
Title: Wii Remote—Wikipedia, the free encyclopedia (online) Citation: <URL: http://en .wikipedia.org/wiki/Wii_Remote> Publication Date: (actual publication date unknown) 17 pages. |
Title: About the Kinect for Windows SDK—Microsoft Research (online) Citation: <URL: http://research .microsoft.com/en-us/um/redmond/projects/kinectsdk/about.aspx> Publication Date: (actual publication date unknown) 2 pages. |
Title: Kinect—Wikipedia, the free encyclopedia (online) Citation: <URL: http://en .wikipedia.org/wiki/Kinect> Publication Date: (actual publication date unknown) 15 pages. |
Title: International Search Report & Written Opinion Citation: PCT/US2012/031008 Publication Date: Jul. 20, 2012 10 pages. |
Title: International Search Report and Written Opinion Citation : PCT/US2011/030764 Publication Date: Aug. 15, 2011 8 pages. |
Ghobadi, et al. “Real Time Hand Based Robot Control Using Multimodal Images”, IAENG International Journal of Computer Sciences, 35:4, IJCS_35_4_08; Nov. 20, 2008. 6 pgs. |
Robot.pdf (Robot—Definition Robot at Dictionary.com, Oct. 27, 2015, http://dictionary.reference.com/browse/robot, pp. 1-5). |
Number | Date | Country | |
---|---|---|---|
20160106957 A1 | Apr 2016 | US |
Number | Date | Country | |
---|---|---|---|
61040143 | Mar 2008 | US | |
61099904 | Sep 2008 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 12751843 | Mar 2010 | US |
Child | 14878183 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 12347811 | Dec 2008 | US |
Child | 12751843 | US |