a. Field of the Invention
This invention relates to a robotic catheter system and method for automated control of a catheter and related components, including a robotic catheter system for manipulating a catheter and related components, for example, for diagnostic, therapeutic, mapping and ablative procedures.
b. Background Art
Electrophysiology catheters are used in a variety of diagnostic and/or therapeutic medical procedures to correct conditions such as atrial arrhythmia, including for example, ectopic atrial tachycardia, atrial fibrillation, and atrial flutter. Arrhythmia can create a variety of conditions including irregular heart rates, loss of synchronous atrioventricular contractions and stasis of blood flow which can lead to a variety of ailments and even death.
Typically in a procedure, a catheter is manipulated through a patient's vasculature to, for example, a patient's heart, and carries one or more electrodes which may be used for mapping, ablation, diagnosis, or other treatments. Once at the intended site, treatment may include radio frequency (RF) ablation, cryoablation, lasers, chemicals, high-intensity focused ultrasound, etc. An ablation catheter imparts such ablative energy to cardiac tissue to create a lesion in the cardiac tissue. This lesion disrupts undesirable electrical pathways and thereby limits or prevents stray electrical signals that lead to arrhythmias. As readily apparent, such treatment commonly requires precise control of the catheter during manipulation to and at the treatment site, which can invariably be a function of a user's skill level.
The inventors herein have recognized a need for a system and method for more precise and dynamic automated control of a catheter and related components, for example, for diagnostic, therapeutic, mapping and ablative procedures, that will minimize and/or eliminate procedural variability associated with an individual user's skill level. The inventors herein have also recognized a need for a system and method for performing user-specified procedures at the patient site or from a remote location.
A robotic catheter system includes a robotic controller; a robotic manipulator; and in input controller. The input controller includes communication circuitry for receiving a signal from a user input device; memory having stored therein a device driver associated with a type of input device; and a processor configured to recognize an input device connected with the communications circuitry; load a device driver according to the recognized input device; and initialize the input device. In an embodiment, the input controller may receive an input from the input device, and translate it into a standard data format, such as, for example a format including a three-dimensional Cartesian movement. Similarly, the standard data format may include a plurality of discrete variables, and the memory of the input controller may include a program memory and a storage memory. Additionally, in an embodiment, the processor may be configured to allow a plurality of input devices to operate in a collaborative manner.
The input controller may further include an auxiliary display, to which the processor may be may be configured to provide fault messages or steering wire tension values.
a-3e illustrate exemplary coordinate systems associated with robotic catheter systems;
a is a side view of an embodiment of an exemplary input device that may be used with a robotic catheter system;
b is an isometric view of an embodiment of an exemplary input device that may be used with a robotic catheter system;
a-5c are views of an embodiment of an exemplary input device that may be used with a robotic catheter system;
a is an exemplary two dimensional input device that may be used with a robotic catheter system;
b is an exemplary three dimensional input device that may be used with a robotic catheter system;
a-7b are exemplary illustrations of a three dimensional input device that employs non-contact position sensing, and may be used with a robotic catheter system;
a-8b are exemplary embodiments of a touch-sensitive input device that may be used with a robotic catheter system;
Referring now to the drawings wherein like reference numerals are used to identify identical components in the various views, an embodiment of robotic catheter system 10 (described in detail below), also referred to as “the system.” The system may be used, for example, to manipulate the location and orientation of catheters and sheaths in a heart chamber or in another body cavity. As shown in
An embodiment of robotic catheter system 10 may involve automated catheter movement. A user, such as an EP, can identify locations (potentially forming a path) on a rendered computer model of the cardiac anatomy. The system can be configured to relate those digitally selected points to positions within a patient's actual/physical anatomy, and may command and control the movement of a catheter to defined positions. Once in position, either the user or system could then initiate or perform the desired treatment or therapy—which may further be in accordance with a defined algorithm. Further, such systems may enable full robotic control by using optimized path planning routines together with closed-loop position control.
Referring to
In an embodiment, the control system 14 may include features that improve the accuracy or effectiveness of the system. Such features may include, closed-loop feedback (for example, using an EnSite NavX system or gMPS system 18) for creating realistic cardiac chamber geometries or models, displaying activation timing and voltage data to identify arrhythmias, and guiding precise catheter movement, and/or optical force transducers; active tensioning of “passive” steering wires to reduce the system response time; cumulative ablation while the tip is following a front-to-back ironing motion; and/or reactive/resistive impedance monitoring.
The visualization system 16 may provide a user with real-time or near-real-time positioning information concerning the catheter tip. In an exemplary embodiment, system 16 may include a monitor (e.g., an EnSite NavX monitor 26 or other similar monitor) for displaying cardiac chamber geometries or models, displaying activation timing and voltage data to identify arrhythmias, and/or for facilitating guidance of catheter movement. A fluoroscopy monitor 28 may be provided for displaying a real-time x-ray image or for assisting a physician with catheter movement. Additional exemplary displays may include diagnostic displays, ultrasound displays, or other reference displays (e.g., displays 30, 32).
As referenced above, EnSite NavX system 18 (described in detail in U.S. Pat. No. 7,263,397, titled “Method and Apparatus for Catheter Navigation and Location and Mapping in the Heart,” incorporated by reference in its entirety) may be provided for creating realistic cardiac chamber geometries or models, displaying activation timing and voltage data to identify arrhythmias, and guiding precise catheter movement. EnSite NavX system 18 may collect electrical position data from catheters and use this information to track or navigate their movement and construct three-dimensional (3-D) models of the chamber.
In an embodiment, position data from the catheter may also be obtained using a gMPS system, commercially available from Mediguide Ltd., and generally shown and described in U.S. Pat. No. 7,386,339 entitled “Medical Imaging and Navigation System,” which is incorporated herein by reference in its entirety.
As generally shown generally in
In an embodiment, a user (e.g., an EP) may first manually position a catheter and sheath in a coaxial arrangement within the vasculature of a patient. Once the devices are roughly positioned in relation to the heart, each device may be engaged or connected (e.g., “snapped-in”) to the manipulator assembly 20.
As schematically represented in
In an embodiment of the user interface 40, the one or more input devices 50 may be configured to receive input corresponding to a continuous movement 52 of the input device 50 and/or a discrete actuation 54 of the input device 50. The user interface may further provide a means of selecting a particular viewing perspective 56 of a three dimensional anatomical model 72. As used herein, a continuous movement input is one that can be represented on a continuous spectrum, such as the movement of a joystick, mouse, or slider. While it is understood that current digital computing operates in discrete increments, the term “continuous movement” as herein used, is intended to only distinguish from a discrete actuation, such as a button press, which must be represented as a finite state. The input device 50 is configured to provide the various forms of user input from the physician to the controller 70 for processing.
The user interface 40 may further include one or more visual displays 60 that are capable of displaying one or more views 62 of an anatomical model 72. The display 60 may further be configured to display one or more secondary features 64 either together with, or apart from, the displayed view of the model 62. In an embodiment, secondary features may include, for example, markers, targets, sliders, menu buttons, patient vital data, or other useful visual information that may not be strictly representative of the anatomical model 72. In an embodiment, the displayed view of the anatomical model may be selected 56 via the input device 50.
The controller 70 may be configured to maintain a three dimensional anatomical model 72 of the cardiac geometry, and execute both control logic 74 and display logic 76. In an embodiment, the control logic 74 can be configured to relate intended user actions into a controlled physical movement of the catheter and sheath. Such control logic may include the use of, for example, control algorithms, forward and/or inverse kinematic computations, and real-time feedback from the catheter, manipulator, or positioning system. In an embodiment, the display logic 76 may be configured to use three dimensional view rotation, translation, and/or projection techniques to present the user with a displayed representation 62 of the anatomical model 72 corresponding to the provided view selection input 56. The display logic 76 may further be configured to relate a user input 52 made with respect to a presently displayed view 62 into the coordinate system of the anatomical model.
The bedside system 80 may generally include one or more manipulator assemblies 82 configured to manipulate a catheter and sheath, and a positioning system 84 configured to detect the real-time positioning of the catheter and sheath devices within a patient.
The ability to control the ultimate motion of the catheter (via manipulator actuation) may be analytically complex because each of the input device 50, the display 60, the anatomical model 72, the manipulator 82, the distal motion of the catheter resulting from manipulator actuation 82, and the positioning system 84, may reside in different domains, potentially having different coordinate systems. As used herein, a “coordinate system” or “coordinate frame” is intended to refer a collection of variables representing controllable or perceivable qualities of an object or device. These variables may primarily include position and/or orientation, though are not necessarily defined in Cartesian space. Additionally, other temporal or environmental variables that are not strictly related to position or orientation may be included in a given coordinate system (e.g., time, breathing phase/magnitude, ECG phase/magnitude). It should also be noted that while a given unit may represent a physical distance, it may be represented in various forms, such as for example, inches, millimeters, volts, ohms, impedance, encoder counts, or other units or quantities.
a-3e generally illustrate several forms of coordinate systems. As illustrated in
As illustrated in
Finally, as shown in
Referring again to
As generally shown in
In a manner that can, if desired, mimic traditional, manual catheter control, the system may be configured such that the catheter/sheath is commanded to translate in a longitudinal direction when there a corresponding translation made with an input handle. Similarly, rotation of either handle may be configured to rotate the deflection plane of the catheter/sheath tip, and the movement of a thumb tab may cause a deflection of the catheter or sheath within in the current deflection plane.
As generally shown in
As generally illustrated in
In still a further exemplary embodiment, the user input device 50 may include a spatially detected glove or stylus as generally illustrated in
The glove or stylus input device may be locatable in 3-D space through the use of a positioning system employing a magnetic field, an electrostatic field, or through the use of an optical positioning system. These systems may include, for example, the EnSite NavX system from St. Jude Medical, the gMPS system from Mediguide, the CARTO system from Biosense Webster, the Aurora system from Northern Digital, or the RMT system from Boston Scientific.
In still a further exemplary embodiment, the user interface device may be in the form of a touch screen monitor, such as generally illustrated in
Through each type of input device, the system may be further be capable of providing tactile (i.e., “haptic”) feedback to the user. This type of feedback may involve forces generated by a motor connected to user interface device that a user can feel while holding or interfacing with the device. These forces may be based on actual or computed forces being applied to a physical catheter tip, and may be conveyed to the user by, for example, providing motors/encoders on each degree of freedom. While the motors may operate in a passive mode for a majority of the procedure, if feedback is required by the system, the motors may be energized to produce a torque on the input controls capable of retarding movement in particular degrees of freedom.
As generally illustrated in
In an embodiment, the input controller 110 may include a processor 120, input communications circuitry 122, and memory 124. The communications circuitry 122 may detect the presence of one or more input devices, retrieve make and/or model information from the respective devices, and facilitate communications between the devices and the processor of the input controller 120. The input controller may additionally include memory 124 that may maintain control algorithms and device drivers for various input devices and may be used by the input controller processor 120 during the system operation. The memory 124 may include both program memory, such as random access memory (RAM), and storage memory, such as flash memory. In an embodiment, the program memory may be used during the execution of algorithms, while the storage memory may be used to maintain data and code for longer periods of time.
In an embodiment, the input controller 110 may be physically separate from the controller 118 and connected via a wired or wireless connection. In another embodiment, the two controllers may be physically integrated, though may be distinguished by function. In a further embodiment, the input controller may be integrated or provided within the one or more input devices to allow the input device to intelligently communicate raw input movements directly to the controller 118.
In an embodiment, if one or more devices are connected to the input controller 110, the input controller 110 may then prompt the user (via the controller 118) to select an “active” device by first displaying a list of all connected devices to the user 156, and then by receiving an indication as to which device should be “active” 158. In another embodiment, the system may automatically determine the active device by detecting apparent movement of the device, or by monitoring hardware or software switches associated with each respective device.
Once an input device is selected as being “active,” the input controller 110 may load device drivers and setup permissions that are specific to that particular device 160. The “loading” may include transferring the drivers and permissions from storage memory to program memory so they may be readily accessed during operation. For example, without limitation, the system may load either a rotary-based device driver and/or a linear-based input device driver. Some known examples of systems with rotary-based device drivers include U.S. application Ser. No. 12/150,110, filed 23 Apr. 2008 (the '110 application); and U.S. application Ser. No. 12/032,639, filed 15 Feb. 2008 (the '639 application). The '110 application and the '639 application are hereby incorporated by reference in their entirety as though fully set forth herein. Once the necessary drivers and permissions are loaded, the processor 120 of the input controller 110 may initialize one or more algorithms that are specific to the selected input device 162. The initialization may include, for example, performing startup routines for the device, performing calibration routines for the device, loading or determining device-specific parameters, or performing other initialization routines that are required for the device to operate correctly.
Once the processor 120 has executed desired or necessary initialization routines, the input controller 110 (via the controller 118) may alert the user that the input device is ready to use 164. Once initialized, input controller 110 may operate to translate the actions performed at the active input device into a standard data format that the controller 118 is configured to accept. In an embodiment, the standard data format may be a three-dimensional Cartesian position plus a plurality of discrete state-defined variables. In embodiments with a standard data format and including a plurality of discrete variables, such format and variables may linearize a desired motion vector for an input device. In another embodiment, the input controller may translate input movements into the coordinate space of the computerized anatomical model (e.g., coordinate system C3). Other exemplary means of coordinate space translations are described in U.S. patent application Ser. No. 12/751,843, filed Mar. 31, 2010, and entitled “Robotic Catheter System,” and which is herein incorporated by reference in its entirety.
In an embodiment, the input controller may be configured to allow multiple input devices to operate concurrently, either in a collaborative manner, or in a training/teaching environment. In a multiple input scenario, the input controller 110 may allow the user to select multiple devices and then initialize each independently. During operation, the input controller 110 may be configured to provide a single standard data signal to the controller 118, by for example intelligently merging the signals from a plurality of devices. In another embodiment, the input controller 110 may be configured to provide multiple data signals to the controller 118, where each data signal represents the input made at a respective input device. Where the input controller 110 provides multiple signals to the controller 118, each signal may be in a data format that is recognized by the controller, though need not be in an identical format as each other respective signal.
As further illustrated in
In an embodiment, the input controller 110 may record inputs to memory 124 in a sequential fashion. The input controller 110 may further be capable of replaying the movements in a visual manner on an auxiliary display 170. Additionally, if the user wishes to revert back to a previous catheter position, the input controller 110 may play back the sequential movements from memory 124 in reverse until the desired previous pose is achieved. In an embodiment, the auxiliary display 170 may also provide the user with the ability to enter various primitive commands, such as rotation, deflection, translation, and relax. Finally, the auxiliary display 170 may be capable of displaying a corporate logo or trademark, such as when no fault messages exist.
In an embodiment, an auxiliary display 170 may be may be physically associated with each of the respective input devices connected with the system. In another embodiment, the auxiliary display 170 may be an independent display that is merely provided with the system in general.
Although several embodiments of this invention have been described above with a certain degree of particularity, those skilled in the art could make numerous alterations to the disclosed embodiments without departing from the scope of this invention. Joinder references (e.g., attached, coupled, connected, and the like) are to be construed broadly and may include intermediate members between a connection of elements and relative movement between elements. As such, joinder references do not necessarily infer that two elements are directly connected and in fixed relation to each other. It is intended that all matter contained in the above description or shown in the accompanying drawings shall be interpreted as illustrative only and not as limiting. Changes in detail or structure may be made without departing from the invention as defined in the appended claims.
This application is a continuation-in-part of and claims the benefit of and priority to U.S. patent application Ser. No. 12/751,843 filed 31 Mar. 2010 (the '843 application), which is a continuation-in-part of and claims the benefit of and priority to U.S. patent application Ser. No. 12/347,811 filed 31 Dec. 2008 (the '811 application), which in turn claims the benefit of and priority to U.S. provisional patent application Nos. 61/040,143 filed 27 Mar. 2008 (the '143 application) and 61/099,904 filed 24 Sep. 2008 (the '904 application), the entire disclosure of each of the '843 application, the '811 application, the '143 application, and the '904 application are hereby incorporated by reference as though fully set forth herein.
Number | Name | Date | Kind |
---|---|---|---|
3091130 | Payerle et al. | May 1963 | A |
3605725 | Bentov | Sep 1971 | A |
3893449 | Lee et al. | Jul 1975 | A |
4160508 | Frosch et al. | Jul 1979 | A |
4348556 | Gettig et al. | Sep 1982 | A |
4393728 | Larson et al. | Jul 1983 | A |
4543090 | McCoy | Sep 1985 | A |
4758222 | McCoy | Jul 1988 | A |
4784042 | Paynter | Nov 1988 | A |
4802487 | Martin et al. | Feb 1989 | A |
4884557 | Takehana et al. | Dec 1989 | A |
4962448 | DeMaio et al. | Oct 1990 | A |
4974151 | Advani et al. | Nov 1990 | A |
5078140 | Kwoh | Jan 1992 | A |
5107080 | Rosen et al. | Apr 1992 | A |
5170817 | Sunderland et al. | Dec 1992 | A |
5238005 | Imran | Aug 1993 | A |
5298930 | Asakura | Mar 1994 | A |
5303148 | Mattson | Apr 1994 | A |
5318525 | West et al. | Jun 1994 | A |
5339799 | Kami et al. | Aug 1994 | A |
5396266 | Brimhall et al. | Mar 1995 | A |
5410638 | Colgate et al. | Apr 1995 | A |
5441483 | Avitall | Aug 1995 | A |
5449345 | Taylor et al. | Sep 1995 | A |
5520644 | Imran | May 1996 | A |
5533967 | Imran | Jul 1996 | A |
5545200 | West et al. | Aug 1996 | A |
5579442 | Kimoto et al. | Nov 1996 | A |
5607158 | Chan | Mar 1997 | A |
5607462 | Imran | Mar 1997 | A |
5623582 | Rosenberg | Apr 1997 | A |
5630783 | Steinberg | May 1997 | A |
5661253 | Aoki | Aug 1997 | A |
5706827 | Ehr et al. | Jan 1998 | A |
5784542 | Ohm et al. | Jul 1998 | A |
5791908 | Gillio | Aug 1998 | A |
5800178 | Gillio | Sep 1998 | A |
5807377 | Madhani et al. | Sep 1998 | A |
5808665 | Green | Sep 1998 | A |
5828813 | Ohm | Oct 1998 | A |
5854622 | Brannon | Dec 1998 | A |
5861024 | Rashidi | Jan 1999 | A |
5876325 | Mizuno et al. | Mar 1999 | A |
5897488 | Ueda | Apr 1999 | A |
5913820 | Bladen | Jun 1999 | A |
6040758 | Sedor et al. | Mar 2000 | A |
6063095 | Wang et al. | May 2000 | A |
6113395 | Hon | Sep 2000 | A |
6201196 | Wergen | Mar 2001 | B1 |
6233476 | Strommer | May 2001 | B1 |
6233504 | Das et al. | May 2001 | B1 |
6290683 | Erez et al. | Sep 2001 | B1 |
6348911 | Rosenberg et al. | Feb 2002 | B1 |
6358207 | Lathbury | Mar 2002 | B1 |
6385509 | Das et al. | May 2002 | B2 |
6396232 | Haanpaa et al. | May 2002 | B2 |
6432112 | Brock et al. | Aug 2002 | B2 |
6498944 | Ben-Haim | Dec 2002 | B1 |
6500167 | Webster | Dec 2002 | B1 |
6522141 | Debbins | Feb 2003 | B2 |
6540685 | Rhoads | Apr 2003 | B1 |
6671533 | Chen | Dec 2003 | B2 |
6709667 | Lowe et al. | Mar 2004 | B1 |
6785358 | Johnson | Aug 2004 | B2 |
6850252 | Hoffberg | Feb 2005 | B1 |
6869390 | Elliott et al. | Mar 2005 | B2 |
6869396 | Belson | Mar 2005 | B2 |
6968223 | Hanover | Nov 2005 | B2 |
7016469 | Johnson | Mar 2006 | B2 |
7193521 | Moberg | Mar 2007 | B2 |
7197354 | Sobe | Mar 2007 | B2 |
7199790 | Rosenberg et al. | Apr 2007 | B2 |
7247139 | Yudkovitch | Jul 2007 | B2 |
7263397 | Hauck et al. | Aug 2007 | B2 |
7276044 | Ferry et al. | Oct 2007 | B2 |
7386339 | Strommer | Jun 2008 | B2 |
7465288 | Dudney | Dec 2008 | B2 |
7672849 | Yudkovitch | Mar 2010 | B2 |
7698966 | Gosselin | Apr 2010 | B2 |
7742803 | Viswanathan | Jun 2010 | B2 |
7850642 | Moll | Dec 2010 | B2 |
7880717 | Berkley et al. | Feb 2011 | B2 |
7945546 | Bliss | May 2011 | B2 |
7963288 | Rosenberg et al. | Jun 2011 | B2 |
8317744 | Kirschenman | Nov 2012 | B2 |
8317745 | Kirschenman et al. | Nov 2012 | B2 |
8390438 | Olson et al. | Mar 2013 | B2 |
8560118 | Greer et al. | Oct 2013 | B2 |
8926511 | Bar-Tal | Jan 2015 | B2 |
20010018591 | Brock et al. | Aug 2001 | A1 |
20010025183 | Shahidi | Sep 2001 | A1 |
20020068868 | Thompson et al. | Jun 2002 | A1 |
20020072704 | Mansouri-Ruiz | Jun 2002 | A1 |
20020087048 | Brock et al. | Jul 2002 | A1 |
20020184055 | Naghavi et al. | Dec 2002 | A1 |
20030018232 | Elliott | Jan 2003 | A1 |
20030050733 | Wang et al. | Mar 2003 | A1 |
20030114962 | Niemeyer | Jun 2003 | A1 |
20030121382 | Morson | Jul 2003 | A1 |
20040050247 | Topping | Mar 2004 | A1 |
20040068173 | Viswanathan | Apr 2004 | A1 |
20040133189 | Sakurai | Jul 2004 | A1 |
20040138530 | Kawai et al. | Jul 2004 | A1 |
20040146388 | Khajepour et al. | Jul 2004 | A1 |
20040193239 | Falwell et al. | Sep 2004 | A1 |
20040223636 | Edic et al. | Nov 2004 | A1 |
20040243147 | Lipow | Dec 2004 | A1 |
20050038333 | Sra et al. | Feb 2005 | A1 |
20050075538 | Banik et al. | Apr 2005 | A1 |
20050172405 | Menkedick et al. | Aug 2005 | A1 |
20050203382 | Govari et al. | Sep 2005 | A1 |
20050222554 | Wallace et al. | Oct 2005 | A1 |
20050234293 | Yamamoto et al. | Oct 2005 | A1 |
20050234320 | Balasubramanian | Oct 2005 | A1 |
20060052664 | Julian et al. | Mar 2006 | A1 |
20060089637 | Werneth et al. | Apr 2006 | A1 |
20060137476 | Bull et al. | Jun 2006 | A1 |
20060155321 | Bressler et al. | Jul 2006 | A1 |
20060276775 | Rosenberg et al. | Dec 2006 | A1 |
20060293643 | Wallace et al. | Dec 2006 | A1 |
20070016008 | Schoenefeld | Jan 2007 | A1 |
20070022384 | Abbott | Jan 2007 | A1 |
20070043338 | Moll et al. | Feb 2007 | A1 |
20070060833 | Hauck | Mar 2007 | A1 |
20070073137 | Schoenefeld et al. | Mar 2007 | A1 |
20070100254 | Murakami et al. | May 2007 | A1 |
20070120512 | Albu-Schaffer et al. | May 2007 | A1 |
20070135803 | Belson | Jun 2007 | A1 |
20070142726 | Carney et al. | Jun 2007 | A1 |
20070172803 | Hannaford et al. | Jul 2007 | A1 |
20070185404 | Hauck et al. | Aug 2007 | A1 |
20070185485 | Hauck et al. | Aug 2007 | A1 |
20070185486 | Hauck et al. | Aug 2007 | A1 |
20070197896 | Moll | Aug 2007 | A1 |
20070197939 | Wallace et al. | Aug 2007 | A1 |
20070198008 | Hauck et al. | Aug 2007 | A1 |
20070233044 | Wallace | Oct 2007 | A1 |
20070233045 | Weitzner et al. | Oct 2007 | A1 |
20070270685 | Kang et al. | Nov 2007 | A1 |
20070276214 | Dachille et al. | Nov 2007 | A1 |
20070298877 | Rosenberg | Dec 2007 | A1 |
20080009791 | Cohen et al. | Jan 2008 | A1 |
20080013809 | Zhu et al. | Jan 2008 | A1 |
20080112842 | Edwards | May 2008 | A1 |
20080201847 | Menkedick | Aug 2008 | A1 |
20080312536 | Dala-Krishna | Dec 2008 | A1 |
20090012533 | Barbagli | Jan 2009 | A1 |
20090033623 | Lin | Feb 2009 | A1 |
20090123111 | Udd | May 2009 | A1 |
20090137952 | Ramamurthy et al. | May 2009 | A1 |
20090177454 | Bronstein et al. | Jul 2009 | A1 |
20090192519 | Omori et al. | Jul 2009 | A1 |
20090195514 | Glynn | Aug 2009 | A1 |
20090247993 | Kirschenman et al. | Oct 2009 | A1 |
20090264156 | Burghardt | Oct 2009 | A1 |
20090322697 | Cao | Dec 2009 | A1 |
20100066676 | Kramer et al. | Mar 2010 | A1 |
20100073150 | Olson | Mar 2010 | A1 |
20100082039 | Mohr et al. | Apr 2010 | A1 |
20100256558 | Olson | Oct 2010 | A1 |
20100268067 | Razzaque et al. | Oct 2010 | A1 |
20100314031 | Heideman | Dec 2010 | A1 |
20110040547 | Gerber et al. | Feb 2011 | A1 |
20110128555 | Rotschild | Jun 2011 | A1 |
20110137156 | Razzaque | Jun 2011 | A1 |
20110152882 | Wenderow et al. | Jun 2011 | A1 |
20110289441 | Venon et al. | Nov 2011 | A1 |
20110306986 | Lee | Dec 2011 | A1 |
20120071891 | Itkowitz et al. | Mar 2012 | A1 |
20120133601 | Marshall et al. | May 2012 | A1 |
20120277663 | Millman et al. | Nov 2012 | A1 |
20130006268 | Swarup et al. | Jan 2013 | A1 |
20130154913 | Genc et al. | Jun 2013 | A1 |
20130165854 | Sandhu | Jun 2013 | A1 |
20130176220 | Merschon | Jul 2013 | A1 |
20130179162 | Merschon | Jul 2013 | A1 |
Number | Date | Country |
---|---|---|
0151479 | Aug 1985 | EP |
09094796 | Mar 1999 | EP |
2211280 | Jun 1989 | GB |
2397177 | Jul 2007 | GB |
H10216238 | Aug 1998 | JP |
2003024336 | Jan 2003 | JP |
2007325936 | Dec 2007 | JP |
9320535 | Oct 1993 | WO |
WO-9639944 | Dec 1996 | WO |
WO03049596 | Jun 2003 | WO |
WO-2006120666 | Nov 2006 | WO |
WO-2007088208 | Aug 2007 | WO |
WO-2007098494 | Aug 2007 | WO |
WO-2007120329 | Oct 2007 | WO |
WO-2007136803 | Nov 2007 | WO |
2007143859 | Dec 2007 | WO |
WO-2007146325 | Dec 2007 | WO |
WO2008045831 | Apr 2008 | WO |
2008103212 | Aug 2008 | WO |
WO-20081012258 | Aug 2008 | WO |
2009120940 | Oct 2009 | WO |
2009120992 | Oct 2009 | WO |
WO-2009120982 | Oct 2009 | WO |
WO-2009120992 | Oct 2009 | WO |
2010025338 | Mar 2010 | WO |
2010059179 | May 2010 | WO |
2010068783 | Jun 2010 | WO |
2010107916 | Sep 2010 | WO |
Entry |
---|
“International Search Report & Written Opinion”, PCT/US2009/069712 Feb. 25, 2010. |
“Supplementary European Search Report”, EP 09725131 Feb. 20, 2013. |
LaBelle, Kathryn, Evaluation of Kinect Joint Tracking for Clinical and In-Home Stroke Rehabilitation Tools, <http://netscale.cse.nd.edu/twiki/pub/Edu/KinectRehabilitation/Eval—of—Kinect—for—Rehab.pdf>, Dec. 2011 |
Padoy, Nicolas, Needle Insertion Revisted (tele-surgery in depth), (online), The John Hopkins University <URL: http://www.youtube.com/watch?v=YsY—A0kLh-g>, Jan. 2011. |
Supplementary European Search Report in EP Application No. 11763450.1 (Oct. 29, 2014). |
About the Kinect for Windows SDK—Microsoft Research (online), <URL: http://research.microsoft.com/en-us/um/redmond/projects/kinectsdk/about.aspx>, (actual publication date unknown). |
Apple Wins Strategic Multitouch and Music Tempo Workout Patents, Patently Apple <URL: http://www.patentlyapple.com/patently-apple/2010/04/apple-wins-strategic-multitouch-music-tempo-workout-patents.html>, (actual publication date unknown). |
Emotiv—Brain Computer Interface Technology (online), <URL: http://www.emotiv.com>, (actual publication date unknown). |
Emotiv EPOC Software Development Kit—EPOC neuroheadset (online), <URL: http://www.emotiv.com/store/hardware/epoc/bci/epoc-neuroheadset/>, (actual publication date unknown). |
International Search Report & Written Opinion, PCT/US2012/031008, Jul. 20, 2012. |
International Search Report and Written Opinion, PCT/US2011/030764, Jun. 15, 2011. |
Kinect—Wikipedia, the free encyclopedia (online), <URL: http://en.wikipedia.org/wiki/Kinect/>, (actual publication date unknown). |
Polaris Family of Optical Tracking Systems, polaris Vicra & Spectra—Optical Measurement Systems for Medical, Northern Digital Inc. <URL: http://www.ndigital.com/medical/polarisfamily.php?act=print>, (actual publication date unknown). |
The Aurora Electromagnetic Tracking System, Aurora Electromagnetic Measurement System—2D Trackinhg for Medical Guidance; Northern Digital Inc.<URL:http://www.ndigital.com/medical/aurora.pho?act=print>; (actual publication date unknown). |
Wii Remote—Wikipedia, the free encyclopedia (online), <URL: http://en.wikipedia.org/wiki/Wii—Remote>, (actual publication date unknown). |
International Search Report, PCT Application No. PCT/US2011/030656, Jun. 13, 2011, 8 pages. |
International Search Report, PCT Application No. PCT/US2009/038525, May 27, 2009, 2 pages. |
International Search Report, PCT Application No. PCT/US2009/038531, May 19, 2009, 3 pages. |
International Search Report, PCT Application No. PCT/US2009/038533, Jun. 17, 2009, 2 pages. |
International Search Report, PCT Application No. PCT/US2009/038618, May 22, 2009, 2 pages. |
International Search Report, PCT Application No, PCT/US2009/038597, May 18, 2009, 2 pages. |
International Search Report, PCT Application No. PCT/US2009/038534, May 27, 2009, 2 pages. |
International Search Report, PCT Application No. PCT/US2009/038536, May 28, 2009, 2 pages. |
International Search Report, PCT Application No. PCT/US2009/058121, Nov. 19, 2009, 2 pages. |
Supplemental European Search Report, EP Application No. 09724550.0, Jul. 10, 2012, 6 pages. |
Supplemental European Search Report, EP Application No. 09723739.0, Jul. 10, 2012, 6 pages. |
Supplemental European Search Report, EP Application No. 09726364.4, Jan. 22, 2013, 7 pages. |
Supplementary European Search Report for EP Application No. 11763140.5, dated Jun. 10, 2015, 7 pages. |
Number | Date | Country | |
---|---|---|---|
20110144806 A1 | Jun 2011 | US |
Number | Date | Country | |
---|---|---|---|
61040143 | Mar 2008 | US | |
61099904 | Sep 2008 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 12751843 | Mar 2010 | US |
Child | 12964407 | US | |
Parent | 12347811 | Dec 2008 | US |
Child | 12751843 | US |