This invention relates to the remote navigation of medical devices in the body, and in particular to the control of remote navigation systems.
Remote navigation systems have been developed that allow a user to remotely control the orientation of the distal end of a medical device to facilitate navigation of the device through the body. Examples of such systems include the magnetic navigation systems made by Stereotaxis, Inc., St. Louis, Mo., which create a magnetic field in a selected direction to orient the distal end of a medical device having one or more magnetically responsive elements. Other examples of remote navigation systems include robotic systems, such as systems using motors or mechanical devices such as pull wires or push wires to move articulated members. The technology of remote navigation systems has advanced to a point where they can quickly and easily orient the distal end of a medical device in a selected direction, but regardless of the method of movement an obstacle to their wide spread use is difficulties indicating to the remote navigation system the desired direction in which to orient the medical device.
A variety of interfaces have been created to facilitate the communication of the desired direction of orientation from the user to the remote navigation system. For example the magnetic navigation systems available from Stereotaxis, Inc. have a number of tools to help the user select the direction for the medical device and cause the magnetic navigation system to align in the selected direction. These interfaces typically require the user to manipulate a cursor on a display or actuate a touch screen. This can be difficult where the user is also trying to manually advance the medical device, or is otherwise using his or her hands.
Embodiments of the methods and interfaces in accordance with the principles of the present invention provide a method of controlling a remote navigation system to orient a medical device in a selected direction. A preferred embodiment of the methods of this invention comprises displaying an exterior image of the body lumen; superimposing an indicator of the current position of the medical device in the body lumen; displaying a plurality of segment labels each corresponding to a segment of the image and having a predetermined direction associated therewith; receiving oral commands and recognizing one of the oral commands as one of the displayed segment labels; causing the remote navigation system to orient the distal end of the device in the preselected direction associated with the segment corresponding to the displayed segment label.
Corresponding reference numerals indicate corresponding parts throughout the several views of the drawings.
A preferred embodiment of an interface implementing methods in accordance with the principles of this invention is illustrated in
In accordance with the principles of this invention, the image data is processed, and the image is divided into a plurality of segments of similar direction. For example, as shown in
The system may alternatively allow the user to name the sections. In this case, the system might prompt the user to type in a name for each section and use text to voice technology to recognize the assigned names when subsequently spoken. The system might alternatively prompt the user to speak the name of each section and either use voice recognition software to store the names, or store the audio information for subsequent comparison.
The system assigns each of the sections at least one direction. This direction may be an average centerline direction for the section, or it may be the direction of the centerline at the midpoint, or it may be the direction of the center line at the proximal or the distal end. The system could also be some sort of composite direction.
The system is adapted to receive and process oral commands, and in accordance with the principles of this invention, is adapted to receive an oral identification from the user of a particular section. Upon confirmation of the identified section, the system then operates the remote navigation system to orient the distal end of the device in the direction corresponding to the identified section. The current location of the medical device is preferably identified on the display, such as with marker 24, and as the distal end of the device moves from section to section, the user can properly orient the device for its current or for its next section simply by orally stating the name of the section that the device is in or to which the device is being moved. The system automatically causes the remote navigation system to change the orientation of the device to an orientation appropriate to the section specified by the user.
Validation of orally issued commands usually is important in a voice controlled system. The system preferably tracks the position of the medical device, and therefore can be programmed to anticipate that the next direction command will correspond to the next section in the distal direction, or in the case of a branched lumen, to one of a limited number of next sections in the distal direction. Thus the system can more accurately discriminate voice commands than if the voice command could be any one of a larger number of commands. However, if validation is desired or required, a validation scheme can be provided to confirm the voice commands. For example, upon receipt of a voice command identifying a particular direction, the system can highlight the selection it corresponds to the voice command is received and before proceeding wait for a validation command, such as “YES” or “CORRECT”. Alternatively, if the user is observing the device on an X-ray image, pressing the Fluoro pedal could be taken to be a confirmation of the voice command. In this case, either the action of pressing the Fluoro pedal, or the fact of the Fluoro pedal being pressed for a certain pre-determined time interval, could be used as command confirmation.
It is extremely helpful to navigation in accordance with the principles of this invention that the user accurately understand the current position of the medical device. When navigating through a reconstructed body lumen or cavity it might not be immediately apparent. Thus, as shown in
To facilitate the navigation of a medical device through body lumens and cavities, it is desirable to clearly indicate to the physician or other user where the distal end of the device is presently located. Thus in accordance with one embodiment of the present invention, an external image 100 of a body lumen or cavity is displayed. The position of the medical device is determined by any conventional means of localization, including using signals, electrostatic localization, optical localization, image processing localization, etc. In the case of navigating through a relatively constricted lumen, such as a blood vessel, the position in the vessel can be determined by measuring the extended length of the device, as advancement of a given length will substantially correspond to the same advancement along the centerline of the vessel. The advancement of the medical device can be measured in a number of ways. If the device is advanced by machine, for example opposed rollers as disclosed in U.S. patent application Ser. No. 10/138,710, filed May 3, 2002, and U.S. patent application Ser. No. 10/858,485, filed Jun. 1, 2004, (the disclosures of which are incorporated by reference), then the rotation of the rollers can be used to measure the advancement of device. Alternatively, marks can be provided on the device which can be physically, electrically, optically, or otherwise sensed to measure the advancement of the medical device.
As shown in
Operation
As shown in
Although the system might accept other commands from the user, knowing that the device is located in section P2 the system would not accept commands identifying sections other than P1, P2, and M1, because the user would not specify a section that was not the same as or adjacent to the section were the device is presently located. Alternatively, or in addition, the system could include a validation procedure such that after the user orally identifies a section, e.g. “M1” the label on the display 20 is highlighted (for example by color change) so that the user can confirm or reject the selection, such as by saying “ACCEPT” or “REJECT” or other appropriate command.
The direction associated with each section can be determined in a number of ways. The direction may be average or weighted average direction of the centerline. The direction may alternatively be the direction at one of the two ends of the segment (and which end may vary depending on whether the device is being advanced or retracted). The direction may also be the direction of the midpoint of the centerline. A direction on the centerline is convenient, because it is usually known from the reconstruction of the imaging data. Of course the method of determining the direction associated with each segment does not have to be the same for each segment, and this direction can be determined with different methods depending, for example of the curvature or rate of change of curvature of the lumen or the direction of travel, or other factors. In some cases imaging data may be spotty, and the direction associated with a particular segment may be based upon the adjacent segments. For example, it could be based on the directions associated with the adjacent segments, or the endpoints of the adjacent segments.
This application claims the benefit of U.S. Provisional Patent Application Ser. No. 60/678,320, filed May 6, 2005, the entire disclosure of which is incorporated herein by reference.
Number | Name | Date | Kind |
---|---|---|---|
5654864 | Ritter et al. | Aug 1997 | A |
5931818 | Werp et al. | Aug 1999 | A |
6014580 | Blume et al. | Jan 2000 | A |
6015414 | Werp et al. | Jan 2000 | A |
6128174 | Ritter et al. | Oct 2000 | A |
6148823 | Hastings | Nov 2000 | A |
6152933 | Werp et al. | Nov 2000 | A |
6157853 | Blume et al. | Dec 2000 | A |
6212419 | Blume et al. | Apr 2001 | B1 |
6241671 | Ritter et al. | Jun 2001 | B1 |
6292678 | Hall et al. | Sep 2001 | B1 |
6296604 | Garibaldi et al. | Oct 2001 | B1 |
6298257 | Hall et al. | Oct 2001 | B1 |
6304768 | Blume et al. | Oct 2001 | B1 |
6315709 | Garibaldi et al. | Nov 2001 | B1 |
6330467 | Creighton, IV et al. | Dec 2001 | B1 |
6352363 | Munger et al. | Mar 2002 | B1 |
6364823 | Garibaldi et al. | Apr 2002 | B1 |
6375606 | Garibaldi et al. | Apr 2002 | B1 |
6385472 | Hall et al. | May 2002 | B1 |
6401723 | Garibaldi et al. | Jun 2002 | B1 |
6428551 | Hall et al. | Aug 2002 | B1 |
6459924 | Creighton, IV et al. | Oct 2002 | B1 |
6505062 | Ritter et al. | Jan 2003 | B1 |
6507751 | Blume et al. | Jan 2003 | B2 |
6522909 | Garibaldi et al. | Feb 2003 | B1 |
6524303 | Garibaldi | Feb 2003 | B1 |
6527782 | Hogg et al. | Mar 2003 | B2 |
6537196 | Creighton, IV et al. | Mar 2003 | B1 |
6542766 | Hall et al. | Apr 2003 | B2 |
6562019 | Sell | May 2003 | B1 |
6630879 | Creighton, IV et al. | Oct 2003 | B1 |
6662034 | Segner et al. | Dec 2003 | B2 |
6677752 | Creighton, IV et al. | Jan 2004 | B1 |
6702804 | Ritter et al. | Mar 2004 | B1 |
6733511 | Hall et al. | May 2004 | B2 |
6755816 | Ritter et al. | Jun 2004 | B2 |
6785593 | Wang et al. | Aug 2004 | B2 |
6817364 | Garibaldi et al. | Nov 2004 | B2 |
6834201 | Gillies et al. | Dec 2004 | B2 |
6902528 | Garibaldi et al. | Jun 2005 | B1 |
6911026 | Hall et al. | Jun 2005 | B1 |
6968846 | Viswanathan | Nov 2005 | B2 |
6975197 | Creighton, IV | Dec 2005 | B2 |
6980843 | Eng et al. | Dec 2005 | B2 |
7008418 | Hall et al. | Mar 2006 | B2 |
7010338 | Ritter et al. | Mar 2006 | B2 |
7019610 | Creighton, IV et al. | Mar 2006 | B2 |
7020512 | Ritter et al. | Mar 2006 | B2 |
7066924 | Garibaldi et al. | Jun 2006 | B1 |
7127401 | Miller | Oct 2006 | B2 |
20010038683 | Ritter et al. | Nov 2001 | A1 |
20020019644 | Hastings et al. | Feb 2002 | A1 |
20020177789 | Ferry et al. | Nov 2002 | A1 |
20030125752 | Werp et al. | Jul 2003 | A1 |
20040006301 | Sell et al. | Jan 2004 | A1 |
20040019447 | Shachar | Jan 2004 | A1 |
20040064153 | Creighton, IV et al. | Apr 2004 | A1 |
20040068173 | Viswanathan | Apr 2004 | A1 |
20040096511 | Harburn et al. | May 2004 | A1 |
20040133130 | Ferry et al. | Jul 2004 | A1 |
20040157082 | Ritter et al. | Aug 2004 | A1 |
20040158972 | Creighton, IV et al. | Aug 2004 | A1 |
20040186376 | Hogg et al. | Sep 2004 | A1 |
20040199074 | Ritter et al. | Oct 2004 | A1 |
20040249262 | Werp et al. | Dec 2004 | A1 |
20040249263 | Creighton, IV | Dec 2004 | A1 |
20040260172 | Ritter et al. | Dec 2004 | A1 |
20050020911 | Viswanathan et al. | Jan 2005 | A1 |
20050043611 | Sabo et al. | Feb 2005 | A1 |
20050065435 | Rauch et al. | Mar 2005 | A1 |
20050096589 | Shachar | May 2005 | A1 |
20050113628 | Creighton, IV et al. | May 2005 | A1 |
20050113812 | Viswanathan et al. | May 2005 | A1 |
20050119687 | Dacey, Jr. et al. | Jun 2005 | A1 |
20050182315 | Ritter et al. | Aug 2005 | A1 |
20050256398 | Hastings et al. | Nov 2005 | A1 |
20060009735 | Viswanathan et al. | Jan 2006 | A1 |
20060025679 | Viswanathan et al. | Feb 2006 | A1 |
20060036125 | Viswanathan et al. | Feb 2006 | A1 |
20060036163 | Viswanathan | Feb 2006 | A1 |
20060041178 | Viswanathan et al. | Feb 2006 | A1 |
20060041179 | Viswanathan et al. | Feb 2006 | A1 |
20060041180 | Viswanathan et al. | Feb 2006 | A1 |
20060041181 | Viswanathan et al. | Feb 2006 | A1 |
20060041245 | Ferry et al. | Feb 2006 | A1 |
20060058646 | Viswanathan | Mar 2006 | A1 |
20060074297 | Viswanathan | Apr 2006 | A1 |
20060079745 | Viswanathan | Apr 2006 | A1 |
20060079812 | Viswanathan | Apr 2006 | A1 |
20060093193 | Viswanathan | May 2006 | A1 |
20060094956 | Viswanathan | May 2006 | A1 |
20060100505 | Viswanathan | May 2006 | A1 |
20060114088 | Shachar | Jun 2006 | A1 |
20060116633 | Shachar | Jun 2006 | A1 |
20060144407 | Aliberto et al. | Jul 2006 | A1 |
20060144408 | Ferry | Jul 2006 | A1 |
Number | Date | Country | |
---|---|---|---|
20060281989 A1 | Dec 2006 | US |
Number | Date | Country | |
---|---|---|---|
60678320 | May 2005 | US |