1. Field of the Invention
The present invention relates to a robotic system that moves a surgical instrument in response to voice commands from the user.
2. Description of Related Art
To reduce the invasiveness of surgery, endoscopes are commonly utilized to view the internal organs of a patient. One end of the endoscope contains a lens which is inserted into the patient through a small incision in the skin. The lens focuses an image that is transmitted by fiber optic cable to a camera located at the opposite end of the endoscope. The camera is coupled to a monitor that displays the image of the patient.
The endoscope can be used in conjunction with another surgical instrument that is inserted into the patient. An assistant typically hold the endoscope while the surgeon manipulates the surgical instrument. The assistant moves the endoscope in response to instructions from the surgeon. Any miscommunication between the surgeon and the assistant may result in an error in the movement of the endoscope, thereby requiring the surgeon to repeat the instruction. Additionally, holding the endoscope for a significant amount of time may cause the assistant to become fatigued.
U.S. application Ser. No. 07/927,801 discloses a robotic arm that holds and moves an endoscope. The surgeon can move the robotic arm by depressing a foot pedal. The foot pedal is connected to a computer which moves the arm and the scope. Although the '801 system effectively moves the endoscope, the surgeon must continually manipulate the foot pedal, a process which may detract from the surgical procedure. It would be desirable to provide a robotic endoscopic system that can be controlled by voice commands from the user.
The present invention is a robotic system which controls the movement of a surgical instrument in response to voice commands from the user. The robotic system has a computer controlled arm that holds the surgical instrument. The user provides voice commands to the computer through a microphone. The computer contains a phrase recognizer that matches the user's speech with words stored in the computer. Matched words are then processed to determine whether the user has spoken a robot command. If the user has spoken a recognized robot command the computer will move the robotic arm in accordance with the command.
The objects and advantages of the present invention will become more readily apparent to those ordinarily skilled in the art after reviewing the following detailed description and accompanying drawings, wherein:
Referring to the drawings more particularly by reference numbers,
The robotic arm assembly 14 controlled by a computer 20. In the preferred embodiment, the robotic arm assembly 16 includes a linear actuator 24 fixed to the table 14. The linear actuator 24 is connected to a linkage arm assembly 26 and adapted to move the linkage assembly 26 along the z axis of a first coordinate system. The first coordinate system also has an x axis and a y axis.
The linkage arm assembly 26 includes a first linkage arm 28 attached to a first rotary actuator 30 and an end effector 32. The first rotary actuator 30 is adapted to rotate the first linkage arm 28 and end effector 32 in a plane perpendicular to the z axis (x-y plane) The first rotary actuator 30 is connected to a second rotary actuator 34 by a second linkage arm 36. The second actuator 34 is adapted to rotate the first actuator 30 in the x-y plane. The second rotary actuator 34 is connected to the output shaft of the linear actuator 24. The actuators 24, 30 and 34 rotate in response to output signals provided by the computer 20. As shown in
The arm assembly may have a pair of passive joints that allow the end effector to be rotated in the direction indicated by the arrows. The actuators 24, 30 and 34, and joints of the arm may each have position sensors (not shown) that are connected to the computer 20. The sensors provide positional feedback signals of each corresponding arm component.
The system has a microphone 40 that is connected to the computer 20. The system may also have a speaker 42 that is connected to the computer 20. The microphone 40 and speaker 42 may be mounted to a headset 44 that is worn by the user. Placing the microphone 40 in close proximity to the user reduces the amount of background noise provided to the computer and decreases the probability of an inadvertent input command.
As shown in
The processor 78 is connected to an address decoder 82 and separate digital to analog (D/A) converters 84. Each D/A converter is connected to an actuator 24, 30 and 34. The D/A converters 84 provide analog output signals to the actuators in response to output signals received from the processor 78. The analog output signals have a sufficient voltage level to energize the electric motors and move the robotic arm assembly. The decoder 82 correlates the addresses provided by the processor with a corresponding D/A converter, so that the correct motor(s) is driven. The address decoder 82 also provides an address for the input data from the A/D converter 76 so that the data is associated with the correct input channel.
The computer 20 has a phrase recognizer 86 connected to the microphone 40 and the processor 78. The phrase recognizer 86 digitizes voice commands provided by the user through the microphone 40. The voice commands are then processed to convert the spoken words into electronic form. The electronic words are typically generated by matching the user's speech with words stored within the computer 20. In the preferred embodiment, the recognizer 86 is an electronic board with accompanying software that is marketed by Scott Instruments of Denton, Tex. under the trademark “Coretechs Technology”.
The electronic words are provided to the processor 78. The processor 78 compares a word, or a combination of words to predefined robot commands that are stored within a library in the memory 80 of the computer 20. If a word, or combination of words match a word or combination of words in the library, the processor 78 provides output commands to the D/A converter 84 to move the robotic arm in accordance with the command.
If the spoken word is AESOP the process continues to state 1. The process next determines whether the user has spoken a word that satisfies a condition to advance to states 2-6. These words include “move”, “step”, “save”, “return”, “speed”, “track instrument” and “track head”. The track instrument command is for a system which has the ability to move an endoscope to automatically track the movement of a second instrument that is inserted into the patient. The track head command may enable the system so that the endoscope movement tracks the user's eyes. For example, if the user looks to the right of the image displayed by the monitor, the robot will move the endoscope to move the image in a rightward direction. The move and step commands induce movement of the scope in a desired direction. The save command saves the position of the endoscope within the memory of the computer. The return command will return the scope to a saved position.
From states 2-6 the process will determine whether the user has spoken words that meet the next condition and so forth and so on. When a certain number of conditions have been met, the processor 78 will provide an output command to the D/A converter 84 in accordance with the voice commands. For example, if the user says “AESOP move left”, the processor 78 will provide output commands to move the endoscope 12, so that the image displayed by the monitor moves in a leftward direction. The microphone 40 phrase recognizer 86 and grammar process essentially provide the same input function as the foot pedal 50, multiplexer 74 and A/D converter 76.
The processor 78 can also provide the user with feedback regarding the recognized command through the speaker 42 or the monitor 18. For example, when the user states “AESOP move right”, after processing the speech, the processor 78 can provide an audio message through the speaker 42, or a visual message on the monitor 18, “AESOP move right”. Additionally, the processor 78 can provide messages regarding system errors, or the present state of the system such as “speed is set for slow”.
Referring to
where;
To improve the effectiveness of the system 10, the system is constructed so that the desired movement of the surgical instrument correlates to a direction relative to the image displayed by the monitor. Thus when the surgeon commands the scope to move up, the scope always appears to move in the up direction. To accomplish this result, the processor 78 converts the desired movement of the end of the endoscope in the third coordinate system to coordinates in the second coordinate system, and then converts the coordinates of the second coordinate system into the coordinates of the first coordinate system.
Referring to
where;
The desired movement of the endoscope is converted from the second coordinate system to the first coordinate system by using the following transformation matrix;
where;
The surgical instrument is typically coupled to a camera and a viewing screen so that any spinning of the instrument about its own longitudinal axis will result in a corresponding rotation of the image on the viewing screen. Rotation of the instrument and viewing image may disorient the viewer. It is therefore desirable to maintain the orientation of the viewing image. In the preferred embodiment, the end effector has a worm gear (not shown) which rotates the surgical instrument about the longitudinal axis of the instrument. To insure proper orientation of the endoscope 16, the worm gear rotates the instrument 16 about its longitudinal axis an amount Δθ6 to insure that the y″ axis is oriented in the most vertical direction within the fixed coordinate system. Δθ6 is computed from the following cross-products.
Δθ6=zi″×(yo″×yi″)
where;
The angles a5 and a6 are provided by position sensors. The vector yo″ is computed using the angles a5 and a6 of the instrument in the original or first position. For the computation of yi″ the angles a5 and a6 of the second position are used in the transformation matrix. After each arm movement yo″ is set to yi″ and a new yi″ vector and corresponding Δθ6 angle are computed and used to re-orient the endoscope. Using the above described algorithms, the worm gear continuously rotates the instrument about its longitudinal axis to insure that the pivotal movement of the endoscope does not cause a corresponding rotation of the viewing image.
The system may have a memory feature to store desired instrument positions within the patient. The memory feature may be enabled either by voice commands or through a button on an input device such as the foot pedal. When a save command is spoken, the coordinates of the end effector in the first coordinate system are saved in a dedicated address(es) of the computer memory. When a return command is spoken, the processor retrieves the data stored in memory and moves the end effector to the coordinates of the effector when the save command was enabled.
The memory feature allows the operator to store the coordinates of the end effector in a first position, move the end effector to a second position and then return to the first position with a simple command. By way of example, the surgeon may take a wide eye view of the patient from a predetermined location and store the coordinates of that location in memory. Subsequently, the surgeon may manipulate the endoscope to enter cavities, etc. which provide a more narrow view. The surgeon can rapidly move back to the wide eye view by merely stating “AESOP return to one”.
In operation, the user provides spoken words to the microphone. The phrase recognizer 86 matches the user's speech with stored words and provides matched electronic words to the processor 78. The processor performs a grammar process to determine whether the spoken words are robot commands. If the words are commands, the computer energizes the actuators and moves the endoscope, accordingly. The system also allows the user to control the movement of the endoscope with a foot pedal if voice commands are not desired.
While certain exemplary embodiments have been described and shown in the accompanying drawings, it is to be understood that such embodiments are merely illustrative of and not restrictive on the broad invention, and that this invention not be limited to the specific constructions and arrangements shown and described, since various other modifications may occur to those ordinarily skilled in the art.
This application is a continuation of application Ser. No. 08/310,665 filed Sep. 22, 1994, now U.S. Pat. No. 6,463,361.
Number | Name | Date | Kind |
---|---|---|---|
977825 | Murphy | Dec 1910 | A |
3171549 | Orloff | Mar 1965 | A |
3280991 | Melton et al. | Oct 1966 | A |
4058001 | Waxman | Nov 1977 | A |
4128880 | Cray, Jr. | Dec 1978 | A |
4221997 | Flemming | Sep 1980 | A |
4367998 | Causer | Jan 1983 | A |
4401852 | Noso et al. | Aug 1983 | A |
4456961 | Price et al. | Jun 1984 | A |
4460302 | Moreau et al. | Jul 1984 | A |
4474174 | Petruzzi | Oct 1984 | A |
4491135 | Klein | Jan 1985 | A |
4503854 | Jako | Mar 1985 | A |
4517963 | Michel | May 1985 | A |
4523884 | Clement et al. | Jun 1985 | A |
4586398 | Yindra | May 1986 | A |
4604016 | Joyce | Aug 1986 | A |
4616637 | Caspari et al. | Oct 1986 | A |
4624011 | Watanabe et al. | Nov 1986 | A |
4633389 | Tanaka et al. | Dec 1986 | A |
4635292 | Mori et al. | Jan 1987 | A |
4641292 | Tunnell et al. | Feb 1987 | A |
4655257 | Iwashita | Apr 1987 | A |
4672963 | Barken | Jun 1987 | A |
4676243 | Clayman | Jun 1987 | A |
4728974 | Nio et al. | Mar 1988 | A |
4762455 | Coughlan et al. | Aug 1988 | A |
4791934 | Brunnett | Dec 1988 | A |
4791940 | Hirschfeld et al. | Dec 1988 | A |
4794912 | Lia | Jan 1989 | A |
4815006 | Andersson et al. | Mar 1989 | A |
4815450 | Patel | Mar 1989 | A |
4837734 | Ichikawa et al. | Jun 1989 | A |
4852083 | Niehaus et al. | Jul 1989 | A |
4853874 | Iwamoto et al. | Aug 1989 | A |
4854301 | Nakajima | Aug 1989 | A |
4860215 | Seraji | Aug 1989 | A |
4863133 | Bonnell | Sep 1989 | A |
4883400 | Kuban et al. | Nov 1989 | A |
4930494 | Takehana et al. | Jun 1990 | A |
4945479 | Rusterholz et al. | Jul 1990 | A |
4949717 | Shaw | Aug 1990 | A |
4954952 | Ubhayakar et al. | Sep 1990 | A |
4965417 | Massie | Oct 1990 | A |
4969709 | Sogawa et al. | Nov 1990 | A |
4969890 | Sugita et al. | Nov 1990 | A |
4979933 | Runge | Dec 1990 | A |
4979949 | Matsen, III et al. | Dec 1990 | A |
4980626 | Hess et al. | Dec 1990 | A |
4989253 | Liang et al. | Jan 1991 | A |
4996975 | Nakamura | Mar 1991 | A |
5019968 | Wang et al. | May 1991 | A |
5020001 | Yamamoto et al. | May 1991 | A |
5065741 | Uchiyama et al. | Nov 1991 | A |
5078140 | Kwoh | Jan 1992 | A |
5086401 | Glassman et al. | Feb 1992 | A |
5091656 | Gahn | Feb 1992 | A |
5097829 | Quisenberry | Mar 1992 | A |
5097839 | Allen | Mar 1992 | A |
5098426 | Sklar et al. | Mar 1992 | A |
5105367 | Tsuchihashi et al. | Apr 1992 | A |
5109499 | Inagami et al. | Apr 1992 | A |
5123095 | Papadopoulos et al. | Jun 1992 | A |
5131105 | Harrawood et al. | Jul 1992 | A |
5142930 | Allen et al. | Sep 1992 | A |
5145227 | Monford, Jr. | Sep 1992 | A |
5166513 | Keenan et al. | Nov 1992 | A |
5175694 | Amato | Dec 1992 | A |
5182641 | Diner et al. | Jan 1993 | A |
5184601 | Putman | Feb 1993 | A |
5187574 | Kosemura et al. | Feb 1993 | A |
5196688 | Hesse et al. | Mar 1993 | A |
5201325 | McEwen et al. | Apr 1993 | A |
5201743 | Haber et al. | Apr 1993 | A |
5217003 | Wilk | Jun 1993 | A |
5221283 | Chang | Jun 1993 | A |
5228429 | Hatano | Jul 1993 | A |
5230623 | Guthrie et al. | Jul 1993 | A |
5236432 | Matsen, III et al. | Aug 1993 | A |
5251127 | Raab | Oct 1993 | A |
5257999 | Slanetz, Jr. | Nov 1993 | A |
5271384 | McEwen et al. | Dec 1993 | A |
5279309 | Taylor et al. | Jan 1994 | A |
5282806 | Haber | Feb 1994 | A |
5289273 | Lang | Feb 1994 | A |
5289365 | Caldwell et al. | Feb 1994 | A |
5299288 | Glassman et al. | Mar 1994 | A |
5300926 | Stoeckl | Apr 1994 | A |
5303148 | Mattson et al. | Apr 1994 | A |
5304185 | Taylor | Apr 1994 | A |
5305203 | Raab | Apr 1994 | A |
5305427 | Nagata | Apr 1994 | A |
5309717 | Minch | May 1994 | A |
5313306 | Kuban et al. | May 1994 | A |
5320630 | Ahmed | Jun 1994 | A |
5337732 | Grundfest et al. | Aug 1994 | A |
5339799 | Kami et al. | Aug 1994 | A |
5343385 | Joskowicz et al. | Aug 1994 | A |
5343391 | Mushabac | Aug 1994 | A |
5345538 | Narayannan et al. | Sep 1994 | A |
5357962 | Green | Oct 1994 | A |
5368015 | Wilk | Nov 1994 | A |
5368428 | Hussey et al. | Nov 1994 | A |
5371536 | Yamaguchi | Dec 1994 | A |
5382885 | Salcudean et al. | Jan 1995 | A |
5388987 | Badoz et al. | Feb 1995 | A |
5395369 | McBrayer et al. | Mar 1995 | A |
5397323 | Taylor et al. | Mar 1995 | A |
5402801 | Taylor | Apr 1995 | A |
5403319 | Matsen, III et al. | Apr 1995 | A |
5408409 | Glassman et al. | Apr 1995 | A |
5410638 | Colgate et al. | Apr 1995 | A |
5417210 | Funda et al. | May 1995 | A |
5417701 | Holmes | May 1995 | A |
5422521 | Neer et al. | Jun 1995 | A |
5431645 | Smith et al. | Jul 1995 | A |
5434457 | Josephs et al. | Jul 1995 | A |
5436542 | Petelin et al. | Jul 1995 | A |
5442728 | Kaufman et al. | Aug 1995 | A |
5443484 | Kirsch et al. | Aug 1995 | A |
5445166 | Taylor | Aug 1995 | A |
5451924 | Massimino et al. | Sep 1995 | A |
5455766 | Scheller et al. | Oct 1995 | A |
5458547 | Teraoka et al. | Oct 1995 | A |
5458574 | Machold et al. | Oct 1995 | A |
5476010 | Fleming et al. | Dec 1995 | A |
5490117 | Oda et al. | Feb 1996 | A |
5490843 | Hildwein et al. | Feb 1996 | A |
5506912 | Nagasaki et al. | Apr 1996 | A |
5512919 | Araki | Apr 1996 | A |
5515478 | Wang | May 1996 | A |
5553198 | Wang et al. | Sep 1996 | A |
5571110 | Matsen, III et al. | Nov 1996 | A |
5572999 | Funda et al. | Nov 1996 | A |
5609560 | Ichikawa et al. | Mar 1997 | A |
5626595 | Sklar et al. | May 1997 | A |
5629594 | Jacobus et al. | May 1997 | A |
5630431 | Taylor | May 1997 | A |
5631973 | Green | May 1997 | A |
5657429 | Wang et al. | Aug 1997 | A |
5658250 | Blomquist et al. | Aug 1997 | A |
5695500 | Taylor et al. | Dec 1997 | A |
5696837 | Green | Dec 1997 | A |
5735290 | Sterman et al. | Apr 1998 | A |
5749362 | Funda et al. | May 1998 | A |
5754741 | Wang et al. | May 1998 | A |
5776126 | Wilk et al. | Jul 1998 | A |
5779623 | Bonnell | Jul 1998 | A |
5800423 | Jensen | Sep 1998 | A |
5807284 | Foxlin | Sep 1998 | A |
5808665 | Green | Sep 1998 | A |
5813813 | Daum et al. | Sep 1998 | A |
5817084 | Jensen | Oct 1998 | A |
5859934 | Green | Jan 1999 | A |
5878193 | Wang et al. | Mar 1999 | A |
5931832 | Jensen | Aug 1999 | A |
5950629 | Taylor et al. | Sep 1999 | A |
6024695 | Taylor et al. | Feb 2000 | A |
6201984 | Funda et al. | Mar 2001 | B1 |
Number | Date | Country |
---|---|---|
U 9204118.3 | Jul 1992 | DE |
4310842 | Jan 1995 | DE |
0239409 | Sep 1987 | EP |
0424687 | May 1991 | EP |
0776738 | Jun 1997 | EP |
WO 9104711 | Apr 1991 | WO |
WO 9220295 | Nov 1992 | WO |
WO 9313916 | Jul 1993 | WO |
WO 9418881 | Sep 1994 | WO |
WO 9426167 | Nov 1994 | WO |
WO 9715240 | May 1997 | WO |
WO 9825666 | Jun 1998 | WO |
Number | Date | Country | |
---|---|---|---|
20020183894 A1 | Dec 2002 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 08310665 | Sep 1994 | US |
Child | 10095488 | US |