Speech interface for an automated endoscopic system

Information

  • Patent Grant
  • 6463361
  • Patent Number
    6,463,361
  • Date Filed
    Thursday, September 22, 1994
    30 years ago
  • Date Issued
    Tuesday, October 8, 2002
    22 years ago
Abstract
A robotic system which controls the movement of a surgical instrument in response to voice commands from the user. The robotic system has a computer controlled arm that holds the surgical instrument. The user provides voice commands to the computer through a microphone. The computer contains a phrase recognizer that matches the user' speech with words stored in the computer. Matched words are then processed to determine whether the user has spoken a robot command. If the user has spoken a recognized robot command the computer will move the robotic arm in accordance with the command.
Description




BACKGROUND OF THE INVENTION




1. Field of the Invention




The present invention relates to a robotic system that moves a surgical instrument in response to voice commands from the user.




2. Description of Related Art




To reduce the invasiveness of surgery, endoscopes are commonly utilized to view the internal organs of a patient. One end of the endoscope contains a lens which is inserted into the patient through a small incision in the skin. The lens focuses an image that is transmitted by fiber optic cable to a camera located at the opposite end of the endoscope. The camera is coupled to a monitor that displays the image of the patient.




The endoscope can be used in conjunction with another surgical instrument that is inserted into the patient. An assistant typically hold the endoscope while the surgeon manipulates the surgical instrument. The assistant moves the endoscope in response to instructions from the surgeon. Any miscommunication between the surgeon and the assistant may result in an error in the movement of the endoscope, thereby requiring the surgeon to repeat the instruction. Additionally, holding the endoscope for a significant amount of time may cause the assistant to become fatigued.




U.S. application Ser. No. 07/927,801 discloses a robotic arm that holds and moves an endoscope. The surgeon can move the robotic arm by depressing a foot pedal. The foot pedal is connected to a computer which moves the arm and the scope. Although the '801 system effectively moves the endoscope, the surgeon must continually manipulate the foot pedal, a process which may detract from the surgical procedure. It would be desirable to provide a robotic endoscopic system that can be controlled by voice commands from the user.




SUMMARY OF THE INVENTION




The present invention is a robotic system which controls the movement of a surgical instrument in response to voice commands from the user. The robotic system has a computer controlled arm that holds the surgical instrument. The user provides voice commands to the computer through a microphone. The computer contains a phrase recognizer that matches the user's speech with words stored in the computer. Matched words are then processed to determine whether the user has spoken a robot command. If the user has spoken a recognized robot command the computer will move the robotic arm in accordance with the command.











BRIEF DESCRIPTION OF THE DRAWINGS




The objects and advantages of the present invention will become more readily apparent to those ordinarily skilled in the art after reviewing the following detailed description and accompanying drawings, wherein:





FIG. 1

is a perspective view of a robotic endoscope system of the present invention;





FIG. 2

is a schematic of an endoscope within two separate coordinate systems;





FIG. 3

is a top view of a foot pedal;





FIG. 4

is a schematic of a computer system;





FIG. 5

is a schematic of a grammar process;





FIG. 6

is a schematic of a robotic arm.











DETAILED DESCRIPTION OF THE INVENTION




Referring to the drawings more particularly by reference numbers,

FIG. 1

shows a robotic system


10


of the present invention. The system


10


is typically used in a sterile operating room where a surgeon performs a surgical procedure on a patient. The patient is placed on a operating table


12


. Attached to the table


12


is a robotic arm assembly


14


which can move a surgical instrument


16


relative to the table


12


and the patient. The surgical instrument


16


is typically an endoscope which is inserted into the abdomen of the patient


12


. The endoscope


16


enters the patient through a cannula, wherein the scope


16


rotate about a cannula pivot point. The endoscope is typically connected to a monitor


18


which allows the surgeon to view the organs, etc. of the patient. Although an endoscope is described and shown, it is to be understood that the present invention can be used with other surgical instruments.




The robotic arm assembly


14


controlled by a computer


20


. In the preferred embodiment, the robotic arm assembly


16


includes a linear actuator


24


fixed to the table


14


. The linear actuator


24


is connected to a linkage arm assembly


26


and adapted to move the linkage assembly


26


along the z axis of a first coordinate system. The first coordinate system also has an x axis and a y axis.




The linkage arm assembly


26


includes a first linkage arm


28


attached to a first rotary actuator


30


and an end effector


32


. The first rotary actuator


30


is adapted to rotate the first linkage arm


28


and end effector


32


in a plane perpendicular to the z axis (x-y plane). The first rotary actuator


30


is connected to a second rotary actuator


34


by a second linkage arm


36


. The second actuator


34


is adapted to rotate the first actuator


30


in the x-y plane. The second rotary actuator


34


is connected to the output shaft of the linear actuator


24


. The actuators


24


,


30


and


34


rotate in response to output signals provided by the computer


20


. As shown in

FIG. 2

, the junction of the endoscope


16


and the end effector


32


define a second coordinate system which has an x′ axis, a y′ axis and a z′ axis. The junction of the end effector


32


and endoscope


18


also define the origin of a third coordinate system which has a x″ axis, a y″ axis and a z″ axis. The z″ axis parallel with the longitudinal axis of the endoscope


16


.




The arm assembly may have a pair of passive joints that allow the end effector to be rotated in the direction indicated by the arrows. The actuators


24


,


30


and


34


, and joints of the arm may each have position sensors (not shown) that are connected to the computer


20


. The sensors provide positional feedback signals of each corresponding arm component.




The system has a microphone


40


that is connected to the computer


20


. The system may also have a speaker


42


that is connected to the computer


20


. The microphone


40


and speaker


42


may be mounted to a headset


44


that is worn by the user. Placing the microphone


40


in close proximity to the user reduces the amount of background noise provided to the computer and decreases the probability of an inadvertent input command.




As shown in

FIG. 3

, the system may also have a foot pedal


50


. The foot pedal


22


has a housing


56


that supports a pair of outer first foot switches


58


and a second foot switch


60


. One outer foot switch


58


has a first pressure transducer


62


and the other switch has a second pressure transducer


64


. The second foot switch


60


has third


66


, fourth


68


, fifth


70


and sixth


72


pressure transducers. The transducers are each connected to a corresponding operational amplifier that provides a voltage input to the computer


20


. The pressure transducers


62


-


72


are preferably constructed so that the resistance of each transducer decreases as the surgeon increases the pressure on the foot switches. Such a transducer is sold by Interlink Electronics. The decreasing transducer resistance increases the input voltage provided to the computer


20


from the operational amplifier. Each transducer corresponds to a predetermined direction within the image displayed by the monitor. In the preferred embodiment, the first pressure transducer


62


corresponds to moving the endoscope toward the image viewed by the surgeon. The second transducer


64


moves the scope away from the image. The third


66


and fourth


68


transducers move the image “up” and “down”, respectively, and the fifth


70


and sixth


72


transducers move the image “left” and “right”, respectively. The pedal may have a button


73


that enables the foot pedal


50


and disable the voice command feature, or vice versa.





FIG. 4

shows a schematic of the computer


20


. The computer


20


has a multiplexer


74


which is connected to the pressure transducers of the foot pedal


50


and the position sensors of the arm. The multiplexer


74


is connected to a single analog to digital (A/D) converter


76


. The computer


20


also has a processor


78


and memory


80


.




The processor


78


is connected to an address decoder


82


and separate digital to analog (D/A) converters


84


. Each D/A converter is connected to an actuator


24


,


30


and


34


. The D/A converters


84


provide analog output signals to the actuators in response to output signals received from the processor


78


. The analog output signals have a sufficient voltage level to energize the electric motors and move the robotic arm assembly. The decoder


82


correlates the addresses provided by the processor with a corresponding D/A converter, so that the correct motor(s) is driven. The address decoder


82


also provides an address for the input data from the A/D converter


76


so that the data is associated with the correct input channel.




The computer


20


has a phrase recognizer


86


connected to the microphone


40


and the processor


78


. The phrase recognizer


86


digitizes voice commands provided by the user through the microphone


40


. The voice commands are then processed to convert the spoken words into electronic form. The electronic words are typically generated by matching the user's speech with words stored within the computer


20


. In the preferred embodiment, the recognizer


86


is an electronic board with accompanying software that is marketed by Scott Instruments of Denton, Texas under the trademark “Coretechs Technology”.




The electronic words are provided to the processor


78


. The processor


78


compares a word, or a combination of words to predefined robot commands that are stored within a library in the memory


80


of the computer


20


. If a word, or combination of words match a word or combination of words in the library, the processor


78


provides output commands to the D/A converter


84


to move the robotic arm in accordance with the command.





FIG. 5

shows exemplary words and combinations of words that provide robot commands. A grammar process is performed to determine whether the voice commands satisfy certain conditions. The process contains a number of states advanced by the satisfaction of a condition. If the voice command provided by the user satisfies a first condition, then the process proceeds to the first state. If a condition of a next state is satisfied then the process proceeds to the next corresponding state, and so forth and so on. For example, to prevent a robot command from being inadvertently spoken, it is desirable to predicate all voice commands with a qualifier. For example, the qualifier may be a name given to the robot such as “AESOP”. Therefore when the user provides a voice command, the process initially determines whether the spoken word is AESOP. If the spoken word is not AESOP then the process ends. The term “stop” may be an exception to this rule, wherein the computer will stop arm movement when the user provides a simple “stop” voice command.




If the spoken word is AESOP the process continues to state 1. The process next determines whether the user has spoken a word that satisfies a condition to advance to states


2


-


6


. These words include “move”, “step”, “save”, “return”, “speed”, “track instrument” and “track head”. The track instrument command is for a system which has the ability to move an endoscope to automatically track the movement of a second instrument that is inserted into the patient. The track head command may enable the system so that the endoscope movement tracks the user's eyes. For example, if the user looks to the right of the image displayed by the monitor, the robot will move the endoscope to move the image in a rightward direction. The move and step commands induce movement of the scope in a desired direction. The save command saves the position of the endoscope within the memory of the computer. The return command will return the scope to a saved position.




From states


2


-


6


the process will determine whether the user has spoken words that meet the next condition and so forth and so on. When a certain number of conditions have been met, the processor


78


will provide an output command to the D/A converter


84


in accordance with the voice commands. For example, if the user says “AESOP move left”, the processor


78


will provide output commands to move the endoscope


12


, so that the image displayed by the monitor moves in a leftward direction. The microphone


40


phrase recognizer


86


and grammar process essentially provide the same input function as the foot pedal


50


, multiplexer


74


and A/D converter


76


.




The processor


78


can also provide the user with feedback regarding the recognized command through the speaker


42


or the monitor


18


. For example, when the user states “AESOP move right”, after processing the speech, the processor


78


can provide an audio message through the speaker


42


, or a visual message on the monitor


18


, “AESOP move right”. Additionally, the processor


78


can provide messages regarding system errors, or the present state of the system such as “speed is set for slow”.




Referring to

FIG. 6

, the processor


78


typically computes the movement of the robotic arm assembly


16


in accordance with the following equations.










a3
=

π
-


cos

-
1




(



x
2

+

y
2

-

L1
2

+

L2
2




-
2


L1L2


)










Δ
=


cos

-
1




(



x
2

+

y
2

+

L1
2

-

L2
2




2
·
L1





x
2

+

y
2





)









a0
=


tan

-
1



2


(

y
x

)









a2
=

a0
+
l
-
Δ






(
1
)













where;




a


2


=angle between the second linkage arm


36


and the x axis.




a


3


=angle between the first linkage arm


28


and the longitudinal axis of the second linkage arm


36


.




L


1


=length of the second linkage arm.




L


2


=length of the first linkage arm.




x=x coordinate of the end effector in the first coordinate system.




y=y coordinate of the end effector in the first coordinate system.




To move the end effector to a new location of the x-y plane the processor


78


computes the change in angles a


2


and a


3


and then provides output signals to move the actuators accordingly. The original angular position of the end effector is provided to the processor


78


by the position sensors. The processor moves the linkage arms an angle that corresponds to the difference between the new location and the original location of the end effector. A differential angle Δa


2


corresponds to the amount of angular displacement provided by the second actuator


34


, a differential angle Δa


3


corresponds to the amount of angular displacement provided by the first actuator


30


.




To improve the effectiveness of the system


10


, the system is constructed so that the desired movement of the surgical instrument correlates to a direction relative to the image displayed by the monitor. Thus when the surgeon commands the scope to move up, the scope always appears to move in the up direction. To accomplish this result, the processor


78


converts the desired movement of the end of the endoscope in the third coordinate system to coordinates in the second coordinate system, and then converts the coordinates of the second coordinate system into the coordinates of the first coordinate system.




Referring to

FIG. 2

, the desired movement of the endoscope is converted from the third coordinate system to the second coordinate system by using the following transformation matrix;










(




Δ






x








Δ






y








Δ






z






)

=


(




cos


(
a6
)




0




-
sin







(
a6
)








-
sin







(
a5
)



sin


(
a6
)






cos


(
a5
)






-

sin


(
a5
)





cos


(
a6
)









cos


(
a5
)




sin


(
a6
)






sin


(
a5
)





cos






(
a5
)



cos


(
a6
)






)



(




Δ






x








Δ






y








Δ






z






)






(
2
)













where;




Δx″=the desired incremental movement of the scope along the x″ axis of the third coordinate system.




Δy″=the desired incremental movement of the scope along the y″ axis of the third coordinate system.




Δz″=the desired incremental movement of the scope along the z″ axis of the third coordinate system.




a


5


=the angle between the z′ axis and the scope in the y′-z′ plane.




a


6


=the angle between the z′ axis and the scope in the x′-z′ plane.




Δx′=the computed incremental movement of the scope along the x′ axis of the second coordinate system.




Δy′=the computed incremental movement of the scope along the y′ axis of the second coordinate system.




Δz′=the computed incremental movement of the scope along the z′ axis of the second coordinate system.




The angles a


5


and a


6


are provided by position sensors located on the end effector


32


. The angles a


5


and a


6


are shown in FIG.


2


.




The desired movement of the endoscope is converted from the second coordinate system to the first coordinate system by using the following transformation matrix;










(




Δ





x






Δ





y






Δ





z




)

=


(




cos


(
π
)





-

sin


(
π
)





0





sin


(
π
)





cos


(
π
)




0




0


0


1



)



(




Δ






x








Δ






y








Δ






z






)






(
3
)













where;




Δx′=the computed incremental movement of the scope along the x′ axis of the second coordinate system.




Δy′=the computed incremental movement of the scope along the y′ axis of the second coordinate system.




Δz′=the computed incremental movement of the scope along the z′ axis of the second coordinate system.




π=is the angle between the first linkage arm and the x axis of the first coordinate system.




Δx=the computed incremental movement of the scope along the x axis of the first coordinate system.




Δy=the computed incremental movement of the scope along the y axis of the first coordinate system.




Δz=the computed incremental movement of the scope along the z axis of the first coordinate system.




The incremental movements Δx and Δy are inserted into the algorithms described above for computing the angular movements (Δa


2


and Δa


3


) of the robotic arm assembly to determine the amount of rotation that is to be provided by each electric motor. The value Δz is used to determine the amount of linear movement provided by the linear actuator


24


.




The surgical instrument is typically coupled to a camera and a viewing screen so that any spinning of the instrument about its own longitudinal axis will result in a corresponding rotation of the image on the viewing screen. Rotation of the instrument and viewing image may disorient the viewer. It is therefore desirable to maintain the orientation of the viewing image. In the preferred embodiment, the end effector has a worm gear (not shown) which rotates the surgical instrument about the longitudinal axis of the instrument. To insure proper orientation of the endoscope


16


, the worm gear rotates the instrument


16


about its longitudinal axis an amount Δθ


6


to insure that the y″ axis is oriented in the most vertical direction within the fixed coordinate system. Δθ


6


is computed from the following cross-products.






Δθ


6




=zi″x


(


yo″×yi″)








where;




Δθ


6


=the angle that the instrument is to be rotated about the z″ axis.




yo″=is the vector orientation of the y″ axis when the instrument is in the first position.




yi″=is the vector orientation of the y″ axis when the instrument is in the second position.




zi″=is the vector orientation of the z″ axis when the instrument is in the second position.




The vectors of the yi″ and zi″ axis are computed with the following algorithms.







[

zi


]

=


[




cos





a6



0




-
sin






a6







-
sin






a5





sin





a6




cos





a5





-
sin






a5





cos





a6






cos





a5





sin





a6




sin





a5




cos





a5





cos





a6




]



[



0




0




1



]







xi


=

z
×

zi








yi


=


zi


×

xi













where;




a


5


=is the angle between the instrument and the z axis in the y-z plane.




a


6


=is the angle between the instrument and the z axis in the x-z plane.




z=is the unit vector of the z axis in the first coordinate system.




The angles a


5


and a


6


are provided by position sensors. The vector yo″ is computed using the angles a


5


and a


6


of the instrument in the original or first position. For the computation of yi″ the angles a


5


and a


6


of the second position are used in the transformation matrix. After each arm movement yo″ is set to yi″ and a new yi″ vector and corresponding Δθ


6


angle are computed and used to re-orient the endoscope. Using the above described algorithms, the worm gear continuously rotates the instrument about its longitudinal axis to insure that the pivotal movement of the endoscope does not cause a corresponding rotation of the viewing image.




The system may have a memory feature to store desired instrument positions within the patient. The memory feature may be enabled either by voice commands or through a button on an input device such as the foot pedal. When a save command is spoken, the coordinates of the end effector in the first coordinate system are saved in a dedicated address(es) of the computer memory. When a return command is spoken, the processor retrieves the data stored in memory and moves the end effector to the coordinates of the effector when the save command was enabled.




The memory feature allows the operator to store the coordinates of the end effector in a first position, move the end effector to a second position and then return to the first position with a simple command. By way of example, the surgeon may take a wide eye view of the patient from a predetermined location and store the coordinates of that location in memory. Subsequently, the surgeon may manipulate the endoscope to enter cavities, etc. which provide a more narrow view. The surgeon can rapidly move back to the wide eye view by merely stating “AESOP return to one”.




In operation, the user provides spoken words to the microphone. The phrase recognizer


86


matches the user's speech with stored words and provides matched electronic words to the processor


78


. The processor performs a grammar process to determine whether the spoken words are robot commands. If the words are commands, the computer energizes the actuators and moves the endoscope, accordingly. The system also allows the user to control the movement of the endoscope with a foot pedal if voice commands are not desired.




While certain exemplary embodiments have been described and shown in the accompanying drawings, it is to be understood that such embodiments are merely illustrative of and not restrictive on the broad invention, and that this invention not be limited to the specific constructions and arrangements shown and described, since various other modifications may occur to those ordinarily skilled in the art.



Claims
  • 1. A robotic system that controls a surgical instrument, comprising:a mechanism that can move the surgical instrument; an audio input device that receives a voice qualifier and a voice command from a user to move the surgical instrument, and provides an output signal that correspond to the voice command; and, a controller that receives said output signal from said audio input device and provides a movement output signal to said mechanism to move said mechanism and the surgical instrument in accordance with the voice command only when the voice qualifier accompanies the voice command.
  • 2. The system as recited in claim 1, wherein said controller provides said movement output signal only when a plurality of voice commands are provided in a predetermined sequence.
  • 3. The system as recited in claim 1, wherein said mechanism is a robotic arm assembly.
  • 4. The system as recited in claim 1, further comprising a foot pedal coupled to said controller.
  • 5. The system as recited in claim 1, further comprising a speaker coupled to said controller to generate an audio message.
  • 6. The system as recited in claim 1, further comprising a monitor coupled to said controller to generate a visual message.
  • 7. The system as recited in claim 1, wherein said controller provides said movement output signal only when a plurality of voice commands are provided in a predetermined sequence.
  • 8. The system as recited in claim 1, wherein said mechanism is a robotic arm assembly.
  • 9. The system as recited in claim 1, further comprising a foot pedal coupled to said controller.
  • 10. The system as recited in claim 1, further comprising a speaker coupled to said controller to generate an audio message.
  • 11. The system as recited in claim 1, further comprising a monitor coupled to said controller to generate a visual message.
  • 12. A robotic system that controls a surgical instrument, comprising:a mechanism that can move the surgical instrument; audio input means for receiving a voice qualifier and a voice command from a user to move the surgical instrument and providing an output signal that correspond to the voice command; and, controller means for receiving said output signal from said audio input means and providing a movement output signal to said mechanism to move said mechanism and the surgical instrument in accordance with the voice command only when the voice qualifier accompanies the voice command.
  • 13. The system as recited in claim 12, wherein said controller means provides said movement output signal only when a plurality of voice commands are provided in a predetermined sequence.
  • 14. The system as recited in claim 12, wherein said mechanism is a robotic arm assembly.
  • 15. The system as recited in claim 12, further comprising a foot pedal coupled to said controller means.
  • 16. The system as recited in claim 12, further comprising speaker means for generating an audio message.
  • 17. The system as recited in claim 12, further comprising monitor means for generating a visual message.
  • 18. A method for moving a surgical instrument, comprising the steps of:a) generating a voice qualifier and a voice command to move a mechanism that holds the surgical instrument; b) determining whether the voice qualifier matches a first robot command; c) determining whether the voice command matches a second robot command if the voice qualifier matched the first robot command; and, d) moving the surgical instrument with said mechanism only if the voice qualifier matches the first robot command and the voice command matches the second robot command.
  • 19. The method as recited in claim 18, wherein the surgical instrument is moved only when a plurality of voice commands are provided in a predetermined sequence.
  • 20. A medical robotic system that controls a surgical instrument, comprising:an actuator; a linkage attached to said actuator and that can be coupled to the surgical instrument; an audio input device that receives a voice qualifier and a voice command from a user; and, a controller that is coupled to said audio input device and provides a movement output signal to said actuator to move said linkage in accordance with the voice command only when the voice qualifier accompanies the voice command.
  • 21. A robotic system that controls a surgical instrument, comprising:a mechanism that can move the surgical instrument; audio input means for receiving voice qualifier and a voice command from a user to move the surgical instrument; and, controller means for moving said mechanism in accordance with the voice command only when the voice qualifier accompanies the voice command.
  • 22. The system as recited in claim 21, wherein said controller means provides said movement output signal only when a plurality of voice commands are provided in a predetermined sequence.
  • 23. The system as recited in claim 21, wherein said mechanism is a robotic arm assembly.
  • 24. The system as recited in claim 21, further comprising a foot pedal coupled to said controller means.
  • 25. The system as recited in claim 21, further comprising speaker means for generating an audio message.
  • 26. The system as recited in claim 21, further comprising monitor means for generating a visual message.
US Referenced Citations (158)
Number Name Date Kind
977825 Murphy Dec 1910 A
3171549 Orloff Mar 1965 A
3280991 Melton et al. Oct 1966 A
4058001 Waxman Nov 1977 A
4128880 Cray, Jr. Dec 1978 A
4221997 Flemming Sep 1980 A
4367998 Causer Jan 1983 A
4401852 Noso et al. Aug 1983 A
4456961 Price et al. Jun 1984 A
4460302 Moreau et al. Jul 1984 A
4474174 Petruzzi Oct 1984 A
4491135 Klein Jan 1985 A
4503854 Jako Mar 1985 A
4517963 Michel May 1985 A
4523884 Clement et al. Jun 1985 A
4586398 Yindra May 1986 A
4604016 Joyce Aug 1986 A
4616637 Caspari et al. Oct 1986 A
4624011 Watanabe et al. Nov 1986 A
4633389 Tanaka et al. Dec 1986 A
4635292 Mori et al. Jan 1987 A
4641292 Tunnell et al. Feb 1987 A
4655257 Iwashita Apr 1987 A
4672963 Barken Jun 1987 A
4676243 Clayman Jun 1987 A
4728974 Nio et al. Mar 1988 A
4762455 Coughlan et al. Aug 1988 A
4791934 Brunnett Dec 1988 A
4791940 Hirschfeld et al. Dec 1988 A
4794912 Lia Jan 1989 A
4815006 Anderson et al. Mar 1989 A
4815450 Patel Mar 1989 A
4837734 Ichikawa et al. Jun 1989 A
4852083 Niehaus et al. Jul 1989 A
4853874 Iwamoto et al. Aug 1989 A
4854301 Nakajima Aug 1989 A
4860215 Seraji Aug 1989 A
4863133 Bonnell Sep 1989 A
4883400 Kuban et al. Nov 1989 A
4930494 Takehana et al. Jun 1990 A
4945479 Rusterholz et al. Jul 1990 A
4949717 Shaw Aug 1990 A
4954952 Ubhayakar et al. Sep 1990 A
4965417 Massie Oct 1990 A
4969709 Sogawa et al. Nov 1990 A
4969890 Sugita et al. Nov 1990 A
4979933 Runge Dec 1990 A
4979949 Matsen, III et al. Dec 1990 A
4980626 Hess et al. Dec 1990 A
4989253 Liang et al. Jan 1991 A
4996975 Nakamura Mar 1991 A
5019968 Wang et al. May 1991 A
5020001 Yamamoto et al. May 1991 A
5065741 Uchlyama et al. Nov 1991 A
5078140 Kwoh Jan 1992 A
5086401 Glassman et al. Feb 1992 A
5091656 Gahn Feb 1992 A
5097829 Quisenberry Mar 1992 A
5097839 Allen Mar 1992 A
5098426 Sklar et al. Mar 1992 A
5105367 Tsuchihashi et al. Apr 1992 A
5109499 Inagami et al. Apr 1992 A
5123095 Papadopulos et al. Jun 1992 A
5131105 Harrawood et al. Jul 1992 A
5142930 Allen et al. Sep 1992 A
5145227 Monford, Jr. Sep 1992 A
5166513 Keenan et al. Nov 1992 A
5175694 Amato Dec 1992 A
5182641 Diner et al. Jan 1993 A
5184601 Putman Feb 1993 A
5187574 Kosemura et al. Feb 1993 A
5196688 Hesse et al. Mar 1993 A
5201325 McEwen et al. Apr 1993 A
5201743 Haber et al. Apr 1993 A
5217003 Wilk Jun 1993 A
5221283 Chang Jun 1993 A
5228429 Hatano Jul 1993 A
5230623 Guthrie et al. Jul 1993 A
5236432 Matsen, III et al. Aug 1993 A
5251127 Raab Oct 1993 A
5257999 Slanetz, Jr. Nov 1993 A
5271384 McEwen et al. Dec 1993 A
5279309 Taylor et al. Jan 1994 A
5282806 Haber Feb 1994 A
5289273 Lang Feb 1994 A
5289365 Caldwell et al. Feb 1994 A
5299288 Glassman et al. Mar 1994 A
5300926 Stoecki Apr 1994 A
5303148 Mattson et al. Apr 1994 A
5304185 Taylor Apr 1994 A
5305203 Raab Apr 1994 A
5305427 Nagata Apr 1994 A
5309717 Minch May 1994 A
5313306 Kuban et al. May 1994 A
5320630 Ahmed Jun 1994 A
5337732 Grundfest et al. Aug 1994 A
5339799 Kami et al. Aug 1994 A
5343385 Joskowicz et al. Aug 1994 A
5343391 Musbabac Aug 1994 A
5345538 Narayannan et al. Sep 1994 A
5357962 Green Oct 1994 A
5368015 Wilk Nov 1994 A
5368428 Hussey et al. Nov 1994 A
5371536 Yamaguchi Dec 1994 A
5382885 Salcudean et al. Jan 1995 A
5388987 Badoz et al. Feb 1995 A
5395369 McBrayer et al. Mar 1995 A
5397323 Taylor et al. Mar 1995 A
5402801 Talyor Apr 1995 A
5403319 Matsen, III et al. Apr 1995 A
5408409 Glassman et al. Apr 1995 A
5410638 Colgate et al. Apr 1995 A
5417210 Funda et al. May 1995 A
5417701 Holmes May 1995 A
5422521 Neer et al. Jun 1995 A
5431645 Smith et al. Jul 1995 A
5434457 Josephs et al. Jul 1995 A
5442728 Kaufman et al. Aug 1995 A
5443484 Kirsch et al. Aug 1995 A
5445166 Taylor Aug 1995 A
5451924 Massimino et al. Sep 1995 A
5455766 Scheller et al. Oct 1995 A
5458547 Teraoka et al. Oct 1995 A
5458574 Machold et al. Oct 1995 A
5476010 Fleming et al. Dec 1995 A
5490117 Oda et al. Feb 1996 A
5490843 Hildwein et al. Feb 1996 A
5506912 Nagasaki et al. Apr 1996 A
5512919 Araki Apr 1996 A
5515478 Wang May 1996 A
5553198 Wang et al. Sep 1996 A
5571110 Matsen, III et al. Nov 1996 A
5572999 Funda et al. Nov 1996 A
5609560 Ichikawa et al. Mar 1997 A
5626595 Sklar et al. May 1997 A
5629594 Jacobus et al. May 1997 A
5630431 Taylor May 1997 A
5631973 Green May 1997 A
5657429 Wang et al. Aug 1997 A
5658250 Blomquist et al. Aug 1997 A
5676673 Ferre et al. Oct 1997 A
5695500 Taylor et al. Dec 1997 A
5696837 Green Dec 1997 A
5735290 Sterman et al. Apr 1998 A
5749362 Funda et al. May 1998 A
5754741 Wang et al. May 1998 A
5776126 Wilk et al. Jul 1998 A
5779623 Bonnell Jul 1998 A
5800423 Jensen Sep 1998 A
5807284 Foxlin Sep 1998 A
5808665 Green Sep 1998 A
5813813 Daum et al. Sep 1998 A
5817084 Jensen Oct 1998 A
5859934 Green Jan 1999 A
5878193 Wang et al. Mar 1999 A
5931832 Jensen Aug 1999 A
5950629 Taylor et al. Sep 1999 A
6024695 Taylor et al. Feb 2000 A
Foreign Referenced Citations (12)
Number Date Country
9204118 Jul 1992 DE
4310842 Jan 1995 DE
0239409 Sep 1987 EP
0424687 May 1991 EP
0776738 Jun 1997 EP
WO 9104711 Apr 1991 WO
WO 9220295 Nov 1992 WO
WO 9313916 Jul 1993 WO
WO 9418881 Sep 1994 WO
WO 9426167 Nov 1994 WO
WO 9715240 May 1997 WO
WO 9825666 Jun 1998 WO
Non-Patent Literature Citations (41)
Entry
“Endocorporeal Surgery Using Remote Manipulators” (Ned. S. Rasor and J.W. Spickler) Remotely Manned Systems—Exploration and Operation in Space, California Institute of Technology 1973.
“A Survey Study of Teleoperators, Robotics, and Remote Systems Technology” (Arthur D. Alexander, III) Remotely Manned Systems—Exploration and Operation in Space, California Institute of Technology 1973.
“Impacts of Telemation on Modern Society” (Arthur D. Alexander, III), On the Theory and Practice of Robots and Manipulators vol. II, 1974.
Transcript of a video presented by SRI at the 3rd World Congress of Endoscopic Surgery in Bordeaux on Jun. 18-20, 1992, in Washington on Apr. 9, 1992, and in San Diego, CA on Jun. 4-7, 1992 entitled “Telepresence Surgery—The Future of Minimally Invasive Medicine”.
Statutory Declaration of Dr. Philip S. Green, presenter of the video entitled “Telepresence Surgery—The Future of Minimally Invasive Medicine”.
Abstract of a presentation “Telepresence: Advanced Teleoperator Technology for Minimally Invasive Surgery” (P. Green et al.) given at the 3rd World Congress of Endoscopic Surgery in Bordeaux, Jun. 18-20, 1992.
Abstract of a presentation “Telepresence: Advanced Teleoperator Technology for Minimally Invasive Surgery” (P. Green et al.) given at “Medicine meets virtual reality”, symposium in San Diego, Jun. 4-7, 1992.
Abstract of a presentation “Camera Control for Laparoscopic Surgery by Speech-Recognizing Robot: Constant Attention and Better Use of Personnel” (Colin Besant et al.) given at the 3rd World Congress of Endoscopic Surgery in Bordeaux, Jun. 18-20, 1992.
“A Literature Review: Robots in Medicine” (B. Preising et al.) IEEE Jun. 1991.
“Robots for the Operating Room” (Elizabeth Corcoran), The New York Times, Sunday Jul. 19, 1992, Section 3, p. 9, col. 1.
“Taming the Bull: Safety in a Precise Surgical Robot” (Russell H. Taylor et al.), IEEE 1991.
Abstract of a presentation “Design Considerations of a New Generation Endoscope Using Robotics and Computer Vision Technology” (S.M. Krishnan et al.) given at the 3rd World Congress of Endoscopic Surgery in Bordeaux, Jun. 18-20, 1992.
Abstract of a presentation “3-D Vision Technology Applied to Advanced Minimally Invasive Surgery Systems” given at the 3rd World Congress of Endoscopic Surgery in Bordeaux, Jun. 18-20, 1992.
“Analysis of the Surgeon's Grasp for Telerobotic Surgical Manipulation” (Frank Tendick and Lawrence Stark), IEEE 1989.
“Kinematic Control and Visual Display of Redundant Teleoperators” (Hardi Das et al.), IEEE 1989.
“A New System for Computer Assisted Neurosurgery” (S. Lavallee), IEEE 1989.
“An Advanced Control Micromanipulator for Surgical Applications” (Ben Gayed et al.), Systems Science vol. 13 1987.
“Force Feedback-Based Telemicromanipulation for Robot Surgery on Soft Tissues” (A.M. Sabatini et al.), IEEE 1989.
“Six-Axis Bilateral Control of an Articulated Slave Manipulator Using a Cartesian Master Manipulator” (Masao Inoue), Advanced Robotics 1990.
“On a Micro-Manipulator for Medical Application—Stability Consideration of its Bilateral Controller” (S. Majima et al.), Mechatronics 1991.
“Anthropomorphic Remote Manipulator”, NASA Tech Briefs 1991.
“Controlling Remote Manipulators through Kinesthetic Coupling” (A.K. Bejczy), Computers in Mechanical Engineering 1983.
“Design of a Surgeon-Machine Interface for Teleoperated Microsurgery” (Steve Charles M.D. et al.), IEEE 1989.
“A Robot in an Operating Room: A Bull in a China Shop” (J.M. Dolan et al.), IEEE 1987.
Abstract of a presentation “Concept and Experimental Application of a Surgical Robotic System the Steerable MIS Instrument SMI” given at the 3rd World Congress of Endoscopic Surgery in Bordeaux, Jun. 18-20, 1992.
Abstract of a presentation given at the 3rd World Congress of Endoscopic Surgery in Bordeaux, Jun. 18-20, 1992, entitled “Session 15/1”.
Abstract of a presentation “A Pneumatic Controlled Sewing Device for Endoscopic Application the MIS Sewing Instrument MSI” given at the 3rd World Congress of Endoscopic Surgery in Bordeaux, Jun. 18-20, 1992.
Abstract of a presentation given at the 3rd World Congress of Endoscopic Surgery in Bordeaux (Jun. 18 to 20, 1992), entitled “Session 15/2”.
Abstract of a presentation given at the 3rd World Congress of Endoscopic Surgery in Bordeaux (Jun. 18 to 20, 1992), entitled Session 15/4.
Abstract of a presentation given at the 3rd World Congress of Endoscopic Surgery in Bordeaux (Jun. 18 to 20, 1992), entitled “Session 15/5”.
“Properties of Master-Slave Robots” (C. Vibet), Motor-con 1987.
“A New Microsurgical Robot System for Corneal Transplantation” (Noriyuki Tejima), Precision Machinery 1988.
“Human/Robot Interaction via the Transfer of Power and Information Signals—Part I: Dynamics and Control Analysis” (H. Kazerooni), IEEE 1989.
“Human/Robot Interaction via the Transfer of Power and Information Signals—Part II: An Experimental Analysis” (H. Kazerooni), IEEE 1989.
“Power and Impedance Scaling in Bilateral Manipulation” (J. Edward Colgate), IEEE 1991.
“S.M.O.S.: Stereotaxical Microtelemanipulator for Ocular Surgery” (Aicha Guerrouad and Pierre Vidal), IEEE 1989.
“Motion Control for a Sheep Shearing Robot” (James P. Trevelyan et al.), Proceedings of the 1st International Symposium on Robotics Research, MIT, Cambridge, Massachusetts, USA, 1983.
“Robots and Telechirs” (M.W. Thring), Wiley 1983.
Industrial Robotics (Gordon M. Mair), Prentice Hall 1988 (pp. 41-43, 49-50, 54, 203-209 enclosed).
“Student Reference Manual for Electronic Instrumentation Laboratories” (Wolf et al.), Prentice Hall, New Jersey 1990, pp. 498 and 499.
Fu, et al, “Robotics: Control, Sensing, Vision and Intelligence”, McGraw-Hill Book Company; 1987.