Head cursor control interface for an automated endoscope system for optimal positioning

Information

  • Patent Grant
  • 6714841
  • Patent Number
    6,714,841
  • Date Filed
    Monday, October 26, 1998
    26 years ago
  • Date Issued
    Tuesday, March 30, 2004
    20 years ago
Abstract
A medical system that is coupled to an endoscope which provides a video image to a monitor. The system includes an electrical circuit to overlay a graphic image onto the video image provided by the endoscope. The endoscope is moved by a robotic arm.
Description




BACKGROUND OF THE INVENTION




1. Field of the Invention




The present invention relates to a graphical user interface that can be remotely controlled by a surgeon to control various devices and conditions of an operating room.




2. Description of Related Art




To reduce the invasiveness of surgery, endoscopes are commonly utilized to view the internal organs of a patient. One end of the endoscope contains a lens which is inserted into the patient through a small incision of the skin. The lens focuses an image that is transmitted by fiber optic cable to a camera located at the opposite end of the endoscope. The camera is coupled to a monitor that displays a video image of the patient.




The endoscope can be used in conjunction with another surgical instrument that is inserted into the patient. An assistant typically holds the endoscope while the surgeon manipulates the surgical instrument. The assistant moves the endoscope in response to instructions from the surgeon. Any mis-communication between the surgeon and the assistant may result in an error in the movement of the endoscope, thereby requiring the surgeon to repeat the instruction. Additionally, holding the endoscope for a significant amount of time may cause the assistant to become fatigued.




U.S. application Ser. No. 07/927,801 discloses a robotic arm that holds and moves an endoscope in response to commands from the surgeon. The commands are provided through a hand controller or a foot pedal. The controller and pedal require coordinated movements which may detract the surgeon from the surgical procedure. It would be desirable to provide an interface that manipulates a robotically controlled surgical device while requiring minimal physical coordination by the surgeon. Additionally, it would be desirable to provide a single interface that allows the surgeon to control a number of devices such as an operating table, laparoscopic camera, laser tool, etc.




SUMMARY OF THE INVENTION




The present invention is an interface that allows a surgeon to remotely control surgical devices and conditions of an operation room. The surgeon views a video image that is displayed by a monitor. The monitor may be coupled to a video device such as a laparoscopic camera that is attached to the end of an endoscope. Static graphic images and a dynamic graphic cursor are overlayed onto the video image. The graphic cursor has a pixel location on the monitor which corresponds to a spatial location of a pointer signal. The pointer signal is transmitted by a transmitter worn on the head of the surgeon. The pointer signal may be a laser which is directed to a screen that is located adjacent to a detection camera. The surgeon may move the graphic cursor relative to the video image by tilting his head and varying the spatial location of the pointer signal. The interface may have a controller which generates output signals in response to the movement of the pointer signal. The output signals may move a robotic arm which controls the position of the endoscope. The controller may also generate command signals when the graphic cursor is moved into a static graphic image. The command may vary a condition of the operating room such as the position of the operating table.











BRIEF DESCRIPTION OF THE DRAWINGS




The objects and advantages of the present invention will become more readily apparent to those ordinarily skilled in the art after reviewing the following detailed description and accompanying drawings, wherein:





FIG. 1

is a perspective view of a robotic system that controls an endoscope;





FIG. 2

is a schematic of an endoscope within two different coordinate systems;





FIG. 3

is a schematic of a head cursor interface electrical circuit.





FIG. 4

is a front view of a monitor which displays a video image and a plurality of graphical overlays;





FIG. 5

is a schematic of an endoscope within various coordinate frames;





FIG. 6

is a schematic of a robotic arm.











DETAILED DESCRIPTION OF THE INVENTION




Referring to the drawings more particularly by reference numbers,

FIG. 1

is a robotic system


10


that controls a surgical instrument


12


. The surgical instrument


12


is typically an endoscope that is inserted into a patient. The tip of the endoscope typically has a lens(es) that focuses an image of the patient. The endoscope


12


may also have fiber optic cable that transmits the image to a camera


16


located at the end of the scope. The camera


16


is typically a charge coupled device (CCD). The camera


16


is coupled to a monitor


18


which displays the image.




The instrument


12


is moved by a robotic arm assembly


20


that is coupled to a computer


22


. In the preferred embodiment the robotic assembly


20


has a linear actuator


24


that is mounted to a surgical table


26


. The linear actuator


24


moves a linkage arm assembly


28


in a linear manner relative to the table


26


. The linear actuator


24


defines an origin of a fixed first coordinate system that has a first x axis, a first y axis and a first z axis.




The linkage arm assembly


28


contains a first linkage arm


30


attached to an end effector


32


. The first linkage arm


30


is mounted to a first rotary actuator


34


which can rotate the arm. The first rotary actuator


34


is attached to a second linkage arm


36


. The second linkage arm


36


is mounted to a second rotary actuator


38


that can rotate the arms. The rotary actuator


38


is attached to the output shaft of the linear actuator


24


.




The end effector


32


is typically coupled to a pair of passive joints (not shown) that allow rotation of the instrument as indicated by the arrows in FIG.


1


. The end effector


32


may also have a worm gear (not shown) that rotates the endoscope about the longitudinal axis of the instrument. As shown in

FIG. 2

, the junction of the instrument


12


and the end effector


32


define the origin of a second coordinate system which has a second x axis (x′), a second y axis (y′) and a second z axis (z′). The junction of the end effector


32


and the instrument


12


also define a third coordinate system which has a third x axis (x″), a third y axis (y″) and a third z axis (z″). The zag axis is always parallel with the longitudinal axis of the instrument


12


. The actuators receive input signals from the computer


22


to control the movement of the robotic arm assembly


20


.




Referring to

FIG. 1

, the surgeon wears a transmitter unit


40


that transmits a pointer signal


42


which is received by a receiver unit


44


. The transmitter unit


40


is preferably a laser pointer which emits a laser beam


42


. The laser pointer may have a blow switch


46


that allows the surgeon to turn the laser on and off by blowing or drawing in the air of a tube located adjacent to the surgeons mouth. The transmitter


40


may be a laser switch sold by Point Source, Inc. of Germantown, Ohio. Although a laser transmitter is shown and described, the transmitter may be an acoustic or electromagnetic device that generates a wave that is detected by an appropriate detector(s). It being understood that any system that can detect a physical movement of the surgeon is encompassed by the present invention.




The receiver unit


42


preferably includes a screen


48


that is in the field of view of a camera


50


. The laser beam


42


creates an illuminated dot on the screen


48


which is then detected by the camera


50


. The camera


50


is preferably a charged coupled device (CCD). When the surgeon moves his head, the pointer signal


42


moves to a new spatial location on the screen


48


. The surgeon can therefore control the position of the illuminated dot by tilting his head.




As shown in

FIG. 3

, the CCD camera


50


is coupled to an image digitizer


52


which digitizes the images provided by the camera


50


. The digitizer


52


provides digitally based values that correspond to the light intensity detected by each pixel of the camera


50


. The digitizer


52


is coupled to a position detector


54


which detects the spatial location of the pointer signal


42


relative to the screen


48


. The detector


54


first compares the intensity values of each pixel with a threshold value. The detector


54


provides an associated value of 1 for each pixel that has an intensity which exceeds the threshold value, and a value of 0 for each pixel which is below the threshold value. The threshold value is selected to correspond to the intensity of an illuminated dot created by the laser beam


42


striking the screen


50


. The threshold value is preferably large enough to filter out background light.




After each pixel is assigned a 1 or 0 value, the x and y spatial coordinates of the pointer signal


42


relative to the screen


48


is computed by determining the center of mass of the pixels which have an assigned value of 1 in accordance with the following equations.







M





x

=






i
-
n

,

j
-
m














x
i

·
0







(

i
,
j

)








i
-
n

,

j
-
m








0






(

i
,
j

)








My
=






i
-
n

,

j
-
m














y
j

·
0







(

i
,
j

)








i
-
n

,

j
-
m








0






(

i
,
j

)














where;




Mx=the x coordinate of the center of mass.




My=the y coordinate of the center of mass.




O(i,j)=the assigned value of the pixels i through j.




Xi=the x coordinate of the pixels i through n.




Yj=the y coordinate of the pixels j through m.




The x and y spatial coordinates generated by the detector


54


are provided to an interface controller


56


. The interface controller


56


maps the x and y spatial coordinates generated by the detector to corresponding pixel locations on the monitor


18


. The interface controller


56


is coupled to a graphic overlay processor


58


and a robot controller


60


. The graphic overlay processor


58


is coupled to the monitor


18


. Although separate controllers are shown and described, it is to be understood that the blocks depicted are merely functional and that the operations may be performed by a single microprocessor or different combinations of processors.




As shown in

FIG. 4

, the monitor


18


displays a video image


62


provided by the camera


16


of the endoscope


12


. The video image


62


is typically an internal organ of a patient. The graphic overlay processor


58


generates a series of static graphic images


64


-


70


that overlay onto the video image


62


displayed by the monitor


18


. The overlay processor


58


also generates a dynamic graphic cursor


72


that can move across the monitor


18


. The graphic cursor


72


may move in conjunction with any movement of the laser beam


42


emitted from the pointer


40


mounted to the surgeon's head.




To move the cursor


72


, the surgeon may move his head and vary the spatial location of the pointer signal


42


on the screen


48


. The new pointer location is detected by the CCD camera


50


. The position detector


54


computes the x and y spatial coordinates which are then provided to the interface controller


56


. The interface controller


56


maps the new x and y spatial coordinates to pixel locations on the video image


62


. The controller


56


then provides the new pixel locations to the graphic overlay processor


58


which displays the cursor


72


.




The interface controller


56


may also generate output signals to move the robotic arm assembly


20


in conjunction with the position of the cursor


72


. For example, the interface controller


56


may generate output signals to move the robotic arm


20


and endoscope


12


and to move the video image in the direction of the cursor. In this manner, the surgeon can view a new location within the patient by merely moving his head. Although a cursor


72


is shown and described, it is to be understood that the surgeon may move the robotic arm


20


and the video image


62


without a cursor


72


by merely tilting his head and watching the displayed image on the monitor


18


.




The static graphic images


64


-


70


may provide input commands to control various devices such as the robotic arm assembly


20


. For example, the graphic images


64


and


66


provide ZOOM IN and ZOOM OUT commands for the video image. When the surgeon moves the cursor


72


into the area of the IN graphic image


64


, the interface controller


56


generates output signals to move the robotic arm


20


so that the end of the endoscope moves closer to the object displayed by the monitor


18


. Likewise, when the cursor


72


is moved into the OUT graphic


66


, the controller


56


generates output signals to move the robotic arm


20


so that the endoscope moves away from the object shown on the monitor


18


.




To determine the interaction between the cursor


72


and the graphic images


64


-


70


, the interface controller


56


compares the pixel locations that correspond to the x and y coordinates provided by the detector


54


with a group of pixel locations associated with each graphic image. If the x and y pixel locations associated with the pointer signal equal a pixel location of a graphic image, the controller


56


generates a command associated with the graphic image. The graphic images


64


-


70


may be removed from the video image by drawing in air on the tube


46


and turning off the laser pointer


40


.




The graphical image


68


may generate a command to create a “pointer” out of the cursor


72


so that any subsequent movement of the cursor


72


will not generate a corresponding movement of the robotic arm


20


. The surgeon may use the pointer as an instructional aid for other personnel viewing the monitor


18


.




The robotic arm


20


can be manipulated by initially placing the cursor


72


in the PAN graphic


70


and then moving the cursor


72


about the monitor


18


. The interface controller


56


generates new pixel locations associated with the cursor movement which are then provided to the robot controller


60


to move the robotic arm so that the video image moves in conjunction with the movement of the cursor and the spatial location of the laser beam on the screen.




The process of moving the endoscope is performed by initially subtracting the new pixel position from an arbitrary reference pixel position to determine a Ax and a Ay pixel movement of the cursor


72


within the video image


62


. The computed movement (Δx and Δy) is multiplied by a weighted pseudoinverse of the following Jacobean matrix with reference to the coordinate system shown in FIG.


5


.







[








-
xy






sin





φ






f

+

y





cos





θ








-
f






ρ


Z
c


-

(

f
+


x
2

f


)





x

Z
c









-
x






cos





θ

-

sin





φ






(

f
+


y
2

f


)


-


f





ρ





sin





φ


Z
c






-

xy
f





y

Z
c





]

&AutoRightMatch;










where;




the angles θ, φ and ρ are measured by robotic position sensors (not shown). The angles provide spherical coordinates of the endoscope within a scope frame coordinate system that has an origin at the pivot point of the instrument and the patient.




x, y=the new pixel coordinates of the reference point.




Z


c


=is a constant.




f=the focal length of the endoscope lens.




The product (Vθ, Vφ and Vρ) of the reference point movement (Δx and Δy) and the Jacobean matrix is the computed movement of the endoscope by the robotic arm assembly in a spherical coordinate frame. The spherical coordinates (Vθ, Vφ and Vρ) are converted into Cartesian coordinates (Vx, Vy and Vz) by a transformation. The movement of the endoscope within the scope frame is converted to the fixed first coordinate system by an additional transformation matrix or matrices.




Referring to

FIG. 6

, the controller


60


typically computes the movement of the robotic arm assembly


20


in accordance with the following equations.






a3
=

π
-


cos

-
1








(



x
2

+

y
2

-

L1
2

+

L2
2




-
2

·
L1L2


)







Δ
=


cos

-
1








(



x
2

+

y
2

-

L1
2

-

L2
2



2

L1




x
2

+

y
2





)






a0
=


tan

-
1







2






(

y
x

)












where;




a


2


=angle between the second linkage arm


36


and the x axis.




a


3


=angle between the first linkage arm


30


and the longitudinal axis of the second linkage arm


36


.




L


1


=length of the second linkage arm.




L


2


=length of the first linkage arm.




x=x coordinate of the end effector in the first coordinate system.




y=y coordinate of the end effector in the first coordinate system.




To move the end effector to a new location of the x-y plane, the computer computes a change in the angles a


2


and a


3


, and then provides output signals to move the actuators accordingly. The original angular position of the end effector is provided to the computer by the position sensors. The computer moves the linkage arms an angle that corresponds to the difference between the new location and the original location of the end effector. A differential angle Δa


2


corresponds to the amount of angular displacement provided by the third actuator


38


and a differential angle Δa


3


corresponds to the amount of angular displacement provided by the second actuator


34


.




To improve the effectiveness of the system


10


, the system is constructed so that the desired movement of the surgical instrument correlates to a direction relative to the image displayed by the monitor. Thus when the robotic arm moves the endoscope


12


up, the scope always appears to move in the up direction relative to the image displayed by the monitor. To accomplish this result, the computer converts the desired movement of the end of the endoscope in the third coordinate system to coordinates in the second coordinate system, and then converts the coordinates of the second coordinate system into the coordinates of the first coordinate system.




Referring to

FIG. 2

, the desired movement of the endoscope is converted from the third coordinate system to the second coordinate system by using the following transformation matrix;







(




Δ






x








Δ






y








Δ






z






)

=


(




cos






(
a6
)




0




-
sin







(
a6
)








-
sin







(
a5
)






sin






(
a6
)





cos






(
a5
)






-
sin







(
a5
)






cos






(
a6
)







cos






(
a5
)






sin






(
a6
)





sin






(
a5
)





cos






(
a5
)






cos






(
a6
)





)



(




Δ






x








Δ






y








Δ






z






)












where;




Δx″=the desired incremental movement of the scope along the x″ axis of the third coordinate system.




Δy″=the desired incremental movement of the scope along the y″ axis of the third coordinate system.




Δz″=the desired incremental movement of the scope along the z″ axis of the third coordinate system.




a


5


=the angle between the z′ axis and the scope in the y′-z′ plane.




a


6


=the angle between the z′ axis and the scope in the x′-z′ plane.




Δx′=the computed incremental movement of the scope along the x′ axis of the second coordinate system.




Δy′=the computed incremental movement of the scope along the y′ axis of the second coordinate system.




Δz′=the computed incremental movement of the scope along the z′ axis of the second coordinate system.




The angles a


5


and a


6


are provided by position sensors coupled on the end effector


32


.




The desired movement of the endoscope is converted from the second coordinate system to the first coordinate system by using the following transformation matrix;







(




Δ





x






Δ





y






Δ





z




)

=


(




cos






(
π
)






-
sin







(
π
)




0





sin






(
π
)





cos






(
π
)




0




0


0


1



)



(




Δ






x








Δ






y








Δ






z






)












where;




Δx′=the computed incremental movement of the scope along the x′ axis of the second coordinate system.




Δy′=the computed incremental movement of the scope along the y′ axis of the second coordinate system.




Δz′=the computed incremental movement of the scope along the z′ axis of the second coordinate system.




π=is the angle between the first linkage arm and the x axis of the first coordinate system.




Δx=the computed incremental movement of the scope along the x axis of the first coordinate system.




Δy=the computed incremental movement of the scope along the y axis of the first coordinate system.




Δz=the computed incremental movement of the scope along the z axis of the first coordinate system.




The incremental movements Δx and Δy are inserted into the algorithms described above for computing the angular movements (Δa


2


and Δa


3


) of the robotic arm assembly to determine the amount of rotation that is to be provided by each actuator. The value Δz is used to determine the amount of linear movement provided by the linear actuator


24


.




The endoscope


12


is typically coupled to the camera


16


such that any spinning of the instrument about its own longitudinal axis will result in a corresponding rotation of the video image


62


on the monitor


18


. Rotation of the instrument and video image may disorient the viewer. It is therefore desirable to maintain the orientation of the video image. In the preferred embodiment, the end effector has a worm gear which rotates the surgical instrument about the longitudinal axis of the instrument. To insure proper orientation of the endoscope, the worm gear rotates the instrument about its longitudinal axis an amount Δθ


6


to insure that the y″ axis is oriented in the most vertical direction within the fixed coordinate system. Δθ


6


is computed from the following cross-products.






Δθ


6


=z


i


″×(yo″×yi″)






where;




Δθ


6


=the angle that the instrument is to be rotated about the z″ axis.




yo″=is the vector orientation of the y″ axis when the instrument is in the first position.




yi″=is the vector orientation of the y″ axis when the instrument is in the second position.




zi″=is the vector orientation of the z″ axis when the instrument is in the second position.




The vectors of the yi″ and zi″ axis are computed with the following algorithms.







[

zi


]

=


a5


[




cos





a6



0




-
sin






a6







-
sin






a5





sin





a6




cos





a5





-
sin






a5





cos





a6






cos





a5





sin





a6




sin





a5




cos





a5





cos





a6




]




[



0




0




1



]








 xi″=z×zi″






yi″=zi″×xi″






where;




a


6


=is the angle between the instrument and the z axis in the y-z plane.




a


5


=is the angle between the instrument and the z axis in the x-z plane.




z=is the unit vector of the z axis in the first coordinate system.




The angles a


5


and a


6


are provided by the joint position sensors of the end effector. The vector yo″ is computed using the angles a


5


and a


6


of the instrument in the original or first position. For the computation of yi″, the angles a


5


and a


6


of the second position are used in the transformation matrix. After each arm movement yo″ is set to yi″ and a new yi″ vector and corresponding Δθ6 angle are computed and used to re-orient the endoscope. Using the above described algorithms, the worm gear continuously rotates the instrument about its longitudinal axis to insure that the pivotal movement of the endoscope does not cause a corresponding rotation of the viewing image.




While certain exemplary embodiments have been described and shown in the accompanying drawings, it is to be understood that such embodiments are merely illustrative of and not restrictive on the broad invention, and that this invention not be limited to the specific constructions and arrangements shown and described, since various other modifications may occur to those ordinarily skilled in the art.




For example, although graphic images which provide commands to control a robotic arm are shown and described, it is to be understood that the graphics may generate commands that control other devices. The graphic overlay processor


58


may provide an entire menu that allows the surgeon to adjust the operating table or the lighting of the camera


16


. Additionally, surgical instruments such as laser cutters or electrode coagulators may be controlled by the surgeon through the head activated graphical interface provided by the present invention. The present invention generally provides a remotely controlled graphically based interface that allows the surgeon to control various devices and conditions at a surgical site.



Claims
  • 1. A medical system adapted to be coupled to an endoscope that is coupled to a monitor which displays a video image provided by the endoscope, the medical system comprising:a robotic arm configured to be coupled to the endoscope; and an electrical circuit that is coupled to said robotic arm and being configured to overlay a graphic image onto the video image; wherein the overlaid graphic image is adjustable to manipulate the video image.
  • 2. The system of claim 1, wherein said electrical circuit overlays a dynamic graphic cursor onto the video image.
  • 3. The system of claim 2, wherein the dynamic graphic cursor can be moved into the graphic image to select a function.
  • 4. The system of claim 3, wherein the function is a movement of said robotic arm.
  • 5. The system of claim 1 further comprising a controller configured to adjust the endoscope to manipulate the video image in response to adjustments to the overlaid graphic image.
  • 6. The system of claim 1 wherein the graphic image comprises at least one of “in” for video zoom in, “out” for video zoom out, and “pan” for video pan movement.
  • 7. A medical system adapted to be coupled to an endoscope that is coupled to a monitor which displays a video image provided by the endoscope, the medical system comprising:movement means for moving the endoscope; and overlay means for overlaying a graphic image onto the video image; wherein the overlaid graphic image is adjustable to manipulate the video image.
  • 8. The system of claim 7, wherein said overlay means overlays a dynamic graphic cursor onto the video image.
  • 9. The system of claim 8, wherein the dynamic graphic cursor can be moved into the graphic image to select a function.
  • 10. The system of claim 9, wherein the function is a movement of said movement means.
  • 11. The system of claim 7 further comprising control means for adjusting the endoscope to manipulate the video image in response to adjustments to the overlaid graphic image.
  • 12. A method for operating a medical system, comprising:moving an endoscope within a patient; displaying a video image provided by the endoscope on a monitor coupled to the endoscope; and overlaying a graphic image onto the video image; wherein the overlaid graphic image is adjustable to manipulate the video image.
  • 13. The method of claim 12, further comprising selecting a function by moving a dynamic graphic cursor into the graphic image.
  • 14. The method of claim 13, further comprising adjusting the endoscope in response to the selection of the function.
  • 15. The method of claim 13 wherein the graphic image comprises at least one of an “in” function for video zoom in, an “out” function for video zoom out, and a “pan” function for video pan movement.
  • 16. The method of claim 12 further comprising adjusting the endoscope to manipulate the video image in response to adjustments to the overlaid graphic image.
  • 17. A medical system adapted to be coupled to an endoscope that is coupled to a monitor which displays a video image provided by the endoscope, the medical system comprising:a robotic arm configured to be coupled to the endoscope; an electrical circuit that is coupled to said robotic arm and being configured to overlay a dynamic graphic cursor and graphic image onto the video image; and a cursor input device coupled to said electrical circuit; wherein the overlaid graphic image is adjustable by the dynamic graphic cursor to manipulate the video image.
  • 18. The system of claim 17, wherein the dynamic graphic cursor can be moved into the graphic image to select a function.
  • 19. The system of claim 18, wherein the function is a movement of said robotic arm.
  • 20. The system of claim 17 further comprising a controller configured to adjust the endoscope to manipulate the video image in response to adjustments to the overlaid graphic image.
  • 21. The system of claim 17 wherein the graphic image comprises at least one of “in” for video zoom in, “out” for video zoom out, and “pan” for video pan movement; and wherein the “in,” “out,” and “pan” are selectable by the dynamic graphic cursor.
  • 22. A medical system adapted to be coupled to an endoscope that is coupled to a monitor which displays a video image provided by the endoscope, the medical system comprising:movement means for moving the endoscope; overlay means for overlaying a dynamic graphic cursor and a graphic image onto the video image; and input means for moving the dynamic graphic cursor; wherein the overlaid graphic image is adjustable by the dynamic graphic cursor to manipulate the video image.
  • 23. The system of claim 22, wherein the dynamic graphic cursor can be moved into the graphic image to select a function.
  • 24. The system of claim 23, wherein the function is a movement of said movement means.
  • 25. The system of claim 22 further comprising control means for adjusting the endoscope to manipulate the video image in response to adjustments to the overlaid graphic image.
CROSS-REFERENCE TO RELATED APPLICATIONS

This application is a continuation application of application Ser. No. 08/904,047, filed Jul. 31, 1997, U.S. Pat. No. 5,911,036, which is a continuation application of application Ser. No. 08/529,095, filed Sep. 15, 1995, now U.S. Pat. No. 5,825,982.

US Referenced Citations (177)
Number Name Date Kind
977825 Murphy Dec 1910 A
3171549 Orloff Mar 1965 A
3280991 Melton et al. Oct 1966 A
4058001 Waxman Nov 1977 A
4128880 Cray, Jr. Dec 1978 A
4221997 Flemming Sep 1980 A
4367998 Causer Jan 1983 A
4401852 Noso et al. Aug 1983 A
4456961 Price et al. Jun 1984 A
4460302 Moreau et al. Jul 1984 A
4474174 Petruzzi Oct 1984 A
4491135 Klein Jan 1985 A
4503854 Jako Mar 1985 A
4517963 Michel May 1985 A
4523884 Clement et al. Jun 1985 A
4586398 Yindra May 1986 A
4604016 Joyce Aug 1986 A
4616637 Caspari et al. Oct 1986 A
4624011 Watanabe et al. Nov 1986 A
4633389 Tanaka et al. Dec 1986 A
4635292 Mori et al. Jan 1987 A
4641292 Tunnell et al. Feb 1987 A
4655257 Iwashita Apr 1987 A
4672963 Barken Jun 1987 A
4676243 Clayman Jun 1987 A
4728974 Nio et al. Mar 1988 A
4762455 Coughlan et al. Aug 1988 A
4791934 Brunnett Dec 1988 A
4791940 Hirschfeld et al. Dec 1988 A
4794912 Lia Jan 1989 A
4815006 Andersson et al. Mar 1989 A
4815450 Patel Mar 1989 A
4837734 Ichikawa et al. Jun 1989 A
4852083 Niehaus et al. Jul 1989 A
4853874 Iwamoto et al. Aug 1989 A
4854301 Nakajima Aug 1989 A
4860215 Seraji Aug 1989 A
4863133 Bonnell Sep 1989 A
4883400 Kuban et al. Nov 1989 A
4930494 Takehana et al. Jun 1990 A
4945479 Rusterholz et al. Jul 1990 A
4949717 Shaw Aug 1990 A
4954952 Ubhayakar et al. Sep 1990 A
4965417 Massie Oct 1990 A
4969709 Sogawa et al. Nov 1990 A
4969890 Sugita et al. Nov 1990 A
4979933 Runge Dec 1990 A
4979949 Matsen, III et al. Dec 1990 A
4980626 Hess et al. Dec 1990 A
4989253 Liang et al. Jan 1991 A
4996975 Nakamura Mar 1991 A
5019968 Wang et al. May 1991 A
5020001 Yamamoto et al. May 1991 A
5065741 Uchiyama et al. Nov 1991 A
5078140 Kwoh Jan 1992 A
5086401 Glassman et al. Feb 1992 A
5091656 Gahn Feb 1992 A
5097829 Quisenberry Mar 1992 A
5097839 Allen Mar 1992 A
5098426 Sklar et al. Mar 1992 A
5105367 Tsuchihashi et al. Apr 1992 A
5109499 Inagami et al. Apr 1992 A
5123095 Papadopoulos et al. Jun 1992 A
5131105 Harrawood et al. Jul 1992 A
5142930 Allen et al. Sep 1992 A
5145227 Monford, Jr. Sep 1992 A
5166513 Keenan et al. Nov 1992 A
5175694 Amato Dec 1992 A
5182641 Diner et al. Jan 1993 A
5184601 Putman Feb 1993 A
5187574 Kosemura et al. Feb 1993 A
5196688 Hesse et al. Mar 1993 A
5201325 McEwen et al. Apr 1993 A
5201743 Haber et al. Apr 1993 A
5217003 Wilk Jun 1993 A
5221283 Chang Jun 1993 A
5228429 Hatano Jul 1993 A
5230623 Guthrie et al. Jul 1993 A
5236432 Matsen, III et al. Aug 1993 A
5251127 Raab Oct 1993 A
5257999 Slanetz, Jr. Nov 1993 A
5271384 McEwen et al. Dec 1993 A
5279309 Taylor et al. Jan 1994 A
5282806 Haber Feb 1994 A
5289273 Lang Feb 1994 A
5289365 Caldwell et al. Feb 1994 A
5299288 Glassman et al. Mar 1994 A
5300926 Stoeckl Apr 1994 A
5303148 Mattson et al. Apr 1994 A
5304185 Taylor Apr 1994 A
5305203 Raab Apr 1994 A
5305427 Nagata Apr 1994 A
5309717 Minch May 1994 A
5313306 Kuban et al. May 1994 A
5320630 Ahmed Jun 1994 A
5337732 Grundfest et al. Aug 1994 A
5339799 Kami et al. Aug 1994 A
5343385 Joskowicz et al. Aug 1994 A
5343391 Mushabac Aug 1994 A
5345538 Narayannan et al. Sep 1994 A
5357962 Green Oct 1994 A
5368015 Wilk Nov 1994 A
5368428 Hussey et al. Nov 1994 A
5371536 Yamaguchi Dec 1994 A
5382885 Salcudean et al. Jan 1995 A
5388987 Badoz et al. Feb 1995 A
5395369 McBrayer et al. Mar 1995 A
5397323 Taylor et al. Mar 1995 A
5402801 Taylor Apr 1995 A
5403319 Matsen, III et al. Apr 1995 A
5408409 Glassman et al. Apr 1995 A
5410638 Colgate et al. Apr 1995 A
5417210 Funda et al. May 1995 A
5417701 Holmes May 1995 A
5422521 Neer et al. Jun 1995 A
5431645 Smith et al. Jul 1995 A
5434457 Josephs et al. Jul 1995 A
5442728 Kaufman et al. Aug 1995 A
5443484 Kirsch et al. Aug 1995 A
5445166 Taylor Aug 1995 A
5451924 Massimino et al. Sep 1995 A
5455766 Scheller et al. Oct 1995 A
5458547 Teraoka et al. Oct 1995 A
5458574 Machold et al. Oct 1995 A
5476010 Fleming et al. Dec 1995 A
5490117 Oda et al. Feb 1996 A
5490843 Hildwein et al. Feb 1996 A
5506912 Nagasaki et al. Apr 1996 A
5512919 Araki Apr 1996 A
5515478 Wang May 1996 A
5524180 Wang et al. Jun 1996 A
5544654 Murphy et al. Aug 1996 A
5553198 Wang et al. Sep 1996 A
5562503 Ellman et al. Oct 1996 A
5571110 Matsen, III et al. Nov 1996 A
5572999 Funda et al. Nov 1996 A
5609560 Ichikawa et al. Mar 1997 A
5626595 Sklar et al. May 1997 A
5629594 Jacobus et al. May 1997 A
5630431 Taylor May 1997 A
5631973 Green May 1997 A
5636259 Khutoryansky et al. Jun 1997 A
5649956 Jensen et al. Jul 1997 A
5657429 Wang et al. Aug 1997 A
5658250 Blomquist et al. Aug 1997 A
5676673 Ferre et al. Oct 1997 A
5695500 Taylor et al. Dec 1997 A
5696837 Green Dec 1997 A
5718038 Takiar et al. Feb 1998 A
5735290 Sterman et al. Apr 1998 A
5749362 Funda et al. May 1998 A
5754741 Wang et al. May 1998 A
5776126 Wilk et al. Jul 1998 A
5779623 Bonnell Jul 1998 A
5800423 Jensen Sep 1998 A
5807284 Foxlin Sep 1998 A
5807378 Jensen et al. Sep 1998 A
5808665 Green Sep 1998 A
5810880 Jensen et al. Sep 1998 A
5813813 Daum et al. Sep 1998 A
5814038 Jensen et al. Sep 1998 A
5815640 Wang et al. Sep 1998 A
5817084 Jensen Oct 1998 A
5825982 Wright et al. Oct 1998 A
5841950 Wang et al. Nov 1998 A
5859934 Green Jan 1999 A
5876325 Mizuno et al. Mar 1999 A
5878193 Wang et al. Mar 1999 A
5882206 Gillio Mar 1999 A
5887121 Funda et al. Mar 1999 A
5907664 Wang et al. May 1999 A
5920395 Schulz Jul 1999 A
5931832 Jensen Aug 1999 A
5950629 Taylor et al. Sep 1999 A
6024695 Taylor et al. Feb 2000 A
6201984 Funda et al. Mar 2001 B1
6463361 Wang et al. Oct 2002 B1
Foreign Referenced Citations (12)
Number Date Country
U 9204118.3 Jul 1992 DE
4310842 Jan 1995 DE
0239409 Sep 1987 EP
0424687 May 1991 EP
0776738 Jun 1997 EP
WO 9104711 Apr 1991 WO
WO 9220295 Nov 1992 WO
WO 9313916 Jul 1993 WO
WO 9418881 Sep 1994 WO
WO 9426167 Nov 1994 WO
WO 9715240 May 1997 WO
WO 9825666 Jun 1998 WO
Non-Patent Literature Citations (41)
Entry
“Endocorporeal Surgery Using Remote Manipulators” (Ned S. Rasor and J.W. Spickler) Remotely Manned Systems—Exploration and Operation in Space, California Institute of Technology 1973.
“A Survey Study of Teleoperators, Robotics, and Remote Systems Technology” (Arthur D. Alexander, III) Remotely Manned Systems—Exploration and Operation in Space, California Institute of Technology 1973.
“Impacts of Telemation on Modern Society” (Arthur D. Alexander, III), On the Theory and Practice of Robots and Manipulators vol. II, 1974.
Transcript of a video presented by SRI at the 3rd World Congress of Endoscopic Surgery in Bordeaux on Jun. 18-20, 1992, in Washington on Apr. 9, 1992, and in San Diego, CA on Jun. 4-7, 1992 entitled “Telepresence Surgery—The Future of Minimally Invasive Medicine”.
Statutory Declaration of Dr. Philip S. Green, presenter of the video entitled “Telepresence Surgery—The Future of Minimally Invasive Medicine”.
Abstract of a presentation “Telepresence: Advanced Teleoperator Technology for Minimally Invasive Surgery” (P. Green et al.) given at the 3rd World Congress of Endoscopic Surgery in Bordeaux, Jun. 18-20, 1992.
Abstract of a presentation “Telepresence: Advanced Teleoperator Technology for Minimally Invasive Surgery”, (P. Green et al.) given at “Medicine meets virtual reality” symposium in San Diego, Jun. 4-7, 1992.
Abstract of a presentation “Camera Control for Laparoscopic Surgery by Speech-Recognizing Robot: Constant Attention and Better Use of Personnel” (Colin Besant et al.) given at the 3rd World Congress of Endoscopic Surgery in Bordeaux, Jun. 18-20, 1992.
“A Literature Review: Robots in Medicine” (B. Preising et al.) IEEE Jun. 1991.
“Robots for the Operating Room” (Elizabeth Corcoran), The New York Times, Sunday Jul. 19, 1992, Section 3, p. 9, col. 1.
“Taming the Bull: Safety in a Precise Surgical Robot” (Russell H. Taylor et al.), IEEE 1991.
Abstract of a presentation “Design Considerations of a New Generation Endoscope Using Robotics and Computer Vision Technology” (S.M. Krishnan et al.) given at the 3rd World Congress of Endoscopic Surgery in Bordeaux, Jun. 18-20, 1992.
Abstract of a presentation “3-D Vision Technology Applied to Advanced Minimally Invasive Surgery Systems” given at the 3rd World Congress of Endoscopic Surgery in Bordeaux, Jun. 18-20, 1992.
“Analysis of the Surgeon's Grasp for Telerobotic Surgical Manipulation” (Frank Tendick and Lawrence Stark), IEEE 1989.
“Kinematic Control and Visual Display of Redundant Teleoperators” (Hardi Das et al.), IEEE 1989.
“A New System for Computer Assisted Neurosurgery” (S. Lavallee), IEEE 1989.
“An Advanced Control Micromanipulator for Surgical Applications” (Ben Gayed et al.), Systems Science vol. 13 1987.
“Force Feedback-Based Telemicromanipulation for Robot Surgery on Soft Tissues” (A.M. Sabatini et al.), IEEE 1989.
“Six-Axis Bilateral Control of an Articulated Slave Manipulator Using a Cartesian Master Manipulator” (Masao Inoue), Advanced Robotics 1990.
“On a Micro-Manipulator for Medical Application—Stability Consideration of its Bilateral Controller” (S. Majima et al.), Mechatronics 1991.
“Anthropomorphic Remote Manipulator”, NASA Tech Briefs 1991.
“Controlling Remote Manipulators through Kinesthetic Coupling” (A.K. Bejczy), Computers in Mechanical Engineering 1983.
“Design of a Surgeon-Machine Interface for Teleoperated Microsurgery” (Steve Charles M.D. et al.), IEEE 1989.
“A Robot in an Operating Room: A Bull in a China Shop” (J.M. Dolan et al.), IEEE 1987.
Abstract of a presentation “Concept and Experimental Application of a Surgical Robotic System the Steerable MIS Instrument SMI” given at the 3rd World Congress of Endoscopic Surgery in Bordeaux, Jun. 18-20, 1992.
Abstract of a presentation given at the 3rd World Congress of Endoscopic Surgery in Bordeaux, Jun. 18-20, 1992, entitled “Session 15/1”.
Abstract of a presentation “A Pneumatic Controlled Sewing Device for Endoscopic Application the MIS Sewing Instrument MSI” given at the 3rd World Congress of Endoscopic Surgery in Bordeaux, Jun. 18-20, 1992.
Abstract of a presentation given at the 3rd World Congress of Endoscopic Surgery in Bordeaux (Jun. 18 to 20, 1992), entitled “Session 15/2”.
Abstract of a presentation given at the 3rd World Congress of Endoscopic Surgery in Bordeaux (Jun. 18 to 20, 1992), entitled Session 15/4.
Abstract of a presentation given at the 3rd World Congress of Endoscopic Surgery in Bordeaux (Jun. 18 to 20, 1992), entitled “Session 15/5”.
“Properties of Master-Slave Robots” (C. Vibet), Motor-con 1987.
“A New Microsurgical Robot System for Corneal Transplantation” (Noriyuki Tejima), Precision Machinery 1988.
“Human/Robot Interaction via the Transfer of Power and Information Signals—Part I: Dynamics and Control Analysis” (H. Kazerooni), IEEE 1989.
“Human/Robot Interaction via the Transfer of Power and Information Signals—Part II: An Experimental Analysis” (H. Kazerooni), IEEE 1989.
“Power and Impedance Scaling in Bilateral Manipulation” (J. Edward Colgate), IEEE 1991.
“S.M.O.S.: Stereotaxical Microtelemanipulator for Ocular Surgery” (Aicha Guerrouad and Pierre Vidal), IEEE 1989.
“Motion Control for a Sheep Shearing Robot” (James P. Trevelyan et al.), Proceedings of the 1st International Symposium on Robotics Research, MIT, Cambridge, Massachusetts, USA, 1983.
“Robots and Telechirs” (M.W. Thring), Wiley 1983.
Industrial Robotics (Gordon M. Mair), Prentice Hall 1988 (pp. 41-43, 49-50, 54, 203-209 enclosed).
“Student Reference Manual for Electronic Instrumentation Laboratories” (Wolf et al.), Prentice Hall, New Jersey 1990, pp. 498 amd 499.
“Surgery in Cyberspace” (Taubes), Discover Magazine, Dec. 1994.
Continuations (2)
Number Date Country
Parent 08/904047 Jul 1997 US
Child 09/179039 US
Parent 08/529095 Sep 1995 US
Child 08/904047 US