Modularity system for computer assisted surgery

Information

  • Patent Grant
  • 6728599
  • Patent Number
    6,728,599
  • Date Filed
    Friday, September 7, 2001
    23 years ago
  • Date Issued
    Tuesday, April 27, 2004
    20 years ago
Abstract
A medical system that allows a medical device to be controlled by one of two input devices. The input devices may be consoles that contain handles and a screen. The medical devices may include robotic arms and instruments used to perform a medical procedure. The system may include an arbitrator that determines which console has priority to control one or more of the robotic arms/instruments.
Description




BACKGROUND OF THE INVENTION




1. Field of the Invention




The present invention relates to a medical robotic system.




2. Background Information




Blockage of a coronary artery may deprive the heart of blood and oxygen required to sustain life. The blockage may be removed with medication or by an angioplasty. For severe blockage a coronary artery bypass graft (CABG) is performed to bypass the blocked area of the artery. CABG procedures are typically performed by splitting the sternum and pulling open the chest cavity to provide access to the heart. An incision is made in the artery adjacent to the blocked area. The internal mammary artery is then severed and attached to the artery at the point of incision. The internal mammary artery bypasses the blocked area of the artery to again provide a full flow of blood to the heart. Splitting the sternum and opening the chest cavity can create a tremendous trauma to the patient. Additionally, the cracked sternum prolongs the recovery period of the patient.




Computer Motion of Goleta, Calif. provides a system under the trademark ZEUS that allows a surgeon to perform a minimally invasive CABG procedure. The procedure is performed with instruments that are inserted through small incisions in the patient's chest. The instruments are controlled by robotic arms. Movement of the robotic arms and actuation of instrument end effectors are controlled by the surgeon through a pair of handles and a foot pedal that are coupled to an electronic controller. Alternatively, the surgeon can control the movement of an endoscope used to view the internal organs of the patient through voice commands.




The handles and a screen are typically integrated into a console that is operated by the surgeon to control the various robotic arms and medical instruments of a ZEUS system. Utilizing a robotic system to perform surgery requires a certain amount of training. It would be desirable to provide a system that would allow a second surgeon to assist another surgeon in controlling a robotic medical system. The second surgeon could both teach and assist a surgeon learning to perform a medical procedure with a ZEUS system. This would greatly reduce the time required to learn the operation of a robotically assisted medical system.




U.S. Pat. No. 5,217,003 issued to Wilk discloses a surgical system which allows a surgeon to remotely operate robotically controlled medical instruments through a telecommunication link. The Wilk system only allows for one surgeon to operate the robotic arms at a given time. Wilk does not disclose or contemplate a system which allows two different surgeons to operate the same set of robotic arms.




U.S. Pat. No. 5,609,560 issued to Ichikawa et al. and assigned to Olympus Optical Co. Ltd. discloses a system that allows an operator to control a plurality of different medical devices through a single interface. The Olympus patent does not disclose a system which allows multiple input devices to control a single medical device.




BRIEF SUMMARY OF THE INVENTION




A medical system that includes a single medical device that can be controlled by one of two input devices.











BRIEF DESCRIPTION OF THE DRAWINGS





FIG. 1

is a perspective view of a medical robotic system;





FIG. 2

is an exploded side view of an instrument of the robotic system;





FIG. 3

is an illustration of network system;





FIG. 4

is an illustration of a “surgeon” side of the system;





FIG. 5

is an illustration of a “patient” side of the system;





FIG. 6

is a schematic showing various fields of a packet transmitted across a communication network;





FIG. 7

is an illustration showing an alternate embodiment of the network system.











DETAILED DESCRIPTION




Referring to the drawings more particularly by reference numbers,

FIG. 1

shows a system


10


that can perform minimally invasive surgery. In one embodiment, the system


10


is used to perform a minimally invasive coronary artery bypass graft (MI-CABG) and other anastomostic procedures. Although a MI-CABG procedure is shown and described, it is to be understood that the system may be used for other surgical procedures. For example, the system can be used to suture any pair of vessels. The system


10


can be used to perform a procedure on a patient


12


that is typically lying on an operating table


14


. Mounted to the operating table


14


is a first articulate arm


16


, a second articulate arm


18


and a third articulate arm


20


. The articulate arms


16


,


18


and


20


are preferably mounted to the table


14


so that the arms are at a same reference plane as the patient. Although three articulate arms are shown and described, it is to be understood that the system may have any number of arms.




The first and second articulate arms


16


and


18


each have a surgical instrument


22


and


24


, respectively, coupled to robotic arms


26


and


28


, respectively. The third articulate arm


20


includes a robotic arm


30


that holds and moves an endoscope


32


. The instruments


22


and


24


, and endoscope


32


are inserted through incisions cut into the skin of the patient. The endoscope has a camera


34


that is coupled to a television monitor


36


which displays images of the internal organs of the patient.




The first


16


, second


18


, and third


20


articulate arms are coupled to a controller


38


which can control the movement of the arms. The controller


38


is connected to an input device


40


such as a foot pedal that can be operated by a surgeon to move the location of the endoscope


32


. The controller


38


contains electrical circuits, such as a processor, to control the robotic arms


26


,


28


and


30


. The surgeon can view a different portion of the patient by depressing a corresponding button(s) of the pedal


40


. The controller


38


receives the input signal(s) from the foot pedal


40


and moves the robotic arm


30


and endoscope


32


in accordance with the input commands of the surgeon. The robotic arm may be a device that is sold by the assignee of the present invention, Computer Motion, Inc. of Goleta, Calif., under the trademark AESOP. The system is also described in U.S. Pat. No. 5,657,429 issued to Wang et al., which is hereby incorporated by reference. Although a foot pedal


40


is shown and described, it is to be understood that the system may have other input means such as a hand controller, or a speech recognition interface.




The instruments


22


and


24


of the first


16


and second


18


articulate arms, respectively, are controlled by a pair of master handles


42


and


44


that can be manipulated by the surgeon. The handles


42


and


44


, and arms


16


and


18


, have a master-slave relationship so that movement of the handles


42


and


44


produces a corresponding movement of the surgical instruments


22


and


24


. The handles


42


and


44


may be mounted to a portable cabinet


46


. The handles


42


and


44


are also coupled to the controller


38


.




The controller


38


receives input signals from the handles


42


and


44


, computes a corresponding movement of the surgical instruments, and provides output signals to move the robotic arms


26


and


28


and instruments


22


and


24


. The entire system may be a product marketed by Computer Motion under the trademark ZEUS. The operation of the system is also described in U.S. Pat. No. 5,762,458 issued to Wang et al. and assigned to Computer Motion, which is hereby incorporated by reference.





FIG. 2

shows one of the surgical instruments


22


or


24


. The instrument


22


or


24


includes an end effector


48


that is coupled to an actuator rod


50


. The actuator rod


50


is coupled to a motor


52


by an adapter


54


. The motor


52


actuates the end effector


48


by moving the actuator rod


50


. The actuator rod


50


is coupled to a force sensor


56


that can sense the force being applied by the end effector


48


. The force sensor


56


provides an analog output signal that is sent to the controller shown in FIG.


1


.




The adapter


54


is coupled to a gear assembly


58


located at the end of a robotic arm


26


or


28


. The gear assembly


58


can rotate the adapter


54


and end effector


48


. The actuator rod


50


and end effector


48


may be coupled to the force sensor


56


and motor


52


by a spring biased lever


60


. The instrument


22


or


24


may be the same or similar to an instrument described in the '458 patent.





FIG. 3

shows a system


100


that allows two different input devices to control one medical device. The input devices may be a first console


102


and a second console


104


. The consoles


102


and


104


may each include the screen


36


, handles


42


and


44


, foot pedal (not shown) and controller


38


shown in FIG.


1


. The medical devices may include the robotic arms


26


,


28


and


30


and/or instruments


22


and


24


shown in FIG.


1


. In general, the system allows a surgeon at either console


102


or


104


to control a medical device


22


,


24


,


26


,


28


and or


30


. For example, the surgeon at console


102


can move the robotic arms


26


and


28


through movement of the handles


42


and


44


. The surgeon at console


104


can override the input from console


102


and control the movement of the robotic arms


26


and


28


through the movement of the console handles.




The consoles


102


and


104


are coupled to a network port


106


by a pair of interconnect devices


108


and


110


. The network port


106


may be a computer that contains the necessary hardware and software to transmit and receive information through a communication link


112


in a communication network


114


.




Consoles


102


and


104


provided by Computer Motion under the ZEUS mark provide output signals that may be incompatible with a computer. The interconnect devices


108


and


110


may provide an interface that conditions the signals for transmitting and receiving signals between the consoles


102


and


104


and the network computer


106


.




It is to be understood that the computer and/or consoles


102


and


104


may be constructed so that the system does not require the interconnect devices


108


and


110


. Additionally, the consoles


102


and


104


may be constructed so that the system does not require a separate networking computer


106


. For example, the consoles


102


and


104


may be constructed and/or configured to directly transmit information through the communication network


114


.




The system


100


may include a second network port


116


that is coupled to a device controller(s)


118


and the communication network


114


. The device controller


118


controls the robotic arms


26


,


28


and


30


and instruments


22


and


24


. The second network port


116


may be a computer that is coupled to the controller


118


by an interconnect device


120


. Although an interconnect device


120


and network computer


116


are shown and described, it is to be understood that the controller


118


can be constructed and configured to eliminate the device


120


and/or computer


116


.




The communication network


114


may be any type of communication system including but not limited to, the internet and other types of wide area networks (WANs), intranets, local area networks (LANs), public switched telephone networks (PSTN), integrated services digital networks (ISDN). It is preferable to establish a communication link through a fiber optic network to reduce latency in the system. Depending upon the type of communication link selected, by way of example, the information can be transmitted in accordance with the user datagram protocol/internet protocol (UDP/IP) or ATM ADAPTION LAYER 1 (ATM/AAL1) protocols. The computers


112


and


116


may operate in accordance with an operating system sold under the designation VxWORKS by Wind River. By way of example, the computers


112


and


116


may be constructed and configured to operate with 100-base T Ethernet and/or 155 Mbps fiber ATM systems.





FIG. 4

shows an embodiment of a “surgeon” side of the system. Each console


102


and


104


may be accompanied by a touchscreen computer


122


and an endoscope interface computer


124


. The touchscreen computer


122


may be a device sold by Computer Motion under the trademark HERMES. The touchscreen


122


allows the surgeon to control and vary different functions and operations of the instruments


22


and


24


. For example, the surgeon may vary the scale between movement of the handles


42


and


44


and movement of the instruments


22


and


24


through a graphical user interface (GUI) of the touchscreen


122


. The touchscreen


122


may have another GUI that allows the surgeon to initiate an action such as closing the gripper of an instrument.




The endoscope computer


124


may allow the surgeon to control the movement of the robotic arm


30


and the endoscope


32


shown in FIG.


1


. The endoscope computer


124


may be an alternate to, or in addition to, the foot pedal


40


shown in FIG.


1


. The endoscope computer


124


may be a device sold by Computer Motion under the trademark SOCRATES The touchscreen


122


and endoscope computers


124


may be coupled to the network computer


106


by RS232 interfaces.




A ZEUS console will transmit and receive information that is communicated as analog, digital or quadrature signals. The network computer


112


may have analog input/output (I/O)


126


, digital I/O


128


and quadrature


130


interfaces that allow communication between the console


102


or


104


and the network


114


. By way of example, the analog interface


126


may transceive data relating to handle position, tilt position, in/out position and foot pedal information (if used). The quadrature signals may relate to roll and pan position data. The digital I/O interface


128


may relate to cable wire sensing data, handle buttons, illuminators (LEDs) and audio feedback (buzzers). The position data is preferably absolute position information. By using absolute position information the robotic arms can still be moved even when some information is not successfully transmitted across the network


114


. If incremental position information is provided, an error in the transmission would create a gap in the data and possibly inaccurate arm movement. The network computer


112


may further have a screen


132


that allows for a user to operate the computer


112


.





FIG. 5

shows an embodiment of a “patient” side of the system


100


. The controller


118


may include three separate controllers


134


,


136


and


138


. The controller


134


may receive input commands, perform kinematic computations based on the commands, and drive output signals to move the robotic arms


26


and


28


and accompanying instruments


22


and


24


to a desired position. The controller


136


may receive commands that are processed to both move and actuate the instruments


22


and


24


. Controller


138


may receive input commands, perform kinematic computations based on the commands, and drive output signals to move the robotic arm


30


and accompanying endoscope


32


.




Controllers


134


and


136


may be coupled to the network computer


116


by digital I/O


140


and analog I/O


142


interfaces. The computer


116


may be coupled to the controller


138


by an RS232 interface. Additionally, the computer


116


may be coupled to corresponding RS232 ports of the controllers


134


and


136


. The RS232 ports of the controllers


134


and


136


may receive data such as movement scaling and end effector actuation.




The robotic arms and instruments contain sensors, encoders, etc. that provide feedback information. Some or all of this feedback information may be transmitted over the network


114


to the surgeon side of the system. By way of example, the analog feedback information may include handle feedback, tilt feedback, in/out feedback and foot pedal feedback. Digital feedback may include cable sensing, buttons, illumination and audatory feedback. The computer


116


may be coupled to a screen


142


.




The computers


106


and


116


may packetize the information for transmission through the communication network


114


. Each packet will contain two types of data, robotic data and RS232 data. Robotic data may include position information of the robots, including input commands to move the robots and position feedback from the robots. RS232 data may include functioning data such as instrument scaling and actuation.




Because the system transmits absolute position data the packets of robotic data can be received out of sequence. This may occur when using a UDP/IP protocol which uses a best efforts methodology. The computers


106


and


116


are constructed and configured to disregard any “late” arriving packets with robotic data. For example, the computer


106


may transmits packets


1


,


2


and


3


. The computer


116


may receive the packets in the order of


1


,


3


and


2


. The computer


116


will disregard the second packet


2


. Disregarding the packet instead of requesting a re-transmission of the data reduces the latency of the system. It is desirable to minimize latency to create a “real time” operation of the system.




It is preferable to have the RS232 information received in strict sequential order. Therefore the receiving computer will request a re-transmission of RS232 data from the transmitting computer if the data is not errorlessly received. RS232 data such as motion scaling and instrument actuation must be accurately transmitted and processed to insure that there is not an inadvertent command.




The computers


106


and


116


can multiplex the RS232 data from the various input sources. The computers


106


and


116


may have first-in first-out queues (FIFO) for transmitting information. Data transmitted between the computer


106


and the various components within the surgeon side of the system may be communicated through a protocol provided by Computer Motion under the name HERMES NETWORK PROTOCOL (HNP). Likewise, information may be transmitted between components on the patient side of the system in accordance with HNP.




In addition to the robotic and RS232 data, the patient side of the system will transmit video data from the endoscope camera


34


. To reduce latency in the system, the computer


116


can multiplex the video data with the robotic/RS232 data onto the communication network. The video data may be compressed using conventional JPEG, etc. compression techniques for transmission to the surgeon side of the system.




Each packet


150


may have the fields shown in FIG.


6


. The SOURCE ID field includes identification information of the input device or medical device from where the data originates. The DESTINATION ID field includes identification information identifying the input device or medical device that is to receive the data. The OPCODE field defines the type of commands being transmitted. The PRIORITY field defines the priority of the input device. The priority data may be utilized to determine which input device has control of the medical device. The SEQ # field provides a packet sequence number so that the receiving computer can determine whether the packet is out of sequence.




The TX Rate field is the average rate at which packets are being transmitted. The RX Rate field is the average rate that packets are being received. The RS232 ACK field includes an acknowledgement count for RS232 data. RS232 data is typically maintained within the queue of a computer until an acknowledgement is received from the receiving computer that the data has been received.




The RS232 POS field is a counter relating to transmitted RS232 data. The RS232 ID field is an identification for RS232 data. The RS232 MESS SZ field contains the size of the packet. The RS232 BUFFER field contains the content length of the packet. The DATA field contains data being transmitted and may contain separate subfields for robotic and RS232 data. CS is a checksum field used to detect errors in the transmission of the packet.




Either computer


106


or


116


can be used as an arbitrator between the input devices and the medical devices. For example, the computer


116


may receive data from both consoles


102


and


104


. The packets of information from each console


102


and


104


may include priority data in the PRIORITY fields. The computer


116


will route the data to the relevant device (eg. robot, instrument, etc.) in accordance with the priority data. For example, console


104


may have a higher priority than console


102


. The computer


116


will route data to control a robot from console


104


to the exclusion of data from console


102


so that the surgeon at


104


has control of the arm.




As an alternate embodiment, the computer


116


may be constructed and configured to provide priority according to the data in the SOURCE ID field. For example, the computer


116


may be programmed to always provide priority for data that has the source ID from console


104


. The computer


116


may have a hierarchical tree that assigns priority for a number of different input devices.




Alternatively, the computer


106


may function as the arbitrator, screening the data before transmission across the network


114


. The computer


106


may have a priority scheme that always awards priority to one of the consoles


102


or


104


. Additionally, or alternatively, one or more of the consoles


102


and


104


may have a mechanical and/or software switch that can be actuated to give the console priority. The switch may function as an override feature to allow a surgeon to assume control of a procedure.




In operation, the system initial performs a start-up routine. The ZEUS system is typically configured to start-up with data from the consoles. The consoles may not be in communication during the start-up routine of the robotic arms, instruments, etc. during the start-up routine so that the system does not have the console data required for system boot. The computer


116


may automatically drive the missing console input data to default values. The default values allow the patient side of the system to complete the start-up routine. Likewise, the computer


106


may also drive missing incoming signals from the patient side of the system to default values to allow the consoles


102


and/or


104


to boot-up. Driving missing signals to a default value may be part of a network local mode. The local mode allows one or more consoles to “hot plug” into the system without shutting the system down.




Additionally, if communication between the surgeon and patient sides of the system are interrupted during operation the computer


106


will again force the missing data to default values. The default values may be quiescent signal values to prevent unsafe operation of the system. The components on the patient side will be left at the last known value so that the instruments and arms do not move.




Once the start-up routines have been completed and the communication link has been established the surgeons can operate the consoles. The system is quite useful for medical procedures wherein one of the surgeons is a teacher and the other surgeon is a pupil. The arbitration function of the system allows the teacher to take control of robot movement and instrument actuation at anytime during the procedure. This allows the teacher to instruct the pupil on the procedure and/or the use of a medical robotic system.




Additionally, the system may allow one surgeon to control one medical device and another surgeon to control the other device. For example, one surgeon may move the instruments


22


and


24


while the other surgeon moves the endoscope


32


, or one surgeon may move one instrument


22


or


24


while the other surgeon moves the other instrument


24


or


22


.





FIG. 7

shows an alternate embodiment, wherein one or more of the consoles


102


and


104


has an alternate communication link


160


. The alternate link may be a telecommunication network that allows the console


102


to be located at a remote location while console


104


is in relative close proximity to the robotic arms, etc. For example, console


102


may be connected to a public phone network, while console


104


is coupled to the controller


118


by a LAN. Such a system would allow telesurgery with the robotic arms, instruments, etc. The surgeon and patient sides of the system may be coupled to the link


160


by network computers


162


and


164


.




While certain exemplary embodiments have been described and shown in the accompanying drawings, it is to be understood that such embodiments are merely illustrative of and not restrictive on the broad invention, and that this invention not be limited to the specific constructions and arrangements shown and described, since various other modifications may occur to those ordinarily skilled in the art.



Claims
  • 1. A medical system, comprising:a first medical device; a first input device that can control said first medical device; a second input device that can control said first medical device; and an arbitrator that is coupled to said first input device and allows either said first input device or said second input device to control said first medical device.
  • 2. The system of claim 1, wherein said first medical device includes a robotic arm.
  • 3. The system of claim 2, wherein said first medical device includes an instrument.
  • 4. The system of claim 1, wherein said first input device includes a console that has a handle.
  • 5. The system of claim 4, wherein said second input device includes a console that has a video screen.
  • 6. The system of claim 1, wherein said second input device includes a touchpad screen.
  • 7. The system of claim 1, wherein said first and second input devices each transmit a packet of information, each packet including a priority that is used by said arbitrator to allow either said first input device or second input device to control said first medical device.
  • 8. The system of claim 1, wherein said first and second input devices each transmit a packet of information, each packet including a source address that is used by said arbitrator to allow either said first input device or second input device to control said first medical device.
  • 9. The system of claim 1, wherein said arbritator includes a computer.
  • 10. The system of claim 1, further comprising a communication link that couples said first and second input devices with said first medical device.
  • 11. The system of claim 10, further comprising a second medical device that is controlled by a third input device.
  • 12. The system of claim 11, wherein said first medical device includes a robotic arm that moves a medical instrument and said second medical device includes a robotic arm that moves an endoscope coupled to a camera.
  • 13. The system of claim 12, further comprising a communication link that couples said first and second medical devices with said first and second input devices, and a computer that multiplexes robotic data from said robotic arms with video data from said camera onto said communication link.
  • 14. A medical system, comprising:a first medical device; first input means for controlling said first medical device; second input means for controlling said first medical instrument; and, arbitrator means for allowing either said first input device or said second input device to control said first medical device.
  • 15. The system of claim 14, wherein said first medical device includes a robotic arm.
  • 16. The system of claim 15, wherein said first medical device includes an instrument.
  • 17. The system of claim 14, wherein said first input means includes a console that has a handle.
  • 18. The system of claim 17, wherein said second input means includes a console that has a video screen.
  • 19. The system of claim 14, wherein said second input means includes a touchpad screen.
  • 20. The system of claim 14, wherein said first and second input means each transmit a packet of information, each packet including a priority that is used by said arbitrator means to allow either said first input means or second input means to control said first medical device.
  • 21. The system of claim 14, wherein said first and second input means each transmit a packet of information, each packet including a source address that is used by said arbitrator means to allow either said first input means or second input means to control said first medical device.
  • 22. The system of claim 14, wherein said arbritator means includes a computer.
  • 23. The system of claim 14, further comprising communication means for coupling said first and second input means with said first medical device.
  • 24. The system of claim 23, further comprising third input means for controlling a second medical device.
  • 25. The system of claim 24, wherein said first medical device includes a robotic arm that moves a medical instrument and said second medical device includes a robotic arm that moves an endoscope coupled to a camera.
  • 26. The system of claim 25, further comprising communication means for coupling said first and second medical devices with said first and second input means, and multiplexor means for multiplexing robotic data from said robotic arms with video data from said camera onto said communication means.
  • 27. A medical system, comprising:a first robotic arm; a first medical instrument coupled to said first robotic arm; a second robotic arm; a second medical instrument coupled to said second robotic arm; a third robotic arm that can hold an endoscope coupled to a camera; a first console that can control said first and second robotic arms; a second console that can control said first and second robotic arms; a first endoscopic input device that can control said third robotic arm; a second endoscopic input device that can control said third robotic arm; and, an arbitrator coupled to said first robotic arm, said second robotic arm, said third robotic arm, said first console, said second console, said first endoscopic input device and said second endoscopic input device.
  • 28. The system of claim 27, wherein said first and second consoles each transmit a packet of information including a priority to said arbitrator, said priority is used by said arbitrator to determine whether said first and second robotic arms are to be controlled by either said first console or said second console.
  • 29. The system of claim 27, wherein said first and second consoles each transmit a packet of information including a source address to said arbitrator, said source addresses are used by said arbitrator to determine whether said first and second robotic arms are to be controlled by said first console or said second console.
  • 30. The system of claim 27, further comprising a communication link coupled to said arbitrator and said first and second consoles, said arbitrator multiplexes robotic data from said first, second and third robotic arms with video data from the camera onto said communication link.
  • 31. A medical system, comprising:a first robotic arm; a first medical instrument coupled to said second robotic arm; a second robotic arm; a second medical instrument coupled to said second robotic arm; a third robotic arm that can hold an endoscope coupled to a camera; a first console that can control said first and second robotic arms; a second console that can control said first and second robotic arms; a first endoscopic input device that can control said third robotic arm; a second endoscopic input device that can control said third robotic arm; and, arbitrator means for allowing said first and second robotic arms to be controlled by either said first console or said second console, and said third robotic arm to be controlled by either said first endoscopic input device or said second endoscopic input device.
  • 32. The system of claim 31, wherein said first and second consoles each transmit a packet of information including a priority to said arbitrator means, said priority is used by said arbitrator means to determine whether said first and second robotic arms are to be controlled by either said first console or said second console.
  • 33. The system of claim 31, wherein said first and second consoles each transmit a packet of information including a source address to said arbitrator means, said source addresses are used by said arbitrator means to determine whether said first and second robotic arms are to be controlled by said first console or said second console.
  • 34. The system of claim 31, further comprising communication means for coupling said arbitrator means with said first and second consoles, said arbitrator means multiplexes robotic data from said first, second and third robotic arms with video data from the camera onto said communication means.
  • 35. A medical system, comprising:a first medical device; an arbitrator coupled to said first and second medical devices.
  • 36. The system of claim 35, wherein said arbitrator allows control of said first medical device in accordance with a priority included in a packet of information transmitted to said arbitrator.
  • 37. The system of claim 35, wherein said arbitrator allows control of said first medical device in accordance with a source address included in a packet of information transmitted to said arbitrator.
  • 38. The system of claim 35, wherein said first medical device includes a robotic arm.
  • 39. A medical system, comprising:a first input device; a second input device; and, an arbitrator coupled to said first and second input devices.
  • 40. The system of claim 39, wherein said first input device transmits a first packet of information including a first priority and said second input device transmits a second packet of information including a second priority, said arbitrator transmits either said first or second packet of information based on said first and second priority.
  • 41. The system of claim 39, wherein said first and second input devices each include a console.
  • 42. The system of claim 41, wherein each console includes a pair of handles and a screen.
  • 43. A medical robotic system architecture, comprising:a communication network that includes an arbitrator; a robotic arm coupled to said communication network; a medical instrument coupled to said robotic arm; a first surgeon console coupled to said communication network; and, a second surgeon console coupled to said communication network.
  • 44. The architecture of claim 43, wherein said communication network includes an interconnect device.
  • 45. The architecture of claim 43, wherein said first and second consoles each transmit a packet of information, said packet containing a priority field.
  • 46. A medical robotic system architecture, comprising:a medical instrument coupled to said robotic arm; movement means for moving said medical instrument; first surgeon control means for controlling said medical instrument; second surgeon control means for controlling said medical instrument; and, communication means that includes an arbitrator for coupling said movement means and said medical instrument to said first and second surgeon control means.
  • 47. The architecture of claim 46, wherein said communication means includes an interconnect device.
  • 48. The architecture of claim 46, wherein said first and second control means each transmit a packet of information, said packet containing a priority field.
  • 49. A medical robotic system, comprising:a medical instrument; a robotic arm; a communication network coupled to said robotic arm and said medical instrument; an input device; and, a computer coupled to said input device, said computer transmits a packet of information over said communication network, said packet including a priority field and a data field that contains robotic data.
  • 50. The system of claim 49, wherein said data field includes RS232 data.
  • 51. The system of claim 49, wherein said input device includes a handle.
  • 52. The system of claim 49, further comprising a screen coupled to said computer.
  • 53. A medical robotic system, comprising:a medical instrument; a robotic arm; a communication network coupled to said robotic arm and said medical instrument; input means for receiving a command to control said medical instrument; and, transmission means for transmitting a packet of information over said communication network, said packet including a priority field and a data field that contains robotic data that corresponds to the command.
  • 54. The system of claim 53, wherein said data field includes RS232 data.
  • 55. The system of claim 53, wherein said input means includes a handle.
  • 56. The system of claim 53, further comprising a screen coupled to said transmission means.
  • 57. A medical robotic system architecture, comprising:a communication network; a robotic arm coupled to said communication network; a medical instrument coupled to said robotic arm; a first surgeon console that is coupled to said communication network and transmits a packet of information that includes a priority field; and, a second surgeon console that is coupled to said communication network and transmits a packet of information that includes a priority field.
  • 58. The architecture of claim 43, wherein said communication network includes an interconnect device.
  • 59. A medical robotic system architecture, comprising:a medical instrument coupled to said robotic arm; movement means for moving said medical instrument; first surgeon control means for controlling said medical instrument and transmitting a packet of information that includes a priority field; second surgeon control means for controlling said medical instrument and transmitting a packet of information that includes a priority field; and, communication means for coupling said movement means and said medical instrument to said first and second surgeon control means.
  • 60. The architecture of claim 46, wherein said communication means includes an interconnect device.
US Referenced Citations (221)
Number Name Date Kind
977825 Murphy Dec 1910 A
3171549 Orloff Mar 1965 A
3280991 Melton et al. Oct 1966 A
4058001 Waxman Nov 1977 A
4128880 Cray, Jr. Dec 1978 A
4221997 Flemming Sep 1980 A
4367998 Causer Jan 1983 A
4401852 Noso et al. Aug 1983 A
4456961 Price et al. Jun 1984 A
4460302 Moreau et al. Jul 1984 A
4474174 Petruzzi Oct 1984 A
4491135 Klein Jan 1985 A
4503854 Jako Mar 1985 A
4517963 Michel May 1985 A
4523884 Clement et al. Jun 1985 A
4586398 Yindra May 1986 A
4604016 Joyce Aug 1986 A
4616637 Caspari et al. Oct 1986 A
4624011 Watanabe et al. Nov 1986 A
4633389 Tanaka et al. Dec 1986 A
4635292 Mori et al. Jan 1987 A
4635479 Salisbury, Jr. et al. Jan 1987 A
4641292 Tunnell et al. Feb 1987 A
4655257 Iwashita Apr 1987 A
4672963 Barken Jun 1987 A
4676243 Clayman Jun 1987 A
4728974 Nio et al. Mar 1988 A
4762455 Coughlan et al. Aug 1988 A
4791934 Brunnett Dec 1988 A
4791940 Hirschfeld et al. Dec 1988 A
4794912 Lia Jan 1989 A
4815006 Andersson et al. Mar 1989 A
4815450 Patel Mar 1989 A
4837734 Ichikawa et al. Jun 1989 A
4852083 Niehaus et al. Jul 1989 A
4853874 Iwamoto et al. Aug 1989 A
4854301 Nakajima Aug 1989 A
4860215 Seraji Aug 1989 A
4863133 Bonnell Sep 1989 A
4883400 Kuban et al. Nov 1989 A
4930494 Takehana et al. Jun 1990 A
4945479 Rusterholz et al. Jul 1990 A
4949717 Shaw Aug 1990 A
4954952 Ubhayakar et al. Sep 1990 A
4965417 Massie Oct 1990 A
4969709 Sogawa et al. Nov 1990 A
4969890 Sugita et al. Nov 1990 A
4979933 Runge Dec 1990 A
4979949 Matsen, III et al. Dec 1990 A
4980626 Hess et al. Dec 1990 A
4989253 Liang et al. Jan 1991 A
4996975 Nakamura Mar 1991 A
5019968 Wang et al. May 1991 A
5020001 Yamamoto et al. May 1991 A
5046375 Salisbury, Jr. et al. Sep 1991 A
5065741 Uchiyama et al. Nov 1991 A
5078140 Kwoh Jan 1992 A
5086401 Glassman et al. Feb 1992 A
5091656 Gahn Feb 1992 A
5097829 Quisenberry Mar 1992 A
5097839 Allen Mar 1992 A
5098426 Sklar et al. Mar 1992 A
5105367 Tsuchihashi et al. Apr 1992 A
5109499 Inagami et al. Apr 1992 A
5123095 Papadopoulos et al. Jun 1992 A
5131105 Harrawood et al. Jul 1992 A
5142930 Allen et al. Sep 1992 A
5145227 Monford, Jr. Sep 1992 A
5166513 Keenan et al. Nov 1992 A
5175694 Amato Dec 1992 A
5182641 Diner et al. Jan 1993 A
5184601 Putman Feb 1993 A
5187574 Kosemura et al. Feb 1993 A
5196688 Hesse et al. Mar 1993 A
5201325 McEwen et al. Apr 1993 A
5201743 Haber et al. Apr 1993 A
5217003 Wilk Jun 1993 A
5221283 Chang Jun 1993 A
5228429 Hatano Jul 1993 A
5230623 Guthrie et al. Jul 1993 A
5236432 Matsen, III et al. Aug 1993 A
5251127 Raab Oct 1993 A
5257999 Slanetz, Jr. Nov 1993 A
5271384 McEwen et al. Dec 1993 A
5279309 Taylor et al. Jan 1994 A
5282806 Haber Feb 1994 A
5289273 Lang Feb 1994 A
5289365 Caldwell et al. Feb 1994 A
5299288 Glassman et al. Mar 1994 A
5300926 Stoeckl Apr 1994 A
5303148 Mattson et al. Apr 1994 A
5304185 Taylor Apr 1994 A
5305203 Raab Apr 1994 A
5305427 Nagata Apr 1994 A
5309717 Minch May 1994 A
5313306 Kuban et al. May 1994 A
5320630 Ahmed Jun 1994 A
5337732 Grundfest et al. Aug 1994 A
5339799 Kami et al. Aug 1994 A
5343385 Joskowicz et al. Aug 1994 A
5343391 Mushabac Aug 1994 A
5345538 Narayannan et al. Sep 1994 A
5357962 Green Oct 1994 A
5368015 Wilk Nov 1994 A
5368428 Hussey et al. Nov 1994 A
5371536 Yamaguchi Dec 1994 A
5382885 Salcudean et al. Jan 1995 A
5388987 Badoz et al. Feb 1995 A
5395369 McBrayer et al. Mar 1995 A
5397323 Taylor et al. Mar 1995 A
5402801 Taylor Apr 1995 A
5403319 Matsen, III et al. Apr 1995 A
5408409 Glassman et al. Apr 1995 A
5410638 Colgate et al. Apr 1995 A
5417210 Funda et al. May 1995 A
5417701 Holmes May 1995 A
5422521 Neer et al. Jun 1995 A
5431645 Smith et al. Jul 1995 A
5434457 Josephs et al. Jul 1995 A
5442728 Kaufman et al. Aug 1995 A
5443484 Kirsch et al. Aug 1995 A
5445166 Taylor Aug 1995 A
5451924 Massimino et al. Sep 1995 A
5455766 Hirschfeld et al. Oct 1995 A
5458547 Teraoka et al. Oct 1995 A
5458574 Machold et al. Oct 1995 A
5476010 Fleming et al. Dec 1995 A
5490117 Oda et al. Feb 1996 A
5490843 Hildwein et al. Feb 1996 A
5506912 Nagasaki et al. Apr 1996 A
5512919 Araki Apr 1996 A
5515478 Wang May 1996 A
5544654 Murphy et al. Aug 1996 A
5553198 Wang et al. Sep 1996 A
5562503 Ellman et al. Oct 1996 A
5571110 Matsen, III et al. Nov 1996 A
5572999 Funda et al. Nov 1996 A
5598269 Kitaevich et al. Jan 1997 A
5609560 Ichikawa et al. Mar 1997 A
5626595 Sklar et al. May 1997 A
5629594 Jacobus et al. May 1997 A
5630431 Taylor May 1997 A
5631973 Green May 1997 A
5636259 Khutoryansky et al. Jun 1997 A
5649956 Jensen et al. Jul 1997 A
5657429 Wang et al. Aug 1997 A
5658250 Blomquist et al. Aug 1997 A
5676673 Ferre et al. Oct 1997 A
5695500 Taylor et al. Dec 1997 A
5696574 Schwaegerle Dec 1997 A
5696837 Green Dec 1997 A
5718038 Takiar et al. Feb 1998 A
5727569 Benetti et al. Mar 1998 A
5735290 Sterman et al. Apr 1998 A
5737711 Abe Apr 1998 A
5749362 Funda et al. May 1998 A
5754741 Wang et al. May 1998 A
5762458 Wang et al. Jun 1998 A
5766126 Anderson Jun 1998 A
5776126 Wilk et al. Jul 1998 A
5779623 Bonnell Jul 1998 A
5792135 Madhani et al. Aug 1998 A
5792178 Welch et al. Aug 1998 A
5797900 Madhani et al. Aug 1998 A
5800423 Jensen Sep 1998 A
5807284 Foxlin Sep 1998 A
5807377 Madhani et al. Sep 1998 A
5807378 Jensen et al. Sep 1998 A
5808665 Green Sep 1998 A
5810880 Jensen et al. Sep 1998 A
5813813 Daum et al. Sep 1998 A
5814038 Jensen et al. Sep 1998 A
5817084 Jensen Oct 1998 A
5825982 Wright et al. Oct 1998 A
5827319 Carlson et al. Oct 1998 A
5836869 Kudo et al. Nov 1998 A
5844824 Newman et al. Dec 1998 A
5855583 Wang et al. Jan 1999 A
5859934 Green Jan 1999 A
5860995 Berkelaar Jan 1999 A
5871017 Mayer Feb 1999 A
5876325 Mizuno et al. Mar 1999 A
5878193 Wang et al. Mar 1999 A
5882206 Gillio Mar 1999 A
5887121 Funda et al. Mar 1999 A
5898599 Massie et al. Apr 1999 A
5904702 Ek et al. May 1999 A
5906630 Anderhub et al. May 1999 A
5911036 Wright et al. Jun 1999 A
5920395 Schultz Jul 1999 A
5931832 Jensen Aug 1999 A
5950629 Taylor et al. Sep 1999 A
5951475 Gueziec et al. Sep 1999 A
5951587 Qureshi et al. Sep 1999 A
5954731 Yoon Sep 1999 A
5957902 Teves Sep 1999 A
5967980 Ferre et al. Oct 1999 A
5971976 Wang et al. Oct 1999 A
5980782 Hershkowitz et al. Nov 1999 A
5984932 Yoon Nov 1999 A
6006127 Van Der Brug et al. Dec 1999 A
6024695 Taylor et al. Feb 2000 A
6080181 Jensen et al. Jun 2000 A
6106511 Jensen Aug 2000 A
6120433 Mizuno et al. Sep 2000 A
6132368 Cooper Oct 2000 A
6201984 Funda et al. Mar 2001 B1
6206903 Ramans Mar 2001 B1
6223100 Green Apr 2001 B1
6226566 Funda et al. May 2001 B1
6231526 Taylor et al. May 2001 B1
6246200 Blumenkranz et al. Jun 2001 B1
6259806 Green Jul 2001 B1
6309397 Julian et al. Oct 2001 B1
6312435 Wallace et al. Nov 2001 B1
6331181 Tierney et al. Dec 2001 B1
6346072 Cooper Feb 2002 B1
6364888 Nieneyer et al. Apr 2002 B1
6368332 Salcudean et al. Apr 2002 B1
6371952 Madhani et al. Apr 2002 B1
6490490 Uchikubo et al. Dec 2002 B1
Foreign Referenced Citations (12)
Number Date Country
U 9204118.3 Jul 1992 DE
4310842 Jan 1995 DE
0239409 Sep 1987 EP
0424687 May 1991 EP
0776738 Jun 1997 EP
WO 9104711 Apr 1991 WO
WO 9220295 Nov 1992 WO
WO 9313916 Jul 1993 WO
WO 9418881 Sep 1994 WO
WO 9426167 Nov 1994 WO
WO 9715240 May 1997 WO
WO 9825666 Jun 1998 WO
Non-Patent Literature Citations (54)
Entry
Mack, Minimally invasive and robotic, 2001, Internet, pp. 568-572.*
Schaaf, Robotic surgery: The future is now, 2001, Internet, pp. 1-13.*
Noonan, NBC Newsweek, The ultimate remote control, 1999, Internet, pp. 1-11.*
Rotmes et al., Digital trainer developed for robotic assisted cardiac surgery, no date, Internet, pp. 1-7.*
Butner et al., A real-time system for tele-surgery, 2001, Internet, pp. 236-243.*
Lai et al., Evaluating control modes for constrained robotic surgery, 2000, Internet, pp. 1-7.*
Cavusoglu et al., A laparoscopic telesurgical workstation, 1999, IEEE/Internet, pp. 728-739.*
Lapietra et al., Will surgeons of the “Computer-game generation” Have an advantage in developing robotic skills?, 2001, Internet, pp. 26-30.*
SVI, Minimally invasive surgery, 1998, Internet, pp. 1-4.*
Howe et al., Robotics for surgery, 1999, Internet, pp. 211-242.*
Willet, Telesurgery, 2001, Internet, pp. 1-3.*
Richard, Emerging technologies for surgery in the 21st century, 1999, Internet, pp. 1-9.*
Parsell, Surgeons in U.S. perform operation on France via robot, 2001, Internet, pp. 1-5.*
“Endocorporeal Surgery Using Remote Manipulators”(Ned S. Rasor and J.W. Spickler) Remotely Manned Systems—Exploration and Operation in Space, California Institute of Technology 1973.
“A Survey Study of Teleoperators, Robotics, and Remote Systems Technology” (Arthur D. Alexander, III) Remotely Manned Systems—Exploration and Operation in Space, California Institute of Technology 1973.
“Impacts of Telemation on Modern Society” (Arthur D. Alexander, III), On the Theory and Practice of Robots and Manipulators vol. II, 1974.
Transcript of a video presented by SRI at the 3rd World Congress of Endoscopic Surgery in Bordeaux on Jun. 18-20, 1992, in Washington on Apr. 9, 1992, and in San Diego, CA on Jun. 4-7, 1992 entitled “Telepresence Surgery—The Future of Minimally Invasive Medicine”.
Statutory Declaration of Dr. Phillip S. Green, presenter of the video entitled “Telepresence Surgery—Teh Future of Minimally Invasive Medicine”.
Abstract of a presentation “Telepresence: Advanced Teleoperator Technology for Minimally Invasive Surgery” (P. Green et al.) given at the 3rd World Congress of Endoscopic Surgery in Bordeaux, Jun. 18-20, 1992.
Abstract of a presentation “Telepresence: Advanced Teleoperator Technology for Minimally Invasive Surgery”, (P. Green et al.) given at “Medicine meets virtual reality” symposium in San Diego, Jun. 4-7, 1992.
Abstract of a presentation “Camera Control for Laparoscopic Surgery by Speech-Recognizing Robot: Constant Attention and Better Use of Personnel” (Colin Besant et al.) given at the 3rd World Congress of Endoscopic Surgery in Bordeaux, Jun. 18-20, 1992.
“A Literature Review: Robots in Medicine” (B. Preising et al.) IEEE June 1991.
“Robots for the Operating Room” (Elizabeth Corcoran), The New York Times, Sunday Jul. 19, 1992, Section 3, p. 9, Column 1.
“Taming the Bull: Safety in a Precise Surgical Robot”(Russell H. Taylor et al.), IEEE 1991.
Abstract of a presentation “Design Considerations of a New Generation Endoscope Using Robotics and Computer Vision Technology” (S.M. Krishnan et al.) given at the 3rd World Congress of Endoscopic Surgery in Bordeaux, Jun. 18-20, 1992.
Abstract of a presentation “3-D vision Technology Applied to Advanced Minimally Invasive Surgery Systems” given at the 3rd World Congress of Endoscopic Surgery in Bordeaux, Jun. 18-20, 1992.
“Analysis of the Surgeon's Grasp for Telerobotic Surgical Manipulation” (Frank Tendick and Lawrence Stark), IEEE 1989.
“Kinematic Control and Visual Display of Redundant Teleoperators”(Hardi Das et al.), IEEE 1989.
“A New System for Computer Assisted Neurosurgery” (S. Lavallee), IEEE 1989.
“An Advanced Control Micromanipulator for Surgical Applications” (Ben Gayed et al.), Systems Science vol. 13 1987.
“Force Feedback-Based Telemicromanipulation for Robot Surgery on Soft Tissue” (A.M. Sabatini et al.), IEEE 1989.
“Six-Axis Bilateral Control of an Articulated Slave Manipulator Using a Cartesian Master Manipulator” (Masao Inoue), Advanced Robotics 1990.
“On a Micro-Manipulator for Medical Application—Stability Consideration of its Bilateral Controller” (S. Majima et al.), Mechatronics 1991.
“Anthropomorphic Remote Manipulator”, NASA Tech Briefs 1991.
“Controlling Remote Manipulators through Kinesthetic Coupling” (A.K. Bejczy), Computers in Mechanical Engineering 1983.
“Design of a Surgeon-Machine Interface for Teleoperated Microsurgery” (Steve Charles M.D. et al.), IEEE 1989.
“A Robot in an Operating Room: A Bull in a China Shop” (J.M. Dolan et al.), IEEE 1987.
Abstract of a presentation “Concept and Experimental Application of a Surgical Robotic System the Steerable MIS Instrument SMI” given at the 3rd World Congress of Endoscopic Surgery in Bordeaux, Jun. 18-20, 1992.
Abstract of a presentaion given at the 3rd World Congress of Endoscopic Surgery in Bordeaux, Jun. 18-20, 1992, entitled “Session 15/1”.
Abstract of a presentation “A Pneumatic Controlled Sewing Device for Endoscopic Application the MIS Sewing Instrument MSI” given at the 3rd World Congress of Endoscopic Surgery in Bordeaux, Jun. 18-20, 1992.
Abstract of a presentation given at the 3rd World Congress of Endoscopic Surgery in Bordeaux (Jun. 18-20, 1992), entitled “Session 15/2”.
Abstract of a presentation given at the 3rd World Congress of Endoscopic Surgery in Bordeaux (Jun. 18-20, 1992), entitled Session 15/4.
Abstract of a presentation given at the 3rd World Congress of Endoscopic Surgery in Bordeaux (Jun. 18-20, 1992), entitled “Session 15/5”.
“Properties of Master-Slave Robots” (C. Vibet), Motor-con 1987.
“A New Microsurgical Robot System for Corneal Transplantation” (Noriyuki Tejima), Precision Machinery 1988.
“Human/Robot Interaction via the Transfer of Power and Information Signals—Part I: Dynamics and Control Analysis” (H. Kazerooni), IEEE 1989.
“Human/Robot Interaction via the Transfer of Power and Information Signals—Part II: An Experimental Analysis” (H. Kazerooni), IEEE 1989.
“Power and Impedance Scaling in Bilateral Manipulation” (J. Edward Colgate), IEEE 1991.
“S.M.O.S.: Stereotaxical Microtelemanipulator for Ocular Surgery” (Aicha Guerrouad and Pierre Vidal), IEEE 1989.
“Motion Control for a Sheep Shearing Robot” (James P. Trevelyan et al.), Proceedings of the 1st International Symposium on Robotics Research, MIT, Cambridge, Massachusetts, USA, 1983.
“Robots and Telechirs” (M.W. Thring), Wiley 1983.
Industrial Robotics (Gordon M. Mair), Prentice Hall 1988 (pp. 41-43, 49-50, 54, 203-209 enclosed).
“Student Reference Manual for Electronic Instrumentation Laboratories” (Wolf et al.), Prentice Hall, New Jersey 1990, pp. 498 amd 499.
“Surgery in Cyberspace” (Taubes), Discover Magazine, Dec. 1994.