Systems, methods, and computer-readable program products for controlling a robotically delivered manipulator

Abstract
Systems, methods, and computer-readable media are provided for controlling a robotically delivered manipulator. One system includes a robotic manipulator having a base and a surgical instrument holder configured to move relative to the base, a surgical instrument removably coupled to the surgical instrument holder, a user interface configured to present information related to least one of the robotic manipulator or the surgical instrument, a gesture detection sensor configured to detect a gesture made by a user representing a desired movement of the robotic manipulator, and a controller configured to actuate the robotic manipulator in a predetermined manner corresponding to the detected gesture.
Description
BACKGROUND

Robotic surgical systems are increasingly being used in minimally invasive medical procedures. Typically, robotic surgical systems include a surgeon console located remote from one or more robotic arms to which surgical instruments and/or cameras are coupled. The surgeon console may be located on another side of the operating room from the robotic arms, in another room, or in another building, and includes input handles or other input devices for receiving inputs from a surgeon. The inputs are communicated to a central controller, which translates the inputs into commands for manipulating the robotic arms in the vicinity of the patient.


Multiple instruments may be used during the course of a surgical procedure. As such, the robotic surgical system may include different instruments coupled to each robotic arm thereby allowing the surgeon to select an input handle or input device corresponding to a desired instrument for use. In some instances, the surgeon may choose to employ an instrument that is not already coupled to one of the robotic arms. In such case, the surgeon may indicate verbally to a bedside assistant a desire to exchange one attached surgical instrument for another. In response to the verbal direction, the bedside assistant physically performs the instrument exchange. In many cases, communication from the surgeon to the bedside assistant is made over an intercommunication system.


Although the current systems and methods for performing robotic surgical procedures are functional, they may be improved. For example, due to various influences that may be present within the operating room, communication between the surgeon and the bedside assistant may become difficult from time to time. In some cases, noises from various status monitors present within the operating room, and/or other audible distractions may cause the bedside assistant to have difficulty hearing the surgeon's communication. The bedside assistant may inadvertently attach and then replace a tool on the robotic arm that was not requested by the surgeon, and the attachment and replacement may be counted toward and/or decremented from the tool's useful life. As a result, the hospital may incur an unwarranted usage fee, which may ultimately be passed through to patients. Additionally, the surgeon may prefer that an instrument be positioned in a particular manner, which may be challenging to convey verbally.


SUMMARY

Accordingly, there is a need for robotic surgical systems and methods that improve procedure efficiencies as well as safety. Additionally, there is a need for robotic surgical systems that allow the surgeon to have improved control over what actions are taken at the patient's bedside and how the actions are implemented.


In accordance with an aspect of the present disclosure, a robotic surgical system is provided that includes a robotic surgical system is provided including a robotic manipulator having a base and a surgical instrument holder configured to move relative to the base, a surgical instrument removably coupled to the surgical instrument holder, a user interface configured to present information related to at least one of the robotic manipulator or the surgical instrument, a gesture detection sensor configured to detect a gesture made by a user, and a controller in communication with the robotic manipulator, the surgical instrument, the user interface, and the gesture detection sensor. The controller includes one or more processors and one or more memories having instructions stored thereon which when executed by the one or more processors cause the one or more processor to provide one or more commands to actuate the robotic manipulator in a predetermined manner corresponding to the detected gesture


In another aspect of the present disclosure, the user interface is a touch screen, the gesture detection sensor is a touch sensor, the presented information is a graphical representation of the surgical instrument, and the one or more memories have stored thereon, further instructions which when executed by the one or more processors cause the one or more processors to determine whether or not a gesture has been detected on the touch screen over the graphical representation of the surgical instrument presented thereon, and in response to a determination that a gesture has been detected, provide the one or more commands to actuate the robotic manipulator in the predetermined manner corresponding to the detected gesture.


In still another aspect of the present disclosure, the user interface is a touch screen, the gesture detection sensor is a touch sensor, the presented information is a graphical representation of the robotic manipulator, and the one or more memories have stored thereon, further instructions which when executed by the one or more processors cause the one or more processors to determine whether or not a gesture has been detected on the touch screen over the graphical representation of the surgical instrument presented thereon, and in response to a determination that a gesture has been detected, provide the one or more commands to actuate the robotic manipulator in the predetermined manner corresponding to the detected gesture.


In another aspect of the present disclosure, the user interface is a display, the gesture detection sensor is a camera sensor, the information presented on the display is a graphical representation of the surgical instrument, and the one or more memories have stored thereon, further instructions which when executed by the one or more processors cause the one or more processors to determine whether or not a gesture has been detected by the camera sensor, and in response to a determination that a gesture has been detected, provide the one or more commands to actuate the robotic manipulator in the predetermined manner corresponding to the detected gesture.


In still another aspect of the present disclosure, the gesture detection sensor is an electric field-based sensor configured to generate an electric field. The one or more memories have stored thereon, further instructions which when executed by the one or more processors cause the one or more processors to determine whether or not a gesture has been detected by the electric field-based sensor by receiving a transmission of a change in the electric field generated by the electric field-based sensor, identify a gesture, based on the change in the electric field, and provide the one or more commands to actuate the robotic manipulator in a predetermined manner corresponding to the identified gesture.


In still another aspect of the present disclosure, the gesture detection sensor is a radar interaction sensor configured to transmit electromagnetic waves having a predetermined frequency. The one or more memories have stored thereon, further instructions which when executed by the one or more processors cause the one or more processors to determine whether or not a gesture has been detected by the radar interaction sensor by receiving a transmission of a change in one or more of an amplitude or a signal of the transmitted electromagnetic waves generated by the radar interaction sensor, identify a gesture, based on the change in the one or more of an amplitude or a signal of the transmitted electromagnetic waves, and provide the one or more commands to actuate the robotic manipulator in a predetermined manner corresponding to the identified gesture.


In still another aspect of the present disclosure, a user image capture device coupled to the controller and configured to capture images of the user. The one or more memories have stored thereon, further instructions which when executed by the one or more processors cause the one or more processors to receive the captured images of the user, track one or both of the eyes of the user, identify a selection, based on the tracked one or both of the eyes of the user, and provide the one or more commands to actuate the robotic manipulator in a predetermined manner corresponding to the identified selection.


According to another aspect of the present disclosure, a method is provided for controlling a robotic surgical system. The method includes presenting information related to at least one of a robotic manipulator or a surgical instrument, the robotic manipulator having a base and a surgical instrument holder configured to move relative to the base, and the surgical instrument being removably coupled to the surgical instrument holder, detecting a gesture made by a user representing a desired movement of the robotic manipulator, and actuating the robotic manipulator in a predetermined manner corresponding to the detected gesture.


In another aspect of the present disclosure, the user interface is a touch screen, the gesture detection sensor is a touchscreen sensor, the presented information is a graphical representation of the surgical instrument, and the method further includes determining whether or not a gesture has been detected on the touch screen over the graphical representation of the surgical instrument presented thereon, and in response to a determination that a gesture has been detected, actuating the robotic manipulator in a predetermined manner corresponding to the detected gesture.


In still another aspect of the present disclosure, the user interface is a touch screen, the gesture detection sensor is a touchscreen sensor, the presented information is a graphical representation of the robotic manipulator, and the method further includes determining whether or not a gesture has been detected on the touch screen over the graphical representation of the surgical instrument presented thereon, and in response to a determination that a gesture has been detected, actuating the robotic manipulator in the predetermined manner corresponding to the detected gesture.


In still another aspect of the present disclosure, the user interface is a display, the gesture detection sensor is a camera sensor, the information presented on the display is a graphical representation of the surgical instrument, and the method further includes determining whether or not a gesture has been detected by the camera sensor, and in response to a determination that a gesture has been detected, actuating the robotic manipulator in the predetermined manner corresponding to the detected gesture.


In another aspect of the present disclosure, the gesture detection sensor is an electric field-based sensor configured to generate an electric field, and the method further comprises determining whether or not a gesture has been detected by the electric field-based sensor by receiving a transmission of a change in the electric field generated by the electric field-based sensor, identifying a gesture, based on the change in the electric field, and actuating the robotic manipulator in a predetermined manner corresponding to the identified gesture.


In still another aspect of the present disclosure, the gesture detection sensor is a radar interaction sensor configured to transmit electromagnetic waves having a predetermined frequency, and the method further comprises determining whether or not a gesture has been detected by the radar interaction sensor by receiving a transmission of a change in one or more of an amplitude or a signal of the transmitted electromagnetic waves generated by the radar interaction sensor, identifying a gesture, based on the change in the one or more of an amplitude or a signal of the transmitted electromagnetic waves, and actuating the robotic manipulator in a predetermined manner corresponding to the identified gesture.


In still yet another aspect of the present disclosure, a user image capture device is coupled to the controller and configured to capture images of the user, and the method further comprises receiving the captured images of the user, tracking one or both of the eyes of the user, identifying a selection, based on the tracked one or both of the eyes of the user, and actuating the robotic manipulator in a predetermined manner corresponding to the identified selection


According to another aspect of the present disclosure, a non-transitory computer-readable medium is included that stores instruction that, when executed by a computer, cause the computer to present information related to at least one of a robotic manipulator or a surgical instrument, the robotic manipulator having a base and a surgical instrument holder configured to move relative to the base, and the surgical instrument being removably coupled to the surgical instrument holder, detect a gesture made by a user representing a desired movement of the robotic manipulator, and actuate the robotic manipulator in a predetermined manner corresponding to the detected gesture.


In accordance with another aspect of the present disclosure, a robotic surgical system is provided including a console, a robotic arm, and a controller. The console includes a user interface configured to display one or more surgical instrument representations and to sense a gesture within a predetermined distance of one of the one or more surgical instrument representations. The robotic arm has a distal end configured to selectively couple to and decouple from a surgical instrument. The controller is in communication with the robotic arm and the console and includes one or more processors and one or more memories having instructions stored thereon which, when executed by the one or more processors, cause the one or more processors to, in response to detecting the gesture indicating the selection of one of the one or more surgical instrument representations, detect a location of a surgical instrument corresponding to the selected one of the one or more surgical instrument representations, and determine whether the corresponding surgical instrument is coupled to the distal end of the robotic arm.


In another aspect of the present disclosure, the robotic surgical system further includes the user interface having a touchscreen and a sensor.


In another aspect of the present disclosure, the robotic surgical system further includes the user interface having a display and a sensor.


In still another aspect of the present disclosure, the robotic surgical system further includes the memory of the controller having instructions that, when executed by the processor, causes the processor to in response to a determination that the corresponding surgical instrument is not coupled to the distal end of the robotic arm cause the robotic arm to move to the detected location.


In another aspect of the present disclosure, the robotic surgical system includes memory of the controller which further includes instructions that, when executed by the processor, causes the processor to in response to a determination that the corresponding surgical instrument is coupled to the distal end of the robotic arm cause the robotic arm to move to a selected location.


Further details and aspects of exemplary embodiments of the present disclosure are described in more detail below with reference to the appended figures.





BRIEF DESCRIPTION OF THE DRAWINGS

Embodiments of the present disclosure are described herein with reference to the accompanying drawings, wherein:



FIG. 1 is a schematic illustration of a medical work station and operating console in accordance with the present disclosure;



FIG. 2 is a functional block diagram of the system architecture for controlling the multi-input robotic surgical system of FIG. 1;



FIG. 3 is a block diagram of control components, of the present disclosure, for controlling the robotic surgical system of FIG. 1;



FIG. 4 is a flow diagram of a method of controlling the multi-input robotic surgical system, in accordance with an embodiment;



FIG. 5 is a flow diagram of a method of moving a component of the multi-input robotic surgical system based on gesture, in accordance with an embodiment;



FIG. 6 is a flow diagram of a method of moving a component of the multi-input robotic surgical system based on gesture, in accordance with another embodiment;



FIG. 7 is a flow diagram of a method of moving a component of the multi-input robotic surgical system based on gesture, in accordance with yet another embodiment;



FIG. 8 is a flow diagram of a method of moving a component of the multi-input robotic surgical system based on gesture, in accordance with still another embodiment; and



FIG. 9 is a flow diagram of a method of moving a component of the multi-input robotic surgical system based on gesture, in accordance with still yet another embodiment.





DETAILED DESCRIPTION

Embodiments of the present disclosure are now described in detail with reference to the drawings in which like reference numerals designate identical or corresponding elements in each of the several views. As used herein, the term “clinician” refers to a doctor, a nurse, or any other care provider and may include support personnel. Throughout this description, the term “proximal” refers to the portion of the device or component thereof that is farthest from the patient and the term “distal” refers to the portion of the device or component thereof that is closest to the patient.


With reference to FIG. 1, a robotic surgical system 10 is provided, which is configured for use on a patient “P” lying on a patient table for the performance of a minimally invasive surgical operation. In accordance with an embodiment, the robotic surgical system 10 generally includes a plurality of robotic manipulators 12 configured to receive commands from a controller 30 for manipulating one or more of the robotic manipulators 12 in response to an input received at a remotely-located surgeon console 40.


Each of the robotic manipulators 12 is made up of a plurality of members connected through joints coupled to and extending from a base 18. Although the base 18 is illustrated schematically as a single location, it will be appreciated that the base 18 may be made up of a plurality of locations from which each robotic manipulator 12 extends. For example, the base 18 may be made up of a plurality of movable carts. In an embodiment, connected to a distal end of each robotic manipulator 12 is a surgical assembly 14, which includes a surgical instrument holder 16 that is configured to removably couple with a surgical instrument 20. Each robotic manipulator 12 may include a surgical instrument 20 configured for a different purpose. For example, one robotic manipulator 12 may include a surgical instrument including a grasping jaw instrument 20, while another robotic manipulator 12 may include a surgical instrument including scissors. Other suitable instruments 20 include, but are not limited to, staplers, clip appliers, suture passers, spatulas, and the like.


Although two robotic manipulators 12 are depicted, the surgical system 10 may include more than two robotic manipulators 12. In this regard, the additional robotic manipulators (not shown) are likewise connected to the controller 30 and are telemanipulatable via the console 40. Accordingly, one or more additional surgical assemblies 14, surgical instrument holders 16, and/or surgical instruments 20 may also be attached to the additional robotic manipulators. In another embodiment, one or more of the robotic manipulators 12 includes an imaging device 66 positioned over the surgical site “S”, in the surgical site “S” (not shown) or the like. The imaging devices 66 may capture visual images, infra-red images, ultrasound images, X-ray images, thermal images, and/or any other known real-time images of the surgical site “S”. The imaging devices transmit captured imaging data to the controller 30 which creates three-dimensional images of the surgical site “S” in real-time from the imaging data and transmits the three-dimensional images to a display device 46 for display. In another embodiment, the displayed images are two-dimensional renderings of the data captured by the imaging devices.


The robotic manipulators 12 may be driven by electric drives (not shown) that are connected to the controller 30. According to an embodiment, the controller 30 is configured to activate drives, for example, via a computer program, such that the robotic manipulators 12 and the surgical assemblies 14, surgical instrument holders 16, and/or surgical instruments 20 corresponding to the robotic manipulators 12, execute a desired movement received through the console 40. The controller 30 may also be configured to regulate movement of the robotic manipulators 12 and/or of the drives.


In an embodiment, one or more gesture detection sensors 56 may be included on or adjacent one or more of the robotic manipulators 12 (such as in the form of a wearable for a bedside assistant or clinician or on a transportable device). For example, one or more of the gesture detection sensors 56 may include electric field-based gesture detection sensors configured to output a circumambient electric field, which provides a scanning region in which inputs or changes to the electric field may be detected. The detected inputs are transmitted to the controller 30, which may be trainable to recognize or may have access to a database of stored patterns associated with the detected inputs where the patterns each correspond to commands to move the robotic manipulators 12 in a predetermined manner. In another embodiment, one or more of the gesture detection sensors 56 may include radar interaction sensors, which may be configured to transmit waves in a particular spectrum including radio frequency at a target, and reflected waves are then received by the sensor and provided to the controller 30. Similar to the controller 30 configured to operate with electric field-based gesture detection sensors, the controller 30 here may be trained to recognize or have access to a database storing patterns associated with inputs, where the patterns correspond to commands. An example of an electric field-based gesture sensor includes the GestIC® technology available through Microchip Technology Inc. of Chandler, Ariz., and an example of a radar interaction sensor includes the Project Soli™ sensor available through Google, Inc. of Mountain View, Calif. In another embodiment, alternative sensors may be implemented.


The controller 30 may control a plurality of motors 32 with each motor configured to drive a pushing or a pulling of one or more cables such as cables (not shown) coupled to the surgical instrument 20. In use, as these cables are pushed and/or pulled, the one or more cables effect operation and/or movement of the surgical instruments 20. The controller 30 coordinates the activation of the various motors 32 to coordinate a pushing or a pulling motion of one or more cables in order to coordinate an operation and/or movement of one or more surgical instrument 20. In an embodiment, each motor 32 is configured to actuate a drive rod or a lever arm to effect operation and/or movement of surgical instruments 20 in addition to, or instead of one or more cables.


The controller 30 includes any suitable logic control circuit adapted to perform calculations and/or operate according to a set of instructions. The controller 30 can be configured to communicate with a remote system (not shown) either via a wireless (e.g., Wi-Fi, Bluetooth, LTE, etc.) and/or wired connection. The remote system can include data, instructions and/or information related to the various components, algorithms, and/or operations of console 40. The remote system can include any suitable electronic service, database, platform, cloud, or the like. The controller 30 may include a central processing unit operably connected to memory. The memory may include transitory type memory (e.g., RAM) and/or non-transitory type memory (e.g., flash media, disk media, etc.). In some embodiments, the memory is part of, and/or operably coupled to, the remote system.


The controller 30 can include a plurality of inputs and outputs for interfacing with the components of the console 40 and/or the robotic arm 12, such as through a driver circuit. The controller 30 can be configured to receive input signals from the console 40, or in an embodiment, from the gesture detection sensors 56 disposed on or adjacent one or more of the robotic arms 12 and/or to generate output signals to control one or more of the various components (e.g., one or more motors) based on the input signals. The output signals can include, and/or can be based upon, algorithmic instructions which may be pre-programmed and/or input by a user. The controller 30 can be configured to accept a plurality of user inputs from a user interface (e.g., switches, buttons, touch screen, etc. of operating the console 40) which may be coupled to remote system.


A memory 34 can be directly and/or indirectly coupled to the controller 30 to store instructions and/or databases including pre-operative data from living being(s) and/or anatomical atlas(es). The memory 34 can be part of, and/or or operatively coupled to, the remote system.


To provide the input to the controller 30 via the surgeon console 40, the surgeon console 40 includes various input devices. In an embodiment, the surgeon console 40 includes input handles 42 or input pedals 45 configured to be manipulated by the clinician through actuation. In particular, the clinician uses his or her hands to grip and move the input handles 42 and the movement of the input handles 42 are translated via the controller 30 to thereby provide a corresponding movement to the robotic manipulators 12 and/or surgical instruments 20. The clinician steps on the input pedals 45 to provide a selection to provide further controls of the robotic manipulators 12.


The surgeon console 40 further includes a user interface system 44 to provide additional mechanisms by which the clinician can control the robotic manipulators 12. According to an embodiment, the user interface system 44 includes mechanisms to allow the clinician to select a robotic manipulator 12 to use, to more easily cause movement of the robotic manipulator 12, to identify surgical instruments 20 that are not already in use, and/or to select use of the robotic manipulators 12 on which they may be coupled. The user interface system 44 is also configured to simplify the exchange of surgical instruments 20 that may not already be coupled to a robotic manipulator 12, in an embodiment. In another embodiment, the user interface system 44 is configured to detect gestures and to translate the detected gestures into movement of the robotic manipulators 12. In this regard, the user interface system 44 includes one or more of a display 46, gesture detection sensors 48, a touch screen 50 including embedded sensors 52, a user image capture device 54, and/or a microphone 55.


The display 46 is set up to display two- or three-dimensional images such as information about at least one of the robotic manipulator 12 and the surgical instrument 20. In an embodiment in which three-dimensional images are provided, the display 46 is configured to provide the three-dimensional images for viewing either with or without specialized viewing lenses provided, for example, in the form of glasses, head sets or other suitable configuration. The display 46 operates with the one or more gesture detection sensors 48 to detect a gesture or movement provided by the clinician. To detect the gestures or movements, the gesture detection sensors 48 may include camera sensors, image sensors, motion sensors, optical sensors, heat sensors, infrared sensors, sensors similar to those described above with respect to the gesture detection sensors 56 of the manipulator 12, any other type of sensor capable of detecting movement, and/or any combination thereof. The gesture detection sensors 48 are mounted to or adjacent the display 46, such as along a top of the display 46 as illustrated in FIG. 1, in an embodiment. Alternatively, the gesture detection sensors 48 are mounted on or adjacent a side or on a bottom of the display 46. In an embodiment, more than one of the gesture detection sensors 48 are mounted at multiple locations around or on the display 46. In still another embodiment, the gesture detection sensors 48 may be disposed on a wearable device (not shown) that may be temporarily attached to the clinician, for example, on a hand, as a glove or other hand-fitting configuration.


The touch screen 50, if included, is disposed on the work station console 40 at a location that is relatively convenient for the clinician to access, for example, within arms' reach. Thus, when positioned to operate the input handles 42, the surgeon is also able to manipulate the touch screen 50. In an embodiment, the touch screen 50 is coupled to a frame 43 of the work station console 40, such as a portion of the frame 43 supporting an arm rest 45, as illustrated in FIG. 1, and may be adjustable in height or proximity to the clinician's position. In another embodiment, the touch screen 50 is mounted to a moveable stand that is separate from the frame 43 and is positioned adjacent the arm rest 45. Similarly, the touch screen 50 mounted to the moveable stand can be repositioned closer to or farther away from the clinician's position or adjusted in height, as desired. The touch screen 50 is configured to present information about at least one of the robotic manipulators 12 and the surgical instruments 20. The touch screen 50 includes sensors 52 that are embedded therewithin for detecting gestures made over or on the touch screen 50. Suitable sensors capable of detecting gestures include touch sensors, capacitive sensors, optical sensors, electromagnetic field change sensors, localized radar sensing, and the like. It will be appreciated that the touch screen 50 and sensors 52 are configured in any one of numerous suitable manners for detecting a gesture over or on a surface of the touch screen 50.


The user image capture device 54, if included, is configured to capture images of the clinician at the surgeon console 40 for the purposes of eye-tracking. In accordance with an embodiment, the user image capture device 54 is mounted to the display device 46 or at another location to allow the user image capture device 54 to be directed at the clinician during system operation. The user image capture device 54 may include one or more filters for the detection of the user's eyes, in an embodiment.


The microphone 55, if included, is configured to capture voice and/or other sounds at the surgeon console 40 and is mounted to the display device 46, or attached to another location, such as a stand (not shown), a headset (not shown), other wearable (not shown) and the like.



FIG. 2 is simplified block diagram of the robotic surgical system 10 of FIG. 1. The robotic surgical system 10 includes a controller 220, a tower 230, and one or more consoles 240. The controller 220 is configured to communicate with the tower 230 to thereby provide instructions for operation, in response to input received from one of the consoles 240.


The controller 230 generally includes a processing unit 222, a memory 224, a tower interface 226, and a console interface 228. The processing unit 222, in particular by means of a computer program stored in the memory 224, functions in such a way to cause components of the tower 230 to execute a desired movement according to a movement defined by input devices 242 of the console 240. In this regard, the processing unit 222 includes any suitable logic control circuit adapted to perform calculations and/or operate according to a set of instructions. The processing unit 222 may include one or more processing devices, such as a microprocessor-type of processing device or other physical device capable of executing instructions stored in the memory 224 and/or processing data. The memory 224 may include transitory type memory (e.g., RAM) and/or non-transitory type memory (e.g., flash media, disk media, etc.). The tower interface 226 and consoles interface 228 communicate with the tower 230 and console 240, respectively, either wirelessly (e.g., Wi-Fi, Bluetooth, LTE, etc.) and/or via wired configurations. Although depicted as separate modules, the interfaces 232, 234 may be a single component in other embodiments.


The tower 230 includes a communications interface 232 configured to receive communications and/or data from the tower interface 226 for manipulating motor mechanisms 234 to thereby move manipulators 236a, 236b. In accordance with an embodiment, the motor mechanisms 234 are configured to, in response to instructions from the processing unit 222, receive an application of current for mechanical manipulation of cables (not shown) which are attached to the manipulators 236a, 236b to cause a desired movement of a selected one of the manipulator 236a, 236b and/or an instrument coupled to a manipulator 236a, 236b. The tower 230 also includes an imaging device 238, which captures real-time images and transmits data representing the images to the controller 230 via the communications interface 232, and one or more sensors 250, which detects gestures made by a bedside assistant or user and transmits signals representing the detected gestures to the computer 248.


To further manipulate the devices of the tower 230, the console 240 has an input device 242, a user interface 244, an image capture device 245, sensors 246, and a computer 248. The input device 242 is coupled to the computer 246 and is used by the clinician to provide an input. In this regard, the input device 242 may be a handle or pedal, or other computer accessory, such as a keyboard, joystick, mouse, button, trackball or other component. In addition to the aforementioned devices, the input device 242 may also include a microphone or other device configured to receive sound input. The user interface 244 displays images or other data received from the controller 220 to thereby communicate the data to the clinician and operates in conjunction with the sensors 246, which detect gestures made by the clinician and sends signals representing the detected gestures to the computer 248. The image capture device 245, if included, captures images of the clinician at the console 240 and provides the captured images, either as still images or as a video stream, to the computer 248. The computer 248 processes the images or video for the purposes of eye-tracking. The computer 248 includes a processing unit and memory, which includes data, instructions and/or information related to the various components, algorithms, and/or operations of the tower 230 and can operate using any suitable electronic service, database, platform, cloud, or the like.



FIG. 3 is a simplified functional block diagram of a system architecture 300 of the robotic surgical system 10 of FIG. 1. The system architecture 300 includes a core module 320, a surgeon master module 330, a robot arm module 340, and an instrument module 350. The core module 320 serves as a central controller for the robotic surgical system 10 and coordinates operations of all of the other modules 330, 340, 350. For example, the core module 320 maps control devices to the manipulators 12, determines current status, performs all kinematics and frame transformations, and relays resulting movement commands. In this regard, the core module 320 receives and analyzes data from each of the other modules 330, 340, 350 in order to provide instructions or commands to the other modules 330, 340, 350 for execution within the robotic surgical system 10. Although depicted as separate modules, one or more of the modules 320, 330, 340, and 350 are a single component in other embodiments.


The core module 320 includes models 322, observers 324, a collision manager 326, controllers 328, and a skeleton 329. The models 322 include units that provide abstracted representations (base classes) for controlled components, such as the motors 32 and/or the manipulators 12. The observers 324 create state estimates based on input and output signals received from the other modules 330, 340, 350. The collision manager 326 prevents collisions between components that have been registered within the system 10. The skeleton 329 tracks the system 10 from a kinematic and dynamics point of view. For example, the kinematics item may be implemented either as forward or inverse kinematics, in an embodiment. The dynamics item may be implemented as algorithms used to model dynamics of the system's components.


The surgeon master module 330 communicates with surgeon control devices at the console 40 and relays inputs received from the console 40 to the core module 320. In accordance with an embodiment, the surgeon master module 330 communicates button status and control device positions to the core module 320 and includes a node controller 332 that includes a state/mode manager 334, a fail-over controller 336, and a N degree-of-freedom (“DOF”) actuator 338.


The robot arm module 340 coordinates operation of a robot arm subsystem, an arm cart subsystem, a set up arm, and an instrument subsystem in order to control movement of a corresponding manipulator 12. Although a single robot arm module 340 is included, it will be appreciated that the robot arm module 340 corresponds to and controls a single manipulator 12. As such, additional modules 340 are included in configurations in which the system 10 includes multiple manipulators 12. The robot arm module 340 includes a node controller 342, a state/mode manager 344, a fail-over controller 346, and a N degree-of-freedom (“DOF”) actuator 348.


The instrument module 350 controls movement of the surgical instrument 20 (shown in FIG. 1) attached to the robotic manipulator 12. The instrument module 350 is configured to correspond to and control a single surgical instrument. Thus, in configurations in which multiple surgical instruments are included, additional instrument modules 350 are likewise included. In an embodiment, the instrument module 350 obtains and communicates data related to the position of the surgical instrument 20 relative to the one or more manipulators 12. The instrument module 350 has a node controller 352, a state/mode manager 354, a fail-over controller 356, and a N degree-of-freedom (“DOF”) actuator 358.


The position data collected by the instrument module 350 is used by the core module 320 to determine the locations of each robotic manipulator 12, where the surgical instrument 20 is within the surgical site, within the operating room, and/or relative to one or more of the robotic manipulators 12. In an embodiment, the core module 320 determines whether or not to move a selected robotic manipulator 12 to the surgical site “S”. In particular, when a gesture is detected, the core module 320 determines whether the detected gesture indicates a selection of a robotic manipulator 12 and, if so, provides instructions to cause the selected robotic manipulator 12 to move to a selected location. The selected location may be the surgical site “S” or a location away from the surgical site “S”


In another embodiment, the core module 320 receives inputs from the sensor 246 and determines whether to output instructions for exchanging one instrument 20 for another, based on an actual position of a selected surgical instrument 20 and a desired position of the selected surgical instrument and on the received inputs. For example, when position data from the instrument module 350 indicates that the position of a first surgical instrument 20 on a first manipulator 12 is within a surgical site “S” and a gesture is detected by the sensors 246 to move the first surgical instrument 20 to another location, instructions are outputted to cause the first manipulator 12 to move out of the surgical site “S”. When a gesture is detected by the sensors 246 to use a second instrument 20 coupled to a second manipulator 12, instructions are outputted to move the first manipulator 12 out of the surgical site “S” and outputted to move the second manipulator 12 in a manner to move the second instrument 20 into the surgical site “S”.


In another embodiment, the core module 320 receives inputs from the sensor 250 and determines whether to output instructions to move the manipulator 12, based on the received inputs. When a gesture is detected by the sensors 250, the core module 320 identifies the gesture, based on a signal from the sensor 250 (for example, a change in an electric field in the case of an electric field-based sensor or a change in signal or frequency in the case of a radar interaction sensor), and provide the commands to actuate the robotic manipulator in a predetermined manner corresponding to the identified gesture.


In still another embodiment, the core module 320 receives inputs from the user image capture device 54 (e.g., images and/or video) and/or the microphone 55 (e.g., voice commands and/or other sounds) and processes the inputs using known algorithms. For example, the core module 320 identifies isolated features from the images, video, or sounds, and provides commands to actuate the robotic manipulator 12 or surgical instrument 20 in a predetermined manner corresponding to the identified isolated feature.



FIGS. 4-9 are flow diagrams depicting various embodiments of methods for controlling the robotic surgical system 10, based on the inputs received from the gesture detection sensors and/or the user image capture device described above.


In an embodiment, a gesture is received as an input at the surgeon console 40, for example, at the touchscreen 59 or via the gesture detection sensor 48, and the gesture is translated into commands to move one or more selected robotic manipulators 12 and/or selected surgical instruments 20 in a desired manner. In this regard, turning now to FIG. 4, an exemplary method 400 of controlling the robotic surgical system 10 is provided. In step 402, information relating to one or more of the robotic manipulators 12 and/or the surgical instruments 20 is displayed. For example, the information is presented as visual representations, such as pictorial icons or other graphical representation that correspond to one or more of the robotic manipulators 12 and/or instruments 20 or as text boxes indicating one or more of the robotic manipulators 12 and/or instruments 20. In an embodiment, the information is presented in a map form indicating the positions of the robotic manipulators 12 and/or instruments 20 relative to each other. In another embodiment, a representation of the robotic surgical system 10 and/or the operating room in which the robotic surgical system 10 is depicted and the information including the one or more of the robotic manipulators 12 and/or instruments 20 is presented in the representation. In another embodiment, the information does not indicate the positioning, and simply lists or shows icons or text boxes representing the robotic manipulators 12 and/or instruments 20. The information is presented either on the touch screen 50 or display 46.


The clinician then provides an input into the system 10 to effect a desired movement. In this regard, a gesture made by the clinician is detected at step 404. In an embodiment, the gesture from the clinician may be detected via the sensors 52 on the touch screen 50. For example, the clinician may tap an icon on the touch screen 50 to provide input indicating a desired selection. In another embodiment, the gesture from the clinician may be detected via the gesture detection sensor 48 on the display 46. In such an embodiment, a movement or gesture by the clinician over the display 46 is detected by the gesture detection sensor 48. As a result of the detection of the gesture over the display 46, commands may be provided to move an icon on the display 46 over which the gesture was made across the display 46 or commands may be provided to move a corresponding icon across the touch screen 50. In addition to detecting the gesture over the display 46, position data of the detected gesture may be obtained as the gesture is being made. For example, in accordance with an embodiment, the position data detected from the detected gesture may be represented as a set of coordinates and velocities, vector or other location and directional identifiers in space relative to a fixed location.


At step 406, the robotic manipulator 12 is actuated according to the detected gesture. In an embodiment, a comparison is made over time between an initial detection by the gesture detection sensor 48 and subsequent detections by the gesture detection sensor 48. For example, the comparison determines whether a difference between one or more selected parameters, such as distance, dwelling/pause time, and the like, is outside of a predetermined threshold range of the corresponding parameter, which may, for example, indicate that the path of the detected gesture or the end position of the detected gesture, is adjacent or within the proximity of an icon representing presented information. If not, a determination is made that the detected gesture does not correspond to a selection, and the gesture detection sensor 48 continues to detect gestures made by the clinician. When the difference is outside of one or more of the predetermined threshold ranges, then a determination is made that the detected gesture corresponds to a selection. In an embodiment in which the gestures are detected by sensors 52 on the touch screen 50, the gesture detection sensor 52 detects a presence of the clinician's hand over the touch screen 50 and may correspond the position to one of the manipulators 12 or surgical instruments 20. In a case in which the clinician's hand is maintained at the position for a duration outside of a predetermined threshold range and/or the movement of the clinician's hand is outside of a predetermined threshold range, commands are provided to actuate the robotic manipulator 12 in a predetermined manner corresponding to the detected gesture, more specifically, the selection. In another embodiment in which gestures are detected by sensors 48 in front of the display 46, the gesture detection sensor 48 detects a presence of the clinician's hand at a position, which may correspond to one of the manipulators 12 or surgical instruments 20. Similar to the previous embodiment, in a case in which the clinician's hand is maintained at the position for a duration outside of a predetermined threshold range and/or movement of the clinician's hand is outside of a predetermined threshold range, commands are provided to actuate the robotic manipulator 12 in a predetermined manner corresponding to the detected gesture. In another embodiment in which the gestures are detected by the sensor 48, actuation occurs when the position of the detected gesture in space is adjacent or within the proximity of another position in space, which when translated causes the display 46 to appear to represent presented information corresponding to the detected gesture.


Movement of a component based on gesture may be implemented into any one or more of the control instructions for manipulating different aspects of the manipulators and/or instruments. Turning to FIG. 5, a flow diagram of a method 500 of moving a component based on gesture, according to an embodiment, is provided. Here, movement of the component based on gesture is applied to select and move a manipulator 12 from one location to the surgical site “S”. Method 500 is implemented in an embodiment in which the system 10 includes two or more manipulators 12. At step 502, the display 46 or touchscreen 50 displays representations of both manipulators 12. A determination is then made as to whether a gesture is detected indicating selection of one of the manipulator representations at step 504. If so, selection of one of the manipulators 12 corresponding to the manipulator representation is indicated at step 506, and the selected manipulator 12 is moved into the surgical site “S” at step 508. In a case in which the non-selected manipulator 12 is already at the surgical site “S”, alternatively, the detected gesture may cause execution of a series of instructions that causes the non-selected manipulator 12 to move out of the surgical site “S” at step 510, and then selected manipulator 12 is moved into the surgical site “S” at step 508.



FIG. 6 is a flow diagram of a method 500 of moving a component based on gesture, according to another embodiment. Here, movement of the component based on gesture is applied to moving a manipulator 12 from one location to another which is not the surgical site “S”, according to another embodiment. The display 46 or touchscreen 50 displays a representation of a manipulator 12 in use and an environment within which the manipulator 12 is positioned, at step 602. For example, the environment includes an operating room. The display 46 or touchscreen 50 also displays various locations in the operating room represented by different representations, such as icons, numerals, letters, or other indicators. A determination is then made as to whether a gesture is detected indicating selection of one of the representations at step 604. If the selection by the detected gesture indicates one of the represented locations, the selection of the represented location is indicated at step 606, and the manipulator 12 is moved to the selected location at step 608. In case a non-selected manipulator 12 is already at the surgical site “S”, alternatively, the detected gesture may cause execution of a series of instructions that causes the non-selected manipulator 12 to move out of the selected location at step 610, and then selected manipulator 12 is moved into the selected location at step 608.


In another embodiment, moving of a component based on gesture can be used for an instrument exchange involving the selection and/or coupling/decoupling of a surgical instrument 20 from a manipulator 12. FIG. 7 is a flow diagram of a method 700 of moving a component based on gesture, according to yet another embodiment. Here, at step 702, the display 46 or touch screen 50 displays a plurality of representations representing different instruments. A determination is then made as to whether a gesture is detected indicating selection of one of the representations at step 704. If the selection by the detected gesture indicates selection of a representation, a selection of one of the instruments is indicated at step 706. A determination is then made as to whether the selected instrument 20 is coupled to a robotic manipulator 12 at the surgical site “S” at step 708. If not, the robotic manipulator 12 including the selected instrument 20 moves into the surgical site “S” at step 710. In an embodiment in which the non-selected instrument 20 is already in the surgical site “S”, the robotic manipulator 12 including the non-selected instrument 20 moves out of the surgical site “S” at step 712, and the robotic manipulator 12 including the selected instrument 20 move into the surgical site “S” at step 710.


Returning to step 708, if a determination is made that the selected instrument 20 is not coupled to a robotic manipulator 12 at the surgical site “S”, for example, the selected instrument 20 may be located on a surgical instrument tray 70 (FIG. 1) within the operating room, a location of the selected instrument 20 is detected at step 714. The location of the instrument 20 in the operating room is tracked via radio-frequency identification tags, optical tracking of various types, such as for example shape recognition or optical codes along the lines of widely used QR codes. No matter the particular embodiment, the selected instrument 20 provides a notification at step 718, such as an audible or visible signal either emitted or displayed at a display in the operating theater, or tactile indication, which then may signal to the bedside technician to attach the selected instrument 20 to one of the manipulators 12. In another embodiment, when a gesture is detected to thereby indicate a selection of one of the instruments, a signal is sent to the manipulator 12 in use to conduct an exchange of instruments and to couple the selected instrument 20 to itself at step 720.


As discussed briefly above, rather than receiving an input using a hand gesture detected by a gesture detection sensor 48, the inputs may be detected using eye-tracking. As such, FIG. 8 is a flow diagram of an exemplary method 800 of controlling the robotic surgical system 10 using eye-tracking. In an embodiment, step 802, information relating to one or more of the robotic manipulators 12 and/or the surgical instruments 20 is displayed. For example, the information is presented as visual representations, such as pictorial icons or other graphical representation that correspond to one or more of the robotic manipulators 12 and/or instruments 20 or as text boxes indicating one or more of the robotic manipulators 12 and/or instruments 20. Additionally, the user's eyes are tracked at step 804. In an embodiment, the user image capture device 54 continuously captures images of a clinician, and the captured images are processed using known algorithms for identifying and tracking the user's eyes. In accordance with an embodiment, the captured images are processed to provide a first position of user's eyes, which indicates that the user is viewing a certain portion of the display 64. A determination is then made as to whether the user's tracked eyes have changed positions at step 806. For example, additional processing is performed to detect a change in one or more parameters associated with the tracking of the user's eyes, such as whether the change is outside of a corresponding predetermined threshold range for the parameter. If not, the method iterates at step 804. However, if a change is detected, a further determination is made as to whether the subsequent position of the user's eyes corresponds to a selection of a representation at step 806. If so, a selected representation is identified at 810 and commands are provided to actuate the manipulator or surgical instrument in a manner corresponding to the selected representation at step 812. If not, the method 800 iterates at step 804.


Each of the embodiments above may be enhanced with the implementation of voice or sound recognition. For example, prior to or concurrently with any one of the gesture detection steps (i.e., step 404 of FIG. 4, step 504 of FIG. 5, step 605 of FIG. 6, step 704 of FIG. 7) or the eye track steps (i.e., step 806 of FIG. 8), the system 10 may be configured such that the user may provide a voice command to either initiate or confirm execution of the gesture detection steps. In a similar manner, prior to or concurrently with any one of the gesture detection steps (i.e., step 404 of FIG. 4, step 504 of FIG. 5, step 605 of FIG. 6, step 704 of FIG. 7), the system 10 may be configured such that the clinician may initiate or confirm execution of the steps through eye-tracking. In such embodiment, the controller 220 may determine from images and/or video received from the image capture device 54 that the clinician is looking at his or her hands, and in response to the determination, any one of the gesture detection steps (i.e., step 404 of FIG. 4, step 504 of FIG. 5, step 605 of FIG. 6, step 704 of FIG. 7) may execute.


In still another embodiment, a gesture may be received as an input at the robotic surgical system 10, rather than at the surgeon console 40. In this regard, the gesture detected by the gesture detection sensor 56 may be translated into commands to move one or more selected robotic manipulators 12 and/or selected surgical instruments 20 in a desired manner so that the bedside assistant or a clinician in the surgical theater can move the manipulator 12 and/or surgical instrument 20 without contact.


Turning now to FIG. 9, an exemplary method 900 of controlling the robotic surgical system 10 in such a manner is provided. Here, a manipulator 12 or surgical instrument 20 intended to be moved may initially be identified. For example, the gesture detection sensor 56 may be disposed directly on a manipulator 12 to be moved, and hence, the user may know at the outset which manipulator 12 is intended to be moved and selects the manipulator 12 or surgical instrument 20 by virtue of proximity. In another example, the gesture detection sensor 56 may be in the vicinity of or may be otherwise clearly associated with a manipulator 12 or surgical instrument 20, in which case, the user selects which manipulator 12 or instrument 20 in this manner. In any case, an electric field or electromagnetic waves in the radio frequency are transmitted from the gesture detection sensor 56 at step 902. For example, the gesture detection sensor 56 may be an electric field-based sensor or a radar interaction sensor, both of which are configured to generate an electric field or electromagnetic waves in the radio frequency, respectively. A change is detected in the electric field or the outputted electromagnetic waves at step 904. For example, the user's hand may enter the electric field or block or otherwise influence the outputted electromagnetic waves, for example, via a change in one or more of an amplitude or signal of the waves.


In response to the detected change, a gesture is identified at step 906. In an embodiment, a database includes data related to electric field or electromagnetic wave changes and data of corresponding gestures, and the identification is performed by matching the change to the corresponding gesture. Once the gesture is identified, the selected manipulator or surgical instrument is moved in a predetermined manner corresponding to the identified gesture at step 908. The database includes data related to a plurality of identified gestures and corresponding manners by which to actuate a manipulator or surgical instrument. For example, a gesture corresponding to a right hand swipe may correspond to a movement of the manipulator in a right-hand direction, a gesture corresponding to a left hand swipe may correspond to a movement of the manipulator in a left-hand direction, and the like.


It will further be appreciated that the embodiments disclosed herein are examples of the disclosure and may be embodied in various forms. For instance, although certain embodiments herein are described as separate embodiments, each of the embodiments herein may be combined with one or more of the other embodiments herein. Specific structural and functional details disclosed herein are not to be interpreted as limiting, but as a basis for the claims and as a representative basis for teaching one skilled in the art to variously employ the present disclosure in virtually any appropriately detailed structure. Like reference numerals may refer to similar or identical elements throughout the description of the figures.


The phrases “in an embodiment,” “in embodiments,” “in some embodiments,” or “in other embodiments” may each refer to one or more of the same or different embodiments in accordance with the present disclosure. A phrase in the form “A or B” means “(A), (B), or (A and B).” A phrase in the form “at least one of A, B, or C” means “(A); (B); (C); (A and B); (A and C); (B and C); or (A, B, and C).” The term “clinician” may refer to a clinician or any medical professional, such as a doctor, nurse, technician, medical assistant, or the like, performing a medical procedure.


The systems described herein may also utilize one or more controllers to receive various information and transform the received information to generate an output. The controller may include any type of computing device, computational circuit, or any type of processor or processing circuit capable of executing a series of instructions that are stored in a memory. The controller may include multiple processors and/or multicore central processing units (CPUs) and may include any type of processor, such as a microprocessor, digital signal processor, microcontroller, or the like. The controller may also include a memory to store data and/or algorithms to perform a series of instructions.


Any of the herein described methods, programs, algorithms or codes may be converted to, or expressed in, a programming language or computer program. A “Programming Language” and “Computer Program” includes any language used to specify instructions to a computer, and includes (but is not limited to) these languages and their derivatives: Assembler, Basic, Batch files, BCPL, C, C+, C++, Delphi, Fortran, Java, JavaScript, Machine code, operating system command languages, Pascal, Perl, PL1, scripting languages, Visual Basic, metalanguages which themselves specify programs, and all first, second, third, fourth, and fifth generation computer languages. Also included are database and other data schemas, and any other meta-languages. No distinction is made between languages which are interpreted, compiled, or use both compiled and interpreted approaches. No distinction is also made between compiled and source versions of a program. Thus, reference to a program, where the programming language could exist in more than one state (such as source, compiled, object, or linked) is a reference to any and all such states. Reference to a program may encompass the actual instructions and/or the intent of those instructions.


Any of the herein described methods, programs, algorithms or codes may be contained on one or more machine-readable media or memory. The term “memory” may include a mechanism that provides (e.g., stores and/or transmits) information in a form readable by a machine such a processor, computer, or a digital processing device. For example, a memory may include a read only memory (ROM), random access memory (RAM), magnetic disk storage media, optical storage media, flash memory devices, or any other volatile or non-volatile memory storage device. Code or instructions contained thereon can be represented by carrier wave signals, infrared signals, digital signals, and by other like signals.


While several embodiments of the disclosure have been shown in the drawings, it is not intended that the disclosure be limited thereto, as it is intended that the disclosure be as broad in scope as the art will allow and that the specification be read likewise. Any combination of the above embodiments is also envisioned and is within the scope of the appended claims. Therefore, the above description should not be construed as limiting, but merely as exemplifications of particular embodiments. Those skilled in the art will envision other modifications within the scope of the claims appended hereto.

Claims
  • 1. A robotic surgical system comprising: one or more robotic manipulators, each having a base and a surgical instrument holder configured to move relative to the base;a surgical instrument removably coupled to the surgical instrument holder;a user interface configured to present information related to at least one of the robotic manipulator or the surgical instrument;a gesture detection sensor configured to detect a gesture made by a user; anda controller in communication with the robotic manipulator, the surgical instrument, the user interface, and the gesture detection sensor and including one or more processors and one or more memories having instructions stored thereon which, when executed by the one or more processors, cause the one or more processors to: determine whether or not a gesture indicating a selection of a first robotic manipulator of the one or more robotic manipulators has been detected; andprovide one or more commands to actuate the one or more robotic manipulators in a predetermined manner corresponding to the detected gesture,wherein, when it is determined that the gesture indicating selection of the first robotic manipulator has been detected, the one or more commands actuate a second robotic manipulator, which is in an area of interest and is not selected, to move out of the area of interest and the first robotic manipulator to move into the area of interest.
  • 2. The robotic surgical system of claim 1, wherein: the user interface is a touch screen,the gesture detection sensor is a touch sensor,the presented information is a graphical representation of the surgical instrument, andthe one or more memories have stored thereon, further instructions which when executed by the one or more processors cause the one or more processors to determine whether or not a gesture has been detected on the touch screen over the graphical representation of the surgical instrument presented thereon, and in response to a determination that a gesture has been detected, provide the one or more commands to actuate the one or more robotic manipulators in the predetermined manner corresponding to the detected gesture.
  • 3. The robotic surgical system of claim 1, wherein: the user interface is a touch screen,the gesture detection sensor is a touch sensor,the presented information is a graphical representation of the robotic manipulator, andthe one or more memories have stored thereon, further instructions which when executed by the one or more processors cause the one or more processors to determine whether or not a gesture has been detected on the touch screen over the graphical representation of the surgical instrument presented thereon, and in response to a determination that a gesture has been detected, provide the one or more commands to actuate the one or more robotic manipulators in the predetermined manner corresponding to the detected gesture.
  • 4. The robotic surgical system of claim 1, wherein: the user interface is a display,the gesture detection sensor is a camera sensor,the presented information is a graphical representation of the surgical instrument, andthe one or more memories have stored thereon, further instructions which when executed by the one or more processors cause the one or more processors to determine whether or not a gesture has been detected by the camera sensor, and in response to a determination that a gesture has been detected, provide the one or more commands to actuate the one or more robotic manipulators in the predetermined manner corresponding to the detected gesture.
  • 5. The robotic surgical system of claim 1, wherein: the gesture detection sensor is an electric field-based sensor configured to generate an electric field, andthe one or more memories have stored thereon, further instructions which when executed by the one or more processors cause the one or more processors to determine whether or not a gesture has been detected by the electric field-based sensor by receiving a transmission of a change in the electric field generated by the electric field-based sensor, identify a gesture, based on the change in the electric field, and provide the one or more commands to actuate the one or more robotic manipulators in a predetermined manner corresponding to the identified gesture.
  • 6. The robotic surgical system of claim 1, wherein: the gesture detection sensor is a radar interaction sensor configured to transmit electromagnetic waves having a predetermined frequency, andthe one or more memories have stored thereon, further instructions which when executed by the one or more processors cause the one or more processors to determine whether or not a gesture has been detected by the radar interaction sensor by receiving a transmission of a change in one or more of an amplitude or a signal of the transmitted electromagnetic waves generated by the radar interaction sensor, identify a gesture, based on the change in the one or more of an amplitude or a signal of the transmitted electromagnetic waves, and provide the one or more commands to actuate the one or more robotic manipulators in a predetermined manner corresponding to the identified gesture.
  • 7. The robotic surgical system of claim 1, further comprising: a user image capture device coupled to the controller and configured to capture images of the user,wherein the one or more memories have stored thereon, further instructions which when executed by the one or more processors cause the one or more processors to receive the captured images of the user, track one or both of eyes of the user, identify a selection, based on the tracked one or both of the eyes of the user, and provide the one or more commands to actuate the one or more robotic manipulators in a predetermined manner corresponding to the identified selection.
  • 8. A method of controlling a robotic surgical system comprising: presenting on a user interface information related to one or more robotic manipulators, each having a base and a surgical instrument holder configured to move relative to the base, and the surgical instrument being removably coupled to the surgical instrument holder;detecting a gesture made by a user indicating a selection of a first robotic manipulator of the one or more robotic manipulators; andactuating a second robotic manipulator of the one or more robotic manipulators, which is in an area of interest and is not selected, to move out of the area of interest; andactuating the first robotic manipulator of the one or more robotic manipulators to move into the area of interest.
  • 9. The method of claim 8, wherein: the user interface is a touch screen,the gesture is detected by a touchscreen sensor,the presented information is a graphical representation of the surgical instrument, andthe method further comprises: determining whether or not a gesture has been detected on the touch screen over the graphical representation of the surgical instrument presented thereon, andin response to a determination that a gesture has been detected, actuating the one or more robotic manipulators in a predetermined manner corresponding to the detected gesture.
  • 10. The method of claim 8, wherein: the user interface is a touch screen,the gesture is detected by a touch sensor,the presented information is a graphical representation of the one or more robotic manipulators, andthe method further comprises: determining whether or not a gesture has been detected on the touch screen over the graphical representation of the surgical instrument presented thereon, andin response to a determination that a gesture has been detected, actuating the one or more robotic manipulators in a predetermined manner corresponding to the detected gesture.
  • 11. The method of claim 8, wherein: the user interface is a display,the gesture is detected by a camera sensor,the presented information is a graphical representation of the surgical instrument, andthe method further comprises: determining whether or not a gesture has been detected by the camera sensor, andin response to a determination that a gesture has been detected, actuating the one or more robotic manipulators in a predetermined manner corresponding to the detected gesture.
  • 12. The method of claim 8, wherein: the gesture is detected by an electric field-based sensor configured to generate an electric field, andthe method further comprises: determining whether or not a gesture has been detected by the electric field-based sensor by receiving a transmission of a change in the electric field generated by the electric field-based sensor;identifying a gesture, based on the change in the electric field; andactuating the one or more robotic manipulators in a predetermined manner corresponding to the identified gesture.
  • 13. The method of claim 8, wherein: the gesture is detected by a radar interaction sensor configured to transmit electromagnetic waves having a predetermined frequency, andthe method further comprises: determining whether or not a gesture has been detected by the radar interaction sensor by receiving a transmission of a change in one or more of an amplitude or a signal of the transmitted electromagnetic waves generated by the radar interaction sensor;identifying a gesture, based on the change in the one or more of an amplitude or a signal of the transmitted electromagnetic waves; andactuating the one or more robotic manipulators in a predetermined manner corresponding to the identified gesture.
  • 14. The method of claim 8, further comprising: capturing images of the user;receiving the captured images of the user;tracking one or both of eyes of the user;identifying a selection, based on the tracked one or both of the eyes of the user; andactuating the one or more robotic manipulators in a predetermined manner corresponding to the identified selection.
  • 15. A robotic surgical system comprising: a console including a user interface configured to display one or more surgical instrument representations and to detect a gesture over one of the one or more surgical instrument representations;one or more robotic arms, each having a distal end configured to selectively couple to and decouple from a surgical instrument; anda controller in communication with the one or more robotic arms and the console, the controller including one or more processors and one or more memories having instructions stored thereon which, when executed by the one or more processors, cause the one or more processors to: in response to detecting the gesture indicating a selection of one of the one or more surgical instrument representations, detect a location of a surgical instrument corresponding to the selected one of the one or more surgical instrument representations;determine whether the corresponding surgical instrument is coupled to the distal end of a first robotic arm of the one or more robotic arms;actuating a second robot arm of the one or more robotic arms, which is coupled to a non-selected surgical instrument in an area of interest, to move out of the area of interest; andactuating the first robot arm, which is coupled to the selected surgical instrument, to move into the area of interest.
  • 16. The robotic surgical system of claim 15, wherein the user interface includes a touchscreen and a sensor.
  • 17. The robotic surgical system of claim 15, wherein the user interface includes a display and a sensor.
  • 18. The robotic surgical system of claim 15, wherein the one or more memories further include instructions that, when executed by the one or more processors, causes the one or more processors to: in response to a determination that the corresponding surgical instrument is not coupled to the distal end of the first robotic arm, cause the first robotic arm to move to the detected location.
  • 19. The robotic surgical system of claim 15, wherein the one or more memories further include instructions that, when executed by the one or more processors, causes the one or more processors to: in response to a determination that the corresponding surgical instrument is coupled to the distal end of the first robotic arm, cause the first robotic arm to move to a selected location.
CROSS-REFERENCE TO RELATED APPLICATIONS

This application is a U.S. National Stage Application filed under 35 U.S.C. § 371(a) of International Patent Application Serial No. PCT/US2017/035576, filed Jun. 2, 2017, which claims the benefit of and priority to U.S. Provisional Patent Application Ser. No. 62/345,054, filed Jun. 3, 2016, the entire disclosure of which is incorporated by reference herein.

PCT Information
Filing Document Filing Date Country Kind
PCT/US2017/035576 6/2/2017 WO
Publishing Document Publishing Date Country Kind
WO2017/210497 12/7/2017 WO A
US Referenced Citations (400)
Number Name Date Kind
6132368 Cooper Oct 2000 A
6206903 Ramans Mar 2001 B1
6246200 Blumenkranz et al. Jun 2001 B1
6312435 Wallace et al. Nov 2001 B1
6331181 Tierney et al. Dec 2001 B1
6394998 Wallace et al. May 2002 B1
6424885 Niemeyer et al. Jul 2002 B1
6441577 Blumenkranz et al. Aug 2002 B2
6459926 Nowlin et al. Oct 2002 B1
6491691 Morley et al. Dec 2002 B1
6491701 Tierney et al. Dec 2002 B2
6493608 Niemeyer Dec 2002 B1
6565554 Niemeyer May 2003 B1
6645196 Nixon et al. Nov 2003 B1
6659939 Moll et al. Dec 2003 B2
6671581 Niemeyer et al. Dec 2003 B2
6676684 Morley et al. Jan 2004 B1
6685698 Morley et al. Feb 2004 B2
6699235 Wallace et al. Mar 2004 B2
6714839 Salisbury, Jr et al. Mar 2004 B2
6716233 Whitman Apr 2004 B1
6728599 Wang et al. Apr 2004 B2
6746443 Morley et al. Jun 2004 B1
6766204 Niemeyer et al. Jul 2004 B2
6770081 Cooper et al. Aug 2004 B1
6772053 Niemeyer Aug 2004 B2
6783524 Anderson et al. Aug 2004 B2
6793652 Whitman et al. Sep 2004 B1
6793653 Sanchez et al. Sep 2004 B2
6799065 Niemeyer Sep 2004 B1
6837883 Moll et al. Jan 2005 B2
6839612 Sanchez et al. Jan 2005 B2
6840938 Morley et al. Jan 2005 B1
6843403 Whitman Jan 2005 B2
6846309 Whitman et al. Jan 2005 B2
6866671 Tierney et al. Mar 2005 B2
6871117 Wang et al. Mar 2005 B2
6879880 Nowlin et al. Apr 2005 B2
6899705 Niemeyer May 2005 B2
6902560 Morley et al. Jun 2005 B1
6936042 Wallace et al. Aug 2005 B2
6951535 Ghodoussi et al. Oct 2005 B2
6974449 Niemeyer Dec 2005 B2
6991627 Madhani et al. Jan 2006 B2
6994708 Manzo Feb 2006 B2
7048745 Tierney et al. May 2006 B2
7066926 Wallace et al. Jun 2006 B2
7118582 Wang et al. Oct 2006 B1
7125403 Julian et al. Oct 2006 B2
7155315 Niemeyer et al. Dec 2006 B2
7239940 Wang et al. Jul 2007 B2
7306597 Manzo Dec 2007 B2
7357774 Cooper Apr 2008 B2
7373219 Nowlin et al. May 2008 B2
7379790 Toth et al. May 2008 B2
7386365 Nixon Jun 2008 B2
7391173 Schena Jun 2008 B2
7398707 Morley et al. Jul 2008 B2
7413565 Wang et al. Aug 2008 B2
7453227 Prisco et al. Nov 2008 B2
7524320 Tierney et al. Apr 2009 B2
7574250 Niemeyer Aug 2009 B2
7594912 Cooper et al. Sep 2009 B2
7607440 Coste-Maniere et al. Oct 2009 B2
7666191 Orban, III et al. Feb 2010 B2
7682357 Ghodoussi et al. Mar 2010 B2
7689320 Prisco et al. Mar 2010 B2
7695481 Wang et al. Apr 2010 B2
7695485 Whitman et al. Apr 2010 B2
7699855 Anderson et al. Apr 2010 B2
7713263 Niemeyer May 2010 B2
7725214 Diolaiti May 2010 B2
7727244 Orban, III et al. Jun 2010 B2
7741802 Prisco et al. Jun 2010 B2
7756036 Druke et al. Jul 2010 B2
7757028 Druke et al. Jul 2010 B2
7762825 Burbank et al. Jul 2010 B2
7778733 Nowlin et al. Aug 2010 B2
7803151 Whitman Sep 2010 B2
7806891 Nowlin et al. Oct 2010 B2
7819859 Prisco et al. Oct 2010 B2
7819885 Cooper Oct 2010 B2
7824401 Manzo et al. Nov 2010 B2
7835823 Sillman et al. Nov 2010 B2
7843158 Prisco Nov 2010 B2
7865266 Moll et al. Jan 2011 B2
7865269 Prisco et al. Jan 2011 B2
7886743 Cooper et al. Feb 2011 B2
7899578 Prisco et al. Mar 2011 B2
7907166 Lamprecht et al. Mar 2011 B2
7935130 Williams May 2011 B2
7963913 Devengenzo et al. Jun 2011 B2
7983793 Toth et al. Jul 2011 B2
8002767 Sanchez et al. Aug 2011 B2
8004229 Nowlin et al. Aug 2011 B2
8012170 Whitman et al. Sep 2011 B2
8054752 Druke et al. Nov 2011 B2
8062288 Cooper et al. Nov 2011 B2
8079950 Stern et al. Dec 2011 B2
8100133 Mintz et al. Jan 2012 B2
8108072 Zhao et al. Jan 2012 B2
8120301 Goldberg et al. Feb 2012 B2
8142447 Cooper et al. Mar 2012 B2
8147503 Zhao et al. Apr 2012 B2
8151661 Schena et al. Apr 2012 B2
8155479 Hoffman et al. Apr 2012 B2
8155787 Chalubert Apr 2012 B2
8182469 Anderson et al. May 2012 B2
8202278 Orban, III et al. Jun 2012 B2
8206406 Orban, III Jun 2012 B2
8210413 Whitman et al. Jul 2012 B2
8216250 Orban, III et al. Jul 2012 B2
8220468 Cooper et al. Jul 2012 B2
8256319 Cooper et al. Sep 2012 B2
8285517 Sillman et al. Oct 2012 B2
8315720 Mohr et al. Nov 2012 B2
8335590 Costa et al. Dec 2012 B2
8347757 Duval Jan 2013 B2
8374723 Zhao et al. Feb 2013 B2
8418073 Mohr et al. Apr 2013 B2
8419717 Diolaiti et al. Apr 2013 B2
8423182 Robinson et al. Apr 2013 B2
8452447 Nixon May 2013 B2
8454585 Whitman Jun 2013 B2
8499992 Whitman et al. Aug 2013 B2
8508173 Goldberg et al. Aug 2013 B2
8515576 Lipow et al. Aug 2013 B2
8528440 Morley et al. Sep 2013 B2
8529582 Devengenzo et al. Sep 2013 B2
8540748 Murphy et al. Sep 2013 B2
8551116 Julian et al. Oct 2013 B2
8562594 Cooper et al. Oct 2013 B2
8594841 Zhao et al. Nov 2013 B2
8597182 Stein et al. Dec 2013 B2
8597280 Cooper et al. Dec 2013 B2
8600551 Itkowitz et al. Dec 2013 B2
8608773 Tierney et al. Dec 2013 B2
8620473 Diolaiti et al. Dec 2013 B2
8624537 Nowlin et al. Jan 2014 B2
8634957 Toth et al. Jan 2014 B2
8638056 Goldberg et al. Jan 2014 B2
8638057 Goldberg et al. Jan 2014 B2
8644988 Prisco et al. Feb 2014 B2
8666544 Moll et al. Mar 2014 B2
8668638 Donhowe et al. Mar 2014 B2
8682489 Itkowitz Mar 2014 B2
8746252 McGrogan et al. Jun 2014 B2
8749189 Nowlin et al. Jun 2014 B2
8749190 Nowlin et al. Jun 2014 B2
8758352 Cooper et al. Jun 2014 B2
8761930 Nixon Jun 2014 B2
8768516 Diolaiti et al. Jul 2014 B2
8786241 Nowlin et al. Jul 2014 B2
8790243 Cooper et al. Jul 2014 B2
8808164 Hoffman et al. Aug 2014 B2
8816628 Nowlin et al. Aug 2014 B2
8821480 Burbank Sep 2014 B2
8823308 Nowlin et al. Sep 2014 B2
8827989 Niemeyer Sep 2014 B2
8838270 Druke et al. Sep 2014 B2
8852174 Burbank Oct 2014 B2
8858547 Brogna Oct 2014 B2
8862268 Robinson et al. Oct 2014 B2
8864751 Prisco et al. Oct 2014 B2
8864752 Diolaiti et al. Oct 2014 B2
8880223 Raj Nov 2014 B2
8903546 Diolaiti et al. Dec 2014 B2
8903549 Itkowitz et al. Dec 2014 B2
8911428 Cooper et al. Dec 2014 B2
8912746 Reid et al. Dec 2014 B2
8935003 Itkowitz Jan 2015 B2
8944070 Guthart et al. Feb 2015 B2
8989903 Weir et al. Mar 2015 B2
8996173 Itkowitz Mar 2015 B2
9002518 Manzo et al. Apr 2015 B2
9014856 Manzo et al. Apr 2015 B2
9016540 Whitman et al. Apr 2015 B2
9019345 Patrick Apr 2015 B2
9043027 Durant et al. May 2015 B2
9050120 Swarup et al. Jun 2015 B2
9055961 Manzo et al. Jun 2015 B2
9068628 Solomon et al. Jun 2015 B2
9078684 Williams Jul 2015 B2
9084623 Gomez et al. Jul 2015 B2
9095362 Dachs, II et al. Aug 2015 B2
9096033 Holop et al. Aug 2015 B2
9101381 Burbank et al. Aug 2015 B2
9113877 Whitman et al. Aug 2015 B1
9138284 Krom et al. Sep 2015 B2
9144456 Rosa et al. Sep 2015 B2
9198730 Prisco et al. Dec 2015 B2
9204923 Manzo et al. Dec 2015 B2
9226648 Saadat et al. Jan 2016 B2
9226750 Weir et al. Jan 2016 B2
9226761 Burbank Jan 2016 B2
9232984 Guthart et al. Jan 2016 B2
9241766 Duque et al. Jan 2016 B2
9241767 Prisco et al. Jan 2016 B2
9241769 Larkin et al. Jan 2016 B2
9259275 Burbank Feb 2016 B2
9259277 Rogers et al. Feb 2016 B2
9259281 Grifliths et al. Feb 2016 B2
9259282 Azizian et al. Feb 2016 B2
9261172 Solomon et al. Feb 2016 B2
9265567 Orban, III et al. Feb 2016 B2
9265584 Itkowitz et al. Feb 2016 B2
9283049 Diolaiti et al. Mar 2016 B2
9301811 Goldberg et al. Apr 2016 B2
9314307 Richmond et al. Apr 2016 B2
9317651 Nixon Apr 2016 B2
9345546 Toth et al. May 2016 B2
9393017 Flanagan et al. Jul 2016 B2
9402689 Prisco et al. Aug 2016 B2
9417621 Diolaiti et al. Aug 2016 B2
9424303 Hoffman et al. Aug 2016 B2
9433418 Whitman et al. Sep 2016 B2
9446517 Burns et al. Sep 2016 B2
9452020 Griffiths et al. Sep 2016 B2
9474569 Manzo et al. Oct 2016 B2
9480533 Devengenzo et al. Nov 2016 B2
9503713 Zhao et al. Nov 2016 B2
9517109 Maeda Dec 2016 B2
9550300 Danitz et al. Jan 2017 B2
9554859 Nowlin et al. Jan 2017 B2
9566124 Prisco et al. Feb 2017 B2
9579164 Itkowitz et al. Feb 2017 B2
9585641 Cooper et al. Mar 2017 B2
9606584 Fram Mar 2017 B1
9615883 Schena et al. Apr 2017 B2
9623563 Nixon Apr 2017 B2
9623902 Griffiths et al. Apr 2017 B2
9629520 Diolaiti Apr 2017 B2
9662177 Weir et al. May 2017 B2
9664262 Donlon et al. May 2017 B2
9687312 Dachs, II et al. Jun 2017 B2
9700334 Hinman et al. Jul 2017 B2
9718190 Larkin et al. Aug 2017 B2
9730719 Brisson et al. Aug 2017 B2
9737199 Pistor et al. Aug 2017 B2
9795446 DiMaio et al. Oct 2017 B2
9797484 Solomon et al. Oct 2017 B2
9801690 Larkin et al. Oct 2017 B2
9814530 Weir et al. Nov 2017 B2
9814536 Goldberg et al. Nov 2017 B2
9814537 Itkowitz et al. Nov 2017 B2
9820823 Richmond et al. Nov 2017 B2
9827059 Robinson et al. Nov 2017 B2
9830371 Hoffman et al. Nov 2017 B2
9839481 Blumenkranz et al. Dec 2017 B2
9839487 Dachs, II Dec 2017 B2
9850994 Schena Dec 2017 B2
9855102 Blumenkranz Jan 2018 B2
9855107 Labonville et al. Jan 2018 B2
9872737 Nixon Jan 2018 B2
9877718 Weir et al. Jan 2018 B2
9883920 Blumenkranz Feb 2018 B2
9888974 Niemeyer Feb 2018 B2
9895813 Blumenkranz et al. Feb 2018 B2
9901408 Larkin Feb 2018 B2
9918800 Itkowitz et al. Mar 2018 B2
9943375 Blumenkranz et al. Apr 2018 B2
9948852 Lilagan et al. Apr 2018 B2
9949798 Weir Apr 2018 B2
9949802 Cooper Apr 2018 B2
9952107 Blumenkranz et al. Apr 2018 B2
9956044 Gomez et al. May 2018 B2
9980778 Ohline et al. May 2018 B2
10008017 Itkowitz et al. Jun 2018 B2
10028793 Griffiths et al. Jul 2018 B2
10033308 Chaghajerdi et al. Jul 2018 B2
10034719 Richmond et al. Jul 2018 B2
10052167 Au et al. Aug 2018 B2
10085811 Weir et al. Oct 2018 B2
10092344 Mohr et al. Oct 2018 B2
10123844 Nowlin et al. Nov 2018 B2
10188471 Brisson Jan 2019 B2
10201390 Swarup et al. Feb 2019 B2
10213202 Flanagan et al. Feb 2019 B2
10258416 Mintz et al. Apr 2019 B2
10278782 Jarc May 2019 B2
10278783 Itkowitz et al. May 2019 B2
10282881 Itkowitz et al. May 2019 B2
10335242 Devengenzo et al. Jul 2019 B2
10405934 Prisco et al. Sep 2019 B2
10433922 Itkowitz et al. Oct 2019 B2
10464219 Robinson et al. Nov 2019 B2
10485621 Morrissette et al. Nov 2019 B2
10500004 Hanuschik et al. Dec 2019 B2
10500005 Weir et al. Dec 2019 B2
10500007 Richmond et al. Dec 2019 B2
10507066 DiMaio et al. Dec 2019 B2
10510267 Jarc et al. Dec 2019 B2
10524871 Liao Jan 2020 B2
10548459 Itkowitz et al. Feb 2020 B2
10575909 Robinson et al. Mar 2020 B2
10592529 Hoffman et al. Mar 2020 B2
10595946 Nixon Mar 2020 B2
10881469 Robinson Jan 2021 B2
10881473 Itkowitz et al. Jan 2021 B2
10898188 Burbank Jan 2021 B2
10898189 McDonald, II Jan 2021 B2
10905506 Itkowitz et al. Feb 2021 B2
10912544 Brisson et al. Feb 2021 B2
10912619 Jarc et al. Feb 2021 B2
10918387 Duque et al. Feb 2021 B2
10918449 Solomon et al. Feb 2021 B2
10932873 Griffiths et al. Mar 2021 B2
10932877 Devengenzo et al. Mar 2021 B2
10939969 Swarup et al. Mar 2021 B2
10939973 DiMaio et al. Mar 2021 B2
10952801 Miller et al. Mar 2021 B2
10965933 Jarc Mar 2021 B2
10966742 Rosa et al. Apr 2021 B2
10973517 Wixey Apr 2021 B2
10973519 Weir et al. Apr 2021 B2
10984567 Itkowitz et al. Apr 2021 B2
10993773 Cooper et al. May 2021 B2
10993775 Cooper et al. May 2021 B2
11000331 Krom et al. May 2021 B2
11013567 Wu et al. May 2021 B2
11020138 Ragosta Jun 2021 B2
11020191 Diolaiti et al. Jun 2021 B2
11020193 Wixey et al. Jun 2021 B2
11026755 Weir et al. Jun 2021 B2
11026759 Donlon et al. Jun 2021 B2
11040189 Vaders et al. Jun 2021 B2
11045077 Stem et al. Jun 2021 B2
11045274 Dachs, II et al. Jun 2021 B2
11058501 Tokarchuk et al. Jul 2021 B2
11076925 DiMaio et al. Aug 2021 B2
11090119 Burbank Aug 2021 B2
11096687 Flanagan et al. Aug 2021 B2
11098803 Duque et al. Aug 2021 B2
11109925 Cooper et al. Sep 2021 B2
11116578 Hoffman et al. Sep 2021 B2
11129683 Steger et al. Sep 2021 B2
11135029 Suresh et al. Oct 2021 B2
11147552 Burbank et al. Oct 2021 B2
11147640 Jarc et al. Oct 2021 B2
11154373 Abbott et al. Oct 2021 B2
11154374 Hanuschik et al. Oct 2021 B2
11160622 Goldberg et al. Nov 2021 B2
11160625 Wixey et al. Nov 2021 B2
11161243 Rabindran et al. Nov 2021 B2
11166758 Mohr et al. Nov 2021 B2
11166770 DiMaio et al. Nov 2021 B2
11166773 Ragosta et al. Nov 2021 B2
11173597 Rabindran et al. Nov 2021 B2
11185378 Weir et al. Nov 2021 B2
11191596 Thompson et al. Dec 2021 B2
11197729 Thompson et al. Dec 2021 B2
11213360 Hourtash et al. Jan 2022 B2
11221863 Azizian et al. Jan 2022 B2
11234700 Ragosta et al. Feb 2022 B2
11241274 Vaders et al. Feb 2022 B2
11241290 Waterbury et al. Feb 2022 B2
11253323 Hughes Feb 2022 B2
11259870 DiMaio et al. Mar 2022 B2
11259884 Burbank Mar 2022 B2
11272993 Gomez et al. Mar 2022 B2
11272994 Saraliev et al. Mar 2022 B2
11291442 Wixey et al. Apr 2022 B2
11291513 Manzo et al. Apr 2022 B2
20040243147 Lipow Dec 2004 A1
20050206583 Lemelson Sep 2005 A1
20060100642 Yang May 2006 A1
20060258938 Hoffman et al. Nov 2006 A1
20070265638 Lipow Nov 2007 A1
20090248036 Hoffman Oct 2009 A1
20100174410 Greer et al. Jul 2010 A1
20110118752 Itkowitz May 2011 A1
20110181510 Hakala Jul 2011 A1
20120071891 Itkowitz Mar 2012 A1
20120116416 Neff et al. May 2012 A1
20130211597 Sommerville Aug 2013 A1
20140018819 Raj Jan 2014 A1
20140194896 Frimer Jul 2014 A1
20140276394 Wong et al. Sep 2014 A1
20150157411 Choi Jun 2015 A1
20150173837 Barnett Jun 2015 A1
20150366156 Holmström Dec 2015 A1
20160228203 Yamanaka Aug 2016 A1
20160235489 Gombert Aug 2016 A1
20160310228 Maeda Oct 2016 A1
20160346930 Hares Dec 2016 A1
20170000575 Griffiths Jan 2017 A1
20170042730 He Feb 2017 A1
20170172674 Hanuschik Jun 2017 A1
20170172675 Jarc Jun 2017 A1
20170180720 Jarc Jun 2017 A1
20170189126 Weir Jul 2017 A1
20170189127 Weir Jul 2017 A1
20170189131 Weir Jul 2017 A1
20180049829 Yates Feb 2018 A1
20180132833 Gotlib May 2018 A1
20190201137 Shelton, IV Jul 2019 A1
20190206565 Shelton, IV Jul 2019 A1
20190231459 Mustufa Aug 2019 A1
20190290374 Ramadorai Sep 2019 A1
20190328468 Schena Oct 2019 A1
Foreign Referenced Citations (10)
Number Date Country
102018572 Apr 2011 CN
102647955 Aug 2012 CN
202726918 Feb 2013 CN
103149935 Jun 2013 CN
104688347 Jun 2015 CN
105593787 May 2016 CN
1020130051818 May 2013 KR
2012044334 Apr 2012 WO
2012149446 Nov 2012 WO
2015143067 Sep 2015 WO
Non-Patent Literature Citations (7)
Entry
Extended European Search Report dated May 28, 2020 corresponding to counterpart Patent Application EP 17807527.1.
International Search Report and Written Opinion corresponding to counterpart Int'l Appln No. PCT/US17/035576 dated Sep. 7, 2017.
Partial European Search Report dated Jan. 3, 2020 corresponding to counterpart Patent Application EP 17807527.1.
Chinese First Office Action dated Feb. 20, 2021 corresponding to counterpart Patent Application CN 201780032069.4.
Chinese Second Office Action dated Sep. 14, 2021 corresponding to counterpart Patent Application CN 201780032069.4.
Tobii, e-book, tech.tobii.com. Copyright 2021, Tobii AB, “5 Ways Next-Generation Surgical Robotics Will Leverage Attention to Enhance Care”, pp. 1/12-12/12.
Tobii, Tobii White Paper, tech.tobii.com., May 2020, Version 1.0, “Why Next-Generation Surgical Systems Will Include Eye Tracking”, pp. 1/15-15/15.
Related Publications (1)
Number Date Country
20190290374 A1 Sep 2019 US
Provisional Applications (1)
Number Date Country
62345054 Jun 2016 US