SURGICAL ROBOTIC SYSTEM AND METHOD FOR LAPAROSCOPIC INSTRUMENTS AND CAMERA NAVIGATION

Information

  • Patent Application
  • 20250041007
  • Publication Number
    20250041007
  • Date Filed
    July 01, 2024
    10 months ago
  • Date Published
    February 06, 2025
    3 months ago
Abstract
A surgical robotic system includes a surgical robotic arm having a surgical device and an instrument drive unit configured to actuate the surgical device. The system also includes a surgeon console configured to receive user input to control at least one of the surgical robotic arm or the surgical device. The system further includes a control tower coupled to the surgical robotic arm and the surgeon console. The control tower includes a storage device and a controller configured to store a plurality of checkpoints on the storage device. Each checkpoint of the plurality of checkpoints includes image data of a surgical site and position data corresponding to a prior position of the surgical robotic arm during a surgical procedure. The controller is also configured to output a graphical user interface (GUI) on a display screen. The GUI is configured to display the image data of at least one checkpoint of the plurality of checkpoints and to receive user input selecting the checkpoint. The controller is further configured to control the surgical robotic arm to move the surgical robotic arm to the prior position based on the selected checkpoint.
Description
BACKGROUND

Surgical robotic systems are currently being used in a variety of surgical procedures, including minimally invasive surgical procedures. Some surgical robotic systems include a surgeon console controlling a surgical robotic arm and a surgical instrument having an end effector (e.g., forceps or grasping instrument) coupled to and actuated by the robotic arm. In operation, the robotic arm is moved to a position over a patient and then guides the surgical instrument into a small incision via a surgical port or a natural orifice of a patient to position the end effector at a work site within the patient's body. A laparoscopic camera, which is also held by one of the robotic arms, is inserted into the patient to image the surgical site. During the robotic surgical procedure, doctors continuously monitor a surgeon display screen disposed on the surgeon console and guide the camera to the anatomy of interest in stepwise manner as the surgeon is navigating around other ports and their respective instruments. As a result, the instruments and camera may be moved multiple times (e.g., hundreds). This process makes robotic surgery complex and time-consuming. Furthermore, the camera may get fogged and must be removed from the patient cavity to be cleaned and replaced at the same position. Placing and removing the camera requires multiple movement steps that surgeon needs to retrace. Multiple movements of the instruments and camera not only increase the procedure duration but also increase the anesthesia dose required for patient, which in turn, increases the recovery duration and cost of procedure.


SUMMARY

The present disclosure provides a surgical robotic system that is configured to store navigation position checkpoints of a laparoscopic camera as well as navigation position checkpoints of one or more instruments inserted into the patient. The checkpoints may be saved manually and/or automatically. The system provides a user interface for the surgeon and the staff, which may be used to save the positions of the instruments and the camera. The user interface may also be used to retract the camera and the instruments at any time during the surgical procedure such that the system automatically moves the camera and/or the instruments to the selected checkpoint.


In one embodiment, the checkpoint process may be used for extraction (e.g., for cleaning) and reinsertion. In this embodiment, an initial position of the camera while it is out of the patient cavity is recorded as a first checkpoint. A robotic arm then inserts the camera into the patient body to perform the surgical procedure. If the camera fogs up, the surgeon can recall the camera to its initial position (i.e., the first checkpoint). One or more subsequent checkpoints for camera position may be also recorded after the insertion to generate a so-called breadcrumb navigation trail along which the system can automatically navigate the camera, in either the backward or forward direction. Thus, once the camera cleaning is completed, the surgeon can recall the latest checkpoint position, so that the arm will move automatically to a desired (e.g., the last known) recorded position automatically. The same navigation trails may be used for insertion and navigation of instruments. The term “trail” denotes the path formed using the co-ordinates of instruments or camera saved during the procedure.


The checkpoint system may also be used to restore the system after a power failure or from a serious error in the system. In case of power failure, an emergency battery backup may not allow surgeon to operate the arms. The checkpoints may be created for each of the robotic arms and instruments allowing for an automated restoration process.


According to one embodiment of the present disclosure, a surgical robotic system is disclosed. The surgical robotic system includes a surgical robotic arm having a surgical device and an instrument drive unit configured to actuate the surgical device. The system also includes a surgeon console configured to receive user input to control at least one of the surgical robotic arm or the surgical device. The system further includes a control tower coupled to the surgical robotic arm and the surgeon console. The control tower includes a storage device and a controller configured to store a plurality of checkpoints on the storage device. Each checkpoint of the plurality of checkpoints includes image data of a surgical site and position data corresponding to a prior position of the surgical robotic arm during a surgical procedure. The controller is also configured to output a graphical user interface (GUI) on a display screen. The GUI is configured to display the image data of at least one checkpoint of the plurality of checkpoints and to receive user input selecting the checkpoint. The controller is further configured to control the surgical robotic arm to move the surgical robotic arm to the prior position based on the selected checkpoint.


Implementations of the above embodiment may include one or more of the following features. According to one aspect of the above embodiment, the controller may be further configured to store the checkpoint in response to activation of the surgical device. The controller may be also configured to store the checkpoint in response to a user command. The controller may be additionally configured to store the checkpoint based on an elapsed time of the surgical procedure. The controller may be further configured to detect a phase of the surgical procedure. The controller may be also configured to store the checkpoint based on the detected phase of the surgical procedure. The surgical device may be a surgical robotic instrument or a laparoscopic camera. The checkpoint may include the position data corresponding to the laparoscopic camera being outside a patient.


According to another embodiment of the present disclosure, a method for restoring positional state of a surgical robotic system is disclosed. The method includes storing on a storage device a plurality of checkpoints including image data of a surgical site and position data corresponding to a prior position during a surgical procedure of a laparoscopic camera held by a surgical robotic arm. A first checkpoint corresponds to a position of the laparoscopic camera being outside a patient and a second checkpoint of the plurality of checkpoints corresponds to a position of the laparoscopic camera being inside patient. The method further includes outputting a graphical user interface (GUI) configured to display on a display screen the image data of the plurality of checkpoints. The method additionally includes receiving a first user input through the GUI selecting the at least one of the first checkpoint or the second checkpoint and controlling the surgical robotic arm to move the surgical robotic arm to the prior position based on selection of at least one of the first checkpoint or the second checkpoint.


Implementations of the above embodiment may include one or more of the following features. According to one aspect of the above embodiment, the method may further include storing the first checkpoint or the second checkpoint in response to activation of the laparoscopic camera. The method may further include storing the first checkpoint or the second checkpoint in response to a user command. The method may additionally include storing the first checkpoint or the second checkpoint based on an elapsed time of the surgical procedure. The method may also include detecting a phase of the surgical procedure; and storing the first checkpoint or the second checkpoint based on the detected phase of the surgical procedure.


According to a further embodiment of the present disclosure, a method for restoring positional state of a surgical robotic system is disclosed. The method includes storing a plurality of checkpoints on a storage device. Each checkpoint of the plurality of checkpoints includes image data of a surgical site and position data corresponding to a prior position of a surgical robotic arm during a surgical procedure. The method further includes outputting a graphical user interface (GUI) configured to display on a display screen the image data of at least one checkpoint of the plurality of checkpoints. The method additionally includes receiving user input through the GUI selecting the at least one checkpoint and controlling the surgical robotic arm to move the surgical robotic arm to the prior position based on the selected at least one checkpoint.


Implementations of the above embodiment may include one or more of the following features. According to one aspect of the above embodiment, the method may also include storing the checkpoint in response to activation of a surgical device held by the surgical robotic arm. The method may also include storing the checkpoint in response to a user command. The method may additionally include storing the checkpoint based on an elapsed time of the surgical procedure. The method may also include detecting a phase of the surgical procedure and storing the checkpoint based on the detected phase of the surgical procedure.





BRIEF DESCRIPTION OF THE DRAWINGS

Various embodiments of the present disclosure are described herein with reference to the drawings wherein:



FIG. 1 is a perspective view of a surgical robotic system including a control tower, a console, and one or more surgical robotic arms each disposed on a mobile cart according to an embodiment of the present disclosure;



FIG. 2 is a perspective view of a surgical robotic arm of the surgical robotic system of FIG. 1 according to an embodiment of the present disclosure;



FIG. 3 is a perspective view of a mobile cart having a setup arm with the surgical robotic arm of the surgical robotic system of FIG. 1 according to an embodiment of the present disclosure;



FIG. 4 is a schematic diagram of a computer architecture of the surgical robotic system of FIG. 1 according to an embodiment of the present disclosure;



FIG. 5 is a plan schematic view of the surgical robotic system of FIG. 1 positioned about a surgical table according to an embodiment of the present disclosure;



FIG. 6 is a schematic diagram of a system for determining phases of a surgical procedure according to an embodiment of the present disclosure;



FIG. 7 is a flow chart of a method for saving checkpoints according to an embodiment of the present disclosure;



FIGS. 8A and B each show a flow chart of a method for restoring checkpoints according to an embodiment of the present disclosure; and



FIGS. 9-20 are screenshots of a graphical user interface (GUI) implementing the methods of FIGS. 7 and 8 according to an embodiment of the present disclosure.





DETAILED DESCRIPTION

Embodiments of the presently disclosed surgical robotic system are described in detail with reference to the drawings, in which like reference numerals designate identical or corresponding elements in each of the several views.


With reference to FIG. 1, a surgical robotic system 10 includes a control tower 20, which is connected to all the components of the surgical robotic system 10 including a surgeon console 30 and one or more mobile carts 60. Each of the mobile carts 60 includes a robotic arm 40 having a surgical instrument 50 removably coupled thereto. The robotic arms 40 also couple to the mobile carts 60. The robotic system 10 may include any number of mobile carts 60 and/or robotic arms 40.


The surgical instrument 50 is configured for use during minimally invasive surgical procedures. In embodiments, the surgical instrument 50 may be configured for open surgical procedures. In further embodiments, the surgical instrument 50 may be an electrosurgical forceps configured to seal tissue by compressing tissue between jaw members and applying electrosurgical current thereto. In yet further embodiments, the surgical instrument 50 may be a surgical stapler including a pair of jaws configured to grasp and clamp tissue while deploying a plurality of tissue fasteners, e.g., staples, and cutting stapled tissue. In yet further embodiments, the surgical instrument 50 may be a surgical clip applier including a pair of jaws configured apply a surgical clip onto tissue.


One of the robotic arms 40 may include a laparoscopic camera 51 configured to capture video of the surgical site. The laparoscopic camera 51 may be a stereoscopic camera configured to capture two side-by-side (i.e., left and right) images of the surgical site to produce a video stream of the surgical scene. The laparoscopic camera 51 is coupled to an image processing device 56, which may be disposed within the control tower 20. The image processing device 56 may be any computing device configured to receive the video feed from the laparoscopic camera 51 and output the processed video stream.


The surgeon console 30 includes a first screen 32, which displays a video feed of the surgical site provided by camera 51 of the surgical instrument 50 disposed on the robotic arm 40, and a second screen 34, which displays a user interface for controlling the surgical robotic system 10. The first screen 32 and second screen 34 may be touchscreens allowing for displaying various graphical user inputs.


The surgeon console 30 also includes a plurality of user interface devices, such as foot pedals 36 and a pair of hand controllers 38a and 38b which are used by a user to remotely control robotic arms 40. The surgeon console further includes an armrest 33 used to support clinician's arms while operating the hand controllers 38a and 38b.


The control tower 20 includes a screen 23, which may be a touchscreen, and outputs on the graphical user interfaces (GUIs). The control tower 20 also acts as an interface between the surgeon console 30 and one or more robotic arms 40. In particular, the control tower 20 is configured to control the robotic arms 40, such as to move the robotic arms 40 and the corresponding surgical instrument 50, based on a set of programmable instructions and/or input commands from the surgeon console 30, in such a way that robotic arms 40 and the surgical instrument 50 execute a desired movement sequence in response to input from the foot pedals 36 and the hand controllers 38a and 38b. The foot pedals 36 may be used to enable and lock the hand controllers 38a and 38b, repositioning camera movement and electrosurgical activation/deactivation. In particular, the foot pedals 36 may be used to perform a clutching action on the hand controllers 38a and 38b. Clutching is initiated by pressing one of the foot pedals 36, which disconnects (i.e., prevents movement inputs) the hand controllers 38a and/or 38b from the robotic arm 40 and corresponding instrument 50 or camera 51 attached thereto. This allows the user to reposition the hand controllers 38a and 38b without moving the robotic arm(s) 40 and the instrument 50 and/or camera 51. This is useful when reaching control boundaries of the surgical space.


Each of the control tower 20, the surgeon console 30, and the robotic arm 40 includes a respective computer 21, 31, 41. The computers 21, 31, 41 are interconnected to each other using any suitable communication network based on wired or wireless communication protocols. The term “network,” whether plural or singular, as used herein, denotes a data network, including, but not limited to, the Internet, Intranet, a wide area network, or a local area network, and without limitation as to the full scope of the definition of communication networks as encompassed by the present disclosure. Suitable protocols include, but are not limited to, transmission control protocol/internet protocol (TCP/IP), datagram protocol/internet protocol (UDP/IP), and/or datagram congestion control protocol (DC). Wireless communication may be achieved via one or more wireless configurations, e.g., radio frequency, optical, Wi-Fi, Bluetooth (an open wireless protocol for exchanging data over short distances, using short length radio waves, from fixed and mobile devices, creating personal area networks (PANs), ZigBee® (a specification for a suite of high level communication protocols using small, low-power digital radios based on the IEEE 122.15.4-1203 standard for wireless personal area networks (WPANs)).


The computers 21, 31, 41 may include any suitable processor (not shown) operably connected to a memory (not shown), which may include one or more of volatile, non-volatile, magnetic, optical, or electrical media, such as read-only memory (ROM), random access memory (RAM), electrically-erasable programmable ROM (EEPROM), non-volatile RAM (NVRAM), or flash memory. The processor may be any suitable processor (e.g., control circuit) adapted to perform the operations, calculations, and/or set of instructions described in the present disclosure including, but not limited to, a hardware processor, a field programmable gate array (FPGA), a digital signal processor (DSP), a central processing unit (CPU), a microprocessor, and combinations thereof. Those skilled in the art will appreciate that the processor may be substituted for by using any logic processor (e.g., control circuit) adapted to execute algorithms, calculations, and/or set of instructions described herein.


With reference to FIG. 2, each of the robotic arms 40 may include a plurality of links 42a, 42b, 42c, which are interconnected at joints 44a, 44b, 44c, respectively. Other configurations of links and joints may be utilized as known by those skilled in the art. The joint 44a is configured to secure the robotic arm 40 to the mobile cart 60 and defines a first longitudinal axis. With reference to FIG. 3, the mobile cart 60 includes a lift 67 and a setup arm 61, which provides a base for mounting the robotic arm 40. The lift 67 allows for vertical movement of the setup arm 61. The mobile cart 60 also includes a screen 69 for displaying information pertaining to the robotic arm 40. In embodiments, the robotic arm 40 may include any type and/or number of joints.


The setup arm 61 includes a first link 62a, a second link 62b, and a third link 62c, which provide for lateral maneuverability of the robotic arm 40. The links 62a, 62b, 62c are interconnected at joints 63a and 63b, each of which may include an actuator (not shown) for rotating the links 62b and 62b relative to each other and the link 62c. In particular, the links 62a, 62b, 62c are movable in their corresponding lateral planes that are parallel to each other, thereby allowing for extension of the robotic arm 40 relative to the patient (e.g., surgical table). In embodiments, the robotic arm 40 may be coupled to the surgical table (not shown). The setup arm 61 includes controls 65 for adjusting movement of the links 62a, 62b, 62c as well as the lift 67. In embodiments, the setup arm 61 may include any type and/or number of joints.


The third link 62c may include a rotatable base 64 having two degrees of freedom. In particular, the rotatable base 64 includes a first actuator 64a and a second actuator 64b. The first actuator 64a is rotatable about a first stationary arm axis which is perpendicular to a plane defined by the third link 62c and the second actuator 64b is rotatable about a second stationary arm axis which is transverse to the first stationary arm axis. The first and second actuators 64a and 64b allow for full three-dimensional orientation of the robotic arm 40.


The actuator 48b of the joint 44b is coupled to the joint 44c via the belt 45a, and the joint 44c is in turn coupled to the joint 46b via the belt 45b. Joint 44c may include a transfer case coupling the belts 45a and 45b, such that the actuator 48b is configured to rotate each of the links 42b, 42c and a holder 46 relative to each other. More specifically, links 42b, 42c, and the holder 46 are passively coupled to the actuator 48b which enforces rotation about a pivot point “P” which lies at an intersection of the first axis defined by the link 42a and the second axis defined by the holder 46. In other words, the pivot point “P” is a remote center of motion (RCM) for the robotic arm 40. Thus, the actuator 48b controls the angle θ between the first and second axes allowing for orientation of the surgical instrument 50. Due to the interlinking of the links 42a, 42b, 42c, and the holder 46 via the belts 45a and 45b, the angles between the links 42a, 42b, 42c, and the holder 46 are also adjusted to achieve the desired angle θ. In embodiments, some or all of the joints 44a, 44b, 44c may include an actuator to obviate the need for mechanical linkages.


The joints 44a and 44b include an actuator 48a and 48b configured to drive the joints 44a, 44b, 44c relative to each other through a series of belts 45a and 45b or other mechanical linkages such as a drive rod, a cable, or a lever and the like. In particular, the actuator 48a is configured to rotate the robotic arm 40 about a longitudinal axis defined by the link 42a.


With reference to FIG. 2, the holder 46 defines a second longitudinal axis and configured to receive an instrument drive unit (IDU) 52 (FIG. 1). The IDU 52 is configured to couple to an actuation mechanism of the surgical instrument 50 and the camera 51 and is configured to move (e.g., rotate) and actuate the instrument 50 and/or the camera 51. IDU 52 transfers actuation forces from its actuators to the surgical instrument 50 to actuate components an end effector 49 of the surgical instrument 50. The holder 46 includes a sliding mechanism 46a, which is configured to move the IDU 52 along the second longitudinal axis defined by the holder 46. The holder 46 also includes a joint 46b, which rotates the holder 46 relative to the link 42c. During laparoscopic procedures, the instrument 50 may be inserted through a laparoscopic access port 55 (FIG. 3) held by the holder 46. The holder 46 also includes a port latch 46c for securing the access port 55 to the holder 46 (FIG. 2).


The robotic arm 40 also includes a plurality of manual override buttons 53 (FIG. 1) disposed on the IDU 52 and the setup arm 61, which may be used in a manual mode. The user may press one or more of the buttons 53 to move the component associated with the button 53.


With reference to FIG. 4, each of the computers 21, 31, 41 of the surgical robotic system 10 may include a plurality of controllers, which may be embodied in hardware and/or software. The computer 21 of the control tower 20 includes a controller 21a and safety observer 21b. The controller 21a receives data from the computer 31 of the surgeon console 30 about the current position and/or orientation of the hand controllers 38a and 38b and the state of the foot pedals 36 and other buttons. The controller 21a processes these input positions to determine desired drive commands for each joint of the robotic arm 40 and/or the IDU 52 and communicates these to the computer 41 of the robotic arm 40. The controller 21a also receives the actual joint angles measured by encoders of the actuators 48a and 48b and uses this information to determine force feedback commands that are transmitted back to the computer 31 of the surgeon console 30 to provide haptic feedback through the hand controllers 38a and 38b. The safety observer 21b performs validity checks on the data going into and out of the controller 21a and notifies a system fault handler if errors in the data transmission are detected to place the computer 21 and/or the surgical robotic system 10 into a safe state.


The controller 21a is coupled to a storage 22a, which may be non-transitory computer-readable medium configured to store any suitable computer data, such as software instructions executable by the controller 21a. The controller 21a also includes transitory memory 22b for loading instructions and other computer readable data during execution of the instructions. In embodiments, other controllers of the system 10 include similar configurations.


The computer 41 includes a plurality of controllers, namely, a main cart controller 41a, a setup arm controller 41b, a robotic arm controller 41c, and an instrument drive unit (IDU) controller 41d. The main cart controller 41a receives and processes joint commands from the controller 21a of the computer 21 and communicates them to the setup arm controller 41b, the robotic arm controller 41c, and the IDU controller 41d. The main cart controller 41a also manages instrument exchanges and the overall state of the mobile cart 60, the robotic arm 40, and the IDU 52. The main cart controller 41a also communicates actual joint angles back to the controller 21a.


Each of joints 63a and 63b and the rotatable base 64 of the setup arm 61 are passive joints (i.e., no actuators are present therein) allowing for manual adjustment thereof by a user. The joints 63a and 63b and the rotatable base 64 include brakes that are disengaged by the user to configure the setup arm 61. The setup arm controller 41b monitors slippage of each of joints 63a and 63b and the rotatable base 64 of the setup arm 61, when brakes are engaged or can be freely moved by the operator when brakes are disengaged, but do not impact controls of other joints. The robotic arm controller 41c controls each joint 44a and 44b of the robotic arm 40 and calculates desired motor torques required for gravity compensation, friction compensation, and closed loop position control of the robotic arm 40. The robotic arm controller 41c calculates a movement command based on the calculated torque. The calculated motor commands are then communicated to one or more of the actuators 48a and 48b in the robotic arm 40. The actual joint positions are then transmitted by the actuators 48a and 48b back to the robotic arm controller 41c.


The IDU controller 41d receives desired joint angles for the surgical instrument 50, such as wrist and jaw angles, and computes desired currents for the motors in the IDU 52. The IDU controller 41d calculates actual angles based on the motor positions and transmits the actual angles back to the main cart controller 41a.


With reference to FIG. 5, the surgical robotic system 10 is set up around a surgical table 90. The system 10 includes mobile carts 60a-d, which may be numbered “1” through “4.” During setup, each of the carts 60a-d are positioned around the surgical table 90. Position and orientation of the carts 60a-d depends on a plurality of factors, such as placement of a plurality of access ports 55a-d, which in turn, depends on the surgery being performed. Once the port placement is determined, the access ports 55a-d are inserted into the patient, and carts 60a-d are positioned to insert instruments 50 and the laparoscopic camera 51 into corresponding ports 55a-d.


During use, each of the robotic arms 40a-d is attached to one of the access ports 55a-d that is inserted into the patient by attaching the latch 46c (FIG. 2) to the access port 55 (FIG. 3). The IDU 52 is attached to the holder 46, followed by the SIM 43 being attached to a distal portion of the IDU 52. Thereafter, the instrument 50 is attached to the SIM 43. The instrument 50 is then inserted through the access port 55 by moving the IDU 52 along the holder 46. The SIM 43 includes a plurality of drive shafts configured to transmit rotation of individual motors of the IDU 52 to the instrument 50 thereby actuating the instrument 50. In addition, the SIM 43 provides a sterile barrier between the instrument 50 and the other components of robotic arm 40, including the IDU 52. The SIM 43 is also configured to secure a sterile drape (not shown) to the IDU 52.


A surgical procedure may include multiple phases, and each phase may include one or more surgical actions. As used herein, the term “phase” represents a surgical event that is composed of a series of steps (e.g., closure). A “surgical action” may include an incision, a compression, a stapling, a clipping, a suturing, a cauterization, a scaling, or any other such actions performed to complete a phase in the surgical procedure. A “step” refers to the completion of a named surgical objective (e.g., hemostasis). During each step, certain surgical instruments 50 (e.g., forceps) are used to achieve a specific objective by performing one or more surgical actions.


With reference to FIG. 6, the surgical robotic system 10 may include a machine learning (ML) processing system 310 that processes the surgical data using one or more ML models to identify one or more features, such as surgical phase, instrument, anatomical structure, etc., in the surgical data. The ML processing system 310 includes a ML training system 325, which may be a separate device (e.g., server) that stores its output as one or more trained ML models 330. The ML models 330 are accessible by a ML execution system 340. The ML execution system 340 may be separate from the ML training system 325, namely, devices that “train” the models are separate from devices that “infer,” i.e., perform real-time processing of surgical data using the trained ML models 330.


System 10 includes a data reception system 305 that collects surgical data, including the video data and surgical instrumentation data. The data reception system 305 can include one or more devices (e.g., one or more user devices and/or servers) located within and/or associated with a surgical operating room and/or control center. The data reception system 305 can receive surgical data in real-time, i.e., as the surgical procedure is being performed.


The ML processing system 310, in some examples, may further include a data generator 315 to generate simulated surgical data, such as a set of virtual images, or record the video data from the video processing device 56, to train the ML models 330 as well as other sources of data, e.g., user input, arm movement, etc. Data generator 315 can access (read/write) a data store 320 to record data, including multiple images and/or multiple videos.


The ML processing system 310 also includes a phase detector 350 that uses the ML models to identify a phase within the surgical procedure. Phase detector 350 uses a particular procedural tracking data structure 355 from a list of procedural tracking data structures. Phase detector 350 selects the procedural tracking data structure 355 based on the type of surgical procedure that is being performed. In one or more examples, the type of surgical procedure is predetermined or input by user. The procedural tracking data structure 355 identifies a set of potential phases that may correspond to a part of the specific type of surgical procedure.


In some examples, the procedural tracking data structure 355 may be a graph that includes a set of nodes and a set of edges, with each node corresponding to a potential phase. The edges may provide directional connections between nodes that indicate (via the direction) an expected order during which the phases will be encountered throughout an iteration of the surgical procedure. The procedural tracking data structure 355 may include one or more branching nodes that feed to multiple next nodes and/or may include one or more points of divergence and/or convergence between the nodes. In some instances, a phase indicates a procedural action (e.g., surgical action) that is being performed or has been performed and/or indicates a combination of actions that have been performed. In some instances, a phase relates to a biological state of a patient undergoing a surgical procedure. For example, the biological state may indicate a complication (e.g., blood clots, clogged arteries/veins, etc.), pre-condition (e.g., lesions, polyps, etc.). In some examples, the ML models 330 are trained to detect an “abnormal condition,” such as hemorrhaging, arrhythmias, blood vessel abnormality, etc.


The phase detector 350 outputs the phase prediction associated with a portion of the video data that is analyzed by the ML processing system 310. The phase prediction is associated with the portion of the video data by identifying a start time and an end time of the portion of the video that is analyzed by the ML execution system 340. The phase prediction that is output may include an identity of a surgical phase as detected by the phase detector 350 based on the output of the ML execution system 340. Further, the phase prediction, in one or more examples, may include identities of the structures (e.g., instrument, anatomy, etc.) that are identified by the ML execution system 340 in the portion of the video that is analyzed. The phase prediction may also include a confidence score of the prediction. Other examples may include various other types of information in the phase prediction that is output. The predicted phase may be used by the controller 21a to determine when to implement the collision detection method described below.



FIG. 7 shows a method for saving checkpoints for the robotic arm 40 controlling the camera 51. The method may be implemented as software instructions executable by the controller 21a or any other suitable processor of the system 10. While the method is described with respect to the moving of the camera 51 to saved checkpoints and its corresponding robotic arm 40, the method may be also extended to control other robotic arms 40 and their instruments 50.


With reference to FIG. 7, the method is implemented to be executed during operation of the system 10, i.e., during the surgical procedure. At step 100 the controller 21a determines that a surgeon operating the robotic arms 40, i.e., by detecting movement, paddle, or button presses at one or both of the handle controllers 38a and 38b. At step 102, the controller 21a attempts to obtain the surgery phase name from the phase detector 350. At step 104, if no information is received, e.g., if the phase detector 350 is not present or there is no corresponding phase available, the controller 21a proceeds to step 106 to enable manual naming of phases. The surgeon or any other user may identify and name a checkpoint group by entering, e.g., speech input, typing, etc. the phase name or another identifier. The checkpoint group organizes one or more checkpoints based on their phase.


If the phase detector 350 is active and was able to identify the current phase, then the controller 21a uses the current phase to a create a check point group at step 108. At step 110, a check point is generated and saved in storage 22a as part of the check point group. Each checkpoint includes data describing orientation and position of the robotic arms 40, e.g., coordinates in a world frame, joint angles, arm height, angle relative to the table 90, etc. Checkpoint also includes positional information for the IDU 52, e.g., angular motor position for one or more motors controlling the camera 51 and/or the instrument 50. The checkpoint also includes image data for the checkpoint, such as video taken during checkpoint creation and/or still images, such as screenshot (e.g., thumbnail) from the video. At step 112, the checkpoints may also be saved as part of a new checkpoint group, i.e., checkpoints that occur during a specific phase of the procedure.


The checkpoints may be generated and stored automatically as in step 114, based on detection of a specific phase. In particular, at step 116 the checkpoint is generated in response to insertion or removal of the camera 51 and the instruments 50 into the access ports 50a-d. Another automatic checkpoint generation routine may be done at step 118, which includes creating checkpoints after a set time period has elapsed, which may be done periodically, e.g., every minute. In addition to automatic checkpoint generation, the user may also do so manually at step 120 by commanding the surgeon console 30 to generate a checkpoint via a GUI 200 of FIGS. 9-20. In particular, FIG. 12 shows the GUI 200 including buttons 203a and 203b, which are used to generate manual checkpoints and to save a checkpoint manually, respectively. The system 10 may also continuously record and store checkpoints in a dynamic continuous manner, such that the user can choose to configure or restore the system 10 to any prior point in the surgical procedure.



FIGS. 8A and 8B show a method for restoring the checkpoints for the robotic arm 40 controlling the camera 51. The method may be implemented as software instructions executable by the controller 21a or any other suitable processor of the system 10. While the method is described with respect to restoring the camera 51 and its corresponding robotic arm 40 to a selected checkpoint, the method may be also extended to control other robotic arms 40 and their instruments 50.


The surgeon or the staff may browse and restore any desired checkpoint using the GUI 200 of FIGS. 9-20, which is displayed as an overlay on the screen 32 of the surgeon console 30 or any other suitable screen of the system 10. The screen 32 may overlay the GUI 200 over the video feed from the camera 51 or as a separate menu screen. At step 130, the user activates the GUI 200 by pressing a checkpoints button 202 via any suitable input method (e.g., touch input, pointer, etc.)


At step 132, once pressed, the GUI 200 displays multiple types of checkpoints that have been previously saved. Exemplary categories include, but are not limited to, periodic checkpoints saved during step 118, manual checkpoints saved during step 120, instrument event check points saved during the step 116, phase checkpoints saved during step 114, and dynamic checkpoints. Each category is further accessible via its own button 202a-e, respectively.


At step 134, the controller 21a receives the input selecting one of the buttons 202a-d. If the selected input is not the dynamic checkpoint option via the button 202e, then the controller 21a checks whether the selected input is for phase checkpoints selected via the button 202d. If yes, then the controller proceeds to step 138 during which the saved phase checkpoint groups 204a-f are displayed (e.g., overlayed) on the GUI 200 as shown in FIG. 15. Each of the groups 204a-f may be represented by an image (i.e., thumbnail screenshot) of the video corresponding to the checkpoint group and list affected instruments 50 and/or the camera 51. At step 140, the user selects which of the checkpoint groups 204a-f to display the checkpoints captured during the phase. Each of the checkpoint groups may include one or more checkpoints.


Returning to step 136, if the selected button is one of buttons 202a-c, selecting periodic, manual, or instrument checkpoints, respectively, then the controller proceeds to step 142 during which the user selects one or more associated checkpoints as shown in FIGS. 11, 13, and 14.



FIG. 11 shows periodic checkpoints 205a-c, which are displayed as a submenu of the periodic checkpoints button 202a. Each checkpoint 205a-c may be represented by an image (i.e., thumbnail screenshot) of the video corresponding to the checkpoint and list affected instruments 50 and/or the camera 51. The GUI 200 may include arrows 205d and 205e for scrolling through additional checkpoints. FIG. 13 shows manual checkpoints 206a and 206b, which are displayed as a submenu of the manual checkpoints button 202b. FIG. 14 shows instrument checkpoints 207-c, which are displayed as a submenu of the manual checkpoints button 202c.


Returning to step 134, if the user selects button 202e then dynamic checkpoint selection is enabled at step 146 as shown in FIGS. 16 and 17. The dynamic checkpoint selection includes a video stream 208a of the entire procedure or a portion thereof as well as playback controls 208b allowing the user to jump to a desired point and select a dynamic checkpoint via a button 208c. The video steam 208a may be displayed in any manner, e.g., picture-in-picture mode or overlayed over the video feed of the camera 51. At step 148, the user can also select a previously saved dynamic checkpoint.


Once the user selects a checkpoint, checkpoint navigation is enabled at step 144. FIG. 18 shows when one of the checkpoints 207a-c or any other checkpoints described above is selected. The selected checkpoint, i.e., checkpoint 207a, is enlarged as window 207d as well as a navigation button 207e and a back button 207f to revert to checkpoint selection. At step 152, the controller 21a checks which button 207e or 207f was pressed. Pressing button 207f removes the enlarged window 207d at step 154 and returns the GUI 200 back to step 142 where the available checkpoints for a given submenu (i.e., manual, instrument, period, phase, etc.) are shown. Pressing the navigate button 207e freezes input from the gimbals of the handle controllers 38a and 38b at step 156, in preparation for navigating through the selected checkpoint. Each of the handle controllers 38a and 38b include a gimbal assembly (not shown) allowing for movement and rotation of the handle controllers 38a and 38b about three axes (x, y, z). Details of the handle controllers 38a and 38b, including gimbal operation, are provided in U.S. Patent Application Publication No. 2020/0315729, titled “Control arm assemblies for robotic surgical systems” filed on Nov. 30, 2018, the entire contents of which are incorporated by reference herein.


At step 158, the controller 21a checks whether the user selected to navigate backwards or forward through the selected checkpoint. If the user selected to navigate forward, then at step 160 the GUI 200 clears the checkpoint video and/or thumbnail. The controller 21a also loads from storage 22a coordinates or other data (e.g., kinematics) for placing the robot arms 40a-d and their corresponding instruments 50 and/or camera 51 into the position and configuration of the selected checkpoint. In addition to controlling the robotic arms 40a-d, individual IDU 52 are also actuated to achieve the desired checkpoint. The forward navigation may be used during reinsertion of instruments 50 and/or camera 51 after extraction for instrument replacement or cleaning of the camera 51. This allows for the system 10 to quickly resume operation after the interruption caused by removal of the instruments 50 and/or camera 51 from the surgical site.


At step 162, the controller 21a verifies if the handle controllers 38a and 38b are being held by the user. If one of the handle controllers 38a or 38b is being moved, then at step 164 the controller 21a displays a prompt 209 to the user to release the handle controllers 38a and 38b. At step 166, the controller 21a automatically moves the robotic arms 40a-d and associated instruments 50/camera 51 to the loaded coordinates of the checkpoint. At step 168, the controller 21a checks if there are any other checkpoints available that need to be loaded based on the selected forward direction of the navigation. At step 170, the controller 21a continues to load subsequent checkpoints until a desired or final checkpoint is reached, i.e., the camera 51 is returned to its position prior inside the patient to its extraction. During navigation, the video feed 210a of the camera 51 is shown with a message 210b that navigation is in progress as shown in FIG. 20. In addition, the checkpoint to which the system 10 is being restored is shown in a thumbnail 210c. This provides the user with visual confirmation that the system 10 has reached the checkpoint, i.e., once the feed 210a matches the thumbnail 210c.


Returning to step 158, if the user selected to navigate backward, then the GUI 200 clears the checkpoint video and/or thumbnail at step 172 and the controller 21a loads from storage 22a coordinates or other data (e.g., kinematics) for placing the robot arms 40a-d and their corresponding instruments 50 and/or camera 51 into the position and configuration of the selected checkpoint. Backward navigation may be used for extraction of instruments 50 and/or camera 51 for instrument replacement or cleaning of the camera 51.


At step 174, the controller 21a verifies if the handle controllers 38a and 38b are being held by the user. If one of the handle controllers 38a or 38b is being moved, then at step 176 the controller 21a displays a prompt 209 to the user to release the handle controllers 38a and 38b. At step 178, the controller 21a automatically moves the robotic arms 40a-d and associated instruments 50/camera 51 to the loaded coordinates of the checkpoint. At step 180, the controller 21a checks if there are any other checkpoints available that need to be loaded based on the selected forward direction of the navigation. At step 182, the controller 21a continues to load subsequent checkpoints until a desired or final checkpoint is reached, i.e., the camera 51 is returned to its initial position prior to its insertion into the patient. In further embodiments, the controller 21a may disable input from the handle controllers 38a and 38b during the movement of the robotic arms 40a-d to the checkpoint position.


It will be understood that various modifications may be made to the embodiments disclosed herein. Therefore, the above description should not be construed as limiting, but merely as exemplifications of various embodiments. Those skilled in the art will envision other modifications within the scope and spirit of the claims appended thereto.

Claims
  • 1. A surgical robotic system comprising: a surgical robotic arm including a surgical device and an instrument drive unit configured to actuate the surgical device;a surgeon console configured to receive user input to control at least one of the surgical robotic arm or the surgical device; anda control tower coupled to the surgical robotic arm and the surgeon console, the control tower including: a storage device; anda controller configured to: store a plurality of checkpoints on the storage device, each checkpoint of the plurality of checkpoints including image data of a surgical site and position data corresponding to a prior position of the surgical robotic arm during a surgical procedure;output a graphical user interface (GUI) on a display screen, the GUI configured to display the image data of at least one checkpoint of the plurality of checkpoints and to receive user input selecting the at least one checkpoint; andcontrol the surgical robotic arm to move the surgical robotic arm to the prior position based on the selected at least one checkpoint.
  • 2. The surgical robotic system according to claim 1, wherein the controller is further configured to store the at least one checkpoint of the plurality of checkpoints in response to activation of the surgical device.
  • 3. The surgical robotic system according to claim 1, wherein the controller is further configured to store the at least one checkpoint of the plurality of checkpoints in response to a user command.
  • 4. The surgical robotic system according to claim 1, wherein the controller is further configured to store the at least one checkpoint of the plurality of checkpoints based on an elapsed time of the surgical procedure.
  • 5. The surgical robotic system according to claim 1, wherein the controller is further configured to detect a phase of the surgical procedure.
  • 6. The surgical robotic system according to claim 5, wherein the controller is further configured to store the at least one checkpoint of the plurality of checkpoints based on the detected phase of the surgical procedure.
  • 7. The surgical robotic system according to claim 1, wherein the surgical device is a surgical robotic instrument.
  • 8. The surgical robotic system according to claim 1, wherein the surgical device is a laparoscopic camera.
  • 9. The surgical robotic system according to claim 8, wherein the at least one checkpoint of the plurality of checkpoints includes the position data corresponding to the laparoscopic camera being outside a patient.
  • 10. A method for restoring a positional state of a surgical robotic system, the method comprising: storing on a storage device a plurality of checkpoints including image data of a surgical site and position data corresponding to a prior position during a surgical procedure of a laparoscopic camera held by a surgical robotic arm,wherein a first checkpoint of the plurality of checkpoints corresponds to a position of the laparoscopic camera being outside a patient and a second checkpoint of the plurality of checkpoints corresponds to a position of the laparoscopic camera being inside patient;outputting a graphical user interface (GUI) configured to display on a display screen the image data of the plurality of checkpoints;receiving a first user input through the GUI selecting at least one of the first checkpoint or the second checkpoint; andcontrolling the surgical robotic arm to move the surgical robotic arm to the prior position based on selection of at least one of the first checkpoint or the second checkpoint.
  • 11. The method according to claim 10, further comprising storing at least one of the first checkpoint or the second checkpoint in response to activation of the laparoscopic camera.
  • 12. The method according to claim 10, further comprising storing at least one of the first checkpoint or the second checkpoint in response to a user command.
  • 13. The method according to claim 10, further comprising storing at least one of the first checkpoint or the second checkpoint based on an elapsed time of the surgical procedure.
  • 14. The method according to claim 10, further comprising: detecting a phase of the surgical procedure; andstoring at least one of the first checkpoint or the second checkpoint based on the detected phase of the surgical procedure.
  • 15. A method for restoring positional state of a surgical robotic system, the method comprising: storing a plurality of checkpoints on a storage device, each checkpoint of the plurality of checkpoints including image data of a surgical site and position data corresponding to a prior position of a surgical robotic arm during a surgical procedure;outputting a graphical user interface (GUI) configured to display on a display screen the image data of at least one checkpoint of the plurality of checkpoints;receiving user input through the GUI selecting the at least one checkpoint; andcontrolling the surgical robotic arm to move the surgical robotic arm to the prior position based on the selected at least one checkpoint.
  • 16. The method according to claim 15, further comprising storing the at least one checkpoint of the plurality of checkpoints in response to activation of a surgical device held by the surgical robotic arm.
  • 17. The method according to claim 15, further comprising storing the at least one checkpoint of the plurality of checkpoints in response to a user command.
  • 18. The method according to claim 15, further comprising storing the at least one checkpoint of the plurality of checkpoints based on an elapsed time of the surgical procedure.
  • 19. The method according to claim 15, further comprising detecting a phase of the surgical procedure.
  • 20. The method according to claim 19, further comprising storing the at least one checkpoint of the plurality of checkpoints based on the detected phase of the surgical procedure.
CROSS-REFERENCE TO RELATED APPLICATIONS

This application claims the benefit of, and priority to, U.S. Provisional Patent Application Ser. No. 63/517,158 filed on Aug. 2, 2023. The entire contents of the foregoing application is incorporated by reference herein.

Provisional Applications (1)
Number Date Country
63517158 Aug 2023 US