There are various types of surgical robotic systems on the market or under development. Some surgical robotic systems use a plurality of robotic arms. Each arm carries a surgical instrument, or the camera used to capture images from within the body for display on a monitor. Other surgical robotic systems use a single arm that carries a plurality of instruments and a camera that extend into the body via a single incision. Each of these types of robotic systems uses motors to position and/or orient the camera and instruments and to, where applicable, actuate the instruments. Typical configurations allow two or three instruments and the camera to be supported and manipulated by the system. Input to the system is generated based on input from a surgeon positioned at a master console, typically using input devices such as input handles and a foot pedal. Motion and actuation of the surgical instruments and the camera is controlled based on the user input. The image captured by the camera is shown on a display at the surgeon console. The console may be located patient-side, within the sterile field, or outside of the sterile field.
Although the inventions described herein may be used on a variety of robotic surgical systems, the embodiments will be described with reference to a system of the type shown in
One of the instruments 10a, 10b, 10c is a laparoscopic camera that captures images for display on a display 23 at the surgeon console 12. The camera may be moved by its corresponding robotic manipulator using input from an eye tracker 21.
The input devices at the console may be equipped to provide the surgeon with tactile feedback so that the surgeon can feel on the input devices 17, 18 the forces exerted by the instruments on the patient's tissues.
A control unit 30 is operationally connected to the robotic arms and to the user interface. The control unit receives user input from the input devices corresponding to the desired movement of the surgical instruments, and the robotic arms are caused to manipulate the surgical instruments accordingly.
New opportunities for control of the surgical instruments arise when the system is paired with other surgical implements such as a colpotomy ring, stomach bougie, stent or catheter. This application describes embodiments where the surgical robotic system is capable of identifying and responding to other surgical implements, intraoperatively.
This application describes modes and methods of operation for a surgical robotic system according to which the system may identify and respond to other, intraoperatively. While the modes and methods are not limited to any specific types of surgical procedures, the embodiments describe operation of the system in which a colpotomy ring/cup is used during a total laparoscopic hysterectomy, and one in which and a stomach bougie is used for a sleeve gastrectomy.
Referring to
In the
The surgeon could pre-define a desired sleeve width and the system would help to confirm the position of the stapler with respect to the bougie prior to firing. This confirmation of position could also include a haptic component that causes the use input device to apply force to the surgeon's hand. This force could restrain movement of the user input handle to restrict motion of the stapler along the path, or cause the surgeon to haptically feel as if the instrument is attracted to the path (like a magnet), thus compelling the surgeon to move the handle so as to guide the instrument along that path.
In a modified version of the first embodiment, a memory of the system stores a computer program that includes a computer vision algorithm. A controller executes the computer vision algorithm to analyze endoscopic image data, 3D endoscopic image data or structured light system image data to detect shape characteristics of the stomach as shaped by the bougie. The algorithm is used to determine the location of the bougie based on topographical variations in the imaged region or, if the bougie is illuminated, light variations. The system can generate an overlay on the image display identifying the location of the bougie or a margin of predetermined distance from the detected longitudinal edge of the bougie. The surgeon can the guide the stapler to a target cut/staple pathway based on the region defined by the bougie. Alternatively, the system can generate a haptic boundary as described above, allowing the surgeon to advance the stapler along the haptic boundary to complete the stapling and cutting steps. Additionally, or as an alternative, the system may be configured so that the user cannot fire the stapler except when the stapler is an appropriate position to create the pouch, such as a predetermined distance from the bougie, oriented along the target staple pathway, etc.
A second embodiment would enable the use of a surgical robotic system with a colpotomy ring and uterine manipulator. During a hysterectomy, it is necessary to cut the vaginal cuff circumferentially to detach the uterus from the vagina. As with the bougie, the colpotomy ring is not readily identifiable when inserted into the patient due to the layer of tissue between the device and the robotically controlled surgical instruments.
Much like the bougie example, the second embodiment would enable communication between the uterine manipulator, specifically the colpotomy ring 108, and the surgical system such that the surgical system could identify the location of the colpotomy ring and the instrument proximity to the ring. As in the bougie example, control of the user input devices can be used to deliver haptic feedback that causes the surgeon to feel as if the instruments are haptically attracted to a path defined by the circumference of the ring. Electrosurgical devices used for the procedure may be set up so that their energy-delivery features are enabled when within the ring proximity, as a means to prevent undesired tissue damage.
These modes of operation could be turned on or off by a surgeon using input at the surgeon console or enabled via procedural anticipation based on observed steps motions (using kinematics or computer vision techniques) being carried out during the procedure.
These embodiments provide a number of advantages over existing technology, including:
Described concepts that are particularly unique include:
Concepts described in U.S. application Ser. No. 16/237,418, “Use of Eye Tracking for Tool Identification and Assignment in a Robotic Surgical System” (Ref: TRX-14210) and U.S. application Ser. No. 16/237,444 “System and Method for Controlling a Robotic Surgical System Based on Identified Structures” (Ref: TRX-14410), and U.S. Provisional 62/787,250, entitled “Instrument Path Guidance Using Visualization and Fluorescence” (Ref: TRX-14000) may be combined with those discussed in the present applications.
All patents and applications referenced herein, including for purposes of priority, are incorporated herein by reference.
Number | Date | Country | |
---|---|---|---|
62787250 | Dec 2018 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 16010388 | Jun 2018 | US |
Child | 16733147 | US |