Surgical robots enable enhanced control of instruments via basic features such as tremor-filtration and motion scaling. When paired with advanced vision systems, new opportunities for surgical instrument control arise. Current implementations of vision systems such as MRI and surgical robots like the Mako™ robot from Stryker enable paths and boundaries in orthopedic procedures and other hard tissue interventions.
Although the concepts described herein may be used on a variety of robotic surgical systems, one robotic surgical system is shown in
One of the instruments 10a, 10b, 10c is a camera that captures images of the operative field in the body cavity. The camera may be moved by its corresponding robotic manipulator using input from a variety of types of input devices, including, without limitation, one of the handles 17, 18, additional controls on the console, a foot pedal, an eye tracker 21, voice controller, etc. The console may also include a display or monitor 23 configured to display the images captured by the camera, and for optionally displaying system information, patient information, etc.
A control unit 30 is operationally connected to the robotic arms and to the user interface. The control unit receives user input from the input devices corresponding to the desired movement of the surgical instruments, and the robotic arms are caused to manipulate the surgical instruments accordingly.
The input devices 17, 18 are configured to be manipulated by a user to generate signals that are processed by the system to generate instructions used to command motion of the manipulators in order to move the instruments in multiple degrees of freedom and to, as appropriate, control operation of electromechanical actuators/motors that drive motion and/or actuation of the instrument end effectors.
The surgical system allows the operating room staff to remove and replace the surgical instruments 10a, b, c carried by the robotic manipulator, based on the surgical need. When an instrument exchange is necessary, surgical personnel remove an instrument from a manipulator arm and replace it with another.
This application details embodiments where the vision system used intraoperatively with the surgical robot is able to distinguish and identify tissue planes, paths, anatomical landmarks and structures and the surgical robot uses that information to enable advanced control of surgical instruments. The embodiments herein will focus on soft tissue interventions but could have applicability in other dynamic scenarios as well.
A first embodiment pairs a surgical robot system with a standard, off-the-shelf visualization system (such as Stryker or Olympus, etc.). The image processing equipment is designed to enable differentiation of objects within the field of view based on the RGB values. As an example, the
A second mode would restrict use of an electrosurgical device, by preventing monopolar energy from being activated from the instrument unless the instrument was positioned to deliver that energy within a region bordering the identified intersection between the two tissues. This could also prevent undesired damage to adjacent tissue.
A third operative mode may enable boundaries based on tissue color identification. These boundaries may either keep surgical instruments within a given area (keep-in), or outside of a given area (keep-out). The surgical system would haptically prevent the user from moving instruments out of/into those regions, as applicable.
A second embodiment would enable the use of a surgical robotic system with an advanced visualization system (such as the one from Novadaq) equipped with fluorescence imaging technology. See the image below which illustrates a hidden structure that has been illuminated using a fluorescing dye, placed prior to the surgical intervention in the structure or blood stream. The image processing equipment would identify the presence of fluorescence and use the differentiation between the fluorescing object and surrounding tissue to identify paths or boundaries for the surgical robot. See
The disclosed concepts are advantageous in that they define a path or region definition using visualization and tissue differentiation based on color or fluorescence. Operative modes for a surgical robot use the on paths or regions defined with the image processing equipment to, for example, prevent or allow certain types of activity, in some cases allowing the user to “feel” identified boundaries or paths via haptic constraints, attractions or repulsions. In some cases those modes that enable the use of instrument features the instrument is when near identified paths, objects or boundaries. Others disable the use of instrument features when the instrument is near identified paths, objects or boundaries.
It will be appreciated that the concepts described here may be used in conjunction with systems and modes of operation described in co-pending U.S. application Ser. No. 16/237,444, entitled “System and Method for Controlling a Robotic Surgical System Based on Identified Structures” which is incorporated herein by reference.
Number | Date | Country | |
---|---|---|---|
62787250 | Dec 2018 | US |