Robotic surgical devices with tracking camera technology and related systems and methods

Information

  • Patent Grant
  • 11974824
  • Patent Number
    11,974,824
  • Date Filed
    Tuesday, July 6, 2021
    3 years ago
  • Date Issued
    Tuesday, May 7, 2024
    6 months ago
Abstract
The various inventions relate to robotic surgical devices, consoles for operating such surgical devices, operating theaters in which the various devices can be used, insertion systems for inserting and using the surgical devices, and related methods. A positionable camera is disposed therein, and the system is configured to execute a tracking and positioning algorithm to re-position and re-orient the camera tip.
Description
FIELD OF THE INVENTION

The implementations disclosed herein relate to various medical devices and related components, including robotic and/or in vivo medical devices and related components. Certain implementations include various robotic medical devices, including robotic devices that are disposed within a body cavity and positioned using a support component disposed through an orifice or opening in the body cavity and further including a camera that is positioned through the support component and can be operated to manually or automatically track the arms or end effectors of the robotic device. Further implementations relate to methods and devices for operating the above devices.


BACKGROUND OF THE INVENTION

Invasive surgical procedures are essential for addressing various medical conditions. When possible, minimally invasive procedures such as laparoscopy are preferred.


However, known minimally invasive technologies such as laparoscopy are limited in scope and complexity due in part to 1) mobility restrictions resulting from using rigid tools inserted through access ports, and 2) limited visual feedback. Known robotic systems such as the da Vinci® Surgical System (available from Intuitive Surgical, Inc., located in Sunnyvale, Calif.) are also restricted by the access ports, as well as having the additional disadvantages of being very large, very expensive, unavailable in most hospitals, and having limited sensory and mobility capabilities.


There is a need in the art for improved surgical methods, systems, and devices.


BRIEF SUMMARY OF THE INVENTION

Discussed herein are various robotic surgical systems, including certain systems having camera lumens constructed and arranged to receive various camera systems, including tracking camera systems. Further implementations relate to surgical insertion devices constructed and arranged to be used to insert various surgical devices into a cavity of a patient while maintaining insufflation of the cavity.


In various Examples, a system of one or more computers can be configured to perform particular operations or actions through software, firmware, hardware, or a combination of them installed on the system that in operation causes or cause the system to perform the actions. One or more computer programs can be configured to perform particular operations or actions by virtue of including instructions that, when executed by data processing apparatus, cause the apparatus to perform the actions.


In Example 1, a robotic surgical system, comprising a device body constructed and arranged to be positioned at least partially within a body cavity of a patient through an incision, the device body comprising: a first robotic surgical arm operably coupled to the device body and comprising a first end effector; a second robotic surgical arm operably coupled to the device body and comprising a first end effector; a camera lumen defined in the device body; a positionable camera constructed and arranged to provide views of the first and second end effectors; and a surgical console comprising a processor constructed and arranged to execute an algorithm to position the positionable camera.


In Example 2, of Example of claim 1, wherein the positionable camera comprises a tip constructed and arranged to be capable of both pitch and yaw.


In Example 3, of Example of claim 1, wherein the processor is constructed and arranged to execute a control algorithm for positioning of the first and second robotic surgical arms.


In Example 4, of Example of claim 3, wherein the control algorithm is constructed and arranged to establish a camera reference frame and a robot reference frame.


In Example 5, of Example of claim 4, wherein the processor is configured to align the camera reference frame with the robot reference frame and re-position the positionable camera.


In Example 6, of Example of claim 4, wherein the robot coordinate frame is established relative to the device body and is defined by orthogonal unit vectors xR, yR, and zR.


In Example 7, of Example of claim 4, wherein the camera coordinate frame is defined by orthogonal unit vectors xC, yC, and zC.


In Example 8, of Example of claim 4, wherein the processor is configured to define locations PL and PR for the first and second end effectors, respectively.


In Example 9, of Example of claim 8, wherein the processor is configured to establish Midpoint PLPR between the end effectors via PL and PR.


In Example 10, of Example of claim 9, wherein the camera reference frame has an origin and the processor is configured to align the Midpoint PLPR and reposition the positionable camera.


In Example 11, a robotic surgical system, comprising a robotic surgical device comprising a first robotic surgical arm operably coupled to the device body and comprising a first end effector; a second robotic surgical arm operably coupled to the device body and comprising a first end effector; and a camera lumen defined in the device body; a positionable camera comprising an articulating tip and constructed and arranged to be inserted into the robotic surgical device such that the tip is oriented to view the first and second end effectors; and a surgical console comprising a processor constructed and arranged to execute a control algorithm to position the positionable camera, wherein the control algorithm is constructed and arranged to establish a camera reference frame, establish a robot reference frame, and position the camera tip relative to the camera reference frame or robot reference frame.


In Example 12, of Example of claim 11, wherein the robot coordinate frame is established relative to the device body and is defined by orthogonal unit vectors xR, yR, and zR.


In Example 13, of Example of claim 11, wherein the camera coordinate frame is defined by orthogonal unit vectors xC, yC, and zC.


In Example 14, of Example of claim 11, wherein the processor is configured to define locations PL and PR for the first and second end effectors, respectively.


In Example 15, of Example of claim 14, wherein the processor is configured to establish Midpoint PLPR between the end effectors via PL and PR, and wherein the camera reference frame has an origin and the processor is configured to align the Midpoint PLPR and reposition the positionable camera.


In Example 16, a robotic surgical system, comprising: a robotic surgical device comprising: a first robotic surgical arm operably coupled to the device body and comprising a first end effector; and a second robotic surgical arm operably coupled to the device body and comprising a first end effector; a positionable camera comprising an articulating tip and constructed and arranged to be inserted into the robotic surgical device such that the tip is oriented to view the first and second end effectors; and a processor constructed and arranged to execute a control algorithm to position the positionable camera, wherein the control algorithm is constructed and arranged to: establish a camera reference frame defined by orthogonal unit vectors xC, yC, and zC, establish a robot reference frame established relative to the device body and is defined by orthogonal unit vectors xR, yR, and zR, and position the camera tip relative to the camera reference frame or robot reference frame.


In Example 17, of Example of claim 16, further comprising a robot clamp constructed and arranged to rotatably couple the robotic surgical device to a support arm.


In Example 18, of Example of claim 16, wherein the robot clamp further comprises a release button and a clothespin member.


In Example 19, of Example of claim 16, further comprising an interface pod.


In Example 20, of Example of claim 16, further comprising an indicator light.


Other embodiments of these Examples include corresponding computer systems, apparatus, and computer programs recorded on one or more computer storage devices, each configured to perform the actions of the methods.


While multiple implementations are disclosed, still other implementations of the present invention will become apparent to those skilled in the art from the following detailed description, which shows and describes illustrative implementations of the invention. As will be realized, the invention is capable of modifications in various obvious aspects, all without departing from the spirit and scope of the present invention. Accordingly, the drawings and detailed description are to be regarded as illustrative in nature and not restrictive.





BRIEF DESCRIPTION OF THE DRAWINGS


FIG. 1A is a side view schematic view of the robotic surgical system, according to one embodiment.



FIG. 1B is a front view of the robotic surgical system showing the robotic device with an engaged positionable camera, according to one embodiment.



FIG. 2A is three-quarters front view of the robotic device with an engaged positionable camera, according to one embodiment.



FIG. 2B is a three-quarters perspective view of the robot of the implementation of FIG. 2 without the camera.



FIG. 2C is a three-quarters perspective view of the camera of the implementation of FIG. 2 without the robot.



FIG. 3A is a close-up three-quarters front view of the robotic device with an engaged positionable camera, according to one embodiment.



FIG. 3B is a close-up three-quarters front view of the robotic device with an engaged positionable camera showing the degrees of freedom of the arms, according to one embodiment.



FIG. 4A is a perspective view of a surgical device showing various workspaces for one arm, according to one embodiment.



FIG. 4B is a further perspective view of the surgical device of FIG. 6A, showing the workspace of the other arm.



FIG. 5 is a perspective view of a surgical device showing various workspaces for the arms, according to one embodiment.



FIG. 6 is a further perspective view of the surgical device of FIG. 6A, showing the workspace of one arm.



FIG. 7 is a zoomed in view of the camera operations system showing the components on the camera handle, according to one embodiment.



FIG. 8A is perspective three quarters view of the surgical robotic device and positionable camera showing the camera field of view, according to one implementation.



FIG. 8B is a cutaway side view of the robotic surgical device comprising a positionable camera and showing a first degree of freedom, according to one embodiment.



FIG. 8C is a cutaway side view of the robotic surgical device comprising a positionable camera and showing a second degree of freedom, according to one embodiment.



FIG. 8D is a perspective three-quarters side view of the robotic surgical device comprising a positionable camera and showing coordinate reference fames, according to one embodiment.



FIG. 8E is a view from the perspective of the positionable camera inserted into the robotic surgical device and showing the end effectors within that field of view, according to one embodiment.



FIG. 8F is a view from the perspective of the positionable camera inserted into the robotic surgical device and showing the locating of the end effectors within that field of view and generating an origin, according to one embodiment.



FIG. 8G is a perspective three-quarters side view of the robotic surgical device comprising a positionable camera and showing coordinate reference fames and the generation of midpoint calculations, according to one embodiment.



FIG. 8H is a view from the perspective of the positionable camera inserted into the robotic surgical device and showing the end effectors within that field of view and midpoint, according to one embodiment.



FIG. 8I is a view from the perspective of the positionable camera inserted into the robotic surgical device and showing the re-positioning of the camera, according to one embodiment.



FIG. 9 is a front view of the robotic surgical system showing the robotic device with an engaged positionable camera, according to one embodiment.



FIG. 10A is a perspective view of the surgical console, according to one implementation.



FIG. 10B is a perspective view of the surgical console, according to another implementation.



FIG. 11A is a schematic view of the robot, pod and console, showing the schematic connection maps between the components, according to one implementation.



FIG. 11B is a perspective pop-out view of the interface pod on the support cart, according to one implementation.



FIG. 12 is a top view of several surgical tools, according to certain embodiments.



FIG. 13 is a perspective top view showing the installation of the surgical tools into the arms, according to one implementation.



FIG. 14 is a perspective top view showing the surgical robotic device showing the sleeves, according to one implementation.



FIG. 15 is a front view of the robotic surgical system affixed via a clamp attached to a support arm, according to one implementation.



FIG. 16 is a close-up perspective view of the robotic clamp, according to one implementation.





DETAILED DESCRIPTION

The various systems and devices disclosed herein relate to devices for use in medical procedures and systems. More specifically, various implementations relate to various medical devices, including robotic devices having tracking camera systems and related methods and systems, including, in some implementations, controlling consoles and other devices to provide complete systems.


It is understood that the various implementations of robotic devices and related methods and systems disclosed herein can be incorporated into or used with any other known medical devices, systems, and methods. For example, the various implementations disclosed herein may be incorporated into or used with any of the medical devices and systems disclosed in U.S. Pat. No. 7,492,116 (filed on Oct. 31, 2007 and entitled “Robot for Surgical Applications”), U.S. Pat. No. 7,772,796 (filed on Apr. 3, 2007 and entitled “Robot for Surgical Applications”), U.S. Pat. No. 8,179,073 (issued May 15, 2011, and entitled “Robotic Devices with Agent Delivery Components and Related Methods”), U.S. Pat. No. 8,343,171 (issued Jan. 1, 2013 and entitled “Methods and Systems of Actuation in Robotic Devices”), U.S. Pat. No. 8,679,096 (issued Mar. 25, 2014 and entitled “Multifunctional Operational Component for Robotic Devices”), U.S. Pat. No. 8,834,488 (issued Sep. 16, 2014 and entitled “Magnetically Coupleable Surgical Robotic Devices and Related Methods”), U.S. Pat. No. 8,894,633 (issued Nov. 25, 2014 and entitled “Modular and Cooperative Medical Devices and Related Systems and Methods”), U.S. Pat. No. 8,968,267 (issued Mar. 3, 2015 and entitled “Methods and Systems for Handling or Delivering Materials for Natural Orifice Surgery”), U.S. Pat. No. 8,968,332 (issued Mar. 3, 2015 and entitled “Magnetically Coupleable Robotic Devices and Related Methods”), U.S. Pat. No. 8,974,440 (issued Mar. 10, 2015 and entitled “Modular and Cooperative Medical Devices and Related Systems and Methods”), U.S. Pat. No. 9,010,214 (Apr. 21, 2015 and entitled “Local Control Robotic Surgical Devices and Related Methods”), U.S. Pat. No. 9,060,781 (issued Jun. 23, 2015 and entitled “Methods, Systems, and Devices Relating to Surgical End Effectors”), U.S. Pat. No. 9,089,353 (issued Jul. 28, 2015 and entitled “Robotic Surgical Devices, Systems, and Related Methods”), U.S. Pat. No. 9,498,292 (issued Nov. 22, 2016 and entitled “Single Site Robotic Devices and Related Systems and Methods”), U.S. Pat. No. 9,579,088 (issued Feb. 28, 2017 and entitled “Methods, Systems, and Devices for Surgical Visualization and Device Manipulation”), U.S. Pat. No. 9,743,987 (Aug. 29, 2017 and entitled “Methods, Systems, and Devices Relating to Robotic Surgical Devices, End Effectors, and Controllers”), U.S. Pat. No. 9,770,305 (issued Sep. 26, 2017 and entitled “Robotic Surgical Devices, Systems, and Related Methods”), and U.S. Pat. No. 9,888,966 (issued Feb. 13, 2018 and entitled “Methods, Systems, and Devices Relating to Force Control Surgical Systems), all of which are hereby incorporated herein by reference in their entireties.


Further, the various implementations disclosed herein may be incorporated into or used with any of the medical devices and systems disclosed in copending U.S. Published Applications 2014/0046340 (filed Mar. 15, 2013 and entitled “Robotic Surgical Devices, Systems, and Related Methods”), 2014/0058205 (filed Jan. 10, 2013 and entitled “Methods, Systems, and Devices for Surgical Access and Insertion”), 2014/0303434 (filed Mar. 14, 2014 and entitled “Robotic Surgical Devices, Systems, and Related Methods”), 2015/0051446 (filed Jul. 17, 2014 and entitled “Robotic Surgical Devices, Systems, and Related Methods”), 2016/0074120 (filed Sep. 14, 2015, and entitled “Quick-Release End Effectors and Related Systems and Methods”), 2016/0135898 (filed Nov. 11, 2015 entitled “Robotic Device with Compact Joint Design and Related Systems and Methods”), 2016/0157709 (filed Feb. 8, 2016 and entitled “Medical Inflation, Attachment, and Delivery Devices and Related Methods”), 2017/0035526 (filed Aug. 3, 2016 and entitled “Robotic Surgical Devices, Systems, and Related Methods”), 2017/0354470 (filed May 18, 2017 and entitled “Robotic Surgical Devices, Systems, and Related Methods”), 2018/0055584 (filed Aug. 30, 2017 and entitled “Robotic Device with Compact Joint Design and an Additional Degree of Freedom and Related Systems and Methods”), 2018/0056527 (filed Aug. 25, 2017 and entitled “Quick-Release End Effector Tool Interface”), 2018/0140377 (filed Nov. 22, 2017 and entitled “Gross Positioning Device and Related Systems and Methods”), 2018/0147019 (filed Nov. 29, 2017 and entitled “User Controller with User Presence Detection and Related Systems and Methods”), and 2018/0161122 (filed Dec. 14, 2017 and entitled “Releasable Attachment Device for Coupling to Medical Devices and Related Systems and Methods”), all of which are hereby incorporated herein by reference in their entireties. In addition, the various implementations disclosed herein may be incorporated into or used with any of the medical devices and systems disclosed in U.S. Application 62/614,127 (filed Jan. 5, 2018), which is hereby incorporated herein by reference in its entirety.


Certain device and system implementations disclosed in the patents and/or applications listed above can be positioned within a body cavity of a patient in combination with a support component similar to those disclosed herein. An “in vivo device” as used herein means any device that can be positioned, operated, or controlled at least in part by a user while being positioned within a body cavity of a patient, including any device that is coupled to a support component such as a rod or other such component that is disposed through an opening or orifice of the body cavity, also including any device positioned substantially against or adjacent to a wall of a body cavity of a patient, further including any such device that is internally actuated (having no external source of motive force), and additionally including any device that may be used laparoscopically or endoscopically during a surgical procedure. As used herein, the terms “robot,” and “robotic device” shall refer to any device that can perform a task either automatically or in response to a command.


Certain implementations provide for insertion of the present invention into the cavity while maintaining sufficient insufflation of the cavity. Further implementations minimize the physical contact of the surgeon or surgical users with the present invention during the insertion process. Other implementations enhance the safety of the insertion process for the patient and the present invention. For example, some implementations provide visualization of the present invention as it is being inserted into the patient's cavity to ensure that no damaging contact occurs between the system/device and the patient. In addition, certain implementations allow for minimization of the incision size/length. Further implementations reduce the complexity of the access/insertion procedure and/or the steps required for the procedure. Other implementations relate to devices that have minimal profiles, minimal size, or are generally minimal in function and appearance to enhance ease of handling and use.


As in manual laparoscopic procedures, a known insufflation system can be used to pump sterile carbon dioxide (or other gas) into the patient's abdominal cavity. This lifts the abdominal wall from the organs and creates space for the robot. In certain implementations, the system has no direct interface with the insufflation system. Alternatively, the system can have a direct interface to the insufflation system.


In certain implementations, the insertion port is a known, commercially-available flexible membrane placed transabdominally to seal and protect the abdominal incision. This off-the-shelf component is the same device used in the same way for Hand-Assisted Laparoscopic Surgery (HALS). The only difference is that the working arms of the robot are inserted into the abdominal cavity through the insertion port rather than the surgeon's hand. The robot body seals against the insertion port, thereby maintaining insufflation pressure. The port is single-use and disposable. Alternatively, any known port can be used.


Certain implementations disclosed herein relate to “combination” or “modular” medical devices that can be assembled in a variety of configurations. For purposes of this application, both “combination device” and “modular device” shall mean any medical device having modular or interchangeable components that can be arranged in a variety of different configurations, and the related systems. The modular components and combination devices disclosed herein also include segmented triangular or quadrangular-shaped combination devices. These devices, which are made up of modular components (also referred to herein as “segments”) that are connected to create the triangular or quadrangular configuration, can provide leverage and/or stability during use while also providing for substantial payload space within the device that can be used for larger components or more operational components. As with the various combination devices disclosed and discussed above, according to one implementation these triangular or quadrangular devices can be positioned inside the body cavity of a patient in the same fashion as those devices discussed and disclosed above.


The various system implementations described herein are used to perform robotic surgery. Further, the various implementations disclosed herein can be used in a minimally invasive approach to a variety of procedures that are typically performed “open” by known technologies, with the potential to improve clinical outcomes and health care costs, including, for example, general surgery applications in the abdominal cavity, such as, for example, colon resection and other known procedures. Further, the various implementations disclosed herein can be used in place of the known mainframe-like laparoscopic surgical robots that reach into the body from outside the patient. That is, the less-invasive robotic systems, methods, and devices according to the implementations disclosed herein feature small, self-contained surgical devices that are inserted in their entireties through a single incision in the patient's abdomen. Designed to utilize existing tools and techniques familiar to surgeons, the devices disclosed herein will not require a dedicated operating room or specialized infrastructure, and, because of their much smaller size, are expected to be significantly less expensive than existing robotic alternatives for laparoscopic surgery. Due to these technological advances, the various implementations herein could enable a minimally invasive approach to procedures performed in open surgery today. In certain implementations, the various systems described herein are based on and/or utilize techniques used in manual laparoscopic surgery including insufflation of the abdominal cavity and the use of ports to insert tools into the abdominal cavity.


As will be described in additional detail below, components of the various system implementations disclosed or contemplated herein include a control console and a robot having a tracking camera system. The robot implementations are constructed and arranged to be inserted into the insufflated abdominal cavity. The tracking camera system can be an integrated camera system that captures a view of the surgical target and can be manually or automatically controlled to track and capture an ongoing view of the arms and/or end effectors of the robotic device. The surgeon can then use that view on a display to help control the robot's movements. In certain implementations, the camera is designed so that it can be removed so it can be cleaned and used in other applications.


In other implementations as will be discussed in further detail herein, the system can include disposable or permanent sleeves positioned on or attached to the robotic device, an electro-surgery cautery generator, an insertion port, a support arm/structure, a camera, remote surgical displays, end-effectors (tools), an interface pod, a light source, and other system components.


The various implementations are disclosed in additional detail in the attached figures, which may include some written description therein.


According to one implementation, the Robotically Assisted Surgical Device (RASD) system 1 has several components. In one such implementation, and as shown in FIG. 1A and FIG. 1B, a surgical robotic device 10 having a robotically articulated camera 12 disposed therein and an external surgeon control console 100 is provided. In the implementation of FIG. 1A, the robotic device 10 and the camera 12 are shown mounted to the operating table 2 using a robot support arm 4, in accordance with one implementation. The system 1 can be, in certain implementations, operated by the surgeon and one surgical assistant.



FIG. 1B and FIG. 2A depict exemplary implementations of the robotic device 10 having a body 10A (or torso) having a distal end 10B and proximal end 10C, with the camera 12 disposed therein, as has been previously described. Briefly, the robotic device 10 has two robotic arms 14, 16 operably coupled thereto and a camera component or “camera” 12 disposed between the two arms 14, 16 and positionable therein. That is, device 10 has a first (or “right”) arm 14 and a second (or “left) arm 16, both of which are operably coupled to the device 10 as discussed in additional detail below. The device 10 as shown has a casing (also referred to as a “cover” or “enclosure”) 11. The device 10 is also referred to as a “device body” 10A and has two rotatable cylindrical components (also referred to as “shoulders” or “turrets”): a first (or “right”) shoulder 14A and a second (or “left”) shoulder 16A. Each arm 14, 16 also has an upper arm (also referred to herein as an “inner arm,” “inner arm assembly,” “inner link,” “inner link assembly,” “upper arm assembly,” “first link,” or “first link assembly”) 14B, 16B, and a forearm (also referred to herein as an “outer arm,” “outer arm assembly,” “outer link,” “outer link assembly,” “forearm assembly,” “second link,” or “second link assembly”) 14C, 16C. The right upper arm 14B is operably coupled to the right shoulder 14A of the body 10A at the right shoulder joint 14D and the left upper arm 16B is operably coupled to the left shoulder 16A of the body 10 at the left shoulder joint 16D. Further, for each arm 14, 16, the forearm 14C, 16C is rotatably coupled to the upper arm 14B, 16B at the elbow joint 14E, 16E.


In various implementations, the device 10 and each of the links of the arms 14, 16 contain a variety of actuators or motors. In one embodiment, any of the motors discussed and depicted herein can be brush or brushless motors. Further, the motors can be, for example, 6 mm, 8 mm, or 10 mm diameter motors. Alternatively, any known size that can be integrated into a medical device can be used. In a further alternative, the actuators can be any known actuators used in medical devices to actuate movement or action of a component. Examples of motors that could be used for the motors described herein include the EC 10 BLDC+GP10A Planetary Gearhead, EC 8 BLDC+GP8A Planetary Gearhead, or EC 6 BLDC+GP6A Planetary Gearhead, all of which are commercially available from Maxon Motors, located in Fall River, Mass. There are many ways to actuate these motions, such as with DC motors, AC motors, permanent magnet DC motors, brushless motors, pneumatics, cables to remote motors, hydraulics, and the like.


In these implementations, the robotic device 10 and camera 12 are both connected to the surgeon console using a cable: the robot cable 8A and camera cable 8B. Alternatively, any connection configuration can be used. In certain implementations, the system can also interact with other devices during use such as a electrosurgical generator, an insertion port, and auxiliary monitors.


As shown in FIG. 1B, the camera 12 comprises a camera latch 32 and insertion 34 and retraction 36 controls or buttons. The robotic device 10 is supported by a support arm 4 that is clamped to the operating table (shown in FIG. 1A at 2). In these implementations, a robot clamp 150 is used to connect the support arm 4 to an acceptance ring 11 on the robot handle or body 10A.


According to the implementations of FIG. 1B and FIG. 2A, the arms 14, 16 each have active degrees of freedom and an additional active joint 14F, 16F to actuate the end effectors, or tools 18, 20. It is understood that more or less degrees of freedom could be included. The device in this implementation has a connection line 8 (also referred to as a “pigtail cable”) (partially shown) that includes electrical power, electrocautery, and information/communication signals. In certain implementations, the device has distributed control electronics and software to help control the device 10. Some buttons can be included to support insertion and extraction of the device into and out of the abdominal cavity. In this implementation, the integrated camera 12 is also shown inserted in the device body 10A. When inserted into the body 10A, the camera 12 has a handle or body 12A that extends proximally from the proximal body end 10C and a flexible camera imager 12B extending from the distal body end 10B.



FIGS. 2B and 2C depict the robotic device 10 with the camera assembly 12 removed, according to one implementation. In these implementations, and as shown in FIG. 2 and FIGS. 3-4, the camera imager 12B is designed to be positioned between the two arms 14, 16 and capture that view between the two arms 14, 16. In these implementations, the camera 12 extends through the robot body 10A such that the camera imager 12B exits near the joints between the body and the robotic arms (the “shoulder” joints 14A, 16A). The camera 12 has a flexible, steerable tip 12C to allow the user to adjust the viewing direction. The end effectors 18, 20 on the distal end of the arms 14, 16 can include various tools 18, 20 (scissors, graspers, needle drivers and the like). In certain implementations, the tools 18, 20 are designed to be removable by a small twist of the tool knob that couples the end effector to the arm 14, 16.


As is shown in FIGS. 2B and 2C, the camera assembly 12 has a handle 12A and a long shaft 12D with the camera imager 12B at the distal tip 12C. In various implementations, the flexible tip 12C and therefore camera imager 12B can be steered or otherwise moved in two independent directions in relation to the shaft 12D at a flexible section 12E (black section on shaft) to change the direction of view. In certain implementations, the camera 12 has some control buttons 12F as shown. In some implementations, the camera assembly 12 can be used independently of the robotic device 10 as shown in FIG. 2C.


Alternatively, the assembly can be inserted into the robotic device 10 though a lumen 10D defined through the body 10A of the robotic device 10 as shown. In certain implementations, the lumen 10D includes a seal/port 10E to ensure that the patient's cavity remains insufflated (as shown in relation to FIG. 1B). According to one implementation, the robotic device 10 can have a sensor to determine if the camera is positioned in the camera lumen 10D of the device 10.


In use, the distal portion of the robotic device 10 is inserted inside the body of the patient. Thereafter, the robot and camera can both be controlled by the surgeon via the surgeon console sitting outside the sterile field. The surgeon console has user input devices (i.e. joysticks) that allow the surgeon to control the motion of the robot, as described in detail below. There are also pedal inputs and a touchscreen that control device 10 functions in certain implementations, as shown in FIGS. 11A-11B. The console can have a main display that provides images of the surgical environment via the robot camera.


It is understood that in the described implementations, the robotic device 10 has a pair of miniaturized human-like arms 14, 16 attached to a central body or handle 10A, as shown in FIG. 1B, FIG. 2A, FIG. 2B and FIG. 3A and FIG. 3B. Alternatively, any in vivo robot can be utilized with the system implementations disclosed or contemplated herein.


The robot handle 10A in the implementation of FIGS. 1B-3B has a lumen 10D (shown in FIG. 2B) and docking feature that allows the camera 12 to be inserted and removed from the body 10A while maintaining abdominal insufflation. When inserted (as shown in FIGS. 1B and 2A), the camera 12 has an articulating tip 12B that can include a light source and allows the surgeon to view the surgical tools 14, 1618, 20 and surgical environment.


In these implementations, the camera 12 can be locked in place and can be removed using a latch button 32 on the camera handle 12A or elsewhere. In these implementations, the surgical robotic device is supported by a support arm 4 that is clamped to the operating table 2. As described in relation to FIGS. 15 and 16, a robot clamp is used to connect the support arm to an acceptance ring on the robot handle. Alternatively, the robotic device 10 can be supported via any known support component.


As shown in FIG. 3A and FIG. 3B, in use, after the camera 12 is inserted into the robot body 10A, the distal tip of the camera 12 passes through a lumen in the robot and extends into the surgical environment. The distal tip 12B of the camera 12 can then be actuated to provide views of the surgical tools and surgical target. It is understood that the camera 12 can be used with any similar robotic device having a camera lumen defined therethrough.


Each robot arm 14, 16 in this implementation has six degrees of freedom, including the open/close function of the tool, as shown in FIG. 3B. The robot shoulder is approximately a spherical joint similar to a human shoulder. The shoulder can yaw (J1), pitch (J2), and roll about the upper arm segment (J3). These first three axes of rotation roughly intersect at the shoulder joint. The robot elbow (J4) allows rotation of the forearm with respect to the upper arm. Finally, the tool can roll (J5) about the long axis of the tool and some tools have an open/close actuation function. In contrast, it is understood that a hook cautery tool does not open/close.


The surgical robot in this implementation has significant dexterity. As shown in FIG. 4A and FIG. 4B, the six degrees of freedom described above allow the robot's arms 14, 16 to reach into the confined spaces of the abdominal cavity.



FIGS. 4A, 4B, 5 and schematically depict the entire workspace 30 as well as the individual reachable workspaces 30A, 30B of each of the arms 14, 16 of a robotic device 10, according to certain implementations. In these implementations, “workspace” 30 means the space 30 around the robotic device 10 in which either arm and/or end effector 18, 20 can move, access, and perform its function within that space.



FIG. 5 shows the regions that can be reached by the left arm and by the right arm. More specifically, FIG. 5 depicts a perspective view of the device body 10A and further schematically shows the entire workspace 30 as well as the individual workspaces 30A, 30B of the first arm 14 and second arm 16, respectively. Note that the each arm 14, 16 has a range of motion and corresponding workspace 30A, 30B that extends from the front 22 of the device to the back 24 of the device 10. Thus, the first arm 14 equally to the front 22 and the back 24, through about 180° of space relative to the axis of the device body 10A for each arm 14, 16. This workspace 30 allows the robotic device to work to the front 22 and back 24 equally well without having to reposition the body 10A. The overlap of these volumes represents a region that is reachable by both the left and right arms and is defined as the bi-manual robot workspace. The surgeon will have full robot dexterity when working in this bi-manual region.


As best shown in FIG. 6, the overlap of the ranges of motion for the individual arms in these implementations also enables an intersecting, or bi-manual workspace 30C (as is also shown in FIG. 6A). It is understood that the intersecting workspace 30C in these implementations encompasses the workspace 30C reachable by both arms 14, 16 and end effectors 18, 20 in any individual device 10 position. Again, in these implementations, the intersecting workspace 30C includes a range of about 180° of space relative to the axis of the device body 10A.


The bi-manual workspace 30C is approximated by an ellipse that is rotated 180 degrees about the shoulder pitch joint (J2 in FIG. 3B) and is shown in FIG. 6. For one design, the ellipse is approximately 4.5″ (11.5 cm) on the long axis and 3.25″ (8.25 cm) on the minor axis. The bi-manual workspace 30 extends from in front of the robotic device 10 to below the robot and is also behind the back of the robot. This dexterity of the robotic arms 14, 16 allows the surgeon to operate the arms 14, 16 to work equally well anywhere inside this bi-manual workspace 30C.


In addition, according to this implementation, the surgical robotic device 10 can reach any area of the abdominal cavity because it can be easily repositioned during the procedure via “gross positioning.” That is, the device 10 can be quickly, in a matter of seconds, be moved by adjusting the external support arm 4 and robot clamp 150. The combination of gross positioning of the robotic device 10 and the dexterity of the robot arms 14, 16 allow the surgeon to place the device 10 so it can work anywhere in the abdominal cavity with the arms 14, 16 well triangulated for the given procedure, as discussed below.


Turning to the insertion of the device 10 and camera 12 in greater detail, FIG. 7 depicts a detailed view of the handle 12A according to certain implementations. In FIG. 7, the camera 12 has a camera latch 32 and insertion 34 and retraction 36 controls or buttons. The robotic device 10 is supported by a support arm 4 that is clamped to the operating table (shown in FIG. 1A at 2). In these implementations, a robot clamp 150 is used to connect the support arm 4 to an acceptance ring 154 on the robot handle or body 10A.


In various implementations of the system 1, the device 10 is inserted into the abdomen of the patient by executing a series of configurations and/or arm positions. In certain implementations, the insertion 34 and retraction 36 controls or buttons allow the physician or user to executed the respective insertion and retraction steps/positions through the insertion and/or retraction, as would be understood. Further, in certain implementations, the camera latch 32 toggles the internal components of the device 10 and/or camera 12 into “locked” or “unlocked” positions, thereby securing the camera 12 within the device 10 or allowing it to be freely removed from the camera lumen, as would be understood.


Various implementations of the surgical robotic device 10 according to these implementations have an indicator light 38 or lights 38 disposed at the proximal end 10C of the device 10 and constructed and arranged to indicate any state of the device and can be any color or any intensity or of varying intensity. In certain implementations, LED lights or similar lighting components can be used, as would be appreciated by those of skill in the art.


In various implementations, the robotically articulated camera 12 is part of a system 1 to provide visual feedback to the surgeon from the perspective of the camera 12. In one specific implementation, the camera provides 1080 p 60 Hz. digital video. Alternatively, the camera can provide any known video quality.


As is shown in the implementation of FIG. 8A, the camera 12 is constructed and arranged to be inserted into a lumen in the robot base link as shown in FIG. 1 so that the tip 12B of the camera is always positioned between the two robot arms 14, 16, and that the camera 12 has a field of view (shown with reference letter C in FIG. 8A).


It is likewise understood that when the robotic device 10 is repositioned during surgery, the camera 12 and robotic device 10 can move together or in a coordinated fashion in this configuration. This results in coordinated triangulation between the robot and tools 18, 20 for any configuration, positioning, and use of the device 10.


In accordance with certain implementations, the camera 12 is designed to visualize all possible positions of the robot's tools 18, 20. Accordingly, the camera tip 12B can be robotically articulated as to reposition the field of view (C). It is understood that in certain implementations, the surgeon controls this movement via the surgeon console 100 (described in detail in relation to FIG. 11A and FIG. 11B).


As shown in the implementations of FIG. 8B and FIG. 8C, the camera 12 can move in pitch (screen up/down) and/or yaw (screen left/right), respectively, which may also be referred to as tilt and pan, respectively. In certain implementations, the system uses an articulating camera 12, as has been previously described. Briefly, in these implementations, the camera 12 articulates to ensure the surgeon can view all possible locations of the robot arms 14, 16 as well as the desired areas of the surgical theater.


As mentioned above, the approximate camera field of view (C) for a given location of the camera is shown in the implementation of FIG. 8A. The camera field of view (C) is about 100 degrees in this implementation, as defined by the angle θ1 created along the diagonal of the cross section of a rectangle. Any other known field of view angle can be used. It is appreciated that many other angles are possible. In these implementations, it is understood that the surgeon/user is able to view both robot end effectors 18, 20 over a wide range of working distance.


Further, as the robotic device 10 makes large motions with its arms 14, 16—like those described in FIGS. 5 & 6—the robot camera tip 12B can be moved using active joints in coordination with the large arm movements to view the entire robot workspace. In certain implementations, the joints of the camera are actively controlled using motors and sensors and a control algorithm implemented on a processor.


The system 1 according to certain implementations has a processor constructed and arranged to execute such a control algorithm. The control algorithm can be provided on computer-readable medium on a processor optionally having an operating system, memory, an input/output interface and the like, as would be appreciated by one of skill in the art. The processor in various implementations can be disposed in the camera handle 12A, device body 10A, in the surgical console 100 or elsewhere, as would be appreciated by those of skill in the art. For purposes of the discussed implementations, the processor is located inside the surgical console 100 as would be readily appreciated.


In these implementations, the control algorithm allows for automated and/or semi-automated positioning and re-positioning of the camera 12 about the pitch (α) and/or yaw (β) rotations shown in FIGS. 8B and 8C, relative to the robotic device 10. This 2 degrees-of-freedom (DOF) system can also be constructed and arranged to translate the camera tip 12B as the robotic device 10 articulates. It is understood that alternative designs are possible.


In the implementation of FIG. 8B and FIG. 8C, the system 1 executes a control algorithm such as an algorithm as discussed above. According to these implementations, the camera 12 is capable of rotating relative to the robot body 10A so as to direct or “point” the camera 12 in various directions to alter the field of view. In this implementation, a robot coordinate frame {R} is affixed to the robot body 10A and is defined by orthogonal unit vectors xR, yR, and zR. A camera coordinate frame {C} is defined with relation to the location of the imaging tip 12B of the camera. In this implementation, the {C} frame is defined by the orthogonal unit vectors xC, yC, and zC, as shown in FIGS. 8B-8D.


In this implementation, the xC axis is located so as to extend outward from the imaging tip 12B as an extension of the longitudinal axis of the camera 12 and thus point directly in line with the field of view of the camera 12 (as shown in FIG. 8A at C). The yC axis points directly to the left of the camera image and the zC axis is vertical when viewed by the camera imager. The {C} frame is shown from the perspective of the camera in FIG. 8E.


In this implementation, two angles are defined to describe the 2 DOF rotation of the camera frame {C} relative to the robot frame {R}; first angle α and second angle β. Many angles can be used, but in this representative implementation, fixed angles are used and are described by rotations about the xR and yR frames.


The first angle α is defined as a rotation of the camera tip 12B (xC axis) relative to the xR axis about the yR axis, as is shown in FIG. 8B. The second angle β is defined as a rotation of the camera tip 12B (xC axis) relative to the yR axis about the xR axis, as is shown in FIG. 8C.


In these implementations, the system can generate coordinate transformations from one of the camera frame {C} and/or the robot frame {R} to the other—or to any other coordinate frame.


As shown in the implementations of FIGS. 8D-8I, the system 1 according to certain implementations can be constructed and arranged to execute a control algorithm and move the camera 12 and arms 14, 16 in response to the defined camera frame {C} and/or the robot frame {R}. That is, it is understood that in certain implementations, the surgeon or user commands robotic device 10 motion based on images returned by the camera 12, and that the system 1 is constructed and arranged to adjust the locations of various reference frames and components, as described herein.


According to certain of these implementations, the camera frame {C} is fixed to the camera tip 12B so it does not move relative to the view provided by the surgeon.


As shown in FIG. 8E, the system 1 according to these implementations establishes an origin (shown at X) of the camera frame {C} at the intersection of the xc-yc- and zc-axis. Likewise, the robot frame {R} establishes a reference point or origin relative to the position of the device components for coordinated translation between the frames {C}, {R}, as would be understood.


Continuing with the implementation of FIG. 8E, the locations PL and PR of the end effectors 18, 20 can then be located within the camera frame {C}. The location of the end effectors 18, 20 is known in the robot frame {R} as that is what is controlled to operate the robot. Then a coordinate transformation is established between the {R} frame and the {C} frame to locate the position of the end effectors 18, 20 in the camera frame.


It is understood that the positioning of the camera 12 according to these implementations can be controlled and/or planned using several approaches. One approach is to allow the user to control the position of the camera 12 via an input device operably coupled to the console 100, and as described in detail in relation to FIG. 11A and FIG. 11B. Some non-limiting examples of the input device include, for example, a hand or foot controlled joystick. Further implementations have independent joystick-like devices that control the various motions—for example pitch α and yaw β—of the camera. A further approach includes toggling the function of one of the robot hand controllers and/or pedal to then temporarily use the hand controller to command the motion of the camera 12.


In further alternate implementations, additional data relating to the position of the camera 12 and other components such as the arms 14, 16 can be used to establish the reference frames {R}, {C} to choose the direction of the camera 12. These implementations can include end effector 18, 20 positions and velocities as well as many factors associated with the motion of the tools, as would be appreciated by those of skill in the art.


A further approach according to certain implementations is to control the movement of the camera 12 to be fixed on the end effectors 18, 20. When viewed from the camera perspective C according to these implementations, the end effector 18, 20 locations are defined as PL and PR, where PL and PR are vectors containing the x, y, and z coordinates of the location of the respective points. These can be detected via the camera 12 and their position can be established in the camera frame, as is shown in FIG. 8D, FIG. 8E and FIG. 8F.


In various of these implementations, it is therefore possible to calculate the midpoint Midpoint PLPR between the end effectors in the camera frame FIG. 8G. In these implementations, a line is created between the left 16 and right 18 end effector locations PL and PR, as is shown in FIG. 8G. The midpoint of that line CP can then be located in the camera coordinate frame—or in any other frame using known coordinate transformation matricies—where:







c

P
_


=


Midpoint








P
R



P
L


_


=

[




P
x






P
y






P
z




]






Using these reference frames, it is possible to re-position an initial camera view C1 to a second camera view C2 via coordinate transformations to ensure the camera 12 remains centered on the tools 18, 20. For example, as is shown in FIG. 8H, when viewed from the initial camera view C1, the {R} midpoint PRPL can be observed relative to the camera coordinate frame {C}. It is understood that the {C} reference frame origin Xc is not aligned with the midpoint PRPL established by the {R} reference frame.


The camera 12 can then be re-positioned so as to zero the origin point Xc of the camera to the midpoint PRPL of the two tools 18, 20 via coordinate transformations, as is shown in FIG. 8I at XC1→XC2. This motion can also be damped. In these implementations, the system 1 retards the motion of the camera tip 12B by reducing the motion of the tip with a term proportional to the velocity of the tip, as would be understood.


Further implementations involving the control of camera 12 utilize a running average position of each right 18 and left 20 end effector is calculated. In these implementations, the difference between average position and actual position is calculated for each arm 14, 16. If the difference is greater than a threshold value, the arm is considered to be moving. In these implementations, camera actuation outputs are calculated via the kinematics of the camera as compared with a target position. When only one arm is moving, the target position is the position of only the moving arm. If both arms are moving, the midpoint between the two end-effector positions is used as the target position, as would be understood.


In implementations such as these running-average kinematic control execute pseudo-code such as:














algorithm kinematics is


 input: point at which to aim camera, pos


 output: camera angles to point camera at pos, theta1, theta2, . . .


 set theta1, theta2, . . . based on camera kinematics and pos


 return theta1, theta2, . . .


algorithm cameraTracking is


 input: left and right end effector positions, posL & posR


 output: camera actuation angles, theta1, theta2, . . .


 enqueue posL into FIFO array of fixed size, arrayL


 set avgL to average of arrayL


 set diffL to difference of avgL & posL


 enqueue posR into FIFO array of fixed size, arrayR


 set avgR to average of arrayR


 set diffR to difference of avgR & posR


 if diffL is greater than movementThreshold


  set movingL to true


 else


  set movingL to false


 if diffR is greater than movementThreshold


  set movingR to true


 else


  set movingR to false


 if movingR is true and movingL is false


  return kinematics(posR)


 else if movingL is true and movingR is false


  return kinematics(posL)


 else if movingL is true and movingR is true


  set midPos to average of posR & posL


  return kinematics(midPos)









Alternatively, other clinical and robotic factors can be used to determine the camera location. For example, the velocity/position and/or the velocity/position history can be considered in the commanded camera position. In constructing and arranging the system, it is understood that a tool that moves quickly, often, or constantly, or other factors could “pull” the camera toward that tool, and that a more stationary tip may not hold the camera as close.


Further, it is well appreciated that various machine learning techniques or other algorithms can be used to determine the orientation of the camera 12. This could include neural networks, genetic algorithms, or many other machine learning algorithms known and appreciated in the art.


Alternatively, the surgeon may also choose to remove the camera 12 from the robotic device 10 and use it in another, known laparoscopic port 8 like a standard manual laparoscope as shown in FIG. 9. It is understood that this perspective may be useful to visualize the robotic device 10 to ensure safe insertion and extraction via the main port 6. The camera 12 according to these implementations can also be removed from the robotic device 10 so the optics can be cleaned.


In certain implementations, the robotic device is piloted from the surgeon console 100 as shown in FIG. 10A. This exemplary implementation of the surgeon console 100 contains a main computer 102 that performs robot control functions and system monitoring. In these implementations, the surgeon views the surgical environment using the output of the robotically articulated camera shown on a high-definition real-time display 104. Several functions of the console and robot are controlled through a touch screen interface 106. The touch screen 106 is also used to display some information about the state of the robot. Alternatively, any known console can be used with the various implementations of the system disclosed or contemplated herein.


The device 10 and camera 12 motion are controlled in this implementation via the surgeon console 100 with left and right hand input devices 108. The input devices 108 interface with the surgeon's hands and monitor the surgeon's movement. As has been previously described, the input devices 108 have a surgeon presence sensor to indicate the surgeon's hands are properly engaged. The devices can also provide haptic feedback by pushing on the surgeon's hands to indicate things such as workspace boundaries and to prevent collisions between the robot arms, as was also described in the incorporated references. These input devices 108 also control open/close functions of the robot's surgical tools.


The surgeon console 100 according to these implementations can also have foot pedals 110 that are used to control various robot functions including clutching, camera movements, and various electro cautery functions. Alternatively, other input devices on the console can be used to control those various functions.


The surgeon console 100 according to these implementations is constructed and arranged to be used in either a sitting (similar to Intuitive's da Vinci) or standing position (similar to manual laparoscopy). The console 100 is designed to be easily transported between operating rooms using castors and a transport handle 112.


A further implementation of the surgeon console 100 is shown in FIG. 10B. In these implementations, additional, alternative support equipment is provided, here, a remote display 120 and a companion cart 122. It is understood that the space around a patient during a surgery is valuable, and that certain wired or otherwise connected components have limited range.


The remote display 120 according to these implementations is operably coupled to the other components and can be wireless or wired. This display 120 can be used to show the view from the robot camera or any other video.


In the implementation of FIG. 10B, a companion cart 122 is also provided. The cart 122 can be used to hold the robot interface pod 124 or an electro surgical generator or other equipment.


In certain implementations one 110A of the foot pedals 110 or another input device can be used as a clutch that separates coordinated motion of the hand input devices from the motion of the robot. In certain implementations, the foot pedals 110 can be configured allow the user to move the hand input devices 108 to a more desirable location in their own workspace. Then the coordinated motion can be reengaged. Alternatively, in other implementations the clutch function might separate the coordinated motion of the hand input devices from the motion of the robot and then the hand input devices might automatically move to a desired portion. Then the coordinated motion can be reengaged.


In certain system implementations, various cables 126 are used to connect the robot, camera, electrosurgical generator, and the surgeon console, as is shown in FIG. 11A.


According to one implementation, all connections of the cables 126 to and from the various system 1 components are made through a connection pod 124, shown in FIG. 10B, FIG. 11A and FIG. 11B. The cables and connectors are shown schematically in FIG. 11A.


In these implementations, the pod 124 is permanently connected to the surgeon console 100 via an approximately 20′ (6 meters) cable 126 giving flexibility in the placement of the surgeon console within the operating room. Other lengths are of course possible. It is understood that in use, the pod 124 and cable 126 can be hung from the back of the console 100 for transport. When in use, the pod 124 can be placed near the electrosurgical generator and/or near the operating table.


In various implementations, the robotic device 10 and camera 12 both have pigtails 126A, 126B that are permanently attached to the robot and camera and then have connectors at the pod. The robot pigtail 126A carries electrical power and control signals as well as cautery energy. The camera pigtail 126B carries electrical power and control signals as well as a fiber optic cable for the video signal.


The pod 124 according to these implementations can also be constructed and arranged to interface with an electrosurgical generator (ESG) 128. On/Off control signals from the user at the surgeon console 100 are directly connected to the ESG 128 control inputs. The mono-polar return pad 130 is first connected to the pod 124 and then the cautery energy is routed from the ESG 128 to the appropriate surgical tools via the pod 124. In various implementations, each connection contains a sensor that allows the surgeon console to determine if connections are made correctly. This system 1 has been designed to ensure safety and simplicity of setup.


One interface pod 124 design is shown in FIG. 11B. In this implementation, the companion cart 122 is used to house the interface pod 124 and ESG 128. The interface pod connects to the surgeon console and the electro surgical unit. The interface pod 124 then has connections for the robotic device 10 and camera 12.


In various implementations, a known, commercially-available ESG 128 can interface with the system, according to one implementation. For example, in one specific implementation, the surgeon console can have two (IPX7) foot pedals 110 that open and close an electrical circuit that activates and deactivates the ESG 128. The pedals 110 are directly connected to the ESG 128. As a safety measure, the surgeon console 100 can disconnect the pedals from the ESG 128, but cannot activate the ESG 128. Activation of the ESG 128 requires the surgeon to also depress the pedals 110. Mono-polar cautery energy is delivered to the right arm of the robot and bi-polar energy is delivered to the left arm. The electrocautery energy is delivered to the surgical target through the specifically designed surgical tools—such as a grasper for bi-polar and scissors and hood for mono-polar energy. Verification testing-creepage, clearance, impedance and the like—has been performed to ensure proper interoperability function between the electrosurgical generator and the system.


Alternatively, the ESG 128 can interface with the system 1 through other input devices other than the foot pedals. Alternatively, the system has no pod 124. In addition to these specialized subsystems, certain implementations of the system can utilize one or more of the many standard general surgical and laparoscopic systems and techniques that are commonly available and provided by the users, as described below.


Further aspects of the system 1 are described herein.



FIG. 12 depicts views of various surgical tools (also referred to above as end effectors 18, 20) are the “hands” of the system and shown generally at 130. Four tools are shown in FIG. 12, including a fenestrated grasper 132 that is capable of bi-polar cautery, a scissors 134 that delivers mono-polar cautery, a hook 136 that delivers mono-polar cautery, and a left/right needle driver set 138. Alternatively, other end effectors can be used with the implementations disclosed or contemplated herein.


In certain implementations, these surgical instruments 130 are designed to be single-use disposable accessories to the robot system 1. They can be chosen based on clinical need for the specific surgical task.


The tools 130 are inserted into the distal end of the robot forearm 14, 16 and then are locked in place using a ¼-turn bayonet-style connection as end effectors 18, 20, as shown in FIG. 13. The tools 130 are removed by reversing the process. When the tools 130 are inserted they interact with connections inside the forearm to deliver cautery energy to the tool tip. Alternatively, any coupling mechanism can be used to couple any of the end effectors with the robotic device.


According to certain implementations, the surgical robotic device 10 is intended to be cleaned and sterilized for reuse. The robotic device 10 has a molded silicon protective sleeve (not shown) that covers the areas between the robot base link and the forearms. This enables the robot to be cleaned and fully exposed during the sterilization process.


In certain implementations, protective and fitted sleeves are provided that are tailored to cover the robot arms 14, 16. One such sleeve 140 is shown in FIG. 14 prior to installation onto the robot arms 14, 16. The sleeve 140 is flexible so it does not restrict motion of the robot arms 14, 16 and is durable to tear and puncture during normal robot operation. The sleeve 140 serves as a barrier to fluid ingress into the robot. It is made of biocompatible material and, like all other tissue contact materials in the system, is compliant with ISO 10993. The robot sleeve 140 can be factory installed and stays on the robot throughout the useful life of the device 10.


The robot sleeve 140 also makes the device easily cleaned post-surgery and ensures that all patient contact surfaces are properly exposed during the sterilization process. Alternatively, any known sleeves or protective components can be used.


In certain implementations, a robot clamp 150 is provided to support the device 10 during the procedure. In these implementations, a known, commercially-available support arm 4 can be used to anchor the device 10 to the operating table 2, as shown in FIG. 15. It is understood that the support arm 4 has several adjustment features so it can provide stability while allowing significant repositioning of the robot. In certain examples, the support arm adjustment features are controlled using one adjustment knob 142.


One clamp 150 implementation is depicted in FIG. 16. The device 10 according to these implementations interfaces with the support arm 4 through a robot clamp 150 as shown in FIG. 16. The clamp 150 has a safety release button 152 that must be pressed prior to clamping or unclamping the device 10. The robot body 10A has a robot clamp interface ring 154 defined in the housing 11 to provide an interface between the clamp 150 and the device 10. After the release button 152 is pressed the robotic device 10 can be inserted or removed from the clamp 150 using the release lever 156.


In implementations such as these, the clamp 150 has a clothespin member 158 that is optionally V-grooved. The clothespin member 158 permits the smooth and controlled rotation of the device 10. In these implementations, a clasping member 160 is disposed opposite the clothespin member 158, which is urged inward to secure the device at the interface ring 154, as would be appreciated.


Although various preferred implementations have been described, persons skilled in the art will recognize that changes may be made in form and detail without departing from the spirit and scope thereof.


Although the present invention has been described with reference to preferred implementations, persons skilled in the art will recognize that changes may be made in form and detail without departing from the spirit and scope of the invention.

Claims
  • 1. A robotic surgical system, comprising: a) a device body constructed and arranged to be positioned at least partially within a body cavity of a patient through an incision, the device body comprising: i) a first robotic surgical arm operably coupled to the device body and comprising a first end effector;ii) a second robotic surgical arm operably coupled to the device body and comprising a second end effector; andiii) a camera lumen defined in the device body;b) a positionable camera constructed and arranged to provide views of the first and second end effectors; andc) a surgical console comprising a processor constructed and arranged to execute an algorithm to position the positionable camera,wherein the processor is configured to define a location PL for the first end effector and a location PR for the second end effector, and to establish a Midpoint PLPR between the first and second end effectors via the locations PL and PR, andwherein the algorithm, when executed, positions the positionable camera such that the positionable camera is aligned with the Midpoint PLPR.
  • 2. The robotic surgical system of claim 1, wherein the algorithm is configured to establish a camera reference frame and a robot reference frame, the camera reference frame having an origin, and wherein the algorithm, when executed, positions the positionable camera to align the origin of the camera reference frame with the Midpoint PLPR.
  • 3. The robotic surgical system of claim 2, wherein the processor is configured to align the camera reference frame with the robot reference frame and re-position the positionable camera.
  • 4. The robotic surgical system of claim 2, wherein the robot reference frame is established relative to the device body and is defined by orthogonal unit vectors xR, yR, and ZR.
  • 5. The robotic surgical system of claim 2, wherein the camera reference frame is defined by orthogonal unit vectors xC, yC, and zR.
  • 6. The robotic surgical system of claim 1, wherein the positioning of the positionable camera to align with the Midpoint PLPR comprises a damped motion.
  • 7. The robotic surgical system of claim 6, wherein the damped motion of the positionable camera comprises a term proportional to a velocity of a tip of the positionable camera.
  • 8. The robotic surgical system of claim 1, wherein the positionable camera comprises a tip configured to extend from a distal end of the camera lumen.
  • 9. The robotic surgical system of claim 8, wherein the tip of the positionable camera is constructed and arranged to be capable of both pitch and yaw.
  • 10. The robotic surgical system of claim 1, wherein the processor is further configured to execute a control algorithm for positioning of the first and second robotic surgical arms.
  • 11. A robotic surgical system, comprising: a) a robotic surgical device comprising: i) a first robotic surgical arm operably coupled to a device body and comprising a first end effector;ii) a second robotic surgical arm operably coupled to the device body and comprising a second end effector; andiii) a camera lumen defined in the device body;b) a positionable camera constructed and arranged to provide views of the first end effector and the second end effector; andc) a surgical console comprising a processor constructed and arranged to execute a first algorithm to position the first and second robotic surgical arms, and a second algorithm to i) establish a camera reference frame having an origin,ii) establish a robot reference frame, andiii) position the positionable camera,wherein the processor is configured to define a location PL for the first end effector and a location PR for the second end effector, and to establish a Midpoint PLPR between the first and second end effectors via the locations PL and PR, andwherein the second algorithm positions the positionable camera such that the origin of the camera reference frame is aligned with the Midpoint PLPR.
  • 12. The robotic surgical system of claim 11, wherein the processor is configured to align the camera reference frame with the robot reference frame and re-position the positionable camera.
  • 13. The robotic surgical system of claim 11, wherein the positioning of the positionable camera to align the origin of the camera reference frame with the Midpoint PLPR comprises a damped motion.
  • 14. The robotic surgical system of claim 13, wherein the damped motion of the positionable camera comprises reducing a velocity of a tip of the positionable camera with a term proportional to the velocity of the tip.
  • 15. The robotic surgical system of claim 11, wherein the positionable camera comprises a tip, and wherein the positionable camera is constructed and arranged to be inserted into the robotic surgical device such that the tip is configured to extend from the camera lumen to view the first and second end effectors.
  • 16. The robotic surgical system of claim 15, wherein the tip of the positionable camera comprises an articulating tip constructed and arranged to be capable of both pitch and yaw.
  • 17. The robotic surgical system of claim 16, wherein the articulating tip comprises a light source.
  • 18. The robotic surgical system of claim 11, further comprising a robot clamp constructed and arranged to rotatably couple the robotic surgical device to a support arm.
CROSS-REFERENCE TO RELATED APPLICATION(S)

This application claims priority as a continuation of U.S. application Ser. No. 16/144,807, filed Sep. 27, 2018 and entitled “Robotic Surgical Devices with Tracking Camera Technology and Related Systems and Methods,” which Issued as U.S. Pat. No. 11,051,894 on Jul. 6, 2021, which claims the benefit under 35 U.S.C. § 119(e) to U.S. Provisional Application 62/564,076, filed Sep. 27, 2017 and entitled “Robotic Surgical Devices with Camera Tracking and Related Systems and Methods,” which is hereby incorporated herein by reference in its entirety.

US Referenced Citations (671)
Number Name Date Kind
2858947 Chapman Nov 1958 A
3817403 Glachet et al. Jun 1974 A
3870264 Robinson Mar 1975 A
3922930 Fletcher et al. Dec 1975 A
3971266 Inakura et al. Jul 1976 A
3989952 Timberlake et al. Nov 1976 A
4246661 Pinson Jan 1981 A
4258716 Sutherland Mar 1981 A
4278077 Mizumoto Jul 1981 A
4353677 Susnjara et al. Oct 1982 A
4538594 Boebel et al. Sep 1985 A
4568311 Miyaki Feb 1986 A
4576545 Maeda Mar 1986 A
4623183 Aomori Nov 1986 A
4636138 Gorman Jan 1987 A
4645409 Gorman Feb 1987 A
4684313 Minematsu et al. Aug 1987 A
4736645 Zimmer Apr 1988 A
4762455 Coughlan et al. Aug 1988 A
4771652 Zimmer Sep 1988 A
4852391 Ruch Aug 1989 A
4854808 Bisiach Aug 1989 A
4896015 Taboada et al. Jan 1990 A
4897014 Tietze Jan 1990 A
4922755 Oshiro et al. May 1990 A
4922782 Kawai May 1990 A
4984959 Kato Jan 1991 A
4990050 Tsuge et al. Feb 1991 A
5019968 Wang et al. May 1991 A
5036724 Rosheim Aug 1991 A
5108140 Bartholet Apr 1992 A
5172639 Wiesman et al. Dec 1992 A
5176649 Wakabayashi Jan 1993 A
5178032 Zona et al. Jan 1993 A
5187032 Sasaki et al. Feb 1993 A
5187796 Wang et al. Feb 1993 A
5195388 Zona et al. Mar 1993 A
5201325 McEwen et al. Apr 1993 A
5217003 Wilk Jun 1993 A
5263382 Brooks et al. Nov 1993 A
5271384 McEwen et al. Dec 1993 A
5284096 Pelrine et al. Feb 1994 A
5297443 Wentz Mar 1994 A
5297536 Wilk Mar 1994 A
5304899 Sasaki et al. Apr 1994 A
5307447 Asano et al. Apr 1994 A
5353807 DeMarco Oct 1994 A
5363935 Schempf et al. Nov 1994 A
5372147 Lathrop et al. Dec 1994 A
5382885 Salcudean et al. Jan 1995 A
5441494 Oritz Jan 1995 A
5388528 Pelrine et al. Feb 1995 A
5397323 Taylor et al. Mar 1995 A
5436542 Petelin et al. Jul 1995 A
5456673 Ziegler et al. Oct 1995 A
5458131 Wilk Oct 1995 A
5458583 McNeely et al. Oct 1995 A
5458598 Feinberg et al. Oct 1995 A
5471515 Fossum et al. Nov 1995 A
5515478 Wang May 1996 A
5524180 Wang et al. Jun 1996 A
5553198 Wang et al. Sep 1996 A
5562448 Mushabac Oct 1996 A
5588442 Scovil et al. Dec 1996 A
5620417 Jang et al. Apr 1997 A
5623582 Rosenberg Apr 1997 A
5624380 Takayama et al. Apr 1997 A
5624398 Smith et al. Apr 1997 A
5632761 Smith et al. May 1997 A
5645520 Nakamura et al. Jul 1997 A
5657429 Wang et al. Aug 1997 A
5657584 Hamlin Aug 1997 A
5667354 Nakazawa Sep 1997 A
5672168 de la Torre et al. Sep 1997 A
5674030 Sigel Oct 1997 A
5728599 Rosteker et al. Mar 1998 A
5736821 Suyama et al. Apr 1998 A
5754741 Wang et al. May 1998 A
5762458 Wang et al. Jun 1998 A
5769640 Jacobus et al. Jun 1998 A
5791231 Cohn et al. Aug 1998 A
5792135 Madhani et al. Aug 1998 A
5797538 Heaton et al. Aug 1998 A
5797900 Madhani et al. Aug 1998 A
5807377 Madhani et al. Sep 1998 A
5808665 Green Sep 1998 A
5815640 Wang et al. Sep 1998 A
5825982 Wright et al. Oct 1998 A
5833656 Smith et al. Nov 1998 A
5841950 Wang et al. Nov 1998 A
5845646 Lemelson Dec 1998 A
5855583 Wang et al. Jan 1999 A
5876325 Mizuno Mar 1999 A
5878193 Wang et al. Mar 1999 A
5878783 Smart Mar 1999 A
5895377 Smith et al. Apr 1999 A
5895417 Pomeranz et al. Apr 1999 A
5906591 Dario et al. May 1999 A
5907664 Wang et al. May 1999 A
5910129 Koblish et al. Jun 1999 A
5911036 Wright et al. Jun 1999 A
5954692 Smith et al. Sep 1999 A
5971976 Wang et al. Oct 1999 A
5993467 Yoon Nov 1999 A
6001108 Wang et al. Dec 1999 A
6007550 Wang et al. Dec 1999 A
6030365 Laufer Feb 2000 A
6031371 Smart Feb 2000 A
6058323 Lemelson May 2000 A
6063095 Wang et al. May 2000 A
6066090 Yoon May 2000 A
6086529 Arndt Jul 2000 A
6102850 Wang et al. Aug 2000 A
6106521 Blewett et al. Aug 2000 A
6107795 Smart Aug 2000 A
6132368 Cooper Oct 2000 A
6132441 Grace Oct 2000 A
6139563 Cosgrove, III et al. Oct 2000 A
6156006 Brosens et al. Dec 2000 A
6159146 El Gazayerli Dec 2000 A
6162171 Ng et al. Dec 2000 A
D438617 Cooper et al. Mar 2001 S
6206903 Ramans Mar 2001 B1
D441076 Cooper et al. Apr 2001 S
6223100 Green Apr 2001 B1
D441862 Cooper et al. May 2001 S
6238415 Sepetka et al. May 2001 B1
6240312 Alfano et al. May 2001 B1
6241730 Alby Jun 2001 B1
6244809 Wang et al. Jun 2001 B1
6246200 Blumenkranz et al. Jun 2001 B1
D444555 Cooper et al. Jul 2001 S
6286514 Lemelson Sep 2001 B1
6292678 Hall et al. Sep 2001 B1
6293282 Lemelson Sep 2001 B1
6296635 Smith et al. Oct 2001 B1
6309397 Julian et al. Oct 2001 B1
6309403 Minoret et al. Oct 2001 B1
6312435 Wallace et al. Nov 2001 B1
6321106 Lemelson Nov 2001 B1
6327492 Lemelson Dec 2001 B1
6331181 Tierney et al. Dec 2001 B1
6346072 Cooper Feb 2002 B1
6352503 Matsui et al. Mar 2002 B1
6364888 Niemeyer et al. Apr 2002 B1
6371952 Madhani et al. Apr 2002 B1
6394998 Wallace et al. May 2002 B1
6398726 Ramans et al. Jun 2002 B1
6400980 Lemelson Jun 2002 B1
6408224 Lemelson Jun 2002 B1
6424885 Niemeyer et al. Jul 2002 B1
6432112 Brock et al. Aug 2002 B2
6436107 Wang et al. Aug 2002 B1
6441577 Blumenkranz et al. Aug 2002 B2
6450104 Grant et al. Sep 2002 B1
6450992 Cassidy Sep 2002 B1
6451027 Cooper et al. Sep 2002 B1
6454758 Thompson et al. Sep 2002 B1
6459926 Nowlin et al. Oct 2002 B1
6463361 Wang et al. Oct 2002 B1
6468203 Belson Oct 2002 B2
6468265 Evans et al. Oct 2002 B1
6470236 Ohtsuki Oct 2002 B2
6491691 Morley et al. Dec 2002 B1
6491701 Nemeyer et al. Dec 2002 B2
6493608 Niemeyer et al. Dec 2002 B1
6496099 Wang et al. Dec 2002 B2
6497651 Kan et al. Dec 2002 B1
6508413 Bauer et al. Jan 2003 B2
6512345 Borenstein Jan 2003 B2
6522906 Salisbury, Jr. et al. Feb 2003 B1
6544276 Azizi Apr 2003 B1
6548982 Papanikolopoulos et al. Apr 2003 B1
6554790 Moll Apr 2003 B1
6565554 Niemeyer May 2003 B1
6574355 Green Jun 2003 B2
6587750 Gerbi et al. Jul 2003 B2
6591239 McCall et al. Jul 2003 B1
6594552 Nowlin et al. Jul 2003 B1
6610007 Belson et al. Aug 2003 B2
6620173 Gerbi et al. Sep 2003 B2
6642836 Wang et al. Nov 2003 B1
6645196 Nixon et al. Nov 2003 B1
6646541 Wang et al. Nov 2003 B1
6648814 Kim et al. Nov 2003 B2
6659939 Moll et al. Dec 2003 B2
6661571 Shioda et al. Dec 2003 B1
6671581 Niemeyer et al. Dec 2003 B2
6676684 Morley et al. Jan 2004 B1
6684129 Salisbury, Jr. et al. Jan 2004 B2
6685648 Flaherty et al. Feb 2004 B2
6685698 Morley et al. Feb 2004 B2
6687571 Byme et al. Feb 2004 B1
6692485 Brock et al. Feb 2004 B1
6699177 Wang et al. Mar 2004 B1
6699235 Wallace et al. Mar 2004 B2
6702734 Kim et al. Mar 2004 B2
6702805 Stuart Mar 2004 B1
6714839 Salisbury, Jr. et al. Mar 2004 B2
6714841 Wright et al. Mar 2004 B1
6719684 Kim et al. Apr 2004 B2
6720988 Gere et al. Apr 2004 B1
6726699 Wright et al. Apr 2004 B1
6728599 Wright et al. Apr 2004 B2
6730021 Vassiliades, Jr. et al. May 2004 B2
6731988 Green May 2004 B1
6746443 Morley et al. Jun 2004 B1
6764441 Chiel et al. Jul 2004 B2
6764445 Ramans et al. Jul 2004 B2
6766204 Niemeyer et al. Jul 2004 B2
6770081 Cooper et al. Aug 2004 B1
6774597 Borenstein Aug 2004 B1
6776165 Jin Aug 2004 B2
6780184 Tanrisever Aug 2004 B2
6783524 Anderson et al. Aug 2004 B2
6785593 Wang et al. Aug 2004 B2
6788018 Blumenkranz Sep 2004 B1
6792663 Krzyzanowski Sep 2004 B2
6793653 Sanchez et al. Sep 2004 B2
6799065 Niemeyer Sep 2004 B1
6799088 Wang et al. Sep 2004 B2
6801325 Farr et al. Oct 2004 B2
6804581 Wang et al. Oct 2004 B2
6810281 Brock et al. Oct 2004 B2
6817972 Snow Nov 2004 B2
6817974 Cooper et al. Nov 2004 B2
6817975 Farr et al. Nov 2004 B1
6820653 Schempf et al. Nov 2004 B1
6824508 Kim et al. Nov 2004 B2
6824510 Kim et al. Nov 2004 B2
6826977 Grover et al. Dec 2004 B2
6832988 Sprout Dec 2004 B2
6832996 Woloszko et al. Dec 2004 B2
6836703 Wang et al. Dec 2004 B2
6837846 Jaffe et al. Jan 2005 B2
6837883 Moll et al. Jan 2005 B2
6839612 Sanchez et al. Jan 2005 B2
6840938 Morley et al. Jan 2005 B1
6843793 Brock et al. Jan 2005 B2
6852107 Wang et al. Feb 2005 B2
6853879 Sunaoshi Feb 2005 B2
6858003 Evans et al. Feb 2005 B2
6860346 Burt et al. Mar 2005 B2
6860877 Sanchez et al. Mar 2005 B1
6866671 Tierney et al. Mar 2005 B2
6870343 Borenstein et al. Mar 2005 B2
6871117 Wang et al. Mar 2005 B2
6871563 Choset et al. Mar 2005 B2
6879880 Nowlin et al. Apr 2005 B2
6892112 Wang et al. May 2005 B2
6899705 Niemeyer May 2005 B2
6902560 Morley et al. Jun 2005 B1
6905460 Wang et al. Jun 2005 B2
6905491 Wang et al. Jun 2005 B1
6911916 Wang et al. Jun 2005 B1
6917176 Schempf et al. Jul 2005 B2
6933695 Blumenkranz Aug 2005 B2
6936001 Snow Aug 2005 B1
6936003 Iddan Aug 2005 B2
6936042 Wallace et al. Aug 2005 B2
6943663 Wang et al. Sep 2005 B2
6949096 Davison et al. Sep 2005 B2
6951535 Ghodoussi et al. Oct 2005 B2
6963792 Green Nov 2005 B1
6965812 Wang et al. Nov 2005 B2
6974411 Belson Dec 2005 B2
6974449 Niemeyer Dec 2005 B2
6979423 Moll Dec 2005 B2
6984203 Tartaglia et al. Jan 2006 B2
6984205 Gazdzinski Jan 2006 B2
6991627 Madhani et al. Jan 2006 B2
6993413 Sunaoshi Jan 2006 B2
6994703 Wang et al. Feb 2006 B2
6994708 Manzo Feb 2006 B2
6997908 Carrillo, Jr. et al. Feb 2006 B2
6999852 Green Feb 2006 B2
7025064 Wang et al. Apr 2006 B2
7027892 Wang et al. Apr 2006 B2
7033344 Imran Apr 2006 B2
7039453 Mullick May 2006 B2
7042184 Oleynikov et al. May 2006 B2
7048745 Tierney et al. May 2006 B2
7053752 Wang et al. May 2006 B2
7063682 Whayne et al. Jun 2006 B1
7066879 Fowler et al. Jun 2006 B2
7066926 Wallace et al. Jun 2006 B2
7074179 Wang et al. Jul 2006 B2
7077446 Kameda et al. Jul 2006 B2
7083571 Wang et al. Aug 2006 B2
7083615 Peterson et al. Aug 2006 B2
7087049 Nowlin et al. Aug 2006 B2
7090683 Brock et al. Aug 2006 B2
7097640 Wang et al. Aug 2006 B2
7105000 McBrayer Sep 2006 B2
7107090 Salisbury, Jr. et al. Sep 2006 B2
7109678 Kraus et al. Sep 2006 B2
7118582 Wang et al. Oct 2006 B1
7121781 Sanchez et al. Oct 2006 B2
7125403 Julian et al. Oct 2006 B2
7126303 Farritor et al. Oct 2006 B2
7147650 Lee Dec 2006 B2
7155315 Niemeyer et al. Dec 2006 B2
7155316 Sutherland et al. Dec 2006 B2
7163525 Franer Jan 2007 B2
7169141 Brock et al. Jan 2007 B2
7182025 Ghorbel et al. Feb 2007 B2
7182089 Ries Feb 2007 B2
7199545 Oleynikov et al. Apr 2007 B2
7206626 Quaid, III Apr 2007 B2
7206627 Abovitz et al. Apr 2007 B2
7210364 Ghorbel et al. May 2007 B2
7214230 Brock et al. May 2007 B2
7217240 Snow May 2007 B2
7239940 Wang et al. Jul 2007 B2
7250028 Julian et al. Jul 2007 B2
7259652 Wang et al. Aug 2007 B2
7273488 Nakamura et al. Sep 2007 B2
7311107 Harel et al. Dec 2007 B2
7339341 Oleynikov et al. Mar 2008 B2
7372229 Farritor et al. May 2008 B2
7403836 Aoyama Jul 2008 B2
7438702 Hart et al. Oct 2008 B2
7447537 Funda et al. Nov 2008 B1
7492116 Oleynikov et al. Feb 2009 B2
7566300 Devierre et al. Jul 2009 B2
7574250 Niemeyer Aug 2009 B2
7637905 Saadat et al. Dec 2009 B2
7645230 Mikkaichi et al. Jan 2010 B2
7655004 Long Feb 2010 B2
7670329 Flaherty et al. Mar 2010 B2
7678043 Gilad Mar 2010 B2
7731727 Sauer Jun 2010 B2
7734375 Buehler et al. Jun 2010 B2
7762825 Burbank et al. Jul 2010 B2
7772796 Farritor et al. Aug 2010 B2
7785251 Wilk Aug 2010 B2
7785294 Hueil et al. Aug 2010 B2
7785333 Miyamoto et al. Aug 2010 B2
7789825 Nobis et al. Sep 2010 B2
7789861 Franer Sep 2010 B2
7794494 Sahatjian et al. Sep 2010 B2
7865266 Moll et al. Jan 2011 B2
7960935 Farritor et al. Jun 2011 B2
7979157 Anvari Jul 2011 B2
3021358 Doyle et al. Sep 2011 A1
8179073 Farritor et al. May 2012 B2
8231610 Jo et al. Jul 2012 B2
8343171 Farritor et al. Jan 2013 B2
8353897 Doyle et al. Jan 2013 B2
8377045 Schena Feb 2013 B2
8430851 Mcginley et al. Apr 2013 B2
8604742 Farritor et al. Dec 2013 B2
8636686 Minnelli et al. Jan 2014 B2
8679096 Farritor et al. Mar 2014 B2
8827337 Murata et al. Sep 2014 B2
8828024 Farritor et al. Sep 2014 B2
8834488 Farritor et al. Sep 2014 B2
8864652 Diolaiti et al. Oct 2014 B2
8888687 Ostrovsky et al. Nov 2014 B2
8968332 Farritor et al. Mar 2015 B2
8974440 Farritor et al. Mar 2015 B2
8986196 Larkin et al. Mar 2015 B2
9010214 Markvicka et al. Apr 2015 B2
9060781 Farritor et al. Jun 2015 B2
9089256 Tognaccini et al. Jul 2015 B2
9089353 Farritor et al. Jul 2015 B2
9138129 Diolaiti Sep 2015 B2
9198728 Wang Dec 2015 B2
9516996 Diolaiti et al. Dec 2016 B2
9649020 Finlay May 2017 B2
9717563 Tognaccini et al. Aug 2017 B2
9743987 Farritor et al. Aug 2017 B2
9757187 Farritor et al. Sep 2017 B2
9770305 Farritor et al. Sep 2017 B2
9789608 Itkowitz et al. Oct 2017 B2
9814640 Khaligh Nov 2017 B1
9816641 Bock-Aronson et al. Nov 2017 B2
9849586 Rosheim Dec 2017 B2
9857786 Cristiano Jan 2018 B2
9888966 Farritor et al. Feb 2018 B2
9956043 Farritor et al. May 2018 B2
10008017 Itkowitz et al. Jun 2018 B2
10111711 Farritor et al. Oct 2018 B2
10137575 Itkowitz et al. Nov 2018 B2
10159533 Moll et al. Dec 2018 B2
10220522 Rockrohr Mar 2019 B2
10258425 Mustufa et al. Apr 2019 B2
10307199 Farritor et al. Jun 2019 B2
10342561 Farritor et al. Jul 2019 B2
10368952 Tognaccini et al. Aug 2019 B2
10507066 DiMaio et al. Dec 2019 B2
10555775 Hoffman et al. Feb 2020 B2
10582973 Wilson et al. Mar 2020 B2
10695137 Farritor et al. Jun 2020 B2
10729503 Cameron Aug 2020 B2
10737394 Itkowitz et al. Aug 2020 B2
10751136 Farritor et al. Aug 2020 B2
10751883 Nahum Aug 2020 B2
10806538 Farritor et al. Oct 2020 B2
10966700 Farritor Apr 2021 B2
11032125 Farritor et al. Jun 2021 B2
11298195 Ye Apr 2022 B2
11382702 Tognaccini et al. Jul 2022 B2
11529201 Mondry et al. Dec 2022 B2
11595242 Farritor et al. Feb 2023 B2
20010018591 Brock et al. Aug 2001 A1
20010049497 Kalloo et al. Dec 2001 A1
20020003173 Bauer et al. Jan 2002 A1
20020013601 Nobles et al. Jan 2002 A1
20020026186 Woloszko et al. Feb 2002 A1
20020038077 de la Torre et al. Mar 2002 A1
20020065507 Zando-Azizi May 2002 A1
20020091374 Cooper Jun 2002 A1
20020103417 Gazdzinski Aug 2002 A1
20020111535 Kim et al. Aug 2002 A1
20020120254 Julian et al. Aug 2002 A1
20020128552 Nowlin et al. Sep 2002 A1
20020140392 Borenstein et al. Oct 2002 A1
20020147487 Sundquist et al. Oct 2002 A1
20020151906 Demarais et al. Oct 2002 A1
20020156347 Kim et al. Oct 2002 A1
20020171385 Kim et al. Nov 2002 A1
20020173700 Kim et al. Nov 2002 A1
20020190682 Schempf et al. Dec 2002 A1
20030020810 Takizawa et al. Jan 2003 A1
20030045888 Brock et al. Mar 2003 A1
20030065250 Chiel et al. Apr 2003 A1
20030089267 Ghorbel et al. May 2003 A1
20030092964 Kim et al. May 2003 A1
20030097129 Davison et al. May 2003 A1
20030100817 Wang et al. May 2003 A1
20030109780 Coste-Maniere et al. Jun 2003 A1
20030114731 Cadeddu et al. Jun 2003 A1
20030135203 Wang et al. Jun 2003 A1
20030139742 Wampler et al. Jul 2003 A1
20030144656 Ocel et al. Jul 2003 A1
20030159535 Grover et al. Aug 2003 A1
20030167000 Mullick Sep 2003 A1
20030172871 Scherer Sep 2003 A1
20030179308 Zamorano et al. Sep 2003 A1
20030181788 Yokoi et al. Sep 2003 A1
20030225479 Waled Dec 2003 A1
20030229268 Uchiyama et al. Dec 2003 A1
20030229338 Irion et al. Dec 2003 A1
20030230372 Schmidt Dec 2003 A1
20040024311 Quaid Feb 2004 A1
20040034282 Quaid Feb 2004 A1
20040034283 Quaid Feb 2004 A1
20040034302 Abovitz et al. Feb 2004 A1
20040050394 Jin Mar 2004 A1
20040070822 Shioda et al. Apr 2004 A1
20040099175 Perrot et al. May 2004 A1
20040102772 Baxter et al. May 2004 A1
20040106916 Quaid et al. Jun 2004 A1
20040111113 Nakamura et al. Jun 2004 A1
20040117032 Roth Jun 2004 A1
20040138525 Saadat et al. Jul 2004 A1
20040138552 Harel et al. Jul 2004 A1
20040140786 Borenstein Jul 2004 A1
20040153057 Davison Aug 2004 A1
20040173116 Ghorbel et al. Sep 2004 A1
20040176664 Iddan Sep 2004 A1
20040215331 Chew et al. Oct 2004 A1
20040225229 Viola Nov 2004 A1
20040254680 Sunaoshi Dec 2004 A1
20040267326 Ocel et al. Dec 2004 A1
20050014994 Fowler et al. Jan 2005 A1
20050021069 Feuer et al. Jan 2005 A1
20050029978 Oleynikov et al. Feb 2005 A1
20050043583 Killmann et al. Feb 2005 A1
20050049462 Kanazawa Mar 2005 A1
20050054901 Yoshino Mar 2005 A1
20050054902 Konno Mar 2005 A1
20050064378 Toly Mar 2005 A1
20050065400 Banik et al. Mar 2005 A1
20050070850 Albrecht Mar 2005 A1
20050083460 Hattori et al. Apr 2005 A1
20050095650 Julius et al. May 2005 A1
20050096502 Khalili May 2005 A1
20050143644 Gilad et al. Jun 2005 A1
20050154376 Riviere et al. Jul 2005 A1
20050165449 Cadeddu et al. Jul 2005 A1
20050177026 Hoeg et al. Aug 2005 A1
20050234294 Saadat et al. Oct 2005 A1
20050234435 Layer Oct 2005 A1
20050272977 Saadat et al. Dec 2005 A1
20050283137 Doyle et al. Dec 2005 A1
20050288555 Binmoeller Dec 2005 A1
20050288665 Woloszko Dec 2005 A1
20060020272 Gildenberg Jan 2006 A1
20060046226 Bergler et al. Mar 2006 A1
20060079889 Scott Apr 2006 A1
20060100501 Berkelman et al. May 2006 A1
20060119304 Farritor et al. Jun 2006 A1
20060149135 Paz Jul 2006 A1
20060152591 Lin Jul 2006 A1
20060155263 Lipow Jul 2006 A1
20060189845 Maahs et al. Aug 2006 A1
20060195015 Mullick et al. Aug 2006 A1
20060196301 Oleynikov et al. Sep 2006 A1
20060198619 Oleynikov et al. Sep 2006 A1
20060241570 Wilk Oct 2006 A1
20060241732 Denker et al. Oct 2006 A1
20060253109 Chu Nov 2006 A1
20060258938 Hoffman et al. Nov 2006 A1
20060258954 Timberlake et al. Nov 2006 A1
20060261770 Kishi et al. Nov 2006 A1
20070032701 Fowler et al. Feb 2007 A1
20070043397 Ocel et al. Feb 2007 A1
20070055342 Wu et al. Mar 2007 A1
20070080658 Farritor et al. Apr 2007 A1
20070088277 Mcginley et al. Apr 2007 A1
20070088340 Brock et al. Apr 2007 A1
20070106113 Ravo May 2007 A1
20070106317 Shelton et al. May 2007 A1
20070123748 Meglan May 2007 A1
20070135803 Belson Jun 2007 A1
20070142725 Hardin et al. Jun 2007 A1
20070156019 Larkin Jul 2007 A1
20070156211 Ferren et al. Jul 2007 A1
20070167955 De La Menardiere et al. Jul 2007 A1
20070225633 Ferren et al. Sep 2007 A1
20070225634 Ferren et al. Sep 2007 A1
20070241714 Oleynikov et al. Oct 2007 A1
20070244520 Ferren et al. Oct 2007 A1
20070250064 Darois et al. Oct 2007 A1
20070255273 Fernandez et al. Nov 2007 A1
20070287884 Schena Dec 2007 A1
20080004634 Farritor et al. Jan 2008 A1
20080015565 Davison Jan 2008 A1
20080015566 Livneh Jan 2008 A1
20080021440 Solomon Jan 2008 A1
20080033569 Ferren et al. Feb 2008 A1
20080045803 Williams et al. Feb 2008 A1
20080058835 Farritor et al. Mar 2008 A1
20080058989 Oleynikov et al. Mar 2008 A1
20080071289 Cooper Mar 2008 A1
20080071290 Larkin et al. Mar 2008 A1
20080103440 Ferren et al. May 2008 A1
20080109014 de la Pena May 2008 A1
20080111513 Farritor et al. May 2008 A1
20080119870 Williams et al. May 2008 A1
20080132890 Woloszko et al. Jun 2008 A1
20080161804 Rioux et al. Jun 2008 A1
20080164079 Ferren et al. Jul 2008 A1
20080168639 Otake et al. Jul 2008 A1
20080183033 Bern et al. Jul 2008 A1
20080221591 Farritor et al. Sep 2008 A1
20080269557 Marescaux et al. Oct 2008 A1
20080269562 Marescaux et al. Oct 2008 A1
20090002414 Shibata et al. Jan 2009 A1
20090012532 Quaid et al. Jan 2009 A1
20090020724 Paffrath Jan 2009 A1
20090024142 Ruiz Morales Jan 2009 A1
20090048612 Farritor Feb 2009 A1
20090054909 Farritor et al. Feb 2009 A1
20090069821 Farritor et al. Mar 2009 A1
20090076536 Rentschler et al. Mar 2009 A1
20090137952 Ramamurthy et al. May 2009 A1
20090143787 De La Pena Jun 2009 A9
20090163929 Yeung Jun 2009 A1
20090171373 Farritor Jul 2009 A1
20090234369 Bax Sep 2009 A1
20090236400 Cole Sep 2009 A1
20090240246 Devill et al. Sep 2009 A1
20090247821 Rogers Oct 2009 A1
20090248038 Blumenkranz et al. Oct 2009 A1
20090281377 Newell et al. Nov 2009 A1
20090299143 Conlon et al. Dec 2009 A1
20090305210 Guru et al. Dec 2009 A1
20090326322 Diolaiti Dec 2009 A1
20100010294 Conlon et al. Jan 2010 A1
20100016659 Weitzner et al. Jan 2010 A1
20100016853 Burbank Jan 2010 A1
20100026347 Tizuka Feb 2010 A1
20100042097 Newton et al. Feb 2010 A1
20100056863 Dejima et al. Mar 2010 A1
20100069710 Yamatani et al. Mar 2010 A1
20100069940 Miller et al. Mar 2010 A1
20100081875 Fowler et al. Apr 2010 A1
20100101346 Johnson et al. Apr 2010 A1
20100130986 Mailloux et al. May 2010 A1
20100139436 Kawashima et al. Jun 2010 A1
20100185212 Sholev Jul 2010 A1
20100198231 Manzo et al. Aug 2010 A1
20100204713 Ruiz Morales Aug 2010 A1
20100245549 Allen et al. Sep 2010 A1
20100250000 Blumenkranz et al. Sep 2010 A1
20100262162 Omori Oct 2010 A1
20100263470 Bannasch et al. Oct 2010 A1
20100274079 Kim et al. Oct 2010 A1
20100292691 Brogna Nov 2010 A1
20100301095 Shelton, IV et al. Dec 2010 A1
20100318059 Farritor et al. Dec 2010 A1
20100331856 Carlson et al. Dec 2010 A1
20110015569 Kirschenman et al. Jan 2011 A1
20110020779 Hannaford et al. Jan 2011 A1
20110071347 Rogers Mar 2011 A1
20110071544 Steger et al. Mar 2011 A1
20110075693 Kuramochi et al. Mar 2011 A1
20110077478 Freeman et al. Mar 2011 A1
20110082365 Mcgrogan et al. Apr 2011 A1
20110098529 Ostrovsky et al. Apr 2011 A1
20110107866 Oka et al. May 2011 A1
20110152615 Schostek et al. Jun 2011 A1
20110224605 Farritor et al. Sep 2011 A1
20110230894 Simaan Sep 2011 A1
20110237890 Farritor et al. Sep 2011 A1
20110238079 Hannaford et al. Sep 2011 A1
20110238080 Ranjit et al. Sep 2011 A1
20110264078 Lipow et al. Oct 2011 A1
20110270443 Kamiya et al. Nov 2011 A1
20110276046 Heimbecker et al. Nov 2011 A1
20120016175 Roberts et al. Jan 2012 A1
20120029727 Sholev Feb 2012 A1
20120035582 Nelson et al. Feb 2012 A1
20120059392 Diolaiti Mar 2012 A1
20120078053 Phee et al. Mar 2012 A1
20120109150 Quaid et al. May 2012 A1
20120116362 Kieturakis May 2012 A1
20120179168 Farritor et al. Jul 2012 A1
20120221147 Goldberg et al. Aug 2012 A1
20120253515 Coste-Maniere et al. Oct 2012 A1
20130001970 Suyama et al. Jan 2013 A1
20130041360 Farritor et al. Feb 2013 A1
20130055560 Nakasugi et al. Mar 2013 A1
20130125696 Long May 2013 A1
20130131695 Scarfogliero May 2013 A1
20130178867 Farritor et al. Jul 2013 A1
20130282023 Burbank et al. Oct 2013 A1
20130304084 Beira et al. Nov 2013 A1
20130325030 Hourtash et al. Dec 2013 A1
20130325181 Moore Dec 2013 A1
20130345717 Markvicka et al. Dec 2013 A1
20130345718 Crawford et al. Dec 2013 A1
20140039515 Mondry et al. Feb 2014 A1
20140046340 Wilson et al. Feb 2014 A1
20140055489 Itkowitz et al. Feb 2014 A1
20140058205 Frederick et al. Feb 2014 A1
20140100587 Farritor et al. Apr 2014 A1
20140137687 Nogami et al. May 2014 A1
20140221749 Grant et al. Aug 2014 A1
20140232824 DiMaio et al. Aug 2014 A1
20140276944 Farritor et al. Sep 2014 A1
20140303434 Farritor et al. Oct 2014 A1
20140371762 Farritor et al. Dec 2014 A1
20150051446 Farritor Feb 2015 A1
20150057537 Dillon et al. Feb 2015 A1
20150157191 Phee et al. Jun 2015 A1
20150223896 Farritor et al. Aug 2015 A1
20150297299 Yeung et al. Oct 2015 A1
20160066999 Forgione et al. Mar 2016 A1
20160135898 Frederick et al. May 2016 A1
20160291571 Cristiano Oct 2016 A1
20160303745 Rockrohr Oct 2016 A1
20170014197 Mccrea et al. Jan 2017 A1
20170035526 Farritor et al. Feb 2017 A1
20170078583 Haggerty et al. Mar 2017 A1
20170252096 Felder et al. Sep 2017 A1
20170354470 Farritor et al. Dec 2017 A1
20180132956 Cameron May 2018 A1
20180153578 Cooper et al. Jun 2018 A1
20180338777 Bonadio et al. Nov 2018 A1
20190059983 Germain et al. Feb 2019 A1
20190090965 Farritor et al. Mar 2019 A1
20190209262 Mustufa et al. Jul 2019 A1
20190327394 Ramirez et al. Oct 2019 A1
20200138534 Garcia Kilroy et al. May 2020 A1
20200214775 Farritor et al. Jul 2020 A1
20200330175 Cameron Oct 2020 A1
20200368915 Itkowitz et al. Nov 2020 A1
Foreign Referenced Citations (96)
Number Date Country
2918531 Jan 2015 CA
102499759 Jun 2012 CN
102821918 Dec 2012 CN
104523309 Apr 2015 CN
104582600 Apr 2015 CN
104622528 May 2015 CN
204337044 May 2015 CN
105025826 Nov 2015 CN
102010040405 Mar 2012 DE
0105656 Apr 1984 EP
0279591 Aug 1988 EP
1354670 Oct 2003 EP
2286756 Feb 2011 EP
2286756 Feb 2011 EP
2329787 Jun 2011 EP
2563261 Mar 2013 EP
2684528 Jan 2014 EP
2123225 Dec 2014 EP
2815705 Dec 2014 EP
2881046 Oct 2015 EP
2937047 Oct 2015 EP
5959371 Apr 1984 JP
61165061 Jul 1986 JP
S6268293 Mar 1987 JP
04144533 May 1992 JP
05-115425 May 1993 JP
06507809 Sep 1994 JP
06508049 Sep 1994 JP
2006508049 Sep 1994 JP
07-016235 Jan 1995 JP
07-136173 May 1995 JP
7306155 Nov 1995 JP
08-224248 Sep 1996 JP
2001500510 Jan 2001 JP
2001505810 May 2001 JP
2002000524 Jan 2002 JP
2003220065 Aug 2003 JP
2004144533 May 2004 JP
2004-180781 Jul 2004 JP
2004283940 Oct 2004 JP
2004322310 Nov 2004 JP
2004329292 Nov 2004 JP
2006507809 Mar 2006 JP
2009106606 May 2009 JP
2009297809 Dec 2009 JP
2010533045 Oct 2010 JP
2010536436 Dec 2010 JP
2011504794 Feb 2011 JP
2011045500 Mar 2011 JP
2011115591 Jun 2011 JP
2012176489 Sep 2012 JP
2012504017 Feb 2015 JP
2015526171 Sep 2015 JP
2016213937 Dec 2016 JP
2017113837 Jun 2017 JP
199221291 May 1991 WO
2001089405 Nov 2001 WO
2002082979 Oct 2002 WO
2002100256 Dec 2002 WO
2005009211 Jul 2004 WO
2005044095 May 2005 WO
2006052927 Aug 2005 WO
2006005075 Jan 2006 WO
2006079108 Jan 2006 WO
2006079108 Jul 2006 WO
2007011654 Jan 2007 WO
2007111571 Oct 2007 WO
2007149559 Dec 2007 WO
2009014917 Jan 2009 WO
2009023851 Feb 2009 WO
2009144729 Dec 2009 WO
2009158164 Dec 2009 WO
2010039394 Apr 2010 WO
2010042611 Apr 2010 WO
2010046823 Apr 2010 WO
2010050771 May 2010 WO
2010083480 Jul 2010 WO
2011075693 Jun 2011 WO
2011118646 Sep 2011 WO
2011135503 Nov 2011 WO
2011163520 Dec 2011 WO
2013009887 Jan 2013 WO
2013052137 Apr 2013 WO
2013106569 Jul 2013 WO
2014011238 Jan 2014 WO
2014025399 Feb 2014 WO
2014144220 Sep 2014 WO
2014146090 Sep 2014 WO
2015009949 Jan 2015 WO
2015031777 Mar 2015 WO
2015088655 Jun 2015 WO
2016077478 May 2016 WO
2017024081 Feb 2017 WO
2017064303 Apr 2017 WO
2017201310 Nov 2017 WO
2018045036 Mar 2018 WO
Non-Patent Literature Citations (141)
Entry
Abbott et al., “Design of an Endoluminal NOTES Robotic System,” from the Proceedings of the 2007 IEEE/RSJ Int'l Conf. on Intelligent Robot Systems, San Diego, CA, Oct. 29-Nov. 2, 2007, pp. 410-416.
Allendorf et al., “Postoperative Immune Function Varies Inversely with the Degree of Surgical Trauma in a Murine Model,” Surgical Endoscopy 1997; 11:427-430.
Ang, “Active Tremor Compensation in Handheld Instrument for Microsurgery,” Doctoral Dissertation, tech report CMU-RI-TR-04-28, Robotics Institute, Carnegie Mellon Unviersity, May 2004, 167pp.
Atmel 80C5X2 Core, http://www.atmel.com, 2006, 186pp.
Bailey et al., “Complications of Laparoscopic Surgery,” Quality Medical Publishers, Inc., 1995, 25pp.
Ballantyne, “Robotic Surgery, Telerobotic Surgery, Telepresence, and Telementoring,” Surgical Endoscopy, 2002; 16: 1389-1402.
Bauer et al., “Case Report: Remote Percutaneous Renal Percutaneous Renal Access Using a New Automated Telesurgical Robotic System,” Telemedicine Journal and e-Health 2001; (4): 341-347.
Begos et al., “Laparoscopic Cholecystectomy: From Gimmick to Gold Standard,” J Clin Gastroenterol, 1994; 19(4): 325-330.
Berg et al., “Surgery with Cooperative Robots,” Medicine Meets Virtual Reality, Feb. 2007, 1 pg.
Breda et al., “Future developments and perspectives in laparoscopy,” Eur. Urology 2001; 40(1): 84-91.
Breedveld et al., “Design of Steerable Endoscopes to Improve the Visual Perception of Depth During Laparoscopic Surgery,” ASME, Jan. 2004; vol. 126, pp. 1-5.
Breedveld et al., “Locomotion through the Intestine by means of Rolling Stents,” Proceedings of the ASME Design Engineering Technical Conferences, 2004, pp. 1-7.
Calafiore et al., Multiple Arterial Conduits Without Cardiopulmonary Bypass: Early Angiographic Results,: Ann Thorac Surg, 1999; 67: 450-456.
Camarillo et al., “Robotic Technology in Surgery: Past, Present and Future,” The American Journal of Surgery, 2004; 188: 2S-15.
Cavusoglu et al., “Telesurgery and Surgical Simulation: Haptic Interfaces to Real and Virtual Surgical Environments,” In McLaughliin, M.L., Hespanha, J.P., and Sukhatme, G., editors. Touch in virtual environments, IMSC Series in Multimedia 2001, 28pp.
Dumpert et al., “Stereoscopic In Vivo Surgical Robots,” IEEE Sensors Special Issue on In Vivo Sensors for Medicine, Jan. 2007, 10 pp.
Green, “Telepresence Surgery”, Jan. 1, 1995, Publisher: IEEE Engineering in Medicine and Biology.
Cleary et al., “State of the Art in Surgical Rooties: Clinical Applications and Technology Challenges”, “Computer Aided Surgery”, Jan. 1, 2002, pp. 312-328, vol. 6.
Stoianovici et al., “Robotic Tools for Minimally Invasive Urologic Surgery”, Jan. 1, 2002, pp. 1-17.
Franzino, “The Laprotek Surgical System and the Next Generation of Robotics,” Surg Clin North Am, 2003 83(6): 1317-1320.
Franklin et al., “Prospective Comparison of Open vs. Laparoscopic Colon Surgery for Carcinoma: Five-Year Results,” Dis Colon Rectum, 1996; 39: S35-S46.
Flynn et al, “Tomorrow's surgery: micromotors and microrobots for minimally invasive procedures,” Minimally Invasive Surgery & Allied Technologies, 1998; 7(4): 343-352.
Fireman et al., “Diagnosing small bowel Crohn's desease with wireless capsule endoscopy,” Gut 2003; 52: 390-392.
Fearing et al., “Wing Transmission for a Micromechanical Flying Insect,” Proceedings of the 2000 IEEE International Conference to Robotics & Automation, Apr. 2000; 1509-1516.
Faraz et al., “Engineering Approaches to Mechanical and Robotic Design for Minimaly Invasive Surgery (MIS),” Kluwer Academic Publishers (Boston), 2000, 13pp.
Falcone et al., “Robotic Surgery,” Clin. Obstet. Gynecol. 2003, 46(1): 37-13.
Fraulob et al., “Miniature assistance module for robot-assisted heart surgery,” Biomed. Tech. 2002, 47 Suppl. 1, Pt. 1: 12-15.
Fukuda et al., “Mechanism and Swimming Experiment of Micro Mobile Robot in Water,” Proceedings of the 1994 IEEE International Conference on Robotics and Automation, 1994: 814-819.
Fukuda et al., “Micro Active Catheter System with Multi Degrees of Freedom,” Proceedings of the IEEE International Conference on Robotics and Automation, May 1994, pp. 2290-2295.
Fuller et al., “Laparoscopic Trocar Injuries: A Report from a U.S. Food and Drug Administration (FDA) Center for Devices and Radiological Health (CDRH) Systematic Technology Assessment of Medical Products (STAMP) Committe,” U.S. Food and Drug Adminstration, available at http://www.fdaJ:?;ov, Finalized: Nov. 7, 2003; Updated: Jun. 24, 2005, 11 pp.
Dumpert et al., “Improving in Vivo Robot Visioin Quality,” from the Proceedings of Medicine Meets Virtual Realtiy, Long Beach, CA, Jan. 26-29, 2005. 1 pg.
Dakin et al., “Comparison of laparoscopic skills performance between standard instruments and two surgical robotic systems,” Surg Endosc., 2003; 17: 574-579.
Cuschieri, “Technology for Minimal Access Surgery,” BMJ, 1999, 319: 1-6.
Grady, “Doctors Try New Surgery for Gallbladder Removal,” The New York Times, Apr. 20, 2007, 3 pp.
Choi et al., “Flexure-based Manipulator for Active Handheld Microsurgical Instrument,” Proceedings of the 27th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBS), Sep. 2005, 4pp.
Chanthasopeephan et al., (2003), “Measuring Forces in Liver Cutting: New Equipment and Experimenal Results,” Annals of Biomedical Engineering 31:1372-1382.
Cavusoglu et al., “Robotics for Telesurgery: Second Generation Berkeley/UCSF Laparoscopic Telesurgical Workstation and Looking Towards the Future Applications,” Industrial Robot: An International Journal, 2003; 30(1): 22-29.
Guber et al., “Miniaturized Instrument Systems for Minimally Invasive Diagnosis and Therapy,” Biomedizinische Technic. 2002, Band 47, Erganmngsband 1: 198-201.
Southern Surgeons Club (1991), “A prospective analysis of 1518 laparoscopic cholecystectomies,” N. Eng. 1 Med. 324 (16): 1073-1078.
Suzumori et al., “Development of Flexible Microactuator and its Applications to Robotics Mechanisms,” Proceedings of the IEEE International Conference on Robotics and Automation, 1991: 1622-1627.
Wolfe et al. (1991), Endoscopic Cholecystectomy: An analysis of Complications, Arch. Surg. 1991; 126: 1192-1196.
Mack et al., “Present Role of Thoracoscopy in the Diagnosis and Treatment of Diseases of the Chest,” Ann Thorac Surgery, 1992; 54: 403-409.
Peters, “Minimally Invasive Colectomy: Are the Potential Benefits Realized?” Dis Colon Rectum 1993; 36: 751-756.
Tendick et al. (1993), “Sensing and Manipulation Problems in Endoscopic Surgery: Experiment, Analysis, and Observation,” Presence 2(1): 66-81.
Sackier et al., “Robotically assisted laparoscopic surgery,” Surgical Endoscopy, 1994; 8:63-6.
Stiff et al., “Long-term Pain: Less Common After Laparoscopic than Open Cholecystectomy,” British Journal of Surgery, 1994; 81: 1368-1370.
Slatkin et al., “The Development of a Robotic Endoscope,” Proceedings of the 1995 IEEE International Conference on Robotics and Automation, pp. 162-171, 1995.
Taylor et al., “A Telerobotic Assistant for Laparoscopic Surgery,” IEEE Eng Med Biol, 1995; 279-87.
Way et al., EDITORS, “Fundamentals of Laparoscopic Surgery,” Churchill Livingstone Inc., 1995; 14 pp.
Guo et al., “Micro Active Guide Wire Catheter System—Characteristic Evaluation, Electrical Model* and Operability Evaluation of Micro Active Catheter,” Proceedings of the 1996 IEEE International Conference on Robotics and Automation, Apr. 1996; 2226-2231.
Schippers et al. (1996), “Requirements and Possibilities of Computer-Assisted Endoscopic Surgery,” In: Computer Integrated Surgery: Technology and Clinical Applications, pp. 561-565.
Liem et al., “Comparison of Conventional Anterior Surgery and Laparoscopic Surgery for Inguinal-hernia Repair,” New England Journal of Medicine, 1997; 336 (22):1541-1547.
Kazemier et al. (1998), “Vascular Injuries During Laparoscopy,” J. Am. Coli. Surg. 186(5): 604-5.
Palm. William. “Rapid Prototyping Primer” May 1998 (revised Jul. 30, 2002) (http://www.me.psu.edu/lamancusa/ rapidpro/primer/chapter2.htm), 12 pages.
Tendick et al., “Applications of Micromechatronics in Minimally Invasive Surgery,” IEEE/ASME Transactions on Mechatronics, 1998; 3(1): 34-42.
Worn et al., “Espirit Project No. 33915: Miniaturised Robot for Micro Manipulation (MINIMAN),” Nov. 1998, http://www.ipr.ira.ujka.de/-microbot/miniman.
Macfarlane et al., “Force-Feedback Grasper Helps Restore the Sense of Touch in Minimally Invasive Surgery,” Journal of Gastrointestinal Surgery, 1999; 3: 278-285.
Rosen et al., “Force Controlled and Teleoperated Endoscopic, Grasper for Minimally Invasive Surgery- Experimental Performance Evaluation,” IEEE Transactions of Biomedical Engineering, Oct. 1999; 46(10): 1212-1221.
Gong et al., “Wireless endoscopy,” Gastrointestinal Endoscopy 2000; 51 (6): 725-729.
Heikkinen et al., “Comparison of laparoscopic and open Nissen fundoplication two years after operation: A prospective randomized trial,” Surgical Endoscopy, 2000; 14:1019-1023.
Li et al. (2000), “Microvascular Anastomoses Performed in Rats Using a Microsurgical Telemanipulator,” Comp. Aid. Surg., 5: 326-332.
Ishiyama et al., “Spiral-type Micro-machine for Medical Applications,” 2000 International Symposium on Micromechatronics and Human Science, 2000; 65-69.
Meron, “The development of the swallowable video capsule (M2A),” Gastrointestinal Endoscopy 2000; 52 6: 817-819.
Salky, “What is the Penetration of Endoscopic Techniques into Surgical Practice?” Digestive Surgery 2000; 17:422-426.
Schurr et al., “Robotics and Telemanipulation Technologies for Endoscopic Surgery,” Surgical Endoscopy, 2000; 14:375-381.
Abbou et al., “Laparoscopic Radical Prostatectomy with a Remote Controlled Robot,” The Journal of Urology, Jun. 2001; 165: 1964-1966.
Horgan et al., “Technical Report: Robots in Laparoscopic Surgery,” Journal of Laparoendoscopic & Advanced Surgical Techniques, 2001; 11(6): 415-419.
Kang et al., “Robotic Assistants Aid Surgeons During Minimally Invasive Procedures,” IEEE Engineering in Medicine and Biology, Jan .- Feb. 2001: 94-104.
Lafullarde et al., “Laparoscopic Nissen Fundoplication: Five-year Results and Beyond,” Arch/Surg, Feb. 2001; 136: 180-184.
Mack, “Minimally Invasive and Robotic Surgery,” JAMA, Feb. 2001; 285(5): 568-572.
Peirs et al., “A miniature manipulator for integration in a self-propelling endoscope,” Sensors and Actuators A, 2001, 92: 343-349.
Yu et al., “Microrobotic Cell Injection,” Proceedings of the 2001 IEEE International Conference on Robotics and Automation, May 2001: 620-625.
Yu, Bsn, Rn, “M2ATM Capsule Endoscopy A Breakthrough Diagnostic Tool for Small Intestine Imagining, ” vol. 25, No. 1, 2001, Gastroenterology Nursing, pp. 24-27.
Guo et al., “Fish-like Underwater Microrobot with 3 DOF,” Proceedings of the 2002 IEEE International Conference on Robotics & Automation, May 2002; 738-743.
Leggett et al. (2002), “Aortic injury during laparoscopic Fundoplication,” Surg. Endoscopy 16(2): 362.
Mei et al., “Wireless Drive and Control of a Swimming Microrobot,” Proceedings of the 2002 IEEE International Conference on Robotics & Automation, May 2002: 1131-1136.
Melvin et al., “Computer-Enhanced vs. Standard Laparoscopic Antireflux Surgery,” J Gastrointest Surg 2002; 6: 11-16.
Menciassi et al., “Robotic Solutions and Mechanisms for a Semi-Autonomous Endoscope,” Proceedings of the 2002 IEEE/RSJ Intl. Conference on Intelligent Robots and Systems, Oct. 2002; 1379-1384.
Munro (2002), “Laparoscopic access: complications, technologies, and techniques,” Curro Opin. Obstet. Gynecol., 14(4): 365-74.
Nio et al., “Efficiency of manual vs robotical (Zeus) assisted laparoscopic surgery in the performance of standardized tasks,” Surg Endosc, 2002; 16: 412-415.
Phee et al., “Analysis and Development of Locomotion Devices for the Gastrointestinal Tract,” IEEE Transactions on Biomedical Engineering, vol. 49, No. 6, Jun. 2002: 613-616.
Rosen et al., “Task Decomposition of Laparoscopic Surgery for Objective Evaluation of Surgical Residents' Learning Curve Using Hidden Markov Model,” Computer Aided Surgery, vol. 7, pp. 49-61, 2002.
Rosen et al., “The Blue DRAGON - A System of Measuring the Kinematics and the Dynamics of Minimally Invasive Surgical Tools In-Vivo,” Proc. of the 2002 IEEE International Conference on Robotics and Automation, Washington, DC, pp. 1876-1881, May 2002.
Ruurda et al., “Robot-Assisted surgical systems: a new era in laparoscopic surgery,” Ann R. Coll Surg Engl. 2002; 84: 223-226.
Rosen et al., Objective Evaluation of Laparoscopic Skills Based on Haptic Information and Tool/Tissue Interactions, Computer Aided Surgery, vol. 7, Issue 1, pp. 49-61, Jul. 2002.
Satava, “Surgical Robotics: The Early Chronicles,” Surgical Laparoscopy, Endoscopy & Percutaneous Techniques, 2002; 12(1):6-16.
Thomann et al., “The Design of a new type of Micro Robot for the Intestinal Inspection,” Proceedings of the 2002 IEEE Intl. Conference on Intelligent Robots and Systems, Oct. 2002: 1385-1390.
Orlando et al. (2003), “Needle and Trocar Injuries in Diagnostic Laparoscopy under Local Anesthesia: What Is the True Incidence of These Complications?” Journal of Laparoendoscopic & Advanced Surgical Techniques, 13(3): 181-184.
Lehman et al., Dexterous miniature in vivo robot for NOTES, 2009, IEEE, p. 244-249.
Mihelj et al., ARMin II—7 DoF rehabilitation robot: mechanics and kinematics, 2007, IEEE, p. 4120-4125.
Zhang et al., Cooperative robotic assistant for laparoscopic surgery: CoBRASurge, 2009, IEEE, p. 5540-5545.
Riviere et al., “Toward Active Tremor Canceling in Handheld Microsurgical Instruments,” IEEE Transactions on Robotics and Automation, Oct. 2003, 19(5): 793-800.
Albers et al., Design and development process of a humanoid robot upper body through experimentation, 2004, IEEE, p. 77-92 (Year: 2004).
Glukhovsky et al., “The development and application of wireless capsule endoscopy,” Int. J. Med. Robot. Comput. Assist. Surgery, 2004; 1(1): 114-123.
Hanly et al., “Robotic Abdominal Surgery,” The American Journal of Surgery, 2004; 188 (Suppl. to Oct. 1994); 19S-26S.
Hanly et al., “Value of the SAGES Learning Center in introducing new technology,” Surgical Endoscopy, 2004; 19 (4): 477-483.
Hissink, “Olympus Medical develops capsule camera technology,” Dec. 2004, accessed Aug. 29, 2007, http://www.letsgodigital.org, 3 pp.
Kalloo et al., “Flexible transgastric peritoneoscopy: a novel approach to diagnostic and therapeutic interventions in the peritoneal cavity,” Gastrointestinal Endoscopy, 2004; 60(1): 114-117.
Menciassi et al., “Locomotion of a Leffed Capsule in the Gastrointestinal Tract: Theoretical Study and Preliminary Technological Results,” IEEE Int. Conf. on Engineering in Medicine and Biology, San Francisco, CA, pp. 2767-2770, Sep. 2004.
Miller, Ph.D., et al., “In-Vivo Stereoscopic Imaging System with 5 Degrees-of-Freedom for Minimal Access Surgery,” Dept. of Computer Science and Dept. of Surgery, Columbia University, New York, NY, 7 p. , 2004.
Oleynikov et al., “In Vivo Camera Robots Provide Improved Vision for Laparoscopic Surgery,” Computer Assisted Radiology and Surgery (CARS), Chicago, IL, Jun. 23 - 26, 2004b.
Patronik et al., “Crawling on the Heart: A Mobile Robotic Device for Minimally Invasive Cardiac Interventions,” MICCAI, 2004, pp. 9-16.
Patronik et al., “Development of a Tethered Epicardial Crawler for Minimally Invasive Cardiac Therapies,” IEEE, pp. 239-240, 2004.
Rentschler et al., “In Vivo Robots for Laparoscopic Surgery,” Studies in Health Technology and Infonnatics - Medicine Meets Virtual Reality, ISO Press, Newport Beach, CA, 2004a, 98: 316-322.
Rentschler et al., “Theoretical and Experimental Analysis of In Vivo Wheeled Mobility,” ASME Design Engineering Technical Conferences: 28th Biennial Mechanisms and Robotics Conference, Salt Lake City, Utah, Sep. 28 - Oct. 2, 2004; pp. 1-9.
Jagannath et al., “Peroral transgastric endoscopic ligation of fallopian tubes with long-term survival in a porcine model,” Gastrointestinal Endoscopy, 2005; 61 (3): 449-453.
Kantsevoy et al., “Endoscopic gastrojejunostomy with survival in a porcine model,” Gastrointestinal Endoscopy, 2005; 62(2): 287-292.
Menciassi et al., “Shape memory alloy clamping devices of a capsule for monitoring tasks in the gastrointestinal tract,” J. Micromech. Microeng, 2005; 15: 2045-2055.
Oleynikov et al., “In Vivo Robotic Laparoscopy,” Surgical Innovation, Jun. 2005, 12(2): 177-181.
Oleynikov et al., “Miniature Robots Can Assist in Laparoscopic Cholecystectomy,” Journal of Surgical Endoscopy, 19-4: 473-476, 2005.
Park et al., “Experimental studies of transgastric gallbladder surgery: cholecystectomy and cholecystogastric anastomosis (videos),” Gastrointestinal Endoscopy, 2005; 61 (4): 601-606.
Patronik et al., “Preliminary evaluation of a mobile robotic device for navigation and intervention on the beating heart,” Computer Aided Surgery, 10(4): 225-232, Jul. 2005.
Platt et al., “In Vivo Robotic Cameras can Enhance Imaging Capability During Laparoscopic Surgery,” from the Proceedings of the Society of American Gastrointestinal Endoscopic Surgeons (SAGES) Scientific Conference, Ft. Lauderdale, FL, Apr. 13-16, 2005; 1 pg.
Rentschler et al., “Mobile In Vivo Robots Can Assist in Abdominal Exploration,” from the Proceedings of the Society of American Gastrointestinal Endoscopic Surgeons (SAGES) Scientific Conference, Ft. Lauderdale, FL, April 13-16, 2005b.
Rentschler et al., “Modeling, Analysis, and Experimental Study of In Vivo Wheeled Robotic Mobility,” IEEE Transactions on Robotics, 22 (2): 308-321, 2005c.
Rentschler et al., “Toward In Vivo Mobility,” Studies in Health Technology and Infonnatics - Medicine Meets Virtual Reality, ISO Press, Long Beach, CA, 2005a, III: 397-403.
Rosen et al., “Spherical Mechanism Analysis of a Surgical Robot for Minimally Invasive Surgery - Analytical and Experimental Approaches,” Studies in Health Technology and Infonnatics-Medicine Meets Virtual Reality, pp. 442-448, Jan. 2005.
Smart Pill “Fastastic Voyage: Smart Pill to Expand Testing,” http://www.smartpilldiagnostics.com, Apr. 13, 2005, 1 pg.
Strong et al., “Efficacy of Novel Robotic Camera vs. a Standard Laproscopic Camera,” Surgical Innovation vol. 12, No. 4, Dec. 2005, Westminster Publications, Inc., pp. 315-318.
Kantsevoy et al., “Transgastric endoscopic splenectomy,” Surgical Endoscopy, 2006; 20: 522-525.
Ko et al., “Per-Oral transgastric abdominal surgery,” Chinese Journal of Digestive Diseases, 2006; 7: 67-70.
Micron, http://www.micron.com, 2006, 1/4-inch VGA NTSC/PAL CMOS Digital Image Sensor, 98 pp.
Rentschler et al., “Mechanical Design of Robotic In Vivo Wheeled Mobility,” ASME Journal of Mechanical Design, 2006a; pp. 1-11, Accepted.
Rentschler et al., “Miniature in vivo robots for remote and harsh environments,” IEEE Transaction on Information Technology in Biomedicine, Jan. 2006; 12(1): pp. 66-75.
Rentschler et al., “Mobile In Vivo Biopsy Robot,” IEEE International Conference on Robotics and Automation, Orlando, Florida, May 2006; 4155-4160.
Rentschler et al., “Mobile In Vivo Biopsy and Camera Robot,” Studies in Health and Infonnatics Medicine Meets Virtual Reality, vol. 119: 449-454, IOS Press, Long Beach, CA, 2006e.
Rentschler et al., “Mobile In Vivo Camera Robots Provide Sole Visual Feedback for Abdominal Exploration and Cholecystectomy,” Journal of Surgical Endoscopy, 20-1: 135-138, 2006b.
Rentschler et al., “Natural Orifice Surgery with an Endoluminal Mobile Robot,” The Society of American Gastrointestinal Endoscopic Surgeons, Dallas, TX, April 2006d.
Stefanini et al., “Modeling and Experiments on a Legged Microrobot Locomoting in a Tubular Compliant and Slippery Environment,” Int. Journal of Robotics Research, vol. 25, No. 5-6, pp. 551-560, Mav-Jun. 2006.
Sharp LL-151-3D, http://www.sharp3d.com, 2006, 2 pp.
Crystal Eyes, http://www.reald.com, 2007 (Stereo 3D visualization for CAVEs, theaters and immersive environments), 1 pg.
O'Neill, “Surgeon takes new route to gallbladder,” The Oregonian, Jun. 2007; 2 pp.
Park et al., “Trocar-less Instrumentation for Laparoscopy: Magnetic Positioning of Intra-abdominal Camera and Retractor,” Ann Surg, Mar. 2007; 245(3): 379-384.
Rentschler et al., “An In Vivo Mobile Robot for Surgical Vision and Task Assistance,” Journal of Medical Devices, Mar. 2007; vol. 1: 23-29.
Rentschler et al., “In vivo Robotics during the NEEMO 9 Mission,” Medicine Meets Virtual Reality, Feb. 2007; 1 pg.
Schwartz, “In the Lab: Robots that Slink and Squirm,” The New York Times, Mar. 27, 2007, 4 pp.
Gopura et al., Mechanical designs of active upper-limb exoskeleton robots: State-of-the-art and design difficulties, 2009, IEEE, p. 178-187 (Year: 2009).
Xu et al., “System Design of an Insertable Robotic Effector Platform for Single Access (SPA) Surgery” , The 2009 IEEE/RSJ International Conference on Intelligent Robots and Systems, Oct. 11-15, 2009, St. Louis Mo USA pp. 5546-5552.
Gopura et al., A brief review on upper extremity robotic exoskeleton systems, 2011, IEEE, p. 346-351 (Year: 2011).
Midday Jeff et al., “Material Handling System for Robotic natural Orifice Surgery,”, Proceedings of the 2011 Design of medical Devices Conference, Apr. 12-14, 2011, Minneapolis, MN 4 pages.
Keller et al., Design of the pediatric arm rehabilitation robot ChARMin, 2014, IEEE, p. 530-535 (Year: 2014).
Related Publications (1)
Number Date Country
20210330404 A1 Oct 2021 US
Provisional Applications (1)
Number Date Country
62564076 Sep 2017 US
Continuations (1)
Number Date Country
Parent 16144807 Sep 2018 US
Child 17367915 US