Robotically assisted surgical system and related devices and methods

Information

  • Patent Grant
  • 11903658
  • Patent Number
    11,903,658
  • Date Filed
    Tuesday, January 7, 2020
    4 years ago
  • Date Issued
    Tuesday, February 20, 2024
    9 months ago
Abstract
Disclosed herein are various robotic surgical systems having various robotic devices. Further, disclosed herein are removable coupleable connection ports, each of which can be coupled to a robotic device and a camera assembly that is disposed into and through the robotic device. Also disclosed herein are removable connection ports having at least one of a elongate device body coupling mechanism, a camera assembly coupling mechanism, and/or a presence detection mechanism. Further discussed herein is a camera assembly with at least one actuation mechanism for actuating movement of the steerable distal tip thereof.
Description
FIELD

The embodiments disclosed herein relate to various medical devices and related components that can make up a surgical system, including robotic and/or in vivo medical devices and related components. Certain embodiments include various robotic medical devices, including robotic devices that are disposed within a body cavity and positioned using a body or support component disposed through an orifice or opening in the body cavity. Other embodiments relate to various systems that have a robotic surgical device and a controller, wherein the device has one or more sensors and the controller has one or more motors such that the sensors transmit information that is used at the controller to actuate the motors to provide haptic feedback to a user.


BACKGROUND

Invasive surgical procedures are essential for addressing various medical conditions. When possible, minimally invasive procedures such as laparoscopy are preferred.


However, known minimally invasive technologies such as laparoscopy are limited in scope and complexity due in part to 1) mobility restrictions resulting from using rigid tools inserted through access ports, and 2) limited visual feedback. Known robotic systems such as the da Vinci® Surgical System (available from Intuitive Surgical, Inc., located in Sunnyvale, CA) are also restricted by the access ports, as well as having the additional disadvantages of being very large, very expensive, unavailable in most hospitals, and having limited sensory and mobility capabilities.


There is a need in the art for improved surgical methods, systems, and devices.


BRIEF SUMMARY

Discussed herein are various robotic surgical systems having various robotic devices. Certain of the robotic devices have coupleable connection ports (also referred to as “nests”) that receive and couple to various camera assemblies. The various connection ports herein can have coupling mechanisms therein for coupling to the robotic device and/or the camera assembly. In addition, certain ports have presence detection mechanisms as well. Further discussed herein are camera assemblies with actuation mechanisms for actuating movement of the steerable distal tip thereof.


In Example 1, a robotic surgical system comprises a robotic surgical device and a removable camera component. The robotic surgical device comprises an elongate device body comprising a distal end and a proximal end, a removable connection port disposed at the proximal end of the device body, and first and second robotic arms operably coupled to the distal end of the device body. The connection port comprises a device body coupling mechanism disposed within the connection port, a camera receiving opening defined in a proximal end of the connection port; a seal package disposed in the removable connection port, the seal package comprising at least two seals; and a camera coupling mechanism disposed within the removable connection port. The removable camera component is removably disposable in the camera receiving opening and through the seal package, the removable camera component comprising a camera body, an elongate camera tube, a flexible section, and a distal imager.


Example 2 relates to the robotic surgical system according to Example 1, wherein the device body coupling mechanism comprises first and second hinged coupling mechanisms hingedly coupled to the connection port.


Example 3 relates to the robotic surgical system according to Example 2, wherein each of the first and second hinged coupling mechanisms comprises a coupling mechanism body, a tensioned hinge at a proximal end of the coupling mechanism body, wherein the tensioned hinge is hingedly coupled to the connection port, a coupleable structure at a distal end of the coupling mechanism, wherein the coupleable structure comprises at least one coupling feature configured to be coupleable with a matching coupling feature on the proximal end of the device body and an actuable button.


Example 4 relates to the robotic surgical system according to Example 1, wherein the elongate device body comprises a male connector disposed at a proximal end of the elongate device body, wherein the male connector is coupleable with the connection port.


Example 5 relates to the robotic surgical system according to Example 1, wherein the removable connection port further comprises a presence detection mechanism operably coupled to the camera coupling mechanism.


Example 6 relates to the robotic surgical system according to Example 1, wherein the camera coupling mechanism comprises a slidable body disposed within the connection port, a camera receiving opening defined within the slidable body, an actuable camera release button attached to a first end of the slidable body, and a tensioned spring operably coupled to a second end of the slidable body.


Example 7 relates to the robotic surgical system according to Example 6, wherein the slidable body is slidable along a plane substantially transverse to a longitudinal axis of the elongate device body.


Example 8 relates to the robotic surgical system according to Example 1, further comprising a presence detection mechanism comprising a rotatable lever operably coupled to the camera coupling mechanism at a pivot point, wherein the rotatable lever rotates around the pivot point, a first sensing component disposed on the rotatable lever, and a second sensing component disposed on the elongate body, wherein the second sensing component is configured to sense the presence or absence of the first sensing component.


Example 9 relates to the robotic surgical system according to Example 8, wherein the first sensing component is a magnet.


In Example 10, a removable connection port for a robotic surgical device comprises a connection port body, a distal opening defined at a distal end of the port body, wherein the distal opening is sized and shaped to receive a proximal end of an elongate device body, a proximal opening defined at a proximal end of the port body, wherein the proximal opening is sized and shaped to receive a camera assembly, a seal package disposed in the connection port body, the seal package comprising at least two seals configured to receive a shaft of a camera assembly, a device body coupling mechanism disposed within the connection port body, the device body coupling mechanism comprising first and second hinged coupling mechanisms hingedly coupled to the connection port body, and a camera coupling mechanism disposed within the connection port body. The camera coupling mechanism comprises a slidable body disposed within the connection port body, and a camera receiving opening defined within the slidable body.


Example 11 relates to the removable connection port according to Example 10, wherein each of the first and second hinged coupling mechanisms comprises a coupling mechanism body, a tensioned hinge at a proximal end of the coupling mechanism body, wherein the tensioned hinge is hingedly coupled to the connection port body, and a coupleable structure at a distal end of the coupling mechanism, wherein the coupleable structure comprises at least one coupling feature configured to be coupleable with a matching coupling feature on the proximal end of the elongate device body and an actuable button.


Example 12 relates to the removable connection port according to Example 10, wherein the distal opening is sized and shaped to receive a male connector disposed at the proximal end of the elongate device body.


Example 13 relates to the removable connection port according to Example 10, further comprising a presence detection mechanism operably coupled to the camera coupling mechanism.


Example 14 relates to the removable connection port according to Example 10, wherein the camera coupling mechanism further comprises an actuable camera release button attached to a first end of the slidable body and a tensioned spring operably coupled to a second end of the slidable body.


Example 15 relates to the removable connection port according to Example 10, wherein the slidable body is slidable along a plane substantially transverse to a longitudinal axis of a lumen of the seal package.


Example 16 relates to the removable connection port according to Example 10, further comprising a presence detection mechanism comprising a rotatable lever operably coupled to the camera coupling mechanism at a pivot point, wherein the rotatable lever rotates around the pivot point, and a first sensing component disposed on the rotatable lever, wherein the first sensing component is configured to interact with a second sensing component disposed on the elongate device body when the removable connection port is coupled to the elongate device body.


Example 17 relates to the removable connection port according to Example 16, wherein the first sensing component is a magnet.


In Example 18, a robotic surgical system comprises a robotic surgical device and a removable camera component. The robotic surgical device comprises an elongate device body comprising a distal end and a proximal end, a removable connection port disposed at the proximal end of the device body, and first and second robotic arms operably coupled to the distal end of the device body. The connection port comprises a device body coupling mechanism disposed within the connection port, the device body coupling mechanism comprising first and second hinged coupling mechanisms hingedly coupled to the connection port, an camera receiving opening defined in a proximal end of the connection port, a seal package disposed in the removable connection port, the seal package comprising at least two seals, a camera coupling mechanism disposed within the removable connection port, and a presence detection mechanism operably coupled to the camera coupling mechanism. The camera coupling mechanism comprises a slidable body slidably disposed within the connection port, a camera receiving opening defined within the slidable body, an actuable camera release button attached to a first end of the slidable body, and a tensioned spring operably coupled to a second end of the slidable body. The presence detection mechanism comprises a rotatable lever operably coupled to the camera coupling mechanism at a pivot point, wherein the rotatable lever rotates around the pivot point, a first sensing component disposed on the rotatable lever, and a second sensing component disposed on the elongate body, wherein the second sensing component is configured to sense the presence or absence of the first sensing component. The removable camera component is removably disposable in the camera receiving opening and through the seal package, and the removable camera component comprises a camera body, an elongate camera tube, a flexible section, and a distal imager.


Example 19 relates to the robotic surgical system according to Example 18, wherein each of the first and second hinged coupling mechanisms comprises a coupling mechanism body, a tensioned hinge at a proximal end of the coupling mechanism body, wherein the tensioned hinge is hingedly coupled to the connection port, and a coupleable structure at a distal end of the coupling mechanism, wherein the coupleable structure comprises at least one coupling feature configured to be coupleable with a matching coupling feature on the proximal end of the device body and an actuable button.


Example 20 relates to the robotic surgical system according to Example 18, wherein the slidable body is slidable along a plane substantially transverse to a longitudinal axis of a lumen defined by the at least two seals in the seal package.


In Example 21, a camera assembly for a robotic surgical system comprises an elongate camera shaft, a camera body coupled to a proximal end of the elongate camera shaft, a steerable tip disposed at the distal end of the elongate camera shaft, a first cable coupled at a first end to the first drive carriage and coupled at a second end to the steerable tip, and a second cable coupled at a first end to the second drive carriage and coupled at a second end to the steerable tip. The camera body comprises a distal end configured to be positionable within a robotic device, and at least one actuation mechanism disposed within the camera body. The distal end comprises a distal nose cone disposed around the elongate shaft, and a coupling mechanism acceptance slot defined proximal to the distal nose cone. The at least one actuation mechanism comprises a rotatable shaft, a first drive carriage threadably coupled to the rotatable shaft, and a second drive carriage threadably coupled to the rotatable shaft. The steerable tip comprises a steerable tip body comprising a camera imager and an illumination component, and a flexible section coupled to the elongate camera shaft and the steerable tip body, wherein the steerable tip body is movable in relation to the elongate camera shaft via the flexible section. Further, actuation of the actuation mechanism causes linear movement of the first and second drive carriages in opposite directions, whereby the first and second cables steer the steerable tip.


Example 21 relates to the camera assembly according to Example 21, wherein the camera body further comprises an external housing and a cylindrical heat sink structure, wherein the cylindrical heat sink structure is disposed within the external housing.


While multiple embodiments are disclosed, still other embodiments will become apparent to those skilled in the art from the following detailed description, which shows and describes illustrative embodiments. As will be realized, the various embodiments are capable of modifications in various obvious aspects, all without departing from the spirit and scope thereof. Accordingly, the drawings and detailed description are to be regarded as illustrative in nature and not restrictive.





BRIEF DESCRIPTION OF THE DRAWINGS


FIG. 1 is a perspective view of a robotic surgical system in an operating room, according to one embodiment.



FIG. 2 is perspective view of a robotic device, according to one embodiment.



FIG. 3A is another perspective view of the robotic device of FIG. 2, according to one embodiment.



FIG. 3B is a perspective view of camera that is insertable into the robotic device of FIG. 3A, according to one embodiment.



FIG. 3C is an expanded perspective view of the distal end and robotic arms of the robotic device of FIG. 2, according to one embodiment.



FIG. 4A is an expanded perspective view of the distal end and robotic arms of another robotic device, according to another embodiment.



FIG. 4B is another expanded perspective view of the distal end and robotic arms of the robotic device of FIG. 4A, according to one embodiment.



FIG. 5A is a perspective view of a robotic device attached to a support arm coupled to an operating table, according to one embodiment.



FIG. 5B is an expanded perspective view of the device clamp and robotic device of FIG. 5A, according to one embodiment.



FIG. 6 is a schematic perspective view of a robotic device and a camera component that is not positioned in the device and is instead being operated via a separate port in a patient, according to one embodiment.



FIG. 7 is an expanded perspective view of the console of the surgical system of FIG. 1, according to one embodiment.



FIG. 8A is an side view of camera assembly, according to one embodiment.



FIG. 8B is a front view of the camera assembly of FIG. 8A, according to one embodiment.



FIG. 9A is an expanded side view of the camera body of the camera assembly of FIG. 8A, according to one embodiment.



FIG. 9B is an expanded cutaway side view of the camera body of FIG. 9A, according to one embodiment.



FIG. 9C is an expanded cross-sectional view of the distal end of the camera body of FIG. 9A, according to one embodiment.



FIG. 10A is a cross-sectional side view of the internal components of the camera body of FIG. 9A, according to one embodiment.



FIG. 10B is a cross-sectional front view of the internal components of the camera body of FIG. 9A, according to one embodiment.



FIG. 10C is a perspective cutaway view of an actuation mechanism for the camera body of FIG. 9A, according to one embodiment.



FIG. 10D is a perspective view of the lead screw of the actuation mechanism of FIG. 10C, according to one embodiment.



FIG. 10E is a cutaway view of certain components of the actuation mechanism of FIG. 10C, according to one embodiment.



FIG. 10F is a cutaway view of certain components of the camera body of FIG. 9A, according to one embodiment.



FIG. 11A is a perspective view of an elongate body and robotic arms of a robotic device and a nest that is coupleable thereto, according to one embodiment.



FIG. 11B is a perspective view depicting the robotic device of FIG. 11 with the nest and a camera coupled thereto, along with an expanded view of the nest itself, according to one embodiment.



FIG. 12A is a perspective cutaway view of the nest of FIG. 11B and a seal package disposed therein, according to one embodiment.



FIG. 12B is a cross-sectional cutaway view of the seal package of FIG. 12A, along with an expanded view of the two seals disposed therein, according to one embodiment.



FIG. 13 is a cross-sectional cutaway view of a a proximal end of a device body with the nest of FIG. 11B attached thereto and a camera disposed therethrough, according to one embodiment.



FIG. 14A is a cutaway view of the nest of FIG. 11B and two latches disposed therein, according to one embodiment.



FIG. 14B is a perspective view of one of the latches of FIG. 14A, according to one embodiment.



FIG. 15A is a side cutaway view of the nest of FIG. 11B disposed adjacent to the proximal end of the robotic device body of FIG. 11A, according to one embodiment.



FIG. 15B is a side cutaway view of the nest and device body of FIG. 15A coupled together, according to one embodiment.



FIG. 16A is a side cutaway view of the nest of FIG. 11B disposed adjacent to the male connector of the device body of FIG. 11A, according to one embodiment.



FIG. 16B is a perspective expanded view of one latch of the nest of FIG. 16A being coupled with the male connector of FIG. 16A, according to one embodiment.



FIG. 17A is a side cutaway view of the nest of FIG. 11B disposed adjacent to the male connector of the device body of FIG. 11A, according to one embodiment.



FIG. 17B is a side cutaway view of the nest and male connector of FIG. 17A coupled together, according to one embodiment.



FIG. 18 is a perspective cutaway view of the nest of FIG. 11B with a camera coupling mechanism disposed therein, according to one embodiment.



FIG. 19A is a side cross-sectional cutaway view of the nest and camera coupling mechanism of FIG. 18 in which the camera is not yet coupled to the nest, according to one embodiment.



FIG. 19B is a side cross-sectional cutaway view of the nest and camera coupling mechanism of FIG. 18 in which the camera is being urged into its fully coupled position in the nest, but is not yet fully coupled, according to one embodiment.



FIG. 19C is a side cross-sectional cutaway view of the nest and camera coupling mechanism of FIG. 18 in which the camera is fully coupled to the nest, according to one embodiment.



FIG. 20 is a perspective cutaway view of the nest of FIG. 11B with a presence detection mechanism disposed therein, according to one embodiment.



FIG. 21 is a side cross-sectional cutaway view of the nest and presence detection mechanism of FIG. 20 in which the camera is not yet coupled to the nest, according to one embodiment.



FIG. 22 is a side cross-sectional cutaway view of the nest and presence detection mechanism of FIG. 20 in which the camera is coupled to the nest, according to one embodiment.



FIG. 23 is a side cross-sectional cutaway view of the nest and presence detection mechanism of FIG. 20 in which the camera coupling mechanism button has been depressed, according to one embodiment.



FIG. 24 is a cross-sectional cutaway view of a forearm, according to one embodiment.



FIG. 25 is a perspective view of the forearm of FIG. 24, according to one embodiment.



FIG. 26 is a cross-sectional cutaway view of an upper arm coupled to a device body at the shoulder joint, according to one embodiment.



FIG. 27 is a cross-sectional cutaway view of the proximal end of the device body of FIG. 26, according to one embodiment.





DETAILED DESCRIPTION

The various systems and devices disclosed herein relate to devices for use in medical procedures and systems. More specifically, various embodiments relate to various medical devices, including robotic devices and related methods and systems.


It is understood that the various embodiments of robotic devices and related methods and systems disclosed herein can be incorporated into or used with any other known medical devices, systems, and methods.


It is understood that the various embodiments of robotic devices and related methods and systems disclosed herein can be incorporated into or used with any other known medical devices, systems, and methods. For example, the various embodiments disclosed herein may be incorporated into or used with any of the medical devices and systems disclosed in U.S. Pat. No. 8,968,332 (issued on Mar. 3, 2015 and entitled “Magnetically Coupleable Robotic Devices and Related Methods”), U.S. Pat. No. 8,834,488 (issued on Sep. 16, 2014 and entitled “Magnetically Coupleable Surgical Robotic Devices and Related Methods”), U.S. Pat. No. 10,307,199 (issued on Jun. 4, 2019 and entitled “Robotic Surgical Devices and Related Methods”), U.S. Pat. No. 9,579,088 (issued on Feb. 28, 2017 and entitled “Methods, Systems, and Devices for Surgical Visualization and Device Manipulation”), U.S. Patent Application 61/030,588 (filed on Feb. 22, 2008), U.S. Pat. No. 8,343,171 (issued on Jan. 1, 2013 and entitled “Methods and Systems of Actuation in Robotic Devices”), U.S. Pat. No. 8,828,024 (issued on Sep. 9, 2014 and entitled “Methods and Systems of Actuation in Robotic Devices”), U.S. Pat. No. 9,956,043 (issued on May 1, 2018 and entitled “Methods and Systems of Actuation in Robotic Devices”), U.S. patent application Ser. No. 15/966,606 (filed on Apr. 30, 2018 and entitled “Methods, Systems, and Devices for Surgical Access and Procedures”), U.S. patent application Ser. No. 12/192,663 (filed on Aug. 15, 2008 and entitled “Medical Inflation, Attachment, and Delivery Devices and Related Methods”), U.S. patent application Ser. No. 15/018,530 (filed on Feb. 8, 2016 and entitled “Medical Inflation, Attachment, and Delivery Devices and Related Methods”), U.S. Pat. No. 8,974,440 (issued on Mar. 10, 2015 and entitled “Modular and Cooperative Medical Devices and Related Systems and Methods”), U.S. Pat. No. 8,679,096 (issued on Mar. 25, 2014 and entitled “Multifunctional Operational Component for Robotic Devices”), U.S. Pat. No. 9,179,981 (issued on Nov. 10, 2015 and entitled “Multifunctional Operational Component for Robotic Devices”), U.S. Pat. No. 9,883,911 (issued on Feb. 6, 2018 and entitled “Multifunctional Operational Component for Robotic Devices”), U.S. patent application Ser. No. 15/888,723 (filed on Feb. 5, 2018 and entitled “Multifunctional Operational Component for Robotic Devices”), U.S. Pat. No. 8,894,633 (issued on Nov. 25, 2014 and entitled “Modular and Cooperative Medical Devices and Related Systems and Methods”), U.S. Pat. No. 8,968,267 (issued on Mar. 3, 2015 and entitled “Methods and Systems for Handling or Delivering Materials for Natural Orifice Surgery”), U.S. Pat. No. 9,060,781 (issued on Jun. 23, 2015 and entitled “Methods, Systems, and Devices Relating to Surgical End Effectors”), U.S. Pat. No. 9,757,187 (issued on Sep. 12, 2017 and entitled “Methods, Systems, and Devices Relating to Surgical End Effectors”), U.S. Pat. No. 10,350,000 (issued on Jul. 16, 2019 and entitled “Methods, systems, and devices relating to surgical end effectors”), U.S. patent application Ser. No. 16/512,510 (filed on Jul. 16, 2019 and entitled “Methods, Systems, and Devices Relating to Surgical End Effectors”), U.S. Pat. No. 9,089,353 (issued on Jul. 28, 2015 and entitled “Robotic Surgical Devices, Systems, and Related Methods”), U.S. Pat. No. 10,111,711 (issued on Oct. 30, 2018 and entitled “Robotic Surgical Devices, Systems, and Related Methods”), U.S. patent application Ser. No. 16/123,619 (filed on Sep. 6, 2018 and entitled “Robotic Surgical Devices, Systems and Related Methods”), U.S. Pat. No. 9,770,305 (issued on Sep. 26, 2017 and entitled “Robotic Surgical Devices, Systems, and Related Methods”), U.S. patent application Ser. No. 15/661,147 (filed on Jul. 27, 2017 and entitled “Robotic Devices with On Board Control & Related Systems & Devices”), U.S. patent application Ser. No. 13/833,605 (filed on Mar. 15, 2013 and entitled “Robotic Surgical Devices, Systems, and Related Methods”), U.S. patent application Ser. No. 13/738,706 (filed on Jan. 10, 2013 and entitled “Methods, Systems, and Devices for Surgical Access and Insertion”), U.S. patent application Ser. No. 14/661,465 (filed on Mar. 18, 2015 and entitled “Methods, Systems, and Devices for Surgical Access and Insertion”), U.S. patent application Ser. No. 15/890,860 (filed on Feb. 7, 2018 and entitled “Methods, Systems, and Devices for Surgical Access and Insertion”), U.S. Pat. No. 9,498,292 (issued on Nov. 22, 2016 and entitled “Single Site Robotic Devices and Related Systems and Methods”), U.S. Pat. No. 10,219,870 (issued on Mar. 5, 2019 and entitled “Single site robotic device and related systems and methods”), U.S. patent application Ser. No. 16/293,135 (filed Mar. 3, 2019 and entitled “Single Site Robotic Device and Related Systems and Methods”), U.S. Pat. No. 9,010,214 (issued on Apr. 21, 2015 and entitled “Local Control Robotic Surgical Devices and Related Methods”), U.S. Pat. No. 10,470,828 (issued on Nov. 12, 2019 and entitled “Local Control Robotic Surgical Devices and Related Methods”), U.S. patent application Ser. No. 16/596,034 (filed on Oct. 8, 2019 and entitled “Local Control Robotic Surgical Devices and Related Methods”), U.S. Pat. No. 9,743,987 (issued on Aug. 29, 2017 and entitled “Methods, Systems, and Devices Relating to Robotic Surgical Devices, End Effectors, and Controllers”), U.S. patent application Ser. No. 15/687,787 (filed on Aug. 28, 2017 and entitled “Methods, Systems, and Devices Relating to Robotic Surgical Devices, End Effectors, and Controllers”), U.S. Pat. No. 9,888,966 (issued on Feb. 13, 2018 and entitled “Methods, Systems, and Devices Relating to Force Control Surgical Systems”), U.S. patent application Ser. No. 15/894,489 (filed on Feb. 12, 2018 and entitled “Methods, Systems, and Devices Relating to Force Control Surgical Systems”), U.S. patent application Ser. No. 14/212,686 (filed on Mar. 14, 2014 and entitled “Robotic Surgical Devices, Systems, and Related Methods”), U.S. patent application Ser. No. 14/334,383 (filed on Jul. 17, 2014 and entitled “Robotic Surgical Devices, Systems, and Related Methods”), U.S. patent application Ser. No. 14/853,477 (filed on Sep. 14, 2015 and entitled “Quick-Release End Effectors and Related Systems and Methods”), U.S. patent application Ser. No. 16/504,793 (filed on Jul. 8, 2019 and entitled “Quick-Release End Effectors and Related Systems and Methods”), U.S. Pat. No. 10,376,322 (issued on Aug. 13, 2019 and entitled “Robotic Device with Compact Joint Design and Related Systems and Methods”), U.S. patent application Ser. No. 16/538,902 (filed on Aug. 13, 2019 and entitled “Robotic Device with Compact Joint Design and Related Systems and Methods”), U.S. patent application Ser. No. 15/227,813 (filed on Aug. 3, 2016 and entitled Robotic Surgical Devices, System and Related Methods”) U.S. patent application Ser. No. 15/599,231 (filed on May 18, 2017 and entitled “Robotic Surgical Devices, Systems, and Related Methods”), U.S. patent application Ser. No. 15/687,113 (filed on Aug. 25, 2017 and entitled “Quick-Release End Effector Tool Interface”), U.S. patent application Ser. No. 15/691,087 (filed on Aug. 30, 2017 and entitled “Robotic Device with Compact Joint Design and an Additional Degree of Freedom and Related Systems and Methods”), U.S. patent application Ser. No. 15/821,169 (filed on Nov. 22, 2017 and entitled “Gross Positioning Device and Related Systems and Methods”), U.S. patent application Ser. No. 15/826,166 (filed on Nov. 29, 2017 and entitled “User controller with user presence detection and related systems and methods”), U.S. patent application Ser. No. 15/842,230 (filed on Dec. 14, 2017 and entitled “Releasable Attachment Device for Coupling to Medical Devices and Related Systems and Methods”), U.S. patent application Ser. No. 16/144,807 (filed on Sep. 27, 2018 and entitled “Robotic Surgical Devices with Tracking Camera Technology and Related Systems and Methods”), U.S. patent application Ser. No. 16/241,263 (filed on Jan. 7, 2019 and entitled “Single-Manipulator Robotic Device With Compact Joint Design and Related Systems and Methods”), U.S. Pat. No. 7,492,116 (filed on Oct. 31, 2007 and entitled “Robot for Surgical Applications”), U.S. Pat. No. 7,772,796 (filed on Apr. 3, 2007 and entitled “Robot for Surgical Applications”), and U.S. Pat. No. 8,179,073 (issued on May 15, 2011, and entitled “Robotic Devices with Agent Delivery Components and Related Methods”), all of which are hereby incorporated herein by reference in their entireties.


Certain device and system implementations disclosed in the applications listed above can be positioned within a body cavity of a patient, or a portion of the device can be placed within the body cavity, in combination with a support component similar to those disclosed herein. An “in vivo device” as used herein means any device that can be positioned, operated, or controlled at least in part by a user while being positioned within a body cavity of a patient, including any device that is coupled to a support component such as a rod or other such component that is disposed through an opening or orifice of the body cavity, also including any device positioned substantially against or adjacent to a wall of a body cavity of a patient, further including any such device that is internally actuated (having no external source of motive force), and additionally including any device that may be used laparoscopically or endoscopically during a surgical procedure. As used herein, the terms “robot,” and “robotic device” shall refer to any device that can perform a task either automatically or in response to a command.


Certain embodiments provide for insertion of the present invention into the cavity while maintaining sufficient insufflation of the cavity. Further embodiments minimize the physical contact of the surgeon or surgical users with the present invention during the insertion process. Other implementations enhance the safety of the insertion process for the patient and the present invention. For example, some embodiments provide visualization of the present invention as it is being inserted into the patient's cavity to ensure that no damaging contact occurs between the system/device and the patient. In addition, certain embodiments allow for minimization of the incision size/length. Other implementations include devices that can be inserted into the body via an incision or a natural orifice. Further implementations reduce the complexity of the access/insertion procedure and/or the steps required for the procedure. Other embodiments relate to devices that have minimal profiles, minimal size, or are generally minimal in function and appearance to enhance ease of handling and use.


As in manual laparoscopic procedures, a known insufflation system can be used to pump sterile carbon dioxide (or other gas) into the patient's abdominal cavity. This lifts the abdominal wall from the organs and creates space for the robot. In certain implementations, the system has no direct interface with the insufflation system. Alternatively, the system can have a direct interface to the insufflation system.


In certain implementations in which the device is inserted through an insertion port, the insertion port is a known, commercially-available flexible membrane placed transabdominally to seal and protect the abdominal incision. This off-the-shelf component is the same device or substantially the same device that is used in substantially the same way for Hand-Assisted Laparoscopic Surgery (HALS). The only difference is that the arms of the robotic device according to the various embodiments herein are inserted into the abdominal cavity through the insertion port rather than the surgeon's hand. The robotic device body seals against the insertion port when it is positioned therethrough, thereby maintaining insufflation pressure. The port is single-use and disposable. Alternatively, any known port can be used. In further alternatives, the device can be inserted through an incision without a port or through a natural orifice.


Certain implementations disclosed herein relate to “combination” or “modular” medical devices that can be assembled in a variety of configurations. For purposes of this application, both “combination device” and “modular device” shall mean any medical device having modular or interchangeable components that can be arranged in a variety of different configurations.


Certain embodiments disclosed or contemplated herein can be used for colon resection, a surgical procedure performed to treat patients with lower gastrointestinal diseases such as diverticulitis, Crohn's disease, inflammatory bowel disease and colon cancer. Approximately two-thirds of known colon resection procedures are performed via a completely open surgical procedure involving an 8- to 12-inch incision and up to six weeks of recovery time. Because of the complicated nature of the procedure, existing robot-assisted surgical devices are rarely used for colon resection surgeries, and manual laparoscopic approaches are only used in one-third of cases. In contrast, the various implementations disclosed herein can be used in a minimally invasive approach to a variety of procedures that are typically performed ‘open’ by known technologies, with the potential to improve clinical outcomes and health care costs. Further, the various implementations disclosed herein can be used for any laparoscopic surgical procedure in place of the known mainframe-like laparoscopic surgical robots that reach into the body from outside the patient. That is, the less-invasive robotic systems, methods, and devices disclosed herein feature small, self-contained surgical devices that are inserted in their entireties through a single incision in the patient's abdomen. Designed to utilize existing tools and techniques familiar to surgeons, the devices disclosed herein will not require a dedicated operating room or specialized infrastructure, and, because of their much smaller size, are expected to be significantly less expensive than existing robotic alternatives for laparoscopic surgery. Due to these technological advances, the various embodiments herein could enable a minimally invasive approach to procedures performed in open surgery today.



FIG. 1 depicts one embodiment of a robotic surgical system 10 having several components that will be described in additional detail below. The components of the various system implementations disclosed or contemplated herein can include an external control console 16 and a robotic device 12 having a removable camera 14 as will also be described in additional detail below. In accordance with the implementation of FIG. 1, the robotic device 12 is shown mounted to the operating table 18 via a known, commercially available support arm 20. The system 10 can be, in certain implementations, operated by the surgeon 22 at the console 16 and one surgical assistant 24 positioned at the operating table 18. Alternatively, one surgeon 22 can operate the entire system 10. In a further alternative, three or more people can be involved in the operation of the system 10. It is further understood that the surgeon (or user) 22 can be located at a remote location in relation to the operating table 18 such that the surgeon 22 can be in a different city or country or on a different continent from the patient on the operating table 18.


In this specific implementation, the robotic device 12 with the camera 14 are both connected to the surgeon console 16 via cables: a device cable 24A and a camera cable 24B that will be described in additional detail below. Alternatively, any connection configuration can be used. In certain implementations, the system can also interact with other devices during use such as a electrosurgical generator, an insertion port, and auxiliary monitors.



FIG. 2 depicts one exemplary implementation of a robotic device 40 that can be incorporated into the exemplary system 10 discussed above or any other system disclosed or contemplated herein. The device 40 has a body (or “torso”) 42 having a distal end 42A and proximal end 42B, with the imaging device (or “camera”) 44 disposed therethrough, as mentioned above and as will be described in additional detail below. Briefly, the robotic device 40 has two robotic arms 46, 48 operably coupled thereto and the camera 44 is removably positionable through the body 42 and disposed between the two arms 46, 48. That is, device 40 has a first (or “right”) arm 46 and a second (or “left) arm 48, both of which are operably coupled to the device 40 as discussed in additional detail below. In this embodiment, the body 42 of the device 40 as shown has an enclosure (also referred to as a “cover” or “casing”) 52 such that the internal components and lumens of the body 42 are disposed within the enclosure 52. The device body 42 has two rotatable cylindrical bodies (also referred to as “shoulders” or “turrets”) 54A, 54B: a first (or “right”) shoulder 54A and a second (or “left”) shoulder 54B. Each arm 46, 48 in this implementation also has an upper arm (also referred to herein as an “inner arm,” “inner arm assembly,” “inner link,” “inner link assembly,” “upper arm assembly,” “first link,” or “first link assembly”) 46A, 48A, and a forearm (also referred to herein as an “outer arm,” “outer arm assembly,” “outer link,” “outer link assembly,” “forearm assembly,” “second link,” or “second link assembly”) 46B, 48B. The right upper arm 46A is operably coupled to the right shoulder 54A of the body 42 at the right shoulder joint 46C and the left upper arm 48A is operably coupled to the left shoulder 54B of the body 42 at the left shoulder joint 48C. Further, for each arm 46, 48, the forearm 46B, 48B is rotatably coupled to the upper arm 46A, 48A at the elbow joint 46D, 48D. In various embodiments, the forearms 46B, 48B are configured to receive various removeable, interchangeable end effectors 56A, 56B.


The end effectors 56A, 56B on the distal end of the arms 46, 48 can be various tools 56A, 56B (scissors, graspers, needle drivers and the like), as will be described in additional detail below. In certain implementations, the tools 56A, 56B are designed to be removable, including in some instances by a small twist of the tool knob that couples the end effector 56A, 56B to the arm 46, 48. In certain implementations, at least two single-use, interchangeable, disposable surgical end effectors can be used with any of the robotic device embodiments herein (including device 40). Such end effectors can include, but are not limited to, a fenestrated grasper capable of bi-polar cautery, scissors that deliver mono-polar cautery, a hook that delivers mono-polar cautery, and a left/right needle driver set. The tools can be selected for the specific surgical task. Certain forearm and end effector configurations that allow for the removability and interchangeability of the end effectors are disclosed in detail in U.S. application Ser. No. 14/853,477, which is incorporated by reference above. Further, it is understood that any known forearm and end effector combinations can be used in any of the robotic device embodiments disclosed or contemplated herein.


In various implementations, the body 40 and each of the links of the arms 46, 48 can contain a variety of actuators or motors. In certain implementations, the body 40 has no motors disposed therein, while there is at least one motor in each of the arms 46, 48. In one embodiment, any of the motors discussed and depicted herein can be brush or brushless motors. Further, the motors can be, for example, 6 mm, 8 mm, or 10 mm diameter motors. Alternatively, any known size that can be integrated into a medical device can be used. In a further alternative, the actuators can be any known actuators used in medical devices to actuate movement or action of a component. Examples of motors that could be used for the motors described herein include the EC 10 BLDC+GP10A Planetary Gearhead, EC 8 BLDC+GP8A Planetary Gearhead, or EC 6 BLDC+GP6A Planetary Gearhead, all of which are commercially available from Maxon Motors, located in Fall River, MA There are many ways to actuate these motions, such as with DC motors, AC motors, permanent magnet DC motors, brushless motors, pneumatics, cables to remote motors, hydraulics, and the like. As such, the actuation source can be at least one motor, hydraulic pressure source, pneumatic pressure source, or any other actuation source disposed remotely from or proximally to the device 40 such that an appropriate coupling or transmission mechanism (such as at least one cable, at least one hydraulic transmission hose, at least one pneumatic transmission hose, or any other transmission mechanism) is disposed through the body 42.


In one embodiment, the various joints discussed above in accordance with any of the embodiments disclosed or contemplated herein can be driven by electrical motors disposed within the device and, in some implementations, near each joint. Other embodiments include the incorporation of pneumatic or hydraulic actuators in any of the device implementations herein. In additional alternative embodiments, the driving actuators are disposed outside the device and/or body cavity and power transmission mechanisms are provided to transmit the energy from the external source to the various joints of any device herein. Such a transmission mechanism could, for example, take the form of gears, drive shafts, cables, pulleys, or other known mechanisms, or any combination thereof.



FIGS. 3A and 3B depict one embodiment of the robotic device 40 with the camera assembly 44 removed, according to one implementation. That is, FIG. 3A depicts the device 40 without the camera positioned through the body 42, and FIG. 3B depicts one embodiment of the camera 44. In certain implementations, and as best shown in FIG. 3B, the camera 44 has a handle (or “camera body”) 60 with an elongate shaft 62 coupled thereto such that the shaft 62 extends distally from the distal end of the handle 60. In addition, the camera 44 has a steerable tip 64 coupled to the distal end of the shaft 62 via a flexible section 68 such that the steerability allows the user to adjust the viewing direction, as will be discussed in further detail below. Further, the tip 64 also includes a camera imager 66 at the distal end of the tip 64 that is configured to capture the desired images. Further, the tip 64 in certain implementations has an illumination light (not shown) disposed thereon, such that the light can illuminate the objects in the field of view. In one specific implementation, the camera 44 provides 1080p 60 Hz. digital video. Alternatively, the camera 44 can provide any known video quality.


As best shown in FIGS. 3A and 3C, the camera assembly 44 can be inserted into the body 42 of the robotic device 40 by positioning the distal end of the shaft 62 through a lumen (not shown) defined through the body 42 of the robotic device 40 as shown by the arrow A in FIG. 3A. As will be described in further detail below, certain implementations of the device 40 include a removable nest (or “dock”) (not shown) disposed near the proximal end of the body 42 that includes a seal (not shown) that operates to ensure that the patient's cavity remains insufflated. When the shaft 62 is inserted through the lumen of the body 42 as desired, according to certain embodiments as best shown in FIGS. 2 and 3C, the distal end of the shaft 62, including the flexible section 68 and the steerable tip 64 (containing the imager 66) extends out of an opening at the distal end of the body 42 such that the tip 64 is positioned between the two arms 46, 48 in the surgical environment as shown. Thus, the imager 66 is positioned to capture the view between the two arms 46, 48 and the steerable tip 64 can be actuated to provide views of the surgical tools and surgical target. That is, the tip 64 can be moved such that the surgical tools and/or surgical target are captured within the field of view of the imager 66. It is understood that this camera 44 embodiment and any other such camera embodiment disclosed or contemplated herein can be used with any similar robotic device having a camera lumen defined therethrough.


In various implementations, as best shown in FIG. 3C, the steerable tip 64 and therefore also the camera imager 66 can be steered or otherwise moved in two independent directions in relation to the shaft 62 at a flexible section 68 disposed between the shaft 62 and the steerable tip 64 to change the direction of view. That is, FIG. 3C shows that the steerable tip 64 can be robotically articulated in the yaw direction (left and right in relation to the device 40) as represented by arrow B or pitch direction (up and down in relation to the device 40) as represented by arrow C. In various implementations, the camera 44 can be controlled via a console (such as console 16 discussed above, for example) or via control buttons (not shown) as will be discussed in additional detail below. In one embodiment, the features and operation (including articulation) of the steerable tip are substantially similar to the steerable tip as described in U.S. application Ser. Nos. 14/334,383 and 15/227,813, both of which are incorporated by reference above. Alternatively, any known robotic articulation mechanism for cameras or similar apparatuses can be incorporated into any camera embodiment utilized in any device or system disclosed or contemplated herein.


In various implementations, the camera 44 can be re-sterilized for multiple uses. In one specific embodiment, the camera 44 can be reused up to one hundred times or more. Alternatively, it is understood that any known endoscopic camera that can fit through a device body according to any implementation herein can be utilized.


Focusing now on the robotic arms 82, 84 of a robotic device 80 according to one embodiment as shown in FIGS. 4A-4B, each robot arm 82, 84 in this implementation has six degrees of freedom, including the open/close function of the tool, as best shown in FIG. 4A. For purposes of this discussion, the various degrees of freedom will be discussed in the context of the right arm 82 as shown in FIG. 4A, but it is understood that both arms have the same degrees of freedom. The right shoulder joint 86 is approximately a spherical joint similar to a human shoulder. The upper arm 88 can yaw (J1), pitch (J2), and roll about the shoulder joint 86 (J3). These first three axes of rotation roughly intersect at the shoulder joint 86. The robot elbow 90 (J4) allows rotation of the forearm 92 with respect to the upper arm 88. Finally, the end effector 94 can roll (J5) about the long axis of the tool 94 and some tools that can be replaceably attached to the forearm 92 have an open/close actuation function. On the other hand, it is understood that a hook cautery tool, for example, does not open/close.


The robotic arms 82, 84 in this implementation have significant dexterity. As shown in FIG. 4B, the six degrees of freedom described above allow the arms 82, 84 to reach into the confined spaces of the abdominal cavity. FIG. 4B schematically depicts the entire workspace 110 of the arms 82, 84 of the robotic device 80, according to certain implementations. In these implementations, “workspace” 110 means the space 110 around the robotic device 80 in which either arm 82, 84 (and/or end effector thereof) can move, access, and perform its function within that space 110. According to one embodiment, the arms 82, 84 herein are substantially the same as the arms, the degrees of freedom, and the overall workspace and the individual workspaces of each arm as disclosed in U.S. Published Application 2019/0090965, which is hereby incorporated herein by reference in its entirety.



FIG. 4B depicts a perspective view of the device 80 and further schematically shows the collective workspace 110 of the first and second arms 82, 84. Note that the each arm 82, 84 has a range of motion and corresponding workspace that extends from the front 112 of the device 80 to the back 114 of the device 80. Thus, the first arm 82 moves equally to the front 112 and the back 114, through about 180° of space relative to the axis of the device body 80A for each arm 14, 16. This overall workspace 110, which constitutes an intersecting or collective workspace 110 based on the separate workspaces of the two arms 82, 84, allows the robotic device 80 to work to the front 112 and back 114 equally well without having to reposition the body 80A. Thus, the workspace 110 represents a region that is reachable by both the left and right arms 112, 114 and is defined as the bi-manual robot workspace 110. The surgeon will have full robot dexterity when working in this bi-manual region 110.


The bi-manual workspace 110 is approximated by an ellipse that is rotated 180 degrees about the shoulder pitch joint (J2 in FIG. 4A) and is shown in FIG. 4B. For one specific design implementation, the ellipse is approximately 4.5″ (11.5 cm) on the long axis and 3.25″ (8.25 cm) on the minor axis. The bi-manual workspace 110 extends from in front of the robotic device 80 to below the device 80 and is also behind the back of the device 80. This dexterity of the robotic arms 82, 84 allows the surgeon to operate the arms 82, 84 to work equally well anywhere inside this bi-manual workspace 110.


As can be seen in FIG. 4A, the arms 82, 84 in this exemplary implementation have a molded silicon protective sleeve 96 that is disposed over the arms 82, 84 and shoulder turrets 98A, 98B. In one embodiment, the sleeve 96 is fluidically sealed such that it protects the arms 82, 84 and the robotic device 80 from fluid ingress and also helps to simplify post-surgery cleaning and sterilization. The fluidically sealed sleeve 96 is substantially similar to any of the sleeve embodiments disclosed or contemplated in U.S. applications Ser. Nos. 14/334,383, 15/227,813, and 16/144,807, all of which are incorporated by reference above.


Additional features and components of the robotic device include those disclosed in U.S. applications Ser. Nos. 14/334,383, 15/227,813, and 16/144,807, all of which are incorporated by reference above, along with all of the other patents and applications incorporated by reference above. It is understood that any robotic device embodiment disclosed or contemplated herein (including, for example, the robotic devices 12, 40, 80 discussed above), can be incorporated into not only the system embodiments disclosed herein, but any other known robotic surgical system. It is further understood that, according to certain implementations, any robotic device disclosed or contemplated herein can be configured such that it can be cleaned and sterilized for multiple uses. In some embodiments, the device can be reused up to ten times or more.


As shown in FIGS. 5A and 5B (and as discussed above with respect to FIG. 1), the device 12 is attached to the operating table 18 via support arm 20. More specifically, the support arm 20 is coupled at one end to the operating table 18 as shown. Further, the support arm 20 has a robotic device body clamp 116 at the other end of the arm 20 such that the clamp 116 can be coupled to a clamp interface ring 118 defined in an outer surface of the enclosure 52 of the device body 42, as best shown in FIG. 5B. Certain embodiments of the clamp 116 are disclosed in additional detail in U.S. Published Application 2017/0035526, which is hereby incorporated herein by reference in its entirety. In addition, the support arm 20 has an adjustment knob 119 that allows for the arm 20 to be adjusted such that the attached device 12 can be repositioned to numerous different position options as needed. As such, the device 12 is adjustably attached to the table 18 via the arm 20. Alternatively, any known support arm can be used to support the various robotic device embodiments herein. Thus, in addition to the positioning of the device 12 and arms 46, 48 as discussed above, in various implementations, the robotic device 12 can reach any area of the surgical cavity because it can be easily repositioned during the procedure via “gross positioning.” That is, the device 12 can be quickly, in a matter of seconds, moved by manually adjusting the external support arm 20 and robot clamp 116. The combination of gross positioning of the robotic device 12 and the dexterity of the robot arms 46, 48 as described above allow the surgeon to place the device 12 so it can work anywhere in the target cavity of the patient with, in certain embodiments, the arms 46, 48 well triangulated for the given procedure, as discussed elsewhere herein.


The various camera embodiments herein (including cameras 14 and 44, for example) can, in certain implementations, be coordinated with the device to which it is coupled to create coordinated triangulation between the camera and the arms and end effectors for any configuration, positioning, and use of the device. Further, the steerable tip of any such camera can be robotically articulated so as to reposition the field of view, either automatically or via control by the surgeon using the system console. That is, the camera articulates to ensure the surgeon can view all possible locations of the robotic arms as well as the desired areas of the surgical theater. Further, as the robotic arms move—the steerable camera tip can be coordinated with the arms to move using active joints in coordination with the arm movements to view the entire robot workspace. In certain implementations, the joints of the camera are actively controlled using motors and sensors and a processor (and, in some implementations, a control algorithm contained therein). In these implementations, the processor allows for automated and/or semi-automated positioning and re-positioning of the camera 12 about the pitch (α) and/or yaw (β) rotations relative to the robotic device. It is understood that the various embodiments of systems and devices having such a coordination between the camera and the device (and arms) and the resulting features thereof are disclosed in detail in U.S. Published Application 2019/0090965, which is incorporated by reference above.


Alternatively, in certain implementations as shown in FIG. 6, the camera 44 can be removed from the robotic device 40 and positioned through another, known laparoscopic port 120 typically used with a standard manual laparoscope. As such, in this embodiment, the device 40 is disposed through a main port (also known as an “insertion port”) 122 and the camera 44 is positioned through the known laparoscopic port 120 as shown. It is understood that this arrangement may be useful to visualize the robotic device 40 to ensure safe insertion and extraction via the main port 122. According to various embodiments, the camera 44 can also be removed from the robotic device 40 so the optics can be cleaned, the camera 44 can be repaired, or for any other reason in which it is beneficial to remove the camera 44. It is understood that while the device 40 and camera 44 are depicted and discussed herein, any device or camera according to any implementation disclosed or contemplated herein can also be used in a similar arrangement and any such camera can also be removed from the device for any reason as discussed herein.


It is understood that the insertion port 122 also can represent the port 122 through which any robotic device embodiment disclosed or contemplated herein is positioned for any procedure as contemplated herein (including those procedures in which the camera 44 is disposed through the device 40). In one embodiment, the insertion port 122 can be a single use commercially available flexible membrane disposed transabdominally to seal and protect the abdominal incision and allow for positioning the body 42 of the device 40 therethrough. In specific implementations, the insertion port 122 is the same device used in Hand-Assisted Laparoscopic Surgery (HALS), including the exemplary port 122 depicted in FIG. 6, which, according to one embodiment, is a GelPort™ 122. The device body 42 seals against the insertion port 122, thereby establishing a fluidic seal and thus maintaining insufflation pressure. Alternatively, any known insertion port (or incision) that is configured to receive a device similar to that disclosed herein can be used.


Returning to the overall system embodiments, such as the system 10 depicted in FIG. 1 and discussed above, the robotic device (such as device 12 or device 40) can be piloted, in accordance with certain embodiments, via the surgeon console 16 as shown in FIGS. 1 and 7. This exemplary implementation of the surgeon console 16 contains a main processor (not shown) that performs robot control functions, system monitoring, and any other known processes or functionalities necessary or beneficial for controlling a system according to any embodiment herein. In these implementations, the console 16 also has a real-time display 130 that can display real-time images of the surgical environment using the output of the camera (such as camera 14 or 44). In one embodiment, the display 130 is a high definition display 130. Alternatively, any known display can be used. In addition, the console 16 can also have a touchscreen interface 132 that can be used to control several functions of the console 16 and device 12 (or device 40). In certain implementations, the touch screen 132 can also display various types of information about the state of the robotic device 12 (or 40) or any other component of the system 10. Alternatively, any known console can be used with the various implementations of the system disclosed or contemplated herein.


The console 16 in this implementation also has right and left hand controllers (or “input devices”) 134A, 134B that can be used to control various aspects of the device 12, 40 and/or camera 14, 44, including movement thereof. The surgeon can interface with the input devices 134A, 134B using the surgeon's hands such that the input devices 134A, 134B can track the movement of the surgeon's hands. In certain embodiments, each of the input devices 134A, 134B can have a surgeon presence sensor to track whether the surgeon's hands are properly engaged. In one exemplary embodiment, the user presence sensor is any of the embodiments disclosed or contemplated in U.S. patent application Ser. No. 15/826,166, which is incorporated by reference above. In certain implementations, the input devices 134A, 134B can also be configured to provide haptic feedback by pushing on the surgeon's hands to indicate things such as workspace boundaries and collisions between the robot arms, as is described in detail in U.S. patent application Ser. No. 15/227,813 and U.S. Pat. No. 9,888,966, both of which are incorporated by reference above. According to various embodiments, the input devices 134A, 134B can also control open/close functions of the robot's end effectors.


In accordance with some implementations, the surgeon console 16 can also have foot pedals 136 that are configured to be operable by the surgeon's feet to control various robot functions, including, for example, clutching, camera movements, and various electro cautery functions. Alternatively, the pedals 136 can be used to operate any known functions of the robotic device 12, 40 or any other component of the system 10. In a further alternative, any other input devices on the console 16 can be used to control those various functions.


The surgeon console 16 according to certain implementations can be configured such that it can be operated by a surgeon positioned in either a sitting position (similar to Intuitive's da Vinci console) or a standing position (similar to manual laparoscopy). The console 16 in this exemplary embodiment is designed to be easily transported between operating rooms using castors 138 and a transport handle 140. In certain embodiments, the height of the console 16 is adjustable.


Other console and system embodiments that can be incorporated into any system disclosed or contemplated herein are disclosed in U.S. applications Ser. Nos. 14/334,383, 15/227,813, and 16/144,807, all of which are incorporated by reference above, along with any of the other relevant patents and applications incorporated by reference above. The various components in said applications include companion carts, an interface pod, an electrosurgical generator, and the appropriate cables and connections, for example. Further, it is understood that any other known console or controller can be utilized with any robotic device or system disclosed or contemplated herein.


Another camera embodiment is depicted in FIGS. 8A and 8B (with FIG. 8A depicting a side view and FIG. 8B depicting a front view), in which the camera assembly 150 has a handle (or “camera body”) 152 with an elongate shaft 154 coupled thereto such that the shaft 154 extends distally from the distal end of the handle 152. In addition, the camera 150 has a steerable tip 156 coupled to the distal end of the shaft 154 via a flexible section 158 that couples the tip 156 to the shaft 154 such that the steerability allows the user to adjust the viewing direction, as will be discussed in further detail below. The tip 156 includes a camera imager (also referred to as an “imaging sensor”) 160 at the distal end of the tip 156 that is configured to capture the desired images, along with optics and support electronics (not shown). In this embodiment, the camera 150 also has light fibers (not shown) that are disposed through the shaft 154, flexible section 158, and tip 156 such that the light fibers provide light output at the tip 156 so as to light the surgical target for imaging. In addition, the assembly 150 has a cable 162 that is coupled to the handle 152 and extends therefrom to an external controller (such as the console 16 discussed above or any other controller) such that the cable 162 can provide electrical signals to and from the camera 150, including a video signal and any power and other signals or information necessary to operate the camera 150.


The steerable tip 156 can be robotically articulated in two independent directions, according to one embodiment. More specifically, as discussed above in additional detail with respect to the camera assembly 44 embodiment and depicted in FIG. 3C, the steerable tip 156 can be articulated to move in both the pitch and yaw directions.



FIGS. 9A, 9B, and 9C are expanded views of the camera handle 152 according to one embodiment. In this exemplary implementation, the camera handle 152 has an outer enclosure (also referred to as “exterior housing,” “housing,” or “enclosure”) 180, as best shown in FIG. 9A, with an inner housing 182 disposed within the enclosure 180, as best shown in FIG. 9B. In certain embodiments, the inner housing 182 is an inner heat distribution sink 182 that can function to distribute heat inside the camera handle exterior housing 180. More specifically, the heat distribution sink 182 is a cylinder 182 or structure 182 of any appropriate shape that is disposed within the enclosure 180 and around the internal components of the handle 152. The sink 182 is made of a material that distributes heat across the structure 182 and has a thermal capacity for heat. In one specific embodiment, the material is aluminum. Alternatively, the sink 182 can be made of any known heat sink material. The outer enclosure 180 helps to enclose the heat sink 182 and the internal mechanical and electrical components of the handle 152.


In addition, the handle 152 according to various implementations also has a cable strain relief assembly 184 extending from the proximal end of the handle 152 such that the cable 162 is disposed through the relief assembly 184. In one embodiment, the relief assembly 184 is a tube 184 or other elongate structure 184 through which the cable 162 can be disposed such that the relief assembly 184 has a reinforced structure to reduce the strain applied on the cable 162 when force is applied to the cable 162. It is understood that the cable strain relief assembly 184 can be any known structure for providing strain relief. Alternatively, the assembly 184 can be any known cable strain relief assembly.


The camera handle 152 also has, in this specific implementation, two O-ring seals 186A, 186B disposed around the cable connection 188. The seals 186A, 186B are disposed between the connection 188 and the housing 180, thereby establishing a fluidic seal therebetween and thus helping to prevent fluid ingress. Alternatively, there can be one seal, three seals, or any number of seals disposed around the connection 188. In a further alternative, the seal(s) need not be an O-ring seal, and instead can be any known type of seal.


The distal end of the camera handle 152 has a nose cone 190 extending therefrom, according to one embodiment. As best shown in FIG. 9C, the nose cone 190 is a distal structure 190 that extends distally from the handle 152. The nose cone 190 includes a nose cone tip 192 at the distal end of the nose cone 190 and slot 194 defined in the nose cone proximal to the tip 192. In one embodiment, the nose cone slot 194 is a camera coupling mechanism acceptance slot 194 such that the camera latch 420 discussed in further detail below can be received in the slot 194. As such, the camera latch acceptance slot 194 can work in conjunction with the camera latch 420 to assist with attaching the camera to the elongate body 42 of the elongate device 40 (or any other device embodiment herein) via the nest structure 346 as also discussed in detail below.


In addition, the nose cone 190 has a protrusion (or “collar”) 196 that is disposed around the cone 190 and has both an outer O-ring seal 198A and an inner O-ring seal 198B attached to the protrusion 196 as shown. The outer O-ring seal 198A is disposed on an outer circumference of the collar 196 such that the seal 198A is disposed between the collar 196 and the camera body enclosure 180. Further, the inner O-ring seal 198B is disposed on an inner circumference of the collar 196 such that the seal 198B is disposed between the collar 196 and the cone 190. As such, the seals 198A, 198B help to establish a fluidic seal between the nose cone 190 and the camera housing 180, thereby creating a sealed camera handle 152 that helps to protect the internal mechanical and electrical components during use and cleaning.


In use, the camera assembly 150 is typically held by a user (such as a surgical assistant and/or surgeon) via the camera handle/body 152 when the camera 150 is moved around and inserted into or removed from the device body 42 (or any other device body embodiment herein) (or when used independently of any robotic device as discussed above). Further, the nose cone 190 of the handle 152 is sized and shaped to couple with and assist with attachment to the elongate body 42 (or any other device body embodiment herein) as discussed in additional detail below.


In accordance with one embodiment as shown in FIGS. 10A and 10B, the camera body/handle 152 can contain several internal components. FIG. 10A is a side view of the internal components of the handle 152 (the same view as that provided in FIGS. 8A, 9A, and 9B), while FIG. 10B is a front view (the same view as FIG. 8B). As best shown in FIG. 10A, the handle 152 in this implementation has an LED light source 220 with two light transmission fibers 222A, 222B coupled thereto that extend distally from the light source 220 through the interior of the handle 152 and toward and toward through camera shaft 154 to the distal tip of the shaft 154. Thus, the fibers 222A, 222B transmit light from the source 220 along the fibers 222A, 222B to the field of view of the imager 160 as discussed above. As such, the light from the light source 220 ultimately shines on the imaging target. It is understood that the light source 220 can be adjusted via a controller (such as the console 16), including via software in the controller or console 16, in a known fashion to adjust the light intensity produced therefrom. Alternatively, the handle 152 can have two LED light sources, with each separate source coupled to a different one of the two transmission fibers. In a further alternative, one transmission fiber can be used, or three or more fibers can be used. In yet another alternative, any configuration for transmitting light to the steerable tip 156 can be incorporated into any of the camera embodiments disclosed or contemplated herein.


In addition, the camera body 152 also has at least one circuit board 224 disposed therein. In one embodiment, the at least one circuit board 224 can be used to control the camera assembly 150 in any number of known ways. In some non-limiting examples, the at least one circuit board 224 can control the light source 220 or any facet of the lighting, the video signal, the image sensor 160, the actuation mechanisms 226A, 226B, or any sensors present anywhere on or in the camera 150. Alternatively, the camera body 152 can have two or more circuit boards for controlling various components and/or features of the camera 150. Further, the camera body 152 also has an actuator unit (also referred to herein as an “articulation unit”) 226 that actuates the steerable tip 156. That is, the actuator unit 226 is operably coupled to the steerable tip 156 such that the actuator unit 226 actuates the tip 156 to move in both directions as described in additional detail herein: pitch and yaw. In this specific embodiment, the actuator unit 226 is actually made up of two actuator mechanisms: a first (or yaw or left/right) actuation mechanism 226A and a second (or pitch or north/south) actuation mechanism 226B. As will be described in additional detail below, each of the two mechanisms is coupled to the steerable tip 156 via cables that can be used to transfer the motive force use to move the steerable tip 156 as described herein. In one embodiment as best shown in FIG. 10B, each of the two mechanisms is coupled to the steerable tip 156 via a pair of cables, with the first mechanism 226A coupled to the tip 156 via cable pair 228 and the second mechanism 226B coupled to the tip 156 via cable pair 230. That is, as best shown in the side view of FIG. 10A, the first pair of cables 228A, 228B is coupled at the proximal ends of the pair 228A, 228B to the first mechanism 226A and further is coupled at the distal ends of the pair 228A, 228B to the steerable tip 156. Similarly, as shown in profile in FIG. 10B, the second pair of cables 230 is coupled at the proximal ends thereof to the second mechanism 226B and at the distal ends of the pair 230 to the steerable tip 156. In one embodiment, the cable pairs 228, 230 are known Bowden cables 228, 230, as will be described in additional detail below.



FIG. 10C depicts one of the two actuation mechanisms, according to one embodiment. That is, the first actuation mechanism 226A is depicted in FIG. 10C, but it is understood that the second actuation mechanism 226B has similar components and operates in a similar fashion, according to one embodiment. The mechanism 226A has a drive motor 240, a gear train 242 rotationally coupled to the drive motor 240, and a double thread lead screw 244 rotationally coupled to the gear train 242. Thus, actuation of the drive motor 240 causes rotation of the gear train 242, which causes rotation of the lead screw 244. Alternatively, the motor 240 can be rotationally coupled to the lead screw 244 via any gear or other rotational coupling combination. According to one implementation, the actuation mechanism 226A can also have two sensors 262, 264 that are used to measure the angular position of the lead screw 244. One sensor 262 has a magnet 262 that is rotatably coupled to the lead screw 244 such that the sensor 262 provides for absolute position measurement. The other sensor 264 in this implementation is a position sensor 264 as shown. It is understood that either of these sensors 262, 264 can be any known sensors for tracking the position of the lead screw 244. In accordance with certain implementations, the actuation mechanism 226A also allows for manually driving the lead screw 244. That is, this embodiment has a manual drive input 266 which, in this exemplary embodiment, is a nut 266 that can be turned using a simple tool such as a wrench or the user's fingers. This manual drive mechanism 266 can be used for unpowered movement during assembly or service of the actuation mechanism 226A.


As best shown in FIGS. 10C and 10D, the double thread lead screw 244 has two sets of threads: a first set of threads 246A and a second set of threads 246B. In one embodiment, the two sets of threads 246A, 246B have opposite threads. That is, one of the two sets of threads 246A, 246B has right-handed threads, while the other set has left-handed threads. As best shown in FIGS. 10C and 10E, the actuation mechanism 226A also has two carriages threadably coupled to the double thread lead screw 244. More specifically, the first carriage 248A is threadably coupled to the first set of threads 246A, while the second carriage 248B is threadably coupled to the second set of threads 246B. Each of the two carriages 248A, 248B are slidably disposed in the actuation mechanism 226A along the rods (also referred to as “linear bearings”) 250A, 250B. That is, each of the carriages 248A, 248B have lumens 252A, 252B, 254A, 254B disposed therethrough as shown such that the rods 250A, 250B are disposed therethrough. In other words, the first carriage 248A is slidably disposed on the rods 250A, 250B via lumens 252A, 252B, while the second carriage 248B is slidably disposed on the rods 250A, 250B via lumens 254A, 254B as shown. Thus, given that the sets of threads 246A, 246B have opposite threads, rotation of the lead screw 244 in one direction causes the carriages 248A, 248B to move linearly toward each other (closer together), while rotation of the lead screw 244 in the opposite direction causes the carriages 248A, 248B to move linearly away from each other (farther apart). It is understood that, according to certain embodiments, if the threads of the two sets 246A, 246B have the same pitch, then the carriages 248A, 248B will move in equal and opposite directions.


In addition, each carriage 248A, 248B is coupled to a separate one of the two cables of the cable pair 228, as mentioned above with respect to FIGS. 10A and 10B. More specifically, the first carriage 248A is coupled to the first cable 228A of the pair 228, and the second carriage 248B is coupled to the second cable 228B. In one specific embodiment as best shown in FIG. 10C, each carriage has a coupling structure 258A, 258B that is used to attach the cable 228A, 228B in place. That is, the first carriage 248A has the first coupling structure 258A that can removably couple the cable 228A to the carriage 248A. Similarly, the second carriage 258B has the second coupling structure 258B that can removably couple the cable 228B to the carriage 248B. In this specific implementation, the coupling structures 258A, 258B are clamping structures 258A, 258B in which screws 260A, 260B can be used to loosen or tighten the clamping structures 258A, 258B to removably couple the cables 228A, 228B thereto. Further, each of the coupling structures 258A, 258B also have a half lumen 261A, 261B defined in the side of each carriage 248A, 248B such that the cable 228A, 228B can be disposed within the half lumen 261A, 261B and the clamping structure 258A, 258B can be positioned against the cable 228A, 228B and tightened with the screws 260A, 260B to fix the cables 228A, 228B in place. It is understood that the cables 228A, 228B are coupled at their distal ends to the steerable tip 156, as discussed above. More specifically, one of the two cables 228A, 228B is coupled to the steerable tip 156 such that actuation of that cable 228A, 228B causes the tip 156 to move to the left, while the other of the two cables 228A, 228B is coupled to the tip 156 such that actuation of that cable 228A, 228B causes the tip 156 to move to the right.



FIG. 10F depicts an expanded view of a section of FIG. 10A depicting the internal components of the camera body 152, according to one embodiment. More specifically, FIG. 10F depicts the relationship between the cables 228A, 228B and the connection assemblies 280A, 280B disposed distally of the actuation mechanism 226A. In this exemplary implementation, each of the cables 228A, 228B is a known Bowden cable having an inner cable that is slidably disposed within a cable housing along part of the length of the inner cable. More specifically, the first cable 228A has both an inner cable 282A and a housing 282B. The inner cable 282A extends from the actuation assembly 226A to the steerable tip 156, while the housing 282B extends from the connection assembly 280A to the steerable tip 156. Similarly, the second cable 228B has an inner cable 284A that extends from the actuation assembly 226A to the steerable tip 156 and a housing 284B extending from the connection assembly 280B to the steerable tip 156. Thus, a distal length of the inner cable 282A is slidably disposed within the cable housing 282B, while a proximal length of the inner cable 282A extends out of the housing 282B at the connection assembly 280A and extends toward the actuation assembly 226A as shown at arrow D. Similarly, a distal length of the inner cable 284A is slidably disposed within the cable housing 284B, while a proximal length of the inner cable 284A extends out of the housing 284B at the connection assembly 280B and extends toward the actuation assembly 226A as shown at arrow E. Thus, the inner cables 282A, 284A are actuable by the actuation assembly 226A as discussed above to articulate the steerable tip 156 via linear movement of the inner cables 282A, 284A. In one embodiment, each of the cable housings 282B, 284B is made of a known composite construction with an inner lining (to facilitate movement of the inner cable therein) and is longitudinally incompressible.


Each connection assembly 280A, 280B has both a tension spring assembly 286A, 286B and a tension adjustment mechanism 288A, 288B, both of which will be described in detail below.


According to one embodiment, each tension spring assembly 286A, 286B is operable to absorb force applied to the cable housings 282B, 284B and thereby reduce the strain applied thereto and potentially prevent resulting damage. That is, each tension spring assembly 286A, 286B has a tension spring retainer body 300A, 300B that is slidably disposed through a bushing 302A, 302B and an opening 304A, 304B defined in a body wall 306. Each bushing 302A, 302B is fixedly attached to the body wall 306 such that each spring retainer body 300A, 300B is slidable in relation to the body 152. Each retainer body 300A, 300B has a tension spring 308A, 308B disposed around the body 300A, 300B and positioned between a lip 307A, 307B on the retainer body 300A, 300B and the body wall 306 such that each tension spring 308A, 308B can move between an extended state (as shown in assembly 280A) and a compressed state (as shown in assembly 280B). In addition, each retainer body 300A, 300B has an extendable barrel 310A, 310B extending from a distal end of the retainer body 300A, 300B. Each cable housing 282B, 284B is coupled to the extendable barrel 310A, 310B as shown. It is understood that both the retainer bodies 300A, 300B and the extendable barrels 310A, 310B have lumens 312A, 312B defined therethrough as shown such that the inner cables 282A, 284A can extend therethrough as shown.


Thus, each tension spring assembly 286A, 286B is structured to provide strain relief. For example, if the steerable tip 156 has force applied thereto from an external source (such as a user's hand or collision of the tip 156 with an object), that external force is applied to one or both of the cable housings 282B, 284B. Given that each of the cable housings 282B, 284B is longitudinally incompressible, the force is transferred axially along the length of the housing 282B, 284B and into the retainer body 300A, 300B, which causes the tension spring 308A, 308B to be urged from its relaxed state toward either its compressed or its extended state. All of this occurs without any external force being applied to the inner cable 282A, 284A. Thus, each tension spring assembly 286A, 286B helps to prevent damage to the cables 228A, 228B by absorbing any application of external force thereto through the cable housings 282B, 284B and tension springs 308A, 308B. That is, compression of the tension springs 308A, 308B allows for adjustment of the length of the cable housing 282B, 284B as a result of the external force while protecting the inner cables 282A, 284A from that external force.


Alternatively, the configuration of the tension spring assemblies 286A, 286B need not be limited to the specific components thereof. It is understood that any known assembly for absorbing strain from external forces can be incorporated herein.


Turning now to the tension adjustment mechanisms 288A, 288B, each such assembly 288A, 288B is operable to allow for manually adjusting the tension of each inner cable 282A, 284A. That is, if a user determines that either cable 282A, 284A is too loose or too tight, the user can utilize the appropriate adjustment mechanism 288A, 288B to adjust the tension thereof. Each mechanism 288A, 288B includes the extendable barrel 310A, 310B extending from a distal end of the retainer body 300A, 300B, as discussed above. Each adjustable barrel 310A, 310B has an adjuster nut 314A, 314B that is threadably coupled to the barrel 310A, 310B such that rotation of either nut 314A, 314B by a user causes axial extension or retraction of the respective barrel 310A, 310B. Thus, if either cable 282A, 284A is too loose, the user can rotate either nut 314A, 314B to lengthen the respective barrel 310A, 310B, thereby tightening the cable 282A, 284A. On the other hand, if either cable 282A, 284A is too tight, the user can rotate either nut 314A, 314B to shorten the appropriate barrel 310A, 310B, thereby loosening the cable 282A, 284A.


Alternatively, the configuration of the tension adjustment assemblies 286A, 286B need not be limited to the specific components thereof. It is understood that any known assembly for adjusting the tension of the cables can be incorporated herein.


Further, it is understood that the various camera body embodiments disclosed or contemplated herein are not limited to the specific components and features discussed above. That is, the camera body/handle can incorporate any known mechanisms or components for lighting, transmission of energy/information, articulation, and adjustment/strain relief mechanisms.


As shown in FIGS. 11A and 11B, various implementations of the robotic device 340 can include a removable nest 346 that is removably coupleable to a proximal end of the device body 342 (as best shown in FIG. 11A) and is designed to receive an insertable camera assembly 344 (including, for example, any camera embodiment disclosed or contemplated herein) and couple or lock the device 340 and camera 344 together, as is shown in FIG. 13. That is, the nest 346 is a coupling component or port that can couple to both the device 340 and camera 344 as shown such that the nest 346 is disposed between the device body 342 and the camera 344 when the three components 342, 344, 346 are coupled together (as best shown in FIG. 11B and FIG. 13). In this embodiment, the nest 346 removably attaches to the proximal portion of the elongate body 342 as best shown in FIG. 11A, and can be released therefrom by the nest release buttons 348 as best shown in FIG. 11B (in which one of the two buttons 348 is visible, with the other button disposed on the opposing side of the nest 346). The release buttons 348 will be discussed in additional detail below. Further, as best shown in FIG. 11B, the nest 346 has a proximal opening 350 which is also known as the camera acceptance opening 350 that receives the camera (such as camera 344) such that the camera 344 mechanically couples or mates with the nest 346 in the opening 350. The lip 352 of the opening 350 in this embodiment includes a slot 354, which is a rotation registration feature 354 that mates with a matching feature (in this case, a protrusion) on the camera 344 to ensure that the camera 344 will only fully mate with the nest 346 in one specific orientation. Further, the nest 346 also has a seal package 356 disposed therein that helps to ensure the insufflation pressure in the target cavity of the patient does not leak out though the camera lumen (not shown) in the device body 342, regardless of whether the camera 344 is disposed in the lumen or not. In addition, the nest 346 has a camera lock/release button 358 as shown. Both the seal package 356 and the button 358 will be discussed in additional detail below.


One embodiment of the seal package 356 is depicted in FIGS. 12A and 12B. As best shown in FIG. 12A, the seal package 356 is disposed within the nest 346 such that the seal package 356 receives the camera (such as camera 344) as it is positioned in the nest 346 and thus disposed in and coupled to the device body 342 (as shown in FIG. 13). It is understood that the seal package 356 is any structure having at least one seal disposed therein for establishing a fluidic seal in the presence or absence of the camera. In various implementations, the structure 356 can have one seal, two seals, three seals, or any number of seals as needed to maintain the fluidic seal as described herein. As best shown in FIG. 12B, the seal package 356 in this exemplary embodiment has two seals: a first seal 358 and a second seal 360. In one embodiment as shown, the first seal 358 is disposed above the second seal 360. Alternatively, the second seal 360 can be disposed above the first seal 358.


As best shown in FIG. 12B, the first seal 358 is configured to receive the camera 344 and maintain a fluidic seal between the camera 344 and the seal 358. In the exemplary embodiment as shown, the seal 358 is a circular seal with tensioned seal walls 362 in which an opening 364 is defined. The opening 364 has a diameter that is less than the outer diameter of the camera 344 to be positioned therethrough. When the camera 344 is disposed through the opening 364 in the first seal 358 (as shown in FIG. 13, for example), the walls 362 are urged away from the center of the seal 358 as a result of the smaller diameter of the opening 364, thereby causing the tension in the walls 362 to increase. As such, the tension in the walls 362 causes the walls 362 to be urged toward the camera 344, thereby establishing a seal therebetween. Alternatively, the first seal 358 can be any seal that can receive a camera therethrough and establish a fluidic seal between the seal and the camera.


The second seal 360 is configured to maintain a fluidic seal when the camera is not disposed through the seal package 356. In this embodiment as shown, the seal 360 is a circular seal having hinged seal walls 366 that can move between an open position (when the camera is disposed therethrough) and a closed position (when the camera is not present) such that the closed walls 366 establish a fluidic seal. According to one implementation, the seal walls 366 are urged closed by the higher gas pressure inside the target cavity of the patient, thereby reducing or preventing leakage or loss of air pressure as a result. Alternatively, the second seal 360 can be any seal that can receive a camera therethrough and establish a fluidic seal when the camera is not present.


Continuing with FIG. 12B, the seal structure 356 also has a frame 368 having a top frame structure 370 and a bottom frame structure 372 coupled to the top frame structure 370. The two seals 358, 360 are disposed between the top and bottom frame structures 370, 372 as shown. In one embodiment, the bottom frame structure 372 also has a third seal 374 disposed therein. More specifically, in this specific embodiment, the third seal 374 is an O-ring seal 374. Alternatively, the third seal 374 can be any known seal.


In use, the seal package 356 allows for the camera 344 to be inserted or removed at any time, including during a procedure, without risking loss of insufflation. That is, the first seal 358 maintains a fluidic seal while the camera 344 is present (such as shown in FIG. 13), and the second seal 360 maintains a fluidic seal while the camera 344 is not present.


It is understood that the seal structure 346 embodiments disclosed or contemplated herein can be incorporated into any nest embodiment disclosed or contemplated herein. Further, it is also understood that any other known seal mechanisms or structures can be used to establish a fluidic seal within any nest embodiment for receiving any camera embodiment herein.


Returning to the coupling of the nest 346 to the proximal end of the device body 342, FIG. 14A depicts two hinged latches 380A, 380B disposed within the nest 346 such that the latches 380A, 380B can be used to couple the nest 346 to the device body 342, and FIG. 14B depicts an expanded view of one of those latches 380B. The latches 380A, 380B are coupling mechanisms that can be used to both couple the nest 346 to the proximal end of the device body 342 and to uncouple therefrom. While FIG. 14B depicts solely the hinged latch 380B for purposes of describing the latch 380B in further detail, it is understood that the other hinged latch 380A has substantially the same or similar components. The latch 380B has a proximal end 382 that is hingedly coupled to or cantilevered from an interior portion of the nest 346 such that the distal end 384 of the latch 380B is movable in relation to the nest 346. Further, each latch 380A, 380B is hingedly attached in such a fashion that the latches 380A, 380B are tensioned such that when the latches 380A, 380B are urged radially inward, force is applied to the latches 380A, 380B such that the force is directed to urge them back to their natural state (and thus the force is applied such that the latches 380A, 380B are urged radially outwardly). In addition, the latch button 348 discussed above with respect to FIG. 11B is disposed at the distal end of the latch 380B, with the latch button 348 coupled to the latch body 386 via an arm 388 such that the button 348 is disposed at a distance from the latch body 386. The arm 388 has several structural features that assist with interaction with and coupling/uncoupling of the latch 380B in relation to the device body 342. That is, the arm 388 includes a protrusion or protruding body 390 with an angled surface 392 on the side of the protrusion 390 and a nest latching surface 394 on the top of the protrusion 390. Further, the sides of the arm 388 are guidance surfaces 396. The functions of these surfaces are explained in detail below.



FIGS. 15A and 15B show the coupling of the nest 346 to the proximal end of the device body 342. More specifically, FIG. 15A depicts the nest 346 positioned in close proximity to the proximal end of the body 342, while FIG. 15B depicts the nest 346 coupled to the proximal end of the body 342. The proximal end of the device body 342 has a male connector 400 disposed on the proximal end of the body 342 that is configured to couple with the nest 346 such that the nest 346 is disposed over the male connector 400 when the nest 346 is coupled thereto.



FIGS. 16A-17B depict the operation of the hinged latches 380A, 380B in coupling the nest 346 to the device body 342. As best shown in FIGS. 16A and 16B, the male connector 400 has V-shaped receiving slots 402 defined on both sides thereof, with only one of the slots 402 depicted in the figures. The slots 402 are disposed on opposing sides of the connector 400 and are aligned to receive the two hinged latches 380A, 380B. Further, as also shown in FIGS. 16A and 16B, the slot 402 (and the undepicted slot on the opposing side of the connector 400) is in communication with a protrusion receiving opening 404 that is also defined in the side of the connector 400. Again, the connector has two such protrusion receiving openings, with one such opening 404 depicted and the other opening positioned on the opposing side of the connector 400 but not depicted in the figures.


Thus, as the nest 346 is urged downward toward the male connector 400, the latches 380A, 380B are aligned with the V-shaped slots (including the slot 402), as best shown in FIG. 16A, such that the guidance surfaces 396 of the arms 388 are aligned as well. As the distal ends 384 of the latches 380A, 380B approach and come into contact with the connector 400 (as best shown in FIG. 17A), the arms 388 are aligned with the V-shaped slots 402 and the angled surfaces 392 contact the inner walls 408 of the connector 400. Thus, as the nest 346 is urged further downward, the V-shaped slots 402 help to guide the arms 388 of the latches 380A, 380B and the angled surfaces 392 contacting the inner walls 408 cause the distal ends of the latches 380A, 380B to be urged radially inward, as indicated by the arrows F in FIG. 17A. Once the nest 346 is urged into full connection with the connector 400, the protrusions 390 of the latches 380A, 380B are positioned parallel with the protrusion receiving openings 404 such that the tensioned nature of the latches 380A, 380B results in the latches 380A, 380B being urged radially outwardly such that the protrusions 390 are urged into the openings 404. That is, the distal ends of the latches 380A, 380B and the buttons 348 thereon are urged radially outwardly as depicted by the arrows G in FIG. 17B. At this point, the nest latching surfaces 394 are disposed to be in contact with the upper walls 406 of the openings 404, thereby retaining the latches 380A, 380B in place.


When the nest 346 is coupled to the device body 342 as described herein, the latches 380A, 380B are releasably locked to the male connector 400 as described above. Thus, in order to remove the nest 346, a user must reverse the process described above by pressing both buttons 348 on the latches 380A, 380B, thereby urging the distal ends of the latches 380A, 380B radially inwardly such that the nest latching surfaces 394 are no longer in contact with the upper walls 406. As such, once the buttons 348 are depressed far enough, the latches 380A, 380B release and the nest 346 can be urged proximally off of the proximal end of the device body 342, thereby removing the nest from the device 340.


It is understood that the latch 380A, 380B embodiments disclosed or contemplated herein can be incorporated into any nest embodiment disclosed or contemplated herein, and can be used to couple to any device body disclosed or contemplated herein. Further, it is also understood that any other known attachment mechanisms can be used to removably couple any nest embodiment herein to any device body herein.


As shown in FIG. 18, once the nest 346 is coupled to the device body 342 as described above, the nest 346 can have a camera coupling mechanism (or “latch mechanism”) 420 that can be used to couple the camera (such as camera 344) to the nest 346 and thus the device 340, according to one embodiment. Further, the mechanism 420 can also be used to release or uncouple the camera 344 from the device body 342. In addition, the mechanism 420 can also have a presence detection mechanism 422 coupled thereto that can operate in conjunction with the latch mechanism 420 to detect the presence or absence of the camera 344. Both the camera coupling mechanism 420 and the presence detection mechanism 422 will be described in further detail below.


Continuing with FIG. 18, the camera latch mechanism 420 has a slidable latch body 424 with a camera receiving opening 426 defined therein as shown. The camera receiving opening 426 is disposed above the camera lumen 432 in the seal package 356 as shown. Further, the coupling mechanism 420 also has a latch button 428 attached to the body 424 on one side, and a tension spring 430 attached to the body 424 on the side opposite the button 428 as shown. The tension spring 430 is configured to urge the latch body 424 away from the spring 430 as shown by arrow H such that the opening 426 is not aligned with the camera lumen 432 in the seal package 356.


Insertion of a camera 150 into the nest 346 (and thus the device 340) and the resulting interaction with the camera latch mechanism 420 will now be described with respect to FIGS. 19A-19C, according to one embodiment. As shown in FIG. 19A, as the shaft 154 of the camera (such as camera 150) is inserted into the nest 346 but before the distal end of the handle 152 reaches the latch mechanism 420, the tension spring 430 continues to urge the mechanism 420 toward the button in the direction represented by arrow I. As shown in FIG. 19B, when the camera 150 is urged distally into the nest 346 such that the nose cone tip 192 is urged through the opening 426 in the latch body 424, the tip 192 contacts the body 424 and urges the body 424 back toward the tension spring 430 as represented by arrow J. This allows for passage of the tip 192 through the opening 426 in the latch body 424. As shown in FIG. 19C, once the tip 192 passes through the opening 426, the latch receiving slot 194 is disposed in the opening 426, which releases the force applied by the tip 192 and allows the force of the tension spring 430 to once again urge the latch body 424 away from the spring 430 as again represented by arrow I. This urges the latch body 424 into the slot 294 in the camera 150, thereby coupling the latch mechanism 420 and the camera 150 such that the camera 150 is locked into place in the nest 346 and thus the elongate body 342. When it is desirable to remove the camera 150, a user can depress the button 428, thereby urging the latch mechanism 420 toward the latch springs, thereby urging the latch body 424 out of the slot 194, which releases the camera 150 such that a user can remove it proximally from the nest 346 and the device body 342.


It is understood that the camera coupling mechanism 420 embodiments disclosed or contemplated herein can be incorporated into any nest embodiment disclosed or contemplated herein, and can be used to couple to camera embodiment disclosed or contemplated herein. Further, it is also understood that any other known camera coupling mechanisms can be used to removably couple any camera embodiment herein to any device body herein.


As mentioned above, the nest 346 also has a presence detection mechanism 422 that will now be described in detail with respect to FIGS. 20-22B, according to one embodiment. The mechanism 422 includes a lever 440 that is pivotally coupled to the latch body 424 at a pivot point 442 as best shown in FIG. 21. The lever 440 has a contact pad 444 at one end of the lever 440 as shown, and a magnet 446 at the opposite end of the lever 440. In addition, the detection mechanism 422 also includes a tension spring 448 in contact with the lever 440 at a point along the lever identified by arrow K as best shown in FIGS. 20 and 21, such that the tension spring 448 urges the lever 440 to rotate clockwise around the pivot point 442 as shown in FIG. 21. Further, the detection mechanism 422 includes a sensor 450 disposed in a proximal end of the device body 342 such that the magnet 446 on the lever 440 is configured to interact with the sensor 450 as described in further detail below.


According to one embodiment, the presence detection mechanism 422 can detect three different configurations: (1) the presence of the fully installed camera (such as camera 150), (2) the proper coupling of the nest 346 to the device body 342, and (3) actuation of the camera release button 428 by a user. Each of these will be described in turn below.



FIG. 21 depicts the presence detection mechanism 422 when the camera 150 is not yet coupled properly to the nest 346 and device body 342. Because the camera body 152 is not positioned adjacent to and in contact with the contact pad 444, the force applied by the tension spring 448 urges the portion of the lever 440 forward, which urges the magnet 446 away from the sensor 450. The gap between the magnet 446 and the sensor 450 causes the sensor 450 to indicate that the magnet 446 is not in contact with the sensor 450, thereby indicating that the camera 150 is not properly coupled to the device body 342.



FIG. 22 depicts the present detection mechanism 422 when the camera 150 is properly coupled to the nest 346 and the device body 342. With the camera body 152 positioned in contact with the contact pad 444, that urges the contact pad 444 and thus the portion of the lever 440 above the pivot point 442 radially outward as represented by arrow L. Thus, the magnet 446 is urged into contact with the sensor 450 such that the sensor 450 indicates the presence of the magnet 446, thereby indicating that the camera 150 is properly coupled to the device body 342. In addition, this position of the lever 440 and contact of the magnet 446 with the sensor 450 also indicates that the nest 346 is properly coupled to the device body 342.



FIG. 23 depicts the present detection mechanism 422 when the camera latch button 428 is depressed by a user as indicated by arrow M. Given that the lever 440 is pivotally coupled to the latch body 424 at the pivot point 442, the depression of the latch button 428 causes the pivot point 442 to move away from the button 428 (in the direction of arrow M). The movement of the pivot point 442 in combination with the force applied by the tension spring 448 causes the portion of the lever 440 below the pivot point 442 to move radially outward such that the magnet 446 is urged away from the sensor 450. The gap between the magnet 446 and the sensor 450 causes the sensor 450 to indicate that the magnet 446 is not in contact with the sensor 450, thereby indicating that the button 428 has been depressed. This indication of the depression of the button 428 is a precursor to removal of the camera. Thus, in certain embodiments, the overall system might use this information to straighten the camera tip (bring the tip co-axial with the camera shaft) so that it can be more easily removed, or take any other similar steps.


Alternatively, the magnet 446 need not be a magnet. Instead, the component at the end of the lever 440 can be any sensor component or sensor detectable component that can interact with the sensor 450 to indicate whether that end of the lever 440 is in contact (or close proximity) with the sensor 450 or is not in proximity with the sensor 450. For example, both the component 446 and the sensor 450 can be sensors that can sense the presence of the other sensor. Alternatively, the component 446 and the sensor 450 can be any types of mechanisms that provide for sensing the presence or absence of that end of the lever 440.


It is understood that the presence detection mechanism embodiments disclosed or contemplated herein can be incorporated into any nest embodiment disclosed or contemplated herein. Further, it is also understood that any other known presence detection mechanisms can be used in any nest embodiment herein to detect the presence of any camera embodiment herein.


It is understood that the various nest embodiments disclosed or contemplated herein are not limited to the specific coupling and uncoupling mechanisms, sensors, etc. as described above with respect to the exemplary nest embodiment described herein. It is understood that other known mechanisms or components that can accomplish the same results can be incorporated herein.


In one implementation, the various device embodiments herein can have fluidic seals at all potential fluid entry points in the device, thereby reducing or eliminating the risk of fluid entering any of the internal areas or components of the device. For example, in one embodiment, each of the robotic arms in the device can have fluidic seals disposed at certain locations within the arms to reduce or prevent fluidic access to the internal portions of the arms. In addition, the arms can also have a flexible protective sleeve disposed around the arms as discussed above. The sleeve at one end is attached at the distal end of the elongate device body and at the other end is attached at the distal end of the forearm of each arm.


One exemplary forearm 460 with appropriate fluidic seals is depicted in FIGS. 23 and 24, according to one embodiment. As best shown in FIG. 23, the forearm 460 has a tool lumen 462 defined therein, with a tool drive 464 disposed at the proximal end of the lumen 462. Further, the forearm 460 has a protective sleeve 472 around the forearm body 470 and fluidic seals at both ends of the lumen 462: a proximal seal 468 at the proximal end and an arrangement of seals at the distal end. As best shown in FIG. 24, the arrangement of seals at the distal end of the forearm 460 includes a seal 476 (which can be a compression ring 476) that engages the protective sleeve 472 and retains it in place around the opening to the tool lumen 462. The seal 476 can be molded into the protective sleeve 472 or alternatively can be a separate component. In addition, the arrangement of distal seals also includes a threaded seal holder 478 that is threaded into the lumen opening in the forearm body 470. This holder 478 compresses the seal 476 and the protective sleeve 472, thereby forming a fluidic seal between the protective sleeve 472 and the threaded seal holder 478. In certain implementations, the threaded seal holder 478 is removable, thereby allowing the protective sleeve 472 to be removed and replaced as necessary. Further, the arrangement of distal seals includes a ring seal 466 that can be positioned to form a seal between the threaded seal holder 478 and the tool lumen 462 while still allowing relative rotation between the tool lumen 462 and the forearm body 470. Finally, a tool bayonet 480 is positioned over the seal assembly as shown. The bayonet 480 is configured to receive and couple to any interchangeable end effectors (not shown) that are inserted into the tool lumen 462. It is understood that the bayonet interface 480 can be used to allow only certain end effectors to be inserted into the forearm 460.


The fluidic seals 466, 468 establish a fluidic seal between the tool lumen 462 and the internal areas and components of the forearm body 470. More specifically, each of the fluidic seals 466, 468 prevent fluid ingress while still allowing the tool lumen body 474 to rotate relative to the forearm body 470. The distal seal 466 establishes a fluidic seal between the tool lumen 462 and the protective sleeve 472, as mentioned above, while the proximal 468 seal establishes a fluidic seal between the tool lumen 462 and the internal portions of the body 470, thereby preventing fluids that enter the lumen 462 from accessing the internal portions of the body 470. As such, the two seals 466, 468 in combination with the protective sleeve 472 and the rest of the distal seal assembly discussed above prevent fluid from entering the forearm body 470 even if an end effector (not shown) is not present in the tool lumen 462.


As shown in FIG. 25, a fluidic seal is also established at the shoulder joint 500 between the device body 502 and the upper arm 504. As discussed above, the protective sleeve 472 is disposed around the upper arm 504 and extends across the shoulder joint 500 to the elongate body 502. In one embodiment as shown, the sleeve 472 is mechanically retained against the elongate body 502 with a retention band 506 that is disposed around the sleeve 472 at a groove 508 formed in the body 502 such that the band 506 pulls the sleeve 472 into the groove 508. Further, according to certain implementations, the sleeve 472 can have two seals 510 molded or otherwise formed into the sleeve 472 itself as shown. These bands 510 extend around the outer circumference of the sleeve 472 and act similarly to two O-rings by establishing a redundant compression seal between the sleeve 472 (at the bands 510) and the robot assembly. The camera lumen 512 in the device body 502 extends past and adjacent to the sleeve 472 such that it allows the camera (not shown) to be positioned therethrough and pass by the protective sleeve 472 and extend in front of the body 502 unencumbered.


As shown in FIG. 26, a fluidic seal is also established at the proximal end of the elongate device body 502 via several seals to prevent fluid ingress. First, the body housing 520 is coupled to the groove ring body 522 with a seal 524 (such as, for example, an O-ring 524) to establish a fluidic seal between the housing 520 and the groove ring body 522. Further, a fluidic seal is established between the groove ring body 522 and the light ring 526 with another seal 528 (such as, for example, an O-ring seal 528). Finally, a fluidic seal is established between the light ring 526 and the male connector 530 with another seal 532 (such as, for example, an O-ring seal 532). In certain embodiments, there can also be seals 534 where the electrical cable 536 enters the elongate body 502 and also seals 538 where the elongate body is pierced by the camera lumen 512.


It is understood that the fluidic seals as disclosed or contemplated herein can be incorporated into any device embodiment disclosed or contemplated herein. Further, it is also understood that any other known seal mechanisms can be used and positioned in any known fashion in any device embodiment herein to establish a fluidic seal for any device embodiment herein.


Although the various inventions have been described with reference to preferred embodiments, persons skilled in the art will recognize that changes may be made in form and detail without departing from the spirit and scope thereof.

Claims
  • 1. A robotic surgical system, comprising: (a) a robotic surgical device comprising: (i) an elongate device body comprising a distal end and a proximal end;(ii) a removable connection port disposed at the proximal end of the device body, the connection port comprising: (A) a device body coupling mechanism disposed within the connection port;(B) an camera receiving opening defined in a proximal end of the connection port;(C) a seal package disposed in the removable connection port, the seal package comprising at least two seals; and(D) a camera coupling mechanism disposed within the removable connection port; and(ii) first and second robotic arms operably coupled to the distal end of the device body; and(b) a removable camera component removably disposable in the camera receiving opening and through the seal package, the removable camera component comprising a camera body, an elongate camera tube, a flexible section, and a distal imager; and(c) a presence detection mechanism comprising: (i) a rotatable lever operably coupled to the camera coupling mechanism at a pivot point, wherein the rotatable lever rotates around the pivot point;(ii) a first sensing component disposed on the rotatable lever; and(iii) a second sensing component disposed on the elongate body, wherein the second sensing component is configured to sense the presence or absence of the first sensing component.
  • 2. The robotic surgical system of claim 1, wherein the device body coupling mechanism comprises first and second hinged coupling mechanisms hingedly coupled to the connection port.
  • 3. The robotic surgical system of claim 2, wherein each of the first and second hinged coupling mechanisms comprises: (a) a coupling mechanism body;(b) a tensioned hinge at a proximal end of the coupling mechanism body, wherein the tensioned hinge is hingedly coupled to the connection port; and(c) a coupleable structure at a distal end of the coupling mechanism, wherein the coupleable structure comprises at least one coupling feature configured to be coupleable with a matching coupling feature on the proximal end of the device body and an actuable button.
  • 4. The robotic surgical system of claim 1, wherein the elongate device body comprises a male connector disposed at a proximal end of the elongate device body, wherein the male connector is coupleable with the connection port.
  • 5. The robotic surgical system of claim 1, wherein the camera coupling mechanism comprises: (a) a slidable body disposed within the connection port;(b) a camera receiving opening defined within the slidable body;(c) an actuable camera release button attached to a first end of the slidable body; and(d) a tensioned spring operably coupled to a second end of the slidable body.
  • 6. The robotic surgical system of claim 5, wherein the slidable body is slidable along a plane substantially transverse to a longitudinal axis of the elongate device body.
  • 7. The robotic surgical system of claim 1, wherein the first sensing component is a magnet.
  • 8. A removable connection port for a robotic surgical device, the port comprising: (a) a connection port body;(b) a distal opening defined at a distal end of the port body, wherein the distal opening is sized and shaped to receive a proximal end of an elongate device body;(c) a proximal opening defined at a proximal end of the port body, wherein the proximal opening is sized and shaped to receive a camera assembly;(d) a seal package disposed in the connection port body, the seal package comprising at least two seals configured to receive a shaft of a camera assembly;(e) a device body coupling mechanism disposed within the connection port body, the device body coupling mechanism comprising first and second hinged coupling mechanisms hingedly coupled to the connection port body;(f) a camera coupling mechanism disposed within the connection port body, the camera coupling mechanism comprising: (i) a slidable body disposed within the connection port body; and(ii) a camera receiving opening defined within the slidable body; and(g) a presence detection mechanism comprising: (i) a rotatable lever operably coupled to the camera coupling mechanism at a pivot point, wherein the rotatable lever rotates around the pivot point; and(ii) a first sensing component disposed on the rotatable lever, wherein the first sensing component is configured to interact with a second sensing component disposed on the elongate device body when the removable connection port is coupled to the elongate device body.
  • 9. The removable connection port of claim 8, wherein each of the first and second hinged coupling mechanisms comprises: (a) a coupling mechanism body;(b) a tensioned hinge at a proximal end of the coupling mechanism body, wherein the tensioned hinge is hingedly coupled to the connection port body; and(c) a coupleable structure at a distal end of the coupling mechanism, wherein the coupleable structure comprises at least one coupling feature configured to be coupleable with a matching coupling feature on the proximal end of the elongate device body and an actuable button.
  • 10. The removable connection port of claim 8, wherein the distal opening is sized and shaped to receive a male connector disposed at the proximal end of the elongate device body.
  • 11. The removable connection port of claim 8, wherein the camera coupling mechanism further comprises: (a) an actuable camera release button attached to a first end of the slidable body; and(b) a tensioned spring operably coupled to a second end of the slidable body.
  • 12. The removable connection port of claim 8, wherein the slidable body is slidable along a plane substantially transverse to a longitudinal axis of a lumen of the seal package.
  • 13. The removable connection port of claim 8, wherein the first sensing component is a magnet.
  • 14. A robotic surgical system, comprising: (a) a robotic surgical device comprising: (i) an elongate device body comprising a distal end and a proximal end;(ii) a removable connection port disposed at the proximal end of the device body, the connection port comprising: (A) a device body coupling mechanism disposed within the connection port, the device body coupling mechanism comprising first and second hinged coupling mechanisms hingedly coupled to the connection port;(B) an camera receiving opening defined in a proximal end of the connection port;(C) a seal package disposed in the removable connection port, the seal package comprising at least two seals;(D) a camera coupling mechanism disposed within the removable connection port, the camera coupling mechanism comprising: (1) a slidable body slidably disposed within the connection port;(2) a camera receiving opening defined within the slidable body;(3) an actuable camera release button attached to a first end of the slidable body; and(4) a tensioned spring operably coupled to a second end of the slidable body; and(E) a presence detection mechanism operably coupled to the camera coupling mechanism, the presence detection mechanism comprising: (1) a rotatable lever operably coupled to the camera coupling mechanism at a pivot point, wherein the rotatable lever rotates around the pivot point;(2) a first sensing component disposed on the rotatable lever; and(3) a second sensing component disposed on the elongate body, wherein the second sensing component is configured to sense the presence or absence of the first sensing component; and(iii) first and second robotic arms operably coupled to the distal end of the device body; and(b) a removable camera component removably disposable in the camera receiving opening and through the seal package, the removable camera component comprising a camera body, an elongate camera tube, a flexible section, and a distal imager.
  • 15. The robotic surgical system of claim 14, wherein each of the first and second hinged coupling mechanisms comprises: (a) a coupling mechanism body;(b) a tensioned hinge at a proximal end of the coupling mechanism body, wherein the tensioned hinge is hingedly coupled to the connection port; and(c) a coupleable structure at a distal end of the coupling mechanism, wherein the coupleable structure comprises at least one coupling feature configured to be coupleable with a matching coupling feature on the proximal end of the device body and an actuable button.
  • 16. The robotic surgical system of claim 14, wherein the slidable body is slidable along a plane substantially transverse to a longitudinal axis of a lumen defined by the at least two seals in the seal package.
CROSS-REFERENCE TO RELATED APPLICATION(S)

This application claims the benefit under 35 U.S.C. § 119(e) to U.S. Provisional Application 62/789,029, filed Jan. 7, 2019 and entitled “Robotically Assisted Surgical System,” which is hereby incorporated herein by reference in its entirety.

US Referenced Citations (676)
Number Name Date Kind
2858947 Chapman, Jr. Nov 1958 A
3817403 Glachet et al. Jun 1974 A
3870264 Robinson Mar 1975 A
3922930 Fletcher et al. Dec 1975 A
3971266 Inakura et al. Jul 1976 A
3989952 Timberlake et al. Nov 1976 A
4246661 Pinson Jan 1981 A
4258716 Sutherland Mar 1981 A
4278077 Mizumoto Jul 1981 A
4353677 Susnjara et al. Oct 1982 A
4538594 Boebel et al. Sep 1985 A
4568311 Miyaki Feb 1986 A
4576545 Maeda Mar 1986 A
4623183 Aomori Nov 1986 A
4636138 Gorman Jan 1987 A
4645409 Gorman Feb 1987 A
4684313 Minematsu et al. Aug 1987 A
4736645 Zimmer Apr 1988 A
4762455 Coughlan et al. Aug 1988 A
4771652 Zimmer Sep 1988 A
4852391 Ruch et al. Aug 1989 A
4854808 Bisiach Aug 1989 A
4896015 Taboada et al. Jan 1990 A
4897014 Tietze Jan 1990 A
4922755 Oshiro et al. May 1990 A
4922782 Kawai May 1990 A
4984959 Kato Jan 1991 A
4990050 Tsuge et al. Feb 1991 A
5019968 Wang et al. May 1991 A
5036724 Rosheim Aug 1991 A
5108140 Bartholet Apr 1992 A
5172639 Wiesman et al. Dec 1992 A
5176649 Wakabayashi Jan 1993 A
5178032 Zona et al. Jan 1993 A
5187032 Sasaki et al. Feb 1993 A
5187796 Wang et al. Feb 1993 A
5195388 Zona et al. Mar 1993 A
5201325 McEwen et al. Apr 1993 A
5217003 Wilk Jun 1993 A
5263382 Brooks et al. Nov 1993 A
5271384 McEwen et al. Dec 1993 A
5284096 Pelrine et al. Feb 1994 A
5297443 Wentz Mar 1994 A
5297536 Wilk Mar 1994 A
5304899 Sasaki et al. Apr 1994 A
5305653 Ohtani et al. Apr 1994 A
5307447 Asano et al. Apr 1994 A
5353807 DeMarco Oct 1994 A
5363935 Schempf et al. Nov 1994 A
5372147 Lathrop, Jr. et al. Dec 1994 A
5382885 Salcudean et al. Jan 1995 A
5441494 Oritz Jan 1995 A
5388528 Pelrine et al. Feb 1995 A
5397323 Taylor et al. Mar 1995 A
5436542 Petelin et al. Jul 1995 A
5456673 Ziegler Oct 1995 A
5458131 Wilk Oct 1995 A
5458583 McNeely et al. Oct 1995 A
5458598 Feinberg et al. Oct 1995 A
5471515 Fossum et al. Nov 1995 A
5515478 Wang May 1996 A
5524180 Wang et al. Jun 1996 A
5553198 Wang et al. Sep 1996 A
5562448 Mushabac Oct 1996 A
5588442 Scovil et al. Dec 1996 A
5620417 Jang et al. Apr 1997 A
5623582 Rosenberg Apr 1997 A
5624380 Takayama et al. Apr 1997 A
5624398 Smith et al. Apr 1997 A
5632761 Smith et al. May 1997 A
5645520 Nakamura et al. Jul 1997 A
5657429 Wang et al. Aug 1997 A
5657584 Hamlin Aug 1997 A
5667354 Nakazawa Sep 1997 A
5672168 de la Torre et al. Sep 1997 A
5674030 Sigel Oct 1997 A
5728599 Rosteker et al. Mar 1998 A
5736821 Suyama et al. Apr 1998 A
5754741 Wang et al. May 1998 A
5762458 Wang et al. Jun 1998 A
5769640 Jacobus et al. Jun 1998 A
5791231 Cohn et al. Aug 1998 A
5792135 Madhani et al. Aug 1998 A
5797538 Heaton et al. Aug 1998 A
5797900 Madhani et al. Aug 1998 A
5807377 Madhani et al. Sep 1998 A
5808665 Green Sep 1998 A
5815640 Wang et al. Sep 1998 A
5825982 Wright et al. Oct 1998 A
5833656 Smith et al. Nov 1998 A
5841950 Wang et al. Nov 1998 A
5845646 Lemelson Dec 1998 A
5855583 Wang et al. Jan 1999 A
5876325 Mizuno et al. Mar 1999 A
5878193 Wang et al. Mar 1999 A
5878783 Smart Mar 1999 A
5895377 Smith et al. Apr 1999 A
5895417 Pomeranz et al. Apr 1999 A
5906591 Dario et al. May 1999 A
5907664 Wang et al. May 1999 A
5910129 Koblish et al. Jun 1999 A
5911036 Wright et al. Jun 1999 A
5954692 Smith et al. Sep 1999 A
5971976 Wang et al. Oct 1999 A
5993467 Yoon Nov 1999 A
6001108 Wang et al. Dec 1999 A
6007550 Wang et al. Dec 1999 A
6030365 Laufer Feb 2000 A
6031371 Smart Feb 2000 A
6058323 Lemelson May 2000 A
6063095 Wang et al. May 2000 A
6066090 Yoon May 2000 A
6086529 Arndt Jul 2000 A
6102850 Wang et al. Aug 2000 A
6106521 Blewett et al. Aug 2000 A
6107795 Smart Aug 2000 A
6132368 Cooper Oct 2000 A
6132441 Grace Oct 2000 A
6139563 Cosgrove, III et al. Oct 2000 A
6156006 Brosens et al. Dec 2000 A
6159146 El Gazayerli Dec 2000 A
6162171 Ng et al. Dec 2000 A
D438617 Cooper et al. Mar 2001 S
6206903 Ramans Mar 2001 B1
D441076 Cooper et al. Apr 2001 S
6223100 Green Apr 2001 B1
D441862 Cooper et al. May 2001 S
6238415 Sepetka et al. May 2001 B1
6240312 Alfano et al. May 2001 B1
6241730 Alby Jun 2001 B1
6244809 Wang et al. Jun 2001 B1
6246200 Blumenkranz et al. Jun 2001 B1
D444555 Cooper et al. Jul 2001 S
6286514 Lemelson Sep 2001 B1
6292678 Hall et al. Sep 2001 B1
6293282 Lemelson Sep 2001 B1
6296635 Smith et al. Oct 2001 B1
6309397 Julian et al. Oct 2001 B1
6309403 Minor et al. Oct 2001 B1
6312435 Wallace et al. Nov 2001 B1
6321106 Lemelson Nov 2001 B1
6327492 Lemelson Dec 2001 B1
6331181 Tiemey et al. Dec 2001 B1
6346072 Cooper Feb 2002 B1
6352503 Matsui et al. Mar 2002 B1
6364888 Niemeyer et al. Apr 2002 B1
6371952 Madhani et al. Apr 2002 B1
6394998 Wallace et al. May 2002 B1
6398726 Ramans et al. Jun 2002 B1
6400980 Lemelson Jun 2002 B1
6408224 Lemelson Jun 2002 B1
6424885 Niemeyer et al. Jul 2002 B1
6432112 Brock et al. Aug 2002 B2
6436107 Wang et al. Aug 2002 B1
6441577 Blumenkranz et al. Aug 2002 B2
6450104 Grant et al. Sep 2002 B1
6450992 Cassidy, Jr. Sep 2002 B1
6451027 Cooper et al. Sep 2002 B1
6454758 Thompson et al. Sep 2002 B1
6459926 Nowlin et al. Oct 2002 B1
6463361 Wang et al. Oct 2002 B1
6468203 Belson Oct 2002 B2
6468265 Evans et al. Oct 2002 B1
6470236 Ohtsuki Oct 2002 B2
6491691 Morley et al. Dec 2002 B1
6491701 Nemeyer et al. Dec 2002 B2
6493608 Niemeyer et al. Dec 2002 B1
6496099 Wang et al. Dec 2002 B2
6497651 Kan et al. Dec 2002 B1
6508413 Bauer et al. Jan 2003 B2
6512345 Borenstein Jan 2003 B2
6522906 Salisbury, Jr. et al. Feb 2003 B1
6544276 Azizi Apr 2003 B1
6548982 Papanikolopoulos et al. Apr 2003 B1
6554790 Moll Apr 2003 B1
6565554 Niemeyer May 2003 B1
6574355 Green Jun 2003 B2
6587750 Gerbi et al. Jul 2003 B2
6591239 McCall et al. Jul 2003 B1
6594552 Nowlin et al. Jul 2003 B1
6610007 Belson et al. Aug 2003 B2
6620173 Gerbi et al. Sep 2003 B2
6642836 Wang et al. Nov 2003 B1
6645196 Nixon et al. Nov 2003 B1
6646541 Wang et al. Nov 2003 B1
6648814 Kim et al. Nov 2003 B2
6659939 Moll et al. Dec 2003 B2
6661571 Shioda et al. Dec 2003 B1
6671581 Niemeyer et al. Dec 2003 B2
6676684 Morley et al. Jan 2004 B1
6684129 Salisbury, Jr. et al. Jan 2004 B2
6685648 Flaherty et al. Feb 2004 B2
6685698 Morley et al. Feb 2004 B2
6687571 Byme et al. Feb 2004 B1
6692485 Brock et al. Feb 2004 B1
6699177 Wang et al. Mar 2004 B1
6699235 Wallace et al. Mar 2004 B2
6702734 Kim et al. Mar 2004 B2
6702805 Stuart Mar 2004 B1
6714839 Salisbury, Jr. et al. Mar 2004 B2
6714841 Wright et al. Mar 2004 B1
6719684 Kim et al. Apr 2004 B2
6720988 Gere et al. Apr 2004 B1
6726699 Wright et al. Apr 2004 B1
6728599 Wright et al. Apr 2004 B2
6730021 Vassiliades, Jr. et al. May 2004 B2
6731988 Green May 2004 B1
6746443 Morley et al. Jun 2004 B1
6764441 Chiel et al. Jul 2004 B2
6764445 Ramans et al. Jul 2004 B2
6766204 Niemeyer et al. Jul 2004 B2
6770081 Cooper et al. Aug 2004 B1
6774597 Borenstein Aug 2004 B1
6776165 Jin Aug 2004 B2
6780184 Tanrisever Aug 2004 B2
6783524 Anderson et al. Aug 2004 B2
6785593 Wang et al. Aug 2004 B2
6788018 Blumenkranz Sep 2004 B1
6792663 Krzyzanowski Sep 2004 B2
6793653 Sanchez et al. Sep 2004 B2
6799065 Niemeyer Sep 2004 B1
6799088 Wang et al. Sep 2004 B2
6801325 Farr et al. Oct 2004 B2
6804581 Wang et al. Oct 2004 B2
6810281 Brock et al. Oct 2004 B2
6817972 Snow Nov 2004 B2
6817974 Cooper et al. Nov 2004 B2
6817975 Farr et al. Nov 2004 B1
6820653 Schempf et al. Nov 2004 B1
6824508 Kim et al. Nov 2004 B2
6824510 Kim et al. Nov 2004 B2
6826977 Grover et al. Dec 2004 B2
6832988 Sprout Dec 2004 B2
6832996 Woloszko et al. Dec 2004 B2
6836703 Wang et al. Dec 2004 B2
6837846 Jaffe et al. Jan 2005 B2
6837883 Moll et al. Jan 2005 B2
6839612 Sanchez et al. Jan 2005 B2
6840938 Morley et al. Jan 2005 B1
6843793 Brock et al. Jan 2005 B2
6852107 Wang et al. Feb 2005 B2
6853879 Sunaoshi Feb 2005 B2
6858003 Evans et al. Feb 2005 B2
6860346 Burt et al. Mar 2005 B2
6860877 Sanchez et al. Mar 2005 B1
6866671 Tiemey et al. Mar 2005 B2
6870343 Borenstein et al. Mar 2005 B2
6871117 Wang et al. Mar 2005 B2
6871563 Choset et al. Mar 2005 B2
6879880 Nowlin et al. Apr 2005 B2
6892112 Wang et al. May 2005 B2
6899705 Niemeyer May 2005 B2
6902560 Morley et al. Jun 2005 B1
6905460 Wang et al. Jun 2005 B2
6905491 Wang et al. Jun 2005 B1
6911916 Wang et al. Jun 2005 B1
6917176 Schempf et al. Jul 2005 B2
6933695 Blumenkranz Aug 2005 B2
6936001 Snow Aug 2005 B1
6936003 Iddan Aug 2005 B2
6936042 Wallace et al. Aug 2005 B2
6943663 Wang et al. Sep 2005 B2
6949096 Davison et al. Sep 2005 B2
6951535 Ghodoussi et al. Oct 2005 B2
6963792 Green Nov 2005 B1
6965812 Wang et al. Nov 2005 B2
6974411 Belson Dec 2005 B2
6974449 Niemeyer Dec 2005 B2
6979423 Moll Dec 2005 B2
6984203 Tartaglia et al. Jan 2006 B2
6984205 Gazdzinski Jan 2006 B2
6991627 Madhani et al. Jan 2006 B2
6993413 Sunaoshi Jan 2006 B2
6994703 Wang et al. Feb 2006 B2
6994708 Manzo Feb 2006 B2
6997908 Carrillo, Jr. et al. Feb 2006 B2
6999852 Green Feb 2006 B2
7025064 Wang et al. Apr 2006 B2
7027892 Wang et al. Apr 2006 B2
7033344 Imran Apr 2006 B2
7039453 Mullick May 2006 B2
7042184 Oleynikov et al. May 2006 B2
7048745 Tierney et al. May 2006 B2
7053752 Wang et al. May 2006 B2
7063682 Whayne et al. Jun 2006 B1
7066879 Fowler et al. Jun 2006 B2
7066926 Wallace et al. Jun 2006 B2
7074179 Wang et al. Jul 2006 B2
7077446 Kameda et al. Jul 2006 B2
7083571 Wang et al. Aug 2006 B2
7083615 Peterson et al. Aug 2006 B2
7087049 Nowlin et al. Aug 2006 B2
7090683 Brock et al. Aug 2006 B2
7097640 Wang et al. Aug 2006 B2
7105000 McBrayer Sep 2006 B2
7107090 Salisbury, Jr. et al. Sep 2006 B2
7109678 Kraus et al. Sep 2006 B2
7118582 Wang et al. Oct 2006 B1
7121781 Sanchez et al. Oct 2006 B2
7125403 Julian et al. Oct 2006 B2
7126303 Farritor et al. Oct 2006 B2
7147650 Lee Dec 2006 B2
7155315 Niemeyer et al. Dec 2006 B2
7155316 Sutherland et al. Dec 2006 B2
7163525 Franer Jan 2007 B2
7169141 Brock et al. Jan 2007 B2
7182025 Ghorbel et al. Feb 2007 B2
7182089 Ries Feb 2007 B2
7199545 Oleynikov et al. Apr 2007 B2
7206626 Quaid, III Apr 2007 B2
7206627 Abovitz et al. Apr 2007 B2
7210364 Ghorbel et al. May 2007 B2
7214230 Brock et al. May 2007 B2
7217240 Snow May 2007 B2
7239940 Wang et al. Jul 2007 B2
7250028 Julian et al. Jul 2007 B2
7259652 Wang et al. Aug 2007 B2
7273488 Nakamura et al. Sep 2007 B2
7311107 Harel et al. Dec 2007 B2
7339341 Oleynikov et al. Mar 2008 B2
7372229 Farritor et al. May 2008 B2
7403836 Aoyama Jul 2008 B2
7438702 Hart et al. Oct 2008 B2
7447537 Funda et al. Nov 2008 B1
7492116 Oleynikov et al. Feb 2009 B2
7566300 Devierre et al. Jul 2009 B2
7574250 Niemeyer Aug 2009 B2
7637905 Saadat et al. Dec 2009 B2
7645230 Mikkaichi et al. Jan 2010 B2
7655004 Long Feb 2010 B2
7670329 Flaherty et al. Mar 2010 B2
7678043 Gilad Mar 2010 B2
7731727 Sauer Jun 2010 B2
7734375 Buehler et al. Jun 2010 B2
7762825 Burbank et al. Jul 2010 B2
7772796 Farritor et al. Aug 2010 B2
7785251 Wilk Aug 2010 B2
7785294 Hueil et al. Aug 2010 B2
7785333 Miyamoto et al. Aug 2010 B2
7789825 Nobis et al. Sep 2010 B2
7789861 Franer Sep 2010 B2
7794494 Sahatjian et al. Sep 2010 B2
7865266 Moll et al. Jan 2011 B2
7960935 Farritor et al. Jun 2011 B2
7979157 Anvari Jul 2011 B2
8021358 Doyle et al. Sep 2011 B2
8179073 Farritor et al. May 2012 B2
8231610 Jo et al. Jul 2012 B2
8343171 Farritor et al. Jan 2013 B2
8353897 Doyle et al. Jan 2013 B2
8377045 Schena Feb 2013 B2
8430851 McGinley Apr 2013 B2
8604742 Farritor et al. Dec 2013 B2
8636686 Minnelli et al. Jan 2014 B2
8679096 Farritor et al. Mar 2014 B2
8827337 Murata et al. Sep 2014 B2
8828024 Farritor et al. Sep 2014 B2
8834488 Farritor et al. Sep 2014 B2
8864652 Diolaiti et al. Oct 2014 B2
8888687 Ostrovsky et al. Nov 2014 B2
8968332 Farritor et al. Mar 2015 B2
8974440 Farritor et al. Mar 2015 B2
8986196 Larkin et al. Mar 2015 B2
9010214 Markvicka et al. Apr 2015 B2
9060781 Farritor et al. Jun 2015 B2
9089256 Tognaccini et al. Jul 2015 B2
9089353 Farritor et al. Jul 2015 B2
9138129 Diolaiti Sep 2015 B2
9198728 Wang et al. Dec 2015 B2
9516996 Diolaiti et al. Dec 2016 B2
9579088 Farritor et al. Feb 2017 B2
9649020 Finlay May 2017 B2
9717563 Tognaccini et al. Aug 2017 B2
9743987 Farritor et al. Aug 2017 B2
9757187 Farritor et al. Sep 2017 B2
9770305 Farritor et al. Sep 2017 B2
9789608 Itkowitz et al. Oct 2017 B2
9814640 Khaligh Nov 2017 B1
9816641 Bock-Aronson et al. Nov 2017 B2
9849586 Rosheim Dec 2017 B2
9857786 Cristiano Jan 2018 B2
9888966 Farritor et al. Feb 2018 B2
9956043 Farritor et al. May 2018 B2
10008017 Itkowitz et al. Jun 2018 B2
10111711 Farritor et al. Oct 2018 B2
10137575 Itkowitz et al. Nov 2018 B2
10159533 Moll et al. Dec 2018 B2
10220522 Rockrohr Mar 2019 B2
10258425 Mustufa et al. Apr 2019 B2
10307199 Farritor et al. Jun 2019 B2
10342561 Farritor et al. Jul 2019 B2
10368952 Tognaccini et al. Aug 2019 B2
10398516 Jackson et al. Sep 2019 B2
10470828 Markvicka et al. Nov 2019 B2
10507066 Dimaio et al. Dec 2019 B2
10555775 Hoffman et al. Feb 2020 B2
10582973 Wilson et al. Mar 2020 B2
10695137 Farritor et al. Jun 2020 B2
10729503 Cameron Aug 2020 B2
10737394 Itkowitz et al. Aug 2020 B2
10751136 Farritor et al. Aug 2020 B2
10751883 Nahum Aug 2020 B2
10806538 Farritor et al. Oct 2020 B2
10966700 Farritor et al. Apr 2021 B2
11032125 Farritor et al. Jun 2021 B2
11298195 Ye et al. Apr 2022 B2
11382702 Tognaccini et al. Jul 2022 B2
11529201 Mondry et al. Dec 2022 B2
11595242 Farritor et al. Feb 2023 B2
20010018591 Brock et al. Aug 2001 A1
20010049497 Kalloo et al. Dec 2001 A1
20020003173 Bauer et al. Jan 2002 A1
20020013601 Nobles et al. Jan 2002 A1
20020026186 Woloszko et al. Feb 2002 A1
20020038077 de la Torre et al. Mar 2002 A1
20020065507 Zando-Azizi May 2002 A1
20020091374 Cooper Jun 2002 A1
20020103417 Gazdzinski Aug 2002 A1
20020111535 Kim et al. Aug 2002 A1
20020120254 Julian et al. Aug 2002 A1
20020128552 Nowlin et al. Sep 2002 A1
20020140392 Borenstein et al. Oct 2002 A1
20020147487 Sundquist et al. Oct 2002 A1
20020151906 Demarais et al. Oct 2002 A1
20020156347 Kim et al. Oct 2002 A1
20020171385 Kim et al. Nov 2002 A1
20020173700 Kim et al. Nov 2002 A1
20020190682 Schempf et al. Dec 2002 A1
20030020810 Takizawa et al. Jan 2003 A1
20030045888 Brock et al. Mar 2003 A1
20030065250 Chiel et al. Apr 2003 A1
20030089267 Ghorbel et al. May 2003 A1
20030092964 Kim et al. May 2003 A1
20030097129 Davison et al. May 2003 A1
20030100817 Wang et al. May 2003 A1
20030109780 Coste-Maniere et al. Jun 2003 A1
20030114731 Cadeddu et al. Jun 2003 A1
20030135203 Wang et al. Jun 2003 A1
20030139742 Wampler et al. Jul 2003 A1
20030144656 Ocel et al. Jul 2003 A1
20030159535 Grover et al. Aug 2003 A1
20030167000 Mullick Sep 2003 A1
20030172871 Scherer Sep 2003 A1
20030179308 Zamorano et al. Sep 2003 A1
20030181788 Yokoi et al. Sep 2003 A1
20030225479 Waled Dec 2003 A1
20030229268 Uchiyama et al. Dec 2003 A1
20030229338 Irion et al. Dec 2003 A1
20030230372 Schmidt Dec 2003 A1
20040024311 Quaid Feb 2004 A1
20040034282 Quaid Feb 2004 A1
20040034283 Quaid Feb 2004 A1
20040034302 Abovitz et al. Feb 2004 A1
20040050394 Jin Mar 2004 A1
20040070822 Shioda et al. Apr 2004 A1
20040099175 Perrot et al. May 2004 A1
20040102772 Baxter et al. May 2004 A1
20040106916 Quaid et al. Jun 2004 A1
20040111113 Nakamura et al. Jun 2004 A1
20040117032 Roth Jun 2004 A1
20040138525 Saadat et al. Jul 2004 A1
20040138552 Harel et al. Jul 2004 A1
20040140786 Borenstein Jul 2004 A1
20040153057 Davison Aug 2004 A1
20040173116 Ghorbel et al. Sep 2004 A1
20040176664 Iddan Sep 2004 A1
20040215331 Chew et al. Oct 2004 A1
20040225229 Viola Nov 2004 A1
20040254680 Sunaoshi Dec 2004 A1
20040267326 Ocel Dec 2004 A1
20050014994 Fowler et al. Jan 2005 A1
20050021069 Feuer et al. Jan 2005 A1
20050029978 Oleynikov et al. Feb 2005 A1
20050043583 Killmann et al. Feb 2005 A1
20050049462 Kanazawa Mar 2005 A1
20050054901 Yoshino Mar 2005 A1
20050054902 Konno Mar 2005 A1
20050064378 Toly Mar 2005 A1
20050065400 Banik et al. Mar 2005 A1
20050070850 Albrecht Mar 2005 A1
20050083460 Hattori et al. Apr 2005 A1
20050095650 Julius et al. May 2005 A1
20050096502 Khalili May 2005 A1
20050143644 Gilad et al. Jun 2005 A1
20050154376 Riviere et al. Jul 2005 A1
20050165449 Cadeddu et al. Jul 2005 A1
20050177026 Hoeg et al. Aug 2005 A1
20050234294 Saadat et al. Oct 2005 A1
20050234435 Layer Oct 2005 A1
20050272977 Saadat et al. Dec 2005 A1
20050283137 Doyle et al. Dec 2005 A1
20050288555 Binmoeller Dec 2005 A1
20050288665 Woloszko Dec 2005 A1
20060020272 Gildenberg Jan 2006 A1
20060046226 Bergler et al. Mar 2006 A1
20060079889 Scott Apr 2006 A1
20060100501 Berkelman et al. May 2006 A1
20060119304 Farritor et al. Jun 2006 A1
20060149135 Paz Jul 2006 A1
20060152591 Lin Jul 2006 A1
20060155263 Lipow Jul 2006 A1
20060189845 Maahs et al. Aug 2006 A1
20060195015 Mullick et al. Aug 2006 A1
20060196301 Oleynikov et al. Sep 2006 A1
20060198619 Oleynikov et al. Sep 2006 A1
20060241570 Wilk Oct 2006 A1
20060241732 Denker Oct 2006 A1
20060253109 Chu Nov 2006 A1
20060258938 Hoffman et al. Nov 2006 A1
20060258954 Timberlake et al. Nov 2006 A1
20060261770 Kishi et al. Nov 2006 A1
20070032701 Fowler et al. Feb 2007 A1
20070043397 Ocel et al. Feb 2007 A1
20070055342 Wu et al. Mar 2007 A1
20070080658 Farritor et al. Apr 2007 A1
20070088277 McGinley Apr 2007 A1
20070088340 Brock et al. Apr 2007 A1
20070106113 Ravo May 2007 A1
20070106317 Shelton et al. May 2007 A1
20070123748 Meglan May 2007 A1
20070135803 Belson Jun 2007 A1
20070142725 Hardin et al. Jun 2007 A1
20070156019 Arkin et al. Jul 2007 A1
20070156211 Ferren et al. Jul 2007 A1
20070167955 De La Menardiere et al. Jul 2007 A1
20070225633 Ferren et al. Sep 2007 A1
20070225634 Ferren et al. Sep 2007 A1
20070241714 Oleynikov et al. Oct 2007 A1
20070244520 Ferren et al. Oct 2007 A1
20070250064 Darois et al. Oct 2007 A1
20070255273 Fernandez et al. Nov 2007 A1
20070287884 Schena Dec 2007 A1
20080004634 Farritor et al. Jan 2008 A1
20080015565 Davison Jan 2008 A1
20080015566 Livneh Jan 2008 A1
20080021440 Solomon Jan 2008 A1
20080033569 Ferren et al. Feb 2008 A1
20080045803 Williams et al. Feb 2008 A1
20080058835 Farritor et al. Mar 2008 A1
20080058989 Oleynikov et al. Mar 2008 A1
20080071289 Cooper et al. Mar 2008 A1
20080071290 Larkin et al. Mar 2008 A1
20080103440 Ferren et al. May 2008 A1
20080109014 de la Pena May 2008 A1
20080111513 Farritor et al. May 2008 A1
20080119870 Williams et al. May 2008 A1
20080132890 Woloszko et al. Jun 2008 A1
20080161804 Rioux et al. Jun 2008 A1
20080164079 Ferren et al. Jul 2008 A1
20080168639 Otake et al. Jul 2008 A1
20080183033 Bern et al. Jul 2008 A1
20080221591 Farritor et al. Sep 2008 A1
20080269557 Marescaux et al. Oct 2008 A1
20080269562 Marescaux et al. Oct 2008 A1
20090002414 Shibata et al. Jan 2009 A1
20090012532 Quaid et al. Jan 2009 A1
20090020724 Paffrath Jan 2009 A1
20090024142 Ruiz Morales Jan 2009 A1
20090048612 Farritor et al. Feb 2009 A1
20090054909 Farritor et al. Feb 2009 A1
20090069821 Farritor et al. Mar 2009 A1
20090076536 Rentschler et al. Mar 2009 A1
20090137952 Ramamurthy et al. May 2009 A1
20090143787 De La Pena Jun 2009 A9
20090163929 Yeung et al. Jun 2009 A1
20090171373 Farritor et al. Jul 2009 A1
20090192524 Itkowitz et al. Jul 2009 A1
20090234369 Bax et al. Sep 2009 A1
20090236400 Cole et al. Sep 2009 A1
20090240246 Devill et al. Sep 2009 A1
20090247821 Rogers Oct 2009 A1
20090248038 Blumenkranz et al. Oct 2009 A1
20090281377 Newell et al. Nov 2009 A1
20090299143 Conlon et al. Dec 2009 A1
20090305210 Guru et al. Dec 2009 A1
20090326322 Diolaiti Dec 2009 A1
20100010294 Conlon et al. Jan 2010 A1
20100016659 Weitzner et al. Jan 2010 A1
20100016853 Burbank Jan 2010 A1
20100026347 Iizuka Feb 2010 A1
20100042097 Newton et al. Feb 2010 A1
20100056863 Dejima et al. Mar 2010 A1
20100069710 Yamatani et al. Mar 2010 A1
20100069940 Miller et al. Mar 2010 A1
20100081875 Fowler et al. Apr 2010 A1
20100101346 Johnson et al. Apr 2010 A1
20100130986 Mailloux et al. May 2010 A1
20100139436 Kawashima et al. Jun 2010 A1
20100185212 Sholev Jul 2010 A1
20100198231 Manzo et al. Aug 2010 A1
20100204713 Ruiz Morales Aug 2010 A1
20100245549 Allen et al. Sep 2010 A1
20100250000 Blumenkranz et al. Sep 2010 A1
20100262162 Omori Oct 2010 A1
20100263470 Bannasch et al. Oct 2010 A1
20100274079 Kim et al. Oct 2010 A1
20100292691 Brogna Nov 2010 A1
20100301095 Shelton, IV et al. Dec 2010 A1
20100318059 Farritor et al. Dec 2010 A1
20100331856 Carlson et al. Dec 2010 A1
20110015569 Kirschenman et al. Jan 2011 A1
20110020779 Hannaford et al. Jan 2011 A1
20110071347 Rogers et al. Mar 2011 A1
20110071544 Steger et al. Mar 2011 A1
20110075693 Kuramochi et al. Mar 2011 A1
20110077478 Freeman et al. Mar 2011 A1
20110082365 Mcgrogan et al. Apr 2011 A1
20110098529 Ostrovsky et al. Apr 2011 A1
20110107866 Oka et al. May 2011 A1
20110152615 Schostek et al. Jun 2011 A1
20110224605 Farritor et al. Sep 2011 A1
20110230894 Simaan et al. Sep 2011 A1
20110237890 Farritor et al. Sep 2011 A1
20110238079 Hannaford et al. Sep 2011 A1
20110238080 Ranjit et al. Sep 2011 A1
20110264078 Lipow et al. Oct 2011 A1
20110270443 Kamiya et al. Nov 2011 A1
20110276046 Heimbecker et al. Nov 2011 A1
20120016175 Roberts et al. Jan 2012 A1
20120029727 Sholev Feb 2012 A1
20120035582 Nelson et al. Feb 2012 A1
20120059392 Diolaiti Mar 2012 A1
20120078053 Phee et al. Mar 2012 A1
20120109150 Quaid et al. May 2012 A1
20120116362 Kieturakis May 2012 A1
20120179168 Farritor et al. Jul 2012 A1
20120221147 Goldberg et al. Aug 2012 A1
20120253515 Coste-Maniere et al. Oct 2012 A1
20130001970 Suyama et al. Jan 2013 A1
20130041360 Farritor et al. Feb 2013 A1
20130055560 Nakasugi et al. Mar 2013 A1
20130125696 Long May 2013 A1
20130131695 Scarfogliero et al. May 2013 A1
20130178867 Farritor et al. Jul 2013 A1
20130282023 Burbank et al. Oct 2013 A1
20130304084 Beira et al. Nov 2013 A1
20130325030 Hourtash et al. Dec 2013 A1
20130325181 Moore Dec 2013 A1
20130345717 Markvicka et al. Dec 2013 A1
20130345718 Crawford et al. Dec 2013 A1
20140039515 Mondry et al. Feb 2014 A1
20140046340 Wilson et al. Feb 2014 A1
20140055489 Itkowitz et al. Feb 2014 A1
20140058205 Frederick et al. Feb 2014 A1
20140100587 Farritor et al. Apr 2014 A1
20140137687 Nogami et al. May 2014 A1
20140221749 Grant et al. Aug 2014 A1
20140232824 Dimaio et al. Aug 2014 A1
20140276944 Farritor et al. Sep 2014 A1
20140303434 Farritor et al. Oct 2014 A1
20140371762 Farritor et al. Dec 2014 A1
20150051446 Farritor et al. Feb 2015 A1
20150057537 Dillon Feb 2015 A1
20150157191 Phee et al. Jun 2015 A1
20150223896 Farritor et al. Aug 2015 A1
20150297299 Yeung et al. Oct 2015 A1
20160066999 Forgione et al. Mar 2016 A1
20160135898 Frederick et al. May 2016 A1
20160291571 Cristiano Oct 2016 A1
20160303745 Rockrohr Oct 2016 A1
20170014197 Mccrea et al. Jan 2017 A1
20170035526 Farritor Feb 2017 A1
20170078583 Haggerty et al. Mar 2017 A1
20170252096 Felder et al. Sep 2017 A1
20170354470 Farritor et al. Dec 2017 A1
20180132956 Cameron May 2018 A1
20180153578 Cooper et al. Jun 2018 A1
20180338777 Bonadio et al. Nov 2018 A1
20190059983 Germain Feb 2019 A1
20190090965 Farritor et al. Mar 2019 A1
20190209262 Mustufa et al. Jul 2019 A1
20190327394 Ramirez Luna et al. Oct 2019 A1
20200138534 Garcia Kilroy et al. May 2020 A1
20200214775 Farritor et al. Jul 2020 A1
20200330175 Cameron Oct 2020 A1
20200368915 Itkowitz et al. Nov 2020 A1
Foreign Referenced Citations (97)
Number Date Country
2918531 Jan 2015 CA
102499759 Jun 2012 CN
102821918 Dec 2012 CN
104523309 Apr 2015 CN
104582600 Apr 2015 CN
104622528 May 2015 CN
204337044 May 2015 CN
105025826 Nov 2015 CN
102010040405 Mar 2012 DE
0105656 Apr 1984 EP
0279591 Aug 1988 EP
1354670 Oct 2003 EP
2286756 Feb 2011 EP
2286756 Feb 2011 EP
2329787 Jun 2011 EP
2563261 Mar 2013 EP
2684528 Jan 2014 EP
2123225 Dec 2014 EP
2815705 Dec 2014 EP
2881046 Oct 2015 EP
2937047 Oct 2015 EP
S59059371 Apr 1984 JP
S61165061 Jul 1986 JP
S62068293 Mar 1987 JP
H04144533 May 1992 JP
05-115425 May 1993 JP
2006508049 Sep 1994 JP
H06507809 Sep 1994 JP
H06508049 Sep 1994 JP
07-016235 Jan 1995 JP
07-136173 May 1995 JP
7306155 Nov 1995 JP
08-224248 Sep 1996 JP
2001500510 Jan 2001 JP
2001505810 May 2001 JP
2002000524 Jan 2002 JP
2003220065 Aug 2003 JP
2004144533 May 2004 JP
2004-180781 Jul 2004 JP
2004283940 Oct 2004 JP
2004322310 Nov 2004 JP
2004329292 Nov 2004 JP
2006507809 Mar 2006 JP
2009106606 May 2009 JP
2009297809 Dec 2009 JP
2010533045 Oct 2010 JP
2010536436 Dec 2010 JP
2011504794 Feb 2011 JP
2011045500 Mar 2011 JP
2011115591 Jun 2011 JP
2012504017 Feb 2012 JP
2012176489 Sep 2012 JP
5418704 Feb 2014 JP
2015526171 Sep 2015 JP
2016213937 Dec 2016 JP
2017113837 Jun 2017 JP
199221291 May 1991 WO
2001089405 Nov 2001 WO
2002082979 Oct 2002 WO
2002100256 Dec 2002 WO
2005009211 Jul 2004 WO
2005044095 May 2005 WO
2006052927 Aug 2005 WO
2006005075 Jan 2006 WO
2006079108 Jan 2006 WO
2006079108 Jul 2006 WO
2007011654 Jan 2007 WO
2007111571 Oct 2007 WO
2007149559 Dec 2007 WO
2009014917 Jan 2009 WO
2009023851 Feb 2009 WO
2009144729 Dec 2009 WO
2009158164 Dec 2009 WO
2010039394 Apr 2010 WO
2010042611 Apr 2010 WO
2010046823 Apr 2010 WO
2010050771 May 2010 WO
2010083480 Jul 2010 WO
2011075693 Jun 2011 WO
2011118646 Sep 2011 WO
2011135503 Nov 2011 WO
2011163520 Dec 2011 WO
2013009887 Jan 2013 WO
2013052137 Apr 2013 WO
2013106569 Jul 2013 WO
2014011238 Jan 2014 WO
2014025399 Feb 2014 WO
2014144220 Sep 2014 WO
2014146090 Sep 2014 WO
2015009949 Jan 2015 WO
2015031777 Mar 2015 WO
2015088655 Jun 2015 WO
2016077478 May 2016 WO
2017024081 Feb 2017 WO
2017064303 Apr 2017 WO
2017201310 Nov 2017 WO
2018045036 Mar 2018 WO
Non-Patent Literature Citations (159)
Entry
Franzino, “The Laprotek Surgical System and the Next Generation of Robotics,” Surg Clin North Am, 2003 83(6): 1317-1320.
Franklin et al., “Prospective Comparison of Open vs. Laparoscopic Colon Surgery for Carcinoma: Five-Year Results,” Dis Colon Rectum, 1996; 39: S35-S46.
Flynn et al, “Tomorrow's surgery: micromotors and microrobots for minimally invasive procedures,” Minimally Invasive Surgery & Allied Technologies, 1998; 7(4): 343-352.
Fireman et al., “Diagnosing small bowel Crohn's desease with wireless capsule endoscopy,” Gut 2003; 52: 390-392.
Fearing et al., “Wing Transmission for a Micromechanical Flying Insect,” Proceedings of the 2000 IEEE International Conference to Robotics & Automation, Apr. 2000; 1509-1516.
Faraz et al., “Engineering Approaches to Mechanical and Robotic Design for Minimaly Invasive Surgery (MIS),” Kluwer Academic Publishers (Boston), 2000, 13pp.
Falcone et al., “Robotic Surgery,” Clin. Obstet. Gynecol. 2003, 46(1): 37-43.
Fraulob et al., “Miniature assistance module for robot-assisted heart surgery,” Biomed. Tech. 2002, 47 Suppl. 1, Pt. 1: 12-15.
Fukuda et al., “Mechanism and Swimming Experiment of Micro Mobile Robot in Water,” Proceedings of the 1994 IEEE International Conference on Robotics and Automation, 1994: 814-819.
Fukuda et al., “Micro Active Catheter System with Multi Degrees of Freedom,” Proceedings of the IEEE International Conference on Robotics and Automation, May 1994, pp. 2290-2295.
Fuller et al., “Laparoscopic Trocar Injuries: A Report from a U.S. Food and Drug Administration (FDA) Center for Devices and Radiological Health (CDRH) Systematic Technology Assessment of Medical Products (STAMP) Committe,” U.S. Food and Drug Adminstration, available at http://www.fdaJ:?;ov, Finalized: Nov. 7, 2003; Updated: Jun. 24, 2005, 11 pp.
Dumpert et al., “Improving in Vivo Robot Visioin Quality,” from the Proceedings of Medicine Meets Virtual Realtiy, Long Beach, CA, Jan. 26-29, 2005. 1 pg.
Dakin et al., “Comparison of laparoscopic skills performance between standard instruments and two surgical robotic systems,” Surg Endosc., 2003; 17: 574-579.
Cuschieri, “Technology for Minimal Access Surgery,” BMJ, 1999, 319: 1-6.
Grady, “Doctors Try New Surgery for Gallbladder Removal,” The New York Times, Apr. 20, 2007, 3 pp.
Choi et al., “Flexure-based Manipulator for Active Handheld Microsurgical Instrument,” Proceedings of the 27th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBS), Sep. 2005, 4pp.
Chanthasopeephan et al., (2003), “Measuring Forces in Liver Cutting: New Equipment and Experimenal Results,” Annals of Biomedical Engineering 31: 1372-1382.
Cavusoglu et al., “Robotics for Telesurgery: Second Generation Berkeley/UCSF Laparoscopic Telesurgical Workstation and Looking Towards the Future Applications,” Industrial Robot: An International Journal, 2003; 30(1): 22-29.
Guber et al., “Miniaturized Instrument Systems for Minimally Invasive Diagnosis and Therapy,” Biomedizinische Technic. 2002, Band 47, Erganmngsband 1: 198-201.
Abbott et al., “Design of an Endoluminal NOTES Robotic System,” from the Proceedings of the 2007 IEEE/RSJ Int'l Conf. on Intelligent Robot Systems, San Diego, CA, Oct. 29-Nov. 2, 2007, pp. 410-416.
Allendorf et al., “Postoperative Immune Function Varies Inversely with the Degree of Surgical Trauma in a Murine Model,” Surgical Endoscopy 1997; 11:427-430.
Ang, “Active Tremor Compensation in Handheld Instrument for Microsurgery,” Doctoral Dissertation, tech report Cmu- RI-TR-04-28, Robotics Institute, Carnegie Mellon Unviersity, May 2004, 167pp.
Atmel 80C5X2 Core, http://www.atmel.com, 2006, 186pp.
Bailey et al., “Complications of Laparoscopic Surgery,” Quality Medical Publishers, Inc., 1995, 25pp.
Ballantyne, “Robotic Surgery, Telerobotic Surgery, Telepresence, and Telementoring,” Surgical Endoscopy, 2002; 16: 1389-1402.
Bauer et al., “Case Report: Remote Percutaneous Renal Percutaneous Renal Access Using a New Automated Telesurgical Robotic System,” Telemedicine Journal and e-Health 2001; (4): 341-347.
Begos et al., “Laparoscopic Cholecystectomy: From Gimmick to Gold Standard,” J Clin Gastroenterol, 1994; 19(4): 325-330.
Berg et al., “Surgery with Cooperative Robots,” Medicine Meets Virtual Reality, Feb. 2007, 1 pg.
Breda et al., “Future developments and perspectives in laparoscopy,” Eur. Urology 2001; 40(1): 84-91.
Breedveld et al., “Design of Steerable Endoscopes to Improve the Visual Perception of Depth During Laparoscopic Surgery,” ASME, Jan. 2004; vol. 126, pp. 1-5.
Breedveld et al., “Locomotion through the Intestine by means of Rolling Stents,” Proceedings of the ASME Design Engineering Technical Conferences, 2004, pp. 1-7.
Calafiore et al., Multiple Arterial Conduits Without Cardiopulmonary Bypass: Early Angiographic Results,: Ann Thorac Surg, 1999; 67: 450-456.
Camarillo et al., “Robotic Technology in Surgery: Past, Present and Future,” The American Journal of Surgery, 2004; 188: 28-15.
Cavusoglu et al., “Telesurgery and Surgical Simulation: Haptic Interfaces to Real and Virtual Surgical Environments,” In Mclaughlin, M.L., Hespanha, J.P., and Sukhatme, G., editors. Touch in virtual environments, IMSC Series in Multimedia 2001, 28pp.
Dumpert et al., “Stereoscopic In Vivo Surgical Robots,” IEEE Sensors Special Issue on In Vivo Sensors for Medicine, Jan. 2007, 10 pp.
Green, “Telepresence Surgery”, Jan. 1, 1995, Publisher: IEEE Engineering in Medicine and Biology.
Cleary et al., “State of the Art in Surgical Rootics: Clinical Applications and Technology Challenges”, “Computer Aided Surgery”, Jan. 1, 2002, pp. 312-328, vol. 6.
Stoianovici et al., “Robotic Tools for Minimally Invasive Urologic Surgery”, Jan. 1, 2002, pp. 1-17.
Lehman et al., Dexterous miniature in vivo robot for NOTES, 2009, IEEE, p. 244-249.
Mihelj et al., ARMin II—7 DoF rehabilitation robot: mechanics and kinematics, 2007, IEEE, p. 4120-4125.
Zhang et al., Cooperative robotic assistant for laparoscopic surgery: CoBRASurge, 2009, IEEE, p. 5540-5545.
Abbou et al., “Laparoscopic Radical Prostatectomy with a Remote Controlled Robot,” The Journal of Urology, Jun. 2001; 165: 1964-1966.
Albers et al., Design and development process of a humanoid robot upper body through experimentation, 2004, IEEE, p. 77-92 (Year: 2004).
Crystal Eyes, http://www.reald.com, 2007 (Stereo 3D visualization for CAVEs, theaters and immersive environments), 1 pg.
Definition of Individually. Dictionary.com, retrieved on Aug. 9, 2016; Retrieved from the Internet: <http://www.dictionary.com/browse/individually>, 1 page.
Glukhovsky et al., “The development and application of wireless capsule endoscopy,” Int. J. Med. Robot. Comput. Assist. Surgery, 2004; 1(1): 114-123.
Gong et al., “Wireless endoscopy,” Gastrointestinal Endoscopy 2000; 51 (6): 725-729.
Gopura et al., Mechanical designs of active upper-limb exoskeleton robots: State-of-the-art and design difficulties, 2009, IEEE, p. 178-187 (Year: 2009).
Gopura et al., A brief review on upper extremity robotic exoskeleton systems, 2011, IEEE, p. 346-351 (Year: 2011).
Guo et al., “Micro Active Guide Wire Catheter System—Characteristic Evaluation, Electrical Model* and Operability Evaluation of Micro Active Catheter,” Proceedings of the 1996 IEEE International Conference on Robotics and Automation, Apr. 1996; 2226-2231.
Guo et al., “Fish-like Underwater Microrobot with 3 DOF,” Proceedings of the 2002 IEEE International Conference on Robotics & Automation, May 2002; 738-743.
Hanly et al., “Robotic Abdominal Surgery,” The American Journal of Surgery, 2004; 188 (Suppl. to Oct. 1994); 19S-26S.
Hanly et al., “Value of the SAGES Learning Center in introducing new technology,” Surgical Endoscopy, 2004; 19(4): 477-483.
Heikkinen et al., “Comparison of laparoscopic and open Nissen fundoplication two years after operation: A prospective randomized trial,” Surgical Endoscopy, 2000; 14:1019-1023.
Hissink, “Olympus Medical develops capsule camera technology,” Dec. 2004, accessed Aug. 29, 2007, http://www.letsgodigital.org, 3 pp.
Horgan et al., “Technical Report: Robots in Laparoscopic Surgery,” Journal of Laparoendoscopic & Advanced Surgical Techniques, 2001; 11(6): 415-419.
Ishiyama et al., “Spiral-type Micro-machine for Medical Applications,” 2000 International Symposium on Micromechatronics and Human Science, 2000; 65-69.
Jagannath et al., “Peroral transgastric endoscopic ligation of fallopian tubes with long-term survival in a porcine model,” Gastrointestinal Endoscopy, 2005; 61 (3): 449-453.
Kalloo et al., “Flexible transgastric peritoneoscopy: a novel approach to diagnostic and therapeutic interventions in the peritoneal cavity,” Gastrointestinal Endoscopy, 2004; 60(1): 114-117.
Kang et al., “Robotic Assistants Aid Surgeons During Minimally Invasive Procedures,” IEEE Engineering in Medicine and Biology, Jan.-Feb. 2001: 94-104.
Kantsevoy et al., “Transgastric endoscopic splenectomy,” Surgical Endoscopy, 2006; 20: 522-525.
Kantsevoy et al., “Endoscopic gastrojejunostomy with survival in a porcine model,” Gastrointestinal Endoscopy, 2005; 62(2): 287-292.
Kazemier et al. (1998), “Vascular Injuries During Laparoscopy,” J. Am. Coli. Surg. 186(5): 604-5.
Keller et al., Design of the pediatric arm rehabilitation robot ChARMin, 2014, IEEE, p. 530-535 (Year: 2014).
Kim, “Early Experience with Telemanipulative Robot-Assisted Laparoscopic Cholecystectomy Using da Vinci,” Surgical Laparoscopy, Endoscopy & Percutaneous Techniques, 2002; 12(1): 33-40.
Ko et al., “Per-Oral transgastric abdominal surgery,” Chinese Journal of Digestive Diseases, 2006; 7: 67-70.
Lafullarde et al., “Laparoscopic Nissen Fundoplication: Five-year Results and Beyond,” Arch/Surg, Feb. 2001; 136: 180-184.
Leggett et al. (2002), “Aortic injury during laparoscopic Fundoplication,” Surg. Endoscopy 16(2): 362.
Li et al. (2000), “Microvascular Anastomoses Performed in Rats Using a Microsurgical Telemanipulator,” Comp. Aid. Surg., 5: 326-332.
Liem et al., “Comparison of Conventional Anterior Surgery and Laparoscopic Surgery for Inguinal-hernia Repair,” New England Journal of Medicine, 1997; 336 (22):1541-1547.
Lou Cubrich, “A Four-DOF Laparo-Endoscopic Single Site Platform for Rapidly-Developing Next Generation Surgical Robotics”, Journal of Medical Robotics Research, vol. 1, No. 4, 2016, 165006-1-165006-15.
Macfarlane et al., “Force-Feedback Grasper Helps Restore the Sense of Touch in Minimally Invasive Surgery,” Journal of Gastrointestinal Surgery, 1999; 3: 278-285.
Mack et al., “Present Role of Thoracoscopy in the Diagnosis and Treatment of Diseases of the Chest,” Ann Thorac Surgery, 1992; 54: 403-409.
Mack, “Minimally Invasive and Robotic Surgery,” JAMA, Feb. 2001; 285(5): 568-572.
Mei et al., “Wireless Drive and Control of a Swimming Microrobot,” Proceedings of the 2002 IEEE International Conference on Robotics & Automation, May 2002: 1131-1136.
Menciassi et al., “Robotic Solutions and Mechanisms for a Semi-Autonomous Endoscope,” Proceedings of the 2002 IEEE/RSJ Intl. Conference on Intelligent Robots and Systems, Oct. 2002; 1379-1384.
Melvin et al., “Computer-Enhanced vs. Standard Laparoscopic Antireflux Surgery,” J Gastrointest Surg 2002; 6: 11-16.
Leggedmenciassi et al., “Locomotion of a Legged Capsule in the Gastrointestinal Tract: Theoretical Study and Preliminary Technological Results,” IEEE Int. Conf. on Engineering in Medicine and Biology, San Francisco, CA, pp. 2767-2770, Sep. 2004.
Menciassi et al., “Shape memory alloy clamping devices of a capsule for monitoring tasks in the gastrointestinal tract,” J. Micromech. Microeng, 2005; 15: 2045-2055.
Meron, “The development of the swallowable video capsule (M2A),” Gastrointestinal Endoscopy 2000; 52 6: 817-819.
Micron, http://www.micron.com, 2006, ¼-inch VGA NTSC/PAL CMOS Digital Image Sensor, 98 pp.
Midday Jeff et al., “Material Handling System for Robotic natural Orifice Surgery,”, Proceedings of the 2011 Design of medical Devices Conference, Apr. 12-14, 2011, Minneapolis, MN 4 pages.
Miller, Ph.D., et al., “In-Vivo Stereoscopic Imaging System with 5 Degrees-of-Freedom for Minimal Access Surgery,” Dept. of Computer Science and Dept. of Surgery, Columbia University, New York, NY, 7 pp., 2004.
Munro (2002), “Laparoscopic access: complications, technologies, and techniques,” Curro Opin. Obstet. Gynecol., 14 (4): 365-74.
Nio et al., “Efficiency of manual vs robotical (Zeus) assisted laparoscopic surgery in the performance of standardized tasks,” Surg Endosc, 2002; 16: 412-415.
Oleynikov et al., “In Vivo Camera Robots Provide Improved Vision for Laparoscopic Surgery,” Computer Assisted Radiology and Surgery (CARS), Chicago, IL, Jun. 23-26, 2004b.
Oleynikov et al., “Miniature Robots Can Assist in Laparoscopic Cholecystectomy,” Journal of Surgical Endoscopy, 19-4: 473-476, 2005.
Oleynikov et al., “In Vivo Robotic Laparoscopy,” Surgical Innovation, Jun. 2005, 12(2): 177-181.
O'Neill, “Surgeon takes new route to gallbladder,” The Oregonian, Jun. 2007; 2 pp.
Orlando et al. (2003), “Needle and Trocar Injuries in Diagnostic Laparoscopy under Local Anesthesia: What Is the True Incidence of These Complications?” Journal of Laparoendoscopic & Advanced Surgical Techniques, 13(3): 181-184.
Palm. William. “Rapid Prototyping Primer” May 1998 (revised Jul. 30, 2002) (http://www.me.psu.edu/lamancusa/rapidpro/primer/chapter2.htm), 12 pages.
Park et al., “Experimental studies of transgastric gallbladder surgery: cholecystectomy and cholecystogastric anastomosis (videos),” Gastrointestinal Endoscopy, 2005; 61 (4): 601-606.
Park et al., “Trocar-less Instrumentation for Laparoscopy: Magnetic Positioning of Intra-abdominal Camera and Retractor,” Ann Surg, Mar. 2007; 245(3): 379-384.
Patronik et al., “Crawling on the Heart: A Mobile Robotic Device for Minimally Invasive Cardiac Interventions,” MICCAI, 2004, pp. 9-16.
Patronik et al., “Development of a Tethered Epicardial Crawler for Minimally Invasive Cardiac Therapies,” IEEE, pp. 239-240, 2004.
Patronik et al., “Preliminary evaluation of a mobile robotic device for navigation and intervention on the beating heart,” Computer Aided Surgery, 10(4): 225-232, Jul. 2005.
Peirs et al., “A miniature manipulator for integration in a self-propelling endoscope,” Sensors and Actuators A, 2001, 92: 343-349.
Peters, “Minimally Invasive Colectomy: Are the Potential Benefits Realized?” Dis Colon Rectum 1993; 36: 751-756.
Phee et al., “Development of Microrobotic Devices for Locomotion in the Human Gastrointestinal Tract,” International Conference on Computational Intelligence, Robotics and Autonomous Systems (CI RAS 2001), Nov. 28-30, (2001), Singapore, 6 pages.
Phee et al., “Analysis and Development of Locomotion Devices for the Gastrointestinal Tract,” IEEE Transactions on Biomedical Engineering, vol. 49, No. 6, Jun. 2002: 613-616.
Platt et al., “In Vivo Robotic Cameras can Enhance Imaging Capability During Laparoscopic Surgery,” from the Proceedings of the Society of American Gastrointestinal Endoscopic Surgeons (SAGES) Scientific Conference, Ft. Lauderdale, FL, Apr. 13-16, 2005; 1 pg.
Qian Huan et al., “Multi-joint Single-wound Minimally Invasive Abdominal Surgery Robot Design,” Mechanical Design and Manufacturing, May 8, 2014, pp. 134-137.
Rentschler et al., “In vivo Mobile Surgical Robotic Task Assistance,” 1 pg.
Rentschler et al., “Theoretical and Experimental Analysis of In Vivo Wheeled Mobility,” ASME Design Engineering Technical Conferences: 28th Biennial Mechanisms and Robotics Conference, Salt Lake City, Utah, Sep. 28-Oct. 2, 2004; pp. 1-9.
Rentschler et al., “In Vivo Robots for Laparoscopic Surgery,” Studies in Health Technology and Infonnatics—Medicine Meets Virtual Reality, ISO Press, Newport Beach, CA, 2004a, 98: 316-322.
Rentschler et al., “Toward In Vivo Mobility,” Studies in Health Technology and Infonnatics—Medicine Meets Virtual Reality, ISO Press, Long Beach, CA, 2005a, III: 397-403.
Rentschler et al., “Mobile In Vivo Robots Can Assist in Abdominal Exploration,” from the Proceedings of the Society of American Gastrointestinal Endoscopic Surgeons (SAGES) Scientific Conference, Ft. Lauderdale, FL, April 13-16, 2005b.
Rentschler et al., “Modeling, Analysis, and Experimental Study of In Vivo Wheeled Robotic Mobility,” IEEE Transactions on Robotics, 22 (2): 308-321, 2005c.
Rentschler et al., “Miniature in vivo robots for remote and harsh environments,” IEEE Transaction on Information Technology in Biomedicine, Jan. 2006; 12(1): pp. 66-75.
Rentschler et al., “Mechanical Design of Robotic In Vivo Wheeled Mobility,” ASME Journal of Mechanical Design, 2006a; pp. 1-11, Accepted.
Rentschler et al., “Mobile In Vivo Camera Robots Provide Sole Visual Feedback for Abdominal Exploration and Cholecystectomy,” Journal of Surgical Endoscopy, 20-1: 135-138, 2006b.
Rentschler et al., “Natural Orifice Surgery with an Endoluminal Mobile Robot,” The Society of American Gastrointestinal Endoscopic Surgeons, Dallas, TX, April 2006d.
Rentschler et al., “Mobile In Vivo Biopsy and Camera Robot,” Studies in Health and Infonnatics Medicine Meets Virtual Reality, vol. 119: 449-454, IOS Press, Long Beach, CA, 2006e.
Rentschler et al., “Mobile In Vivo Biopsy Robot,” IEEE International Conference on Robotics and Automation, Orlando, Florida, May 2006; 4155-4160.
Rentschler et al., “In vivo Robotics during the NEEMO 9 Mission,” Medicine Meets Virtual Reality, Feb. 2007; 1 pg.
Rentschler et al., “An In Vivo Mobile Robot for Surgical Vision and Task Assistance,” Journal of Medical Devices, Mar. 2007; vol. 1: 23-29.
Riviere et al., “Toward Active Tremor Canceling in Handheld Microsurgical Instruments,” IEEE Transactions on Robotics and Automation, Oct. 2003, 19(5): 793-800.
Rosen et al., “Force Controlled and Teleoperated Endoscopic, Grasper for Minimally Invasive Surgery-Experimental Performance Evaluation,” IEEE Transactions of Biomedical Engineering, Oct. 1999; 46(10): 1212-1221.
Rosen et al., “Task Decomposition of Laparoscopic Surgery for Objective Evaluation of Surgical Residents' Learning Curve Using Hidden Markov Model,” Computer Aided Surgery, vol. 7, pp. 49-61, 2002.
Rosen et al., “The Blue DRAGON—A System of Measuring the Kinematics and the Dynamics of Minimally Invasive Surgical Tools In-Vivo,” Proc. of the 2002 IEEE International Conference on Robotics and Automation, Washington, DC, pp. 1876-1881, May 2002.
Rosen et al., “Spherical Mechanism Analysis of a Surgical Robot for Minimally Invasive Surgery—Analytical and Experimental Approaches,” Studies in Health Technology and Infonnatics-Medicine Meets Virtual Reality, pp. 442-448, Jan. 2005.
Ruurda et al., “Feasibility of Robot-Assisted Laparoscopic Surgery,” Surgical Laparoscopy, Endoscopy & Percutaneous Techniques, 2002; 12(1):41-45.
Ruurda et al, “Robot-Assisted surgical systems: a new era in laparoscopic surgery,” Ann R. Coll Surg Engl. 2002; 84: 223-226.
Sackier et al., “Robotically assisted laparoscopic surgery,” Surgical Endoscopy, 1994; 8:63-6.
Salky, “What is the Penetration of Endoscopic Techniques into Surgical Practice?” Digestive Surgery 2000; 17:422-426.
Satava, “Surgical Robotics: The Early Chronicles,” Surgical Laparoscopy, Endoscopy & Percutaneous Techniques, 2002; 12(1):6-16.
Schippers et al. (1996), “Requirements and Possibilities of Computer-Assisted Endoscopic Surgery,” In: Computer Integrated Surgery: Technology and Clinical Applications, pp. 561-565.
Schurr et al., “Robotics and Telemanipulation Technologies for Endoscopic Surgery,” Surgical Endoscopy, 2000; 14:375-381.
Schwartz, “In the Lab: Robots that Slink and Squirm,” The New York Times, Mar. 27, 2007, 4 pp.
Sharp LL-151-3D, http://www.sharp3d.com, 2006, 2 pp.
Slatkin et al., “The Development of a Robotic Endoscope,” Proceedings of the 1995 IEEE International Conference on Robotics and Automation, pp. 162-171, 1995.
Smart Pill “Fastastic Voyage: Smart Pill to Expand Testing,” http://www.smartpilldiagnostics.com, Apr. 13, 2005, 1 pg.
Sodeyama et al., A shoulder structure of muscle-driven humanoid with shoulder blades, 2005, IEEE, p. 1-6 (Year: 2005).
Southern Surgeons Club (1991), “A prospective analysis of 1518 laparoscopic cholecystectomies,” N. Eng. 1 Med. 324 (16): 1073-1078.
Stefanini et al., “Modeling and Experiments on a Legged Microrobot Locomoting in a Tubular Compliant and Slippery Environment,” Int. Journal of Robotics Research, vol. 25, No. 5-6, pp. 551-560, May-Jun. 2006.
Stiff et al., “Long-term Pain: Less Common After Laparoscopic than Open Cholecystectomy,” British Journal of Surgery, 1994; 81: 1368-1370.
Strong et al., “Efficacy of Novel Robotic Camera vs. a Standard Laproscopic Camera,” Surgical Innovation vol. 12, No. 4, Dec. 2005, Westminster Publications, Inc., pp. 315-318.
Suzumori et al., “Development of Flexible Microactuator and its Applications to Robotics Mechanisms,” Proceedings of the IEEE International Conference on Robotics and Automation, 1991: 1622-1627.
Taylor et al., “A Telerobotic Assistant for Laparoscopic Surgery,” IEEE Eng Med Biol, 1995; 279-87.
Tendick et al. (1993), “Sensing and Manipulation Problems in Endoscopic Surgery: Experiment, Analysis, and Observation,” Presence 2(1): 66-81.
Tendick et al., “Applications of Micromechatronics in Minimally Invasive Surgery,” IEEE/ASME Transactions on Mechatronics, 1998; 3(1): 34-42.
Thomann et al., “The Design of a new type of Micro Robot for the Intestinal Inspection,” Proceedings of the 2002 IEEE Intl. Conference on Intelligent Robots and Systems, Oct. 2002: 1385-1390.
U.S. Appl. No. 60/180,960, filed Feb. 2000.
U.S. Appl. No. 60/956,032, filed Aug. 15, 2007.
U.S. Appl. No. 60/983,445, filed Oct. 29, 2007.
U.S. Appl. No. 60/990,062, filed Nov. 26, 2007.
U.S. Appl. No. 60/990,076, filed Nov. 26, 2007.
U.S. Appl. No. 60/990,086, filed Nov. 26, 2007.
U.S. Appl. No. 60/990,106, filed Nov. 26, 2007.
U.S. Appl. No. 60/990,470, filed Nov. 27, 2007.
U.S. Appl. No. 61/025,346, filed Feb. 1, 2008.
U.S. Appl. No. 61/030,588, filed Feb. 22, 2008.
U.S. Appl. No. 61/030,617, filed Feb. 22, 2008.
Worn et al., “Espirit Project No. 33915: Miniaturised Robot for Micro Manipulation (MINIMAN),” Nov. 1998, http://www.ipr.ira.ujka.de/-microbot/miniman.
Way et al., editors, “Fundamentals of Laparoscopic Surgery,” Churchill Livingstone Inc., 1995; 14 pp.
Wolfe et al. (1991), Endoscopic Cholecystectomy: An analysis of Complications, Arch. Surg. 1991; 126: 1192-1196.
Xu et al., “System Design of an Insertable Robotic Effector Platform for Single Access (SPA) Surgery”, The 2009 IEEE/RSJ International Conference on Intelligent Robots and Systems, Oct. 11-15, 2009, St. Louis MO USA pp. 5546-5552.
Yu, BSN, RN, “M2ATM Capsule Endoscopy A Breakthrough Diagnostic Tool for Small Intestine Imagining,” vol. 25, No. 1, 2001, Gastroenterology Nursing, pp. 24-27.
Yu et al., “Microrobotic Cell Injection,” Proceedings of the 2001 IEEE International Conference on Robotics and Automation, May 2001: 620-625.
Related Publications (1)
Number Date Country
20200214775 A1 Jul 2020 US
Provisional Applications (1)
Number Date Country
62789029 Jan 2019 US