Cooperative surgical actions for robot-assisted surgical platforms

Information

  • Patent Grant
  • 11058498
  • Patent Number
    11,058,498
  • Date Filed
    Thursday, March 29, 2018
    6 years ago
  • Date Issued
    Tuesday, July 13, 2021
    3 years ago
Abstract
Various robotic surgical systems are disclosed. A robotic surgical system comprises: a first robotic arm comprising a first force sensor, a second robotic arm comprising a second force sensor, and a control unit. The control unit comprises a processor and a memory communicatively coupled to the processor. The memory stores instructions executable by the processor to receive a first input from the first force sensor, to receive a second input from the second force sensor, and to effect cooperative movement of the first robotic arm and the second robotic arm based on the first input from the first force sensor and the second input from the second force sensor in a load control mode.
Description
BACKGROUND

The present disclosure relates to robotic surgical systems. Robotic surgical systems can include a central control unit, a surgeon's command console, and a robot having one or more robotic arms. Robotic surgical tools can be releasably mounted to the robotic arm(s). The number and type of robotic surgical tools can depend on the type of surgical procedure. Robotic surgical systems can be used in connection with one or more displays and/or one or more handheld surgical instruments during a surgical procedure.


SUMMARY

In one general aspect, a robotic surgical system is provided. The robotic surgical system comprises: a first robotic arm comprising a first force sensor; a second robotic arm comprising a second force sensor; and a control unit. The control unit comprises a processor and a memory communicatively coupled to the processor. The memory stores instructions executable by the processor to: receive a first input from the first force sensor; receive a second input from the second force sensor; and effect cooperative movement of the first robotic arm and the second robotic arm based on the first input from the first force sensor and the second input from the second force sensor in a load control mode.


In another general aspect, a robotic surgical system is provided. The robotic surgical system comprises: a first robotic arm comprising a first sensor; a second robotic arm comprising a second sensor; and a control unit. The control unit comprises a processor and a memory communicatively coupled to the processor. The memory stores instructions executable by the processor to: receive a first input from the first sensor; receive a second input from the second sensor; and effect cooperative movement of the first robotic arm and the second robotic arm based on the first input from the first sensor and the second input from the second sensor.


In yet another general aspect, a computer-readable medium is provided. The computer-readable medium is non-transitory and stores computer-readable instructions which, when executed, cause a machine to: receive a first input from a first force sensor; receive a second input from a second force sensor; and effect cooperative movement of a first robotic arm and a second robotic arm based on the first input from the first force sensor and the second input from the second force sensor in a load control mode.


In another general aspect, a robotic surgical system is provided. The robotic surgical system comprises a first robotic arm comprising a first sensor; a second robotic arm comprising a second sensor; and a control circuit. The control unit is configured to: receive a first input from the first sensor; receive a second input from the second sensor; and effect cooperative movement of the first robotic arm and the second robotic arm based on the first input from the first sensor and the second input from the second sensor.





BRIEF DESCRIPTION OF THE FIGURES

The features of various aspects are set forth with particularity in the appended claims. The various aspects, however, both as to organization and methods of operation, together with further objects and advantages thereof, may best be understood by reference to the following description, taken in conjunction with the accompanying drawings as follows.



FIG. 1 is a block diagram of a computer-implemented interactive surgical system, in accordance with at least one aspect of the present disclosure.



FIG. 2 is a surgical system being used to perform a surgical procedure in an operating room, in accordance with at least one aspect of the present disclosure.



FIG. 3 is a surgical hub paired with a visualization system, a robotic system, and an intelligent instrument, in accordance with at least one aspect of the present disclosure.



FIG. 4 is a partial perspective view of a surgical hub enclosure, and of a combo generator module slidably receivable in a drawer of the surgical hub enclosure, in accordance with at least one aspect of the present disclosure.



FIG. 5 is a perspective view of a combo generator module with bipolar, ultrasonic, and monopolar contacts and a smoke evacuation component, in accordance with at least one aspect of the present disclosure.



FIG. 6 illustrates individual power bus attachments for a plurality of lateral docking ports of a lateral modular housing configured to receive a plurality of modules, in accordance with at least one aspect of the present disclosure.



FIG. 7 illustrates a vertical modular housing configured to receive a plurality of modules, in accordance with at least one aspect of the present disclosure.



FIG. 8 illustrates a surgical data network comprising a modular communication hub configured to connect modular devices located in one or more operating theaters of a healthcare facility, or any room in a healthcare facility specially equipped for surgical operations, to the cloud, in accordance with at least one aspect of the present disclosure.



FIG. 9 illustrates a computer-implemented interactive surgical system, in accordance with at least one aspect of the present disclosure.



FIG. 10 illustrates a surgical hub comprising a plurality of modules coupled to the modular control tower, in accordance with at least one aspect of the present disclosure.



FIG. 11 illustrates one aspect of a Universal Serial Bus (USB) network hub device, in accordance with at least one aspect of the present disclosure.



FIG. 12 illustrates a logic diagram of a control system of a surgical instrument or tool, in accordance with at least one aspect of the present disclosure.



FIG. 13 illustrates a control circuit configured to control aspects of the surgical instrument or tool, in accordance with at least one aspect of the present disclosure.



FIG. 14 illustrates a combinational logic circuit configured to control aspects of the surgical instrument or tool, in accordance with at least one aspect of the present disclosure.



FIG. 15 illustrates a sequential logic circuit configured to control aspects of the surgical instrument or tool, in accordance with at least one aspect of the present disclosure.



FIG. 16 illustrates a surgical instrument or tool comprising a plurality of motors which can be activated to perform various functions, in accordance with at least one aspect of the present disclosure.



FIG. 17 is a schematic diagram of a robotic surgical instrument configured to operate a surgical tool described herein, in accordance with at least one aspect of the present disclosure.



FIG. 18 illustrates a block diagram of a surgical instrument programmed to control the distal translation of a displacement member, in accordance with at least one aspect of the present disclosure.



FIG. 19 is a schematic diagram of a surgical instrument configured to control various functions, in accordance with at least one aspect of the present disclosure.



FIG. 20 is a simplified block diagram of a generator configured to provide inductorless tuning, among other benefits, in accordance with at least one aspect of the present disclosure.



FIG. 21 illustrates an example of a generator, which is one form of the generator of FIG. 20, in accordance with at least one aspect of the present disclosure.



FIG. 22 is a schematic of a robotic surgical system, in accordance with one aspect of the present disclosure.



FIG. 23 is a schematic of a robotic surgical system, in accordance with at least one aspect of the present disclosure.



FIG. 24 is a block diagram of control components for the robotic surgical system of FIG. 23, in accordance with at least one aspect of the present disclosure.



FIG. 25A is an elevation view of an ultrasonic surgical tool positioned out of contact with tissue, in accordance with at least one aspect of the present disclosure.



FIG. 25B is an elevation view of the ultrasonic surgical tool of FIG. 25A positioned in abutting contact with tissue, in accordance with at least one aspect of the present disclosure.



FIG. 26A is an elevation view of a monopolar cautery pencil positioned out of contact with tissue, in accordance with at least one aspect of the present disclosure.



FIG. 26B is an elevation view of the monopolar cautery pencil of FIG. 26A positioned in abutting contact with tissue, in accordance with at least one aspect of the present disclosure.



FIG. 27 is a graphical display of continuity and current over time for the ultrasonic surgical tool of FIGS. 25A and 25B, in accordance with at least one aspect of the present disclosure.



FIG. 28 illustrates an end effector comprising radio frequency (RF) data sensors located on a jaw member, in accordance with at least one aspect of the present disclosure.



FIG. 29 illustrates the sensors shown in FIG. 28 mounted to or formed integrally with a flexible circuit, in accordance with at least one aspect of the present disclosure.



FIG. 30 is a flow chart depicting an automatic activation mode of a surgical instrument, in accordance with at least one aspect of the present disclosure.



FIG. 31 is a perspective view of an end effector of a bipolar radio frequency (RF) surgical tool having a smoke evacuation pump for use with a robotic surgical system, depicting the surgical tool clamping and treating tissue, in accordance with at least one aspect of the present disclosure.



FIG. 32 is a block diagram of a surgical system comprising a robotic surgical system, a handheld surgical instrument, and a surgical hub, in accordance with at least one aspect of the present disclosure.



FIG. 33 is a perspective view of a handle portion of a handheld surgical instrument including a display and further depicting a detail view of the display depicting information from the instrument itself, in accordance with at least one aspect of the present disclosure.



FIG. 34 is a perspective view of the handle portion of the handheld surgical instrument of FIG. 33 depicting the instrument paired with a surgical hub and further including a detail view of the display depicting information from the surgical hub, in accordance with at least one aspect of the present disclosure.



FIG. 35 is a schematic of a colon resection procedure, in accordance with at least one aspect of the present disclosure.



FIG. 36 is a graphical display of force over time for the colon resection procedure displayed on the instrument display in FIG. 35, in accordance with at least one aspect of the present disclosure.



FIG. 37 is a schematic of a robotic surgical system during a surgical procedure including a plurality of hubs and interactive secondary displays, in accordance with at least one aspect of the present disclosure.



FIG. 38 is a detail view of the interactive secondary displays of FIG. 37, in accordance with at least one aspect of the present disclosure.



FIG. 39 is a block diagram of a robotic surgical system comprising more than one robotic arm, in accordance with at least one aspect of the present disclosure.



FIG. 40 is a schematic of a surgical procedure utilizing the robotic surgical system of FIG. 39, in accordance with at least one aspect of the present disclosure.



FIG. 41 shows graphical representations of forces and positional displacements experienced by the robotic arms of FIG. 39, in accordance with at least one aspect of the present disclosure.



FIG. 42 is a flow chart depicting an algorithm for controlling the position of the robotic arms of a robotic surgical system, in accordance with at least one aspect of the present disclosure.



FIG. 43 is a flow chart depicting an algorithm for controlling the forces exerted by robotic arms of a robotic surgical system, in accordance with at least one aspect of the present disclosure.



FIG. 44 is a flow chart depicting an algorithm for monitoring the position and forces exerted by robotic arms of a robotic surgical system, in accordance with at least one aspect of the present disclosure.



FIG. 45 is a block diagram of a surgical system comprising a robotic surgical system, a powered handheld surgical instrument, and a surgical hub, in accordance with at least one aspect of the present disclosure.



FIG. 46 is a perspective view of a robotic tool and a handheld surgical instrument during a surgical procedure, in accordance with at least one aspect of the present disclosure.



FIG. 47 is a schematic depicting communication links between surgical hubs and a primary server, in accordance with at least one aspect of the present disclosure.



FIG. 48 is a flow chart depicting a queue for external output of data received from the various surgical hubs of FIG. 47, in accordance with at least one aspect of the present disclosure.



FIG. 49 is a timeline depicting situational awareness of a surgical hub, in accordance with one aspect of the present disclosure.





DETAILED DESCRIPTION

Applicant of the present application owns the following U.S. Provisional Patent Applications, filed on Mar. 28, 2018, each of which is herein incorporated by reference in its entirety:

    • U.S. Provisional Patent Application Ser. No. 62/649,302, titled INTERACTIVE SURGICAL SYSTEMS WITH ENCRYPTED COMMUNICATION CAPABILITIES;
    • U.S. Provisional Patent Application Ser. No. 62/649,294, titled DATA STRIPPING METHOD TO INTERROGATE PATIENT RECORDS AND CREATE ANONYMIZED RECORD;
    • U.S. Provisional Patent Application Ser. No. 62/649,300, titled SURGICAL HUB SITUATIONAL AWARENESS;
    • U.S. Provisional Patent Application Ser. No. 62/649,309, titled SURGICAL HUB SPATIAL AWARENESS TO DETERMINE DEVICES IN OPERATING THEATER;
    • U.S. Provisional Patent Application Ser. No. 62/649,310, titled COMPUTER IMPLEMENTED INTERACTIVE SURGICAL SYSTEMS;
    • U.S. Provisional Patent Application Ser. No. 62/649,291, titled USE OF LASER LIGHT AND RED-GREEN-BLUE COLORATION TO DETERMINE PROPERTIES OF BACK SCATTERED LIGHT;
    • U.S. Provisional Patent Application Ser. No. 62/649,296, titled ADAPTIVE CONTROL PROGRAM UPDATES FOR SURGICAL DEVICES;
    • U.S. Provisional Patent Application Ser. No. 62/649,333, titled CLOUD-BASED MEDICAL ANALYTICS FOR CUSTOMIZATION AND RECOMMENDATIONS TO A USER;
    • U.S. Provisional Patent Application Ser. No. 62/649,327, titled CLOUD-BASED MEDICAL ANALYTICS FOR SECURITY AND AUTHENTICATION TRENDS AND REACTIVE MEASURES;
    • U.S. Provisional Patent Application Ser. No. 62/649,315, titled DATA HANDLING AND PRIORITIZATION IN A CLOUD ANALYTICS NETWORK;
    • U.S. Provisional Patent Application Ser. No. 62/649,313, titled CLOUD INTERFACE FOR COUPLED SURGICAL DEVICES;
    • U.S. Provisional Patent Application Ser. No. 62/649,320, titled DRIVE ARRANGEMENTS FOR ROBOT-ASSISTED SURGICAL PLATFORMS;
    • U.S. Provisional Patent Application Ser. No. 62/649,307, titled AUTOMATIC TOOL ADJUSTMENTS FOR ROBOT-ASSISTED SURGICAL PLATFORMS; and
    • U.S. Provisional Patent Application Ser. No. 62/649,323, titled SENSING ARRANGEMENTS FOR ROBOT-ASSISTED SURGICAL PLATFORMS.


Applicant of the present application owns the following U.S. patent applications, filed on Mar. 29, 2018, each of which is herein incorporated by reference in its entirety:

    • U.S. patent application Ser. No. 15/940,641, titled INTERACTIVE SURGICAL SYSTEMS WITH ENCRYPTED COMMUNICATION CAPABILITIES; now U.S. Patent Application Publication No. 2019/0207911;
    • U.S. patent application Ser. No. 15/940,648, titled INTERACTIVE SURGICAL SYSTEMS WITH CONDITION HANDLING OF DEVICES AND DATA CAPABILITIES; now U.S. Patent Application Publication No. 2019/0206004;
    • U.S. patent application Ser. No. 15/940,656, titled SURGICAL HUB COORDINATION OF CONTROL AND COMMUNICATION OF OPERATING ROOM DEVICES; now U.S. Patent Application Publication No. 2019/0201141;
    • U.S. patent application Ser. No. 15/940,666, titled SPATIAL AWARENESS OF SURGICAL HUBS IN OPERATING ROOMS; now U.S. Patent Application Publication No. 2019/0206551;
    • U.S. patent application Ser. No. 15/940,670, titled COOPERATIVE UTILIZATION OF DATA DERIVED FROM SECONDARY SOURCES BY INTELLIGENT SURGICAL HUBS; now U.S. Patent Application Publication No. 2019/0201116;
    • U.S. patent application Ser. No. 15/940,677, titled SURGICAL HUB CONTROL ARRANGEMENTS; now U.S. Patent Application Publication No. 2019/0201143;
    • U.S. patent application Ser. No. 15/940,632, titled DATA STRIPPING METHOD TO INTERROGATE PATIENT RECORDS AND CREATE ANONYMIZED RECORD; now U.S. Patent Application Publication No. 2019/0205566;
    • U.S. patent application Ser. No. 15/940,640, titled COMMUNICATION HUB AND STORAGE DEVICE FOR STORING PARAMETERS AND STATUS OF A SURGICAL DEVICE TO BE SHARED WITH CLOUD BASED ANALYTICS SYSTEMS; now U.S. Patent Application Publication No. 2019/0200863;
    • U.S. patent application Ser. No. 15/940,645, titled SELF DESCRIBING DATA PACKETS GENERATED AT AN ISSUING INSTRUMENT; now U.S. Patent Application Publication No. 2019/0207773;
    • U.S. patent application Ser. No. 15/940,649, titled DATA PAIRING TO INTERCONNECT A DEVICE MEASURED PARAMETER WITH AN OUTCOME; now U.S. Patent Application Publication No. 2019/0205567;
    • U.S. patent application Ser. No. 15/940,654, titled SURGICAL HUB SITUATIONAL AWARENESS; now U.S. Patent Application Publication No. 2019/0201140;
    • U.S. patent application Ser. No. 15/940,663, titled SURGICAL SYSTEM DISTRIBUTED PROCESSING; now U.S. Patent Application Publication No. 2019/0201033;
    • U.S. patent application Ser. No. 15/940,668, titled AGGREGATION AND REPORTING OF SURGICAL HUB DATA; now U.S. Patent Application Publication No. 2019/0201115;
    • U.S. patent application Ser. No. 15/940,671, titled SURGICAL HUB SPATIAL AWARENESS TO DETERMINE DEVICES IN OPERATING THEATER; now U.S. Patent Application Publication No. 2019/0201104;
    • U.S. patent application Ser. No. 15/940,686, titled DISPLAY OF ALIGNMENT OF STAPLE CARTRIDGE TO PRIOR LINEAR STAPLE LINE; now U.S. Patent Application Publication No. 2019/0201105;
    • U.S. patent application Ser. No. 15/940,700, titled STERILE FIELD INTERACTIVE CONTROL DISPLAYS; now U.S. Patent Application Publication No. 2019/0205001;
    • U.S. patent application Ser. No. 15/940,629, titled COMPUTER IMPLEMENTED INTERACTIVE SURGICAL SYSTEMS; now U.S. Patent Application Publication No. 2019/0201112;
    • U.S. patent application Ser. No. 15/940,704, titled USE OF LASER LIGHT AND RED-GREEN-BLUE COLORATION TO DETERMINE PROPERTIES OF BACK SCATTERED LIGHT; now U.S. Patent Application Publication No. 2019/0206050;
    • U.S. patent application Ser. No. 15/940,722, titled CHARACTERIZATION OF TISSUE IRREGULARITIES THROUGH THE USE OF MONO-CHROMATIC LIGHT REFRACTIVITY; now U.S. Patent Application Publication No. 2019/0200905; and
    • U.S. patent application Ser. No. 15/940,742, titled DUAL CMOS ARRAY IMAGING; now U.S. Patent Application Publication No. 2019/0200906.


Applicant of the present application owns the following U.S. patent applications, filed on Mar. 29, 2018, each of which is herein incorporated by reference in its entirety:

    • U.S. patent application Ser. No. 15/940,636, titled ADAPTIVE CONTROL PROGRAM UPDATES FOR SURGICAL DEVICES; now U.S. Patent Application Publication No. 2019/0206003;
    • U.S. patent application Ser. No. 15/940,653, titled ADAPTIVE CONTROL PROGRAM UPDATES FOR SURGICAL HUBS; now U.S. Patent Application Publication No. 2019/0201114;
    • U.S. patent application Ser. No. 15/940,660, titled CLOUD-BASED MEDICAL ANALYTICS FOR CUSTOMIZATION AND RECOMMENDATIONS TO A USER; now U.S. Patent Application Publication No. 2019/0206555;
    • U.S. patent application Ser. No. 15/940,679, titled CLOUD-BASED MEDICAL ANALYTICS FOR LINKING OF LOCAL USAGE TRENDS WITH THE RESOURCE ACQUISITION BEHAVIORS OF LARGER DATA SET; now U.S. Patent Application Publication No. 2019/0201144;
    • U.S. patent application Ser. No. 15/940,694, titled CLOUD-BASED MEDICAL ANALYTICS FOR MEDICAL FACILITY SEGMENTED INDIVIDUALIZATION OF INSTRUMENT FUNCTION; now U.S. Patent Application Publication No. 2019/0201119;
    • U.S. patent application Ser. No. 15/940,634, titled CLOUD-BASED MEDICAL ANALYTICS FOR SECURITY AND AUTHENTICATION TRENDS AND REACTIVE MEASURES; now U.S. Patent Application Publication No. 2019/0201138;
    • U.S. patent application Ser. No. 15/940,706, titled DATA HANDLING AND PRIORITIZATION IN A CLOUD ANALYTICS NETWORK; now U.S. Patent Application Publication No. 2019/0206561; and
    • U.S. patent application Ser. No. 15/940,675, titled CLOUD INTERFACE FOR COUPLED SURGICAL DEVICES, now U.S. Pat. No. 10,849,697.


Applicant of the present application owns the following U.S. patent applications, filed on Mar. 29, 2018, each of which is herein incorporated by reference in its entirety:

    • U.S. patent application Ser. No. 15/940,627, titled DRIVE ARRANGEMENTS FOR ROBOT-ASSISTED SURGICAL PLATFORMS; now U.S. Patent Application Publication No. 2019/0201111;
    • U.S. patent application Ser. No. 15/940,637, titled COMMUNICATION ARRANGEMENTS FOR ROBOT-ASSISTED SURGICAL PLATFORMS; now U.S. Patent Application Publication No. 2019/0201139;
    • U.S. patent application Ser. No. 15/940,642, titled CONTROLS FOR ROBOT-ASSISTED SURGICAL PLATFORMS; now U.S. Patent Application Publication No. 2019/0201113;
    • U.S. patent application Ser. No. 15/940,676, titled AUTOMATIC TOOL ADJUSTMENTS FOR ROBOT-ASSISTED SURGICAL PLATFORMS; now U.S. Patent Application Publication No. 2019/0201142;


Before explaining

    • U.S. patent application Ser. No. 15/940,680, titled CONTROLLERS FOR ROBOT-ASSISTED SURGICAL PLATFORMS; now U.S. Patent Application Publication No. 2019/0201135;
    • U.S. patent application Ser. No. 15/940,690, titled DISPLAY ARRANGEMENTS FOR ROBOT-ASSISTED SURGICAL PLATFORMS; now U.S. Patent Application Publication No. 2019/0201118; and
    • U.S. patent application Ser. No. 15/940,711, titled SENSING ARRANGEMENTS FOR ROBOT-ASSISTED SURGICAL PLATFORMS; now U.S. Patent Application Publication No. 2019/0201120.


Before explaining various aspects of surgical devices and generators in detail, it should be noted that the illustrative examples are not limited in application or use to the details of construction and arrangement of parts illustrated in the accompanying drawings and description. The illustrative examples may be implemented or incorporated in other aspects, variations and modifications, and may be practiced or carried out in various ways. Further, unless otherwise indicated, the terms and expressions employed herein have been chosen for the purpose of describing the illustrative examples for the convenience of the reader and are not for the purpose of limitation thereof. Also, it will be appreciated that one or more of the following-described aspects, expressions of aspects, and/or examples, can be combined with any one or more of the other following-described aspects, expressions of aspects and/or examples.


Referring to FIG. 1, a computer-implemented interactive surgical system 100 includes one or more surgical systems 102 and a cloud-based system (e.g., the cloud 104 that may include a remote server 113 coupled to a storage device 105). Each surgical system 102 includes at least one surgical hub 106 in communication with the cloud 104 that may include a remote server 113. In one example, as illustrated in FIG. 1, the surgical system 102 includes a visualization system 108, a robotic system 110, and a handheld intelligent surgical instrument 112, which are configured to communicate with one another and/or the hub 106. In some aspects, a surgical system 102 may include an M number of hubs 106, an N number of visualization systems 108, an O number of robotic systems 110, and a P number of handheld intelligent surgical instruments 112, where M, N, O, and P are integers greater than or equal to one.



FIG. 3 depicts an example of a surgical system 102 being used to perform a surgical procedure on a patient who is lying down on an operating table 114 in a surgical operating room 116. A robotic system 110 is used in the surgical procedure as a part of the surgical system 102. The robotic system 110 includes a surgeon's console 118, a patient side cart 120 (surgical robot), and a surgical robotic hub 122. The patient side cart 120 can manipulate at least one removably coupled surgical tool 117 through a minimally invasive incision in the body of the patient while the surgeon views the surgical site through the surgeon's console 118. An image of the surgical site can be obtained by a medical imaging device 124, which can be manipulated by the patient side cart 120 to orient the imaging device 124. The robotic hub 122 can be used to process the images of the surgical site for subsequent display to the surgeon through the surgeon's console 118.


Other types of robotic systems can be readily adapted for use with the surgical system 102. Various examples of robotic systems and surgical tools that are suitable for use with the present disclosure are described in U.S. Provisional Patent Application Ser. No. 62/611,339, titled ROBOT ASSISTED SURGICAL PLATFORM, filed Dec. 28, 2017, the disclosure of which is herein incorporated by reference in its entirety.


Various examples of cloud-based analytics that are performed by the cloud 104, and are suitable for use with the present disclosure, are described in U.S. Provisional Patent Application Ser. No. 62/611,340, titled CLOUD-BASED MEDICAL ANALYTICS, filed Dec. 28, 2017, the disclosure of which is herein incorporated by reference in its entirety.


In various aspects, the imaging device 124 includes at least one image sensor and one or more optical components. Suitable image sensors include, but are not limited to, Charge-Coupled Device (CCD) sensors and Complementary Metal-Oxide Semiconductor (CMOS) sensors.


The optical components of the imaging device 124 may include one or more illumination sources and/or one or more lenses. The one or more illumination sources may be directed to illuminate portions of the surgical field. The one or more image sensors may receive light reflected or refracted from the surgical field, including light reflected or refracted from tissue and/or surgical instruments.


The one or more illumination sources may be configured to radiate electromagnetic energy in the visible spectrum as well as the invisible spectrum. The visible spectrum, sometimes referred to as the optical spectrum or luminous spectrum, is that portion of the electromagnetic spectrum that is visible to (i.e., can be detected by) the human eye and may be referred to as visible light or simply light. A typical human eye will respond to wavelengths in air that are from about 380 nm to about 750 nm.


The invisible spectrum (i.e., the non-luminous spectrum) is that portion of the electromagnetic spectrum that lies below and above the visible spectrum (i.e., wavelengths below about 380 nm and above about 750 nm). The invisible spectrum is not detectable by the human eye. Wavelengths greater than about 750 nm are longer than the red visible spectrum, and they become invisible infrared (IR), microwave, and radio electromagnetic radiation. Wavelengths less than about 380 nm are shorter than the violet spectrum, and they become invisible ultraviolet, x-ray, and gamma ray electromagnetic radiation.


In various aspects, the imaging device 124 is configured for use in a minimally invasive procedure. Examples of imaging devices suitable for use with the present disclosure include, but not limited to, an arthroscope, angioscope, bronchoscope, choledochoscope, colonoscope, cytoscope, duodenoscope, enteroscope, esophagogastro-duodenoscope (gastroscope), endoscope, laryngoscope, nasopharyngo-neproscope, sigmoidoscope, thoracoscope, and ureteroscope.


In one aspect, the imaging device employs multi-spectrum monitoring to discriminate topography and underlying structures. A multi-spectral image is one that captures image data within specific wavelength ranges across the electromagnetic spectrum. The wavelengths may be separated by filters or by the use of instruments that are sensitive to particular wavelengths, including light from frequencies beyond the visible light range, e.g., IR and ultraviolet. Spectral imaging can allow extraction of additional information the human eye fails to capture with its receptors for red, green, and blue. The use of multi-spectral imaging is described in greater detail under the heading “Advanced Imaging Acquisition Module” in U.S. Provisional Patent Application Ser. No. 62/611,341, titled INTERACTIVE SURGICAL PLATFORM, filed Dec. 28, 2017, the disclosure of which is herein incorporated by reference in its entirety. Multi-spectrum monitoring can be a useful tool in relocating a surgical field after a surgical task is completed to perform one or more of the previously described tests on the treated tissue.


It is axiomatic that strict sterilization of the operating room and surgical equipment is required during any surgery. The strict hygiene and sterilization conditions required in a “surgical theater,” i.e., an operating or treatment room, necessitate the highest possible sterility of all medical devices and equipment. Part of that sterilization process is the need to sterilize anything that comes in contact with the patient or penetrates the sterile field, including the imaging device 124 and its attachments and components. It will be appreciated that the sterile field may be considered a specified area, such as within a tray or on a sterile towel, that is considered free of microorganisms, or the sterile field may be considered an area, immediately around a patient, who has been prepared for a surgical procedure. The sterile field may include the scrubbed team members, who are properly attired, and all furniture and fixtures in the area.


In various aspects, the visualization system 108 includes one or more imaging sensors, one or more image processing units, one or more storage arrays, and one or more displays that are strategically arranged with respect to the sterile field, as illustrated in FIG. 2. In one aspect, the visualization system 108 includes an interface for HL7, PACS, and EMR. Various components of the visualization system 108 are described under the heading “Advanced Imaging Acquisition Module” in U.S. Provisional Patent Application Ser. No. 62/611,341, titled INTERACTIVE SURGICAL PLATFORM, filed Dec. 28, 2017, the disclosure of which is herein incorporated by reference in its entirety.


As illustrated in FIG. 2, a primary display 119 is positioned in the sterile field to be visible to an operator at the operating table 114. In addition, a visualization tower 111 is positioned outside the sterile field. The visualization tower 111 includes a first non-sterile display 107 and a second non-sterile display 109, which face away from each other. The visualization system 108, guided by the hub 106, is configured to utilize the displays 107, 109, and 119 to coordinate information flow to operators inside and outside the sterile field. For example, the hub 106 may cause the visualization system 108 to display a snap-shot of a surgical site, as recorded by an imaging device 124, on a non-sterile display 107 or 109, while maintaining a live feed of the surgical site on the primary display 119. The snap-shot on the non-sterile display 107 or 109 can permit a non-sterile operator to perform a diagnostic step relevant to the surgical procedure, for example.


In one aspect, the hub 106 is also configured to route a diagnostic input or feedback entered by a non-sterile operator at the visualization tower 111 to the primary display 119 within the sterile field, where it can be viewed by a sterile operator at the operating table. In one example, the input can be in the form of a modification to the snap-shot displayed on the non-sterile display 107 or 109, which can be routed to the primary display 119 by the hub 106.


Referring to FIG. 2, a surgical instrument 112 is being used in the surgical procedure as part of the surgical system 102. The hub 106 is also configured to coordinate information flow to a display of the surgical instrument 112. For example, in U.S. Provisional Patent Application Ser. No. 62/611,341, titled INTERACTIVE SURGICAL PLATFORM, filed Dec. 28, 2017, the disclosure of which is herein incorporated by reference in its entirety. A diagnostic input or feedback entered by a non-sterile operator at the visualization tower 111 can be routed by the hub 106 to the surgical instrument display 115 within the sterile field, where it can be viewed by the operator of the surgical instrument 112. Example surgical instruments that are suitable for use with the surgical system 102 are described under the heading “Surgical Instrument Hardware” and in U.S. Provisional Patent Application Ser. No. 62/611,341, titled INTERACTIVE SURGICAL PLATFORM, filed Dec. 28, 2017, the disclosure of which is herein incorporated by reference in its entirety, for example.


Referring now to FIG. 3, a hub 106 is depicted in communication with a visualization system 108, a robotic system 110, and a handheld intelligent surgical instrument 112. The hub 106 includes a hub display 135, an imaging module 138, a generator module 140, a communication module 130, a processor module 132, and a storage array 134. In certain aspects, as illustrated in FIG. 3, the hub 106 further includes a smoke evacuation module 126 and/or a suction/irrigation module 128.


During a surgical procedure, energy application to tissue, for sealing and/or cutting, is generally associated with smoke evacuation, suction of excess fluid, and/or irrigation of the tissue. Fluid, power, and/or data lines from different sources are often entangled during the surgical procedure. Valuable time can be lost addressing this issue during a surgical procedure. Detangling the lines may necessitate disconnecting the lines from their respective modules, which may require resetting the modules. The hub modular enclosure 136 offers a unified environment for managing the power, data, and fluid lines, which reduces the frequency of entanglement between such lines.


Aspects of the present disclosure present a surgical hub for use in a surgical procedure that involves energy application to tissue at a surgical site. The surgical hub includes a hub enclosure and a combo generator module slidably receivable in a docking station of the hub enclosure. The docking station includes data and power contacts. The combo generator module includes two or more of an ultrasonic energy generator component, a bipolar RF energy generator component, and a monopolar RF energy generator component that are housed in a single unit. In one aspect, the combo generator module also includes a smoke evacuation component, at least one energy delivery cable for connecting the combo generator module to a surgical instrument, at least one smoke evacuation component configured to evacuate smoke, fluid, and/or particulates generated by the application of therapeutic energy to the tissue, and a fluid line extending from the remote surgical site to the smoke evacuation component.


In one aspect, the fluid line is a first fluid line and a second fluid line extends from the remote surgical site to a suction and irrigation module slidably received in the hub enclosure. In one aspect, the hub enclosure comprises a fluid interface.


Certain surgical procedures may require the application of more than one energy type to the tissue. One energy type may be more beneficial for cutting the tissue, while another different energy type may be more beneficial for sealing the tissue. For example, a bipolar generator can be used to seal the tissue while an ultrasonic generator can be used to cut the sealed tissue. Aspects of the present disclosure present a solution where a hub modular enclosure 136 is configured to accommodate different generators, and facilitate an interactive communication therebetween. One of the advantages of the hub modular enclosure 136 is enabling the quick removal and/or replacement of various modules.


Aspects of the present disclosure present a modular surgical enclosure for use in a surgical procedure that involves energy application to tissue. The modular surgical enclosure includes a first energy-generator module, configured to generate a first energy for application to the tissue, and a first docking station comprising a first docking port that includes first data and power contacts, wherein the first energy-generator module is slidably movable into an electrical engagement with the power and data contacts and wherein the first energy-generator module is slidably movable out of the electrical engagement with the first power and data contacts,


Further to the above, the modular surgical enclosure also includes a second energy-generator module configured to generate a second energy, different than the first energy, for application to the tissue, and a second docking station comprising a second docking port that includes second data and power contacts, wherein the second energy-generator module is slidably movable into an electrical engagement with the power and data contacts, and wherein the second energy-generator module is slidably movable out of the electrical engagement with the second power and data contacts.


In addition, the modular surgical enclosure also includes a communication bus between the first docking port and the second docking port, configured to facilitate communication between the first energy-generator module and the second energy-generator module.


Referring to FIGS. 3-7, aspects of the present disclosure are presented for a hub modular enclosure 136 that allows the modular integration of a generator module 140, a smoke evacuation module 126, and a suction/irrigation module 128. The hub modular enclosure 136 further facilitates interactive communication between the modules 140, 126, 128. As illustrated in FIG. 5, the generator module 140 can be a generator module with integrated monopolar, bipolar, and ultrasonic components supported in a single housing unit 139 slidably insertable into the hub modular enclosure 136. As illustrated in FIG. 5, the generator module 140 can be configured to connect to a monopolar device 146, a bipolar device 147, and an ultrasonic device 148. Alternatively, the generator module 140 may comprise a series of monopolar, bipolar, and/or ultrasonic generator modules that interact through the hub modular enclosure 136. The hub modular enclosure 136 can be configured to facilitate the insertion of multiple generators and interactive communication between the generators docked into the hub modular enclosure 136 so that the generators would act as a single generator.


In one aspect, the hub modular enclosure 136 comprises a modular power and communication backplane 149 with external and wireless communication headers to enable the removable attachment of the modules 140, 126, 128 and interactive communication therebetween.


In one aspect, the hub modular enclosure 136 includes docking stations, or drawers, 151, herein also referred to as drawers, which are configured to slidably receive the modules 140, 126, 128. FIG. 4 illustrates a partial perspective view of a surgical hub enclosure 136, and a combo generator module 145 slidably receivable in a docking station 151 of the surgical hub enclosure 136. A docking port 152 with power and data contacts on a rear side of the combo generator module 145 is configured to engage a corresponding docking port 150 with power and data contacts of a corresponding docking station 151 of the hub modular enclosure 136 as the combo generator module 145 is slid into position within the corresponding docking station 151 of the hub module enclosure 136. In one aspect, the combo generator module 145 includes a bipolar, ultrasonic, and monopolar module and a smoke evacuation module integrated together into a single housing unit 139, as illustrated in FIG. 5.


In various aspects, the smoke evacuation module 126 includes a fluid line 154 that conveys captured/collected smoke and/or fluid away from a surgical site and to, for example, the smoke evacuation module 126. Vacuum suction originating from the smoke evacuation module 126 can draw the smoke into an opening of a utility conduit at the surgical site. The utility conduit, coupled to the fluid line, can be in the form of a flexible tube terminating at the smoke evacuation module 126. The utility conduit and the fluid line define a fluid path extending toward the smoke evacuation module 126 that is received in the hub enclosure 136.


In various aspects, the suction/irrigation module 128 is coupled to a surgical tool comprising an aspiration fluid line and a suction fluid line. In one example, the aspiration and suction fluid lines are in the form of flexible tubes extending from the surgical site toward the suction/irrigation module 128. One or more drive systems can be configured to cause irrigation and aspiration of fluids to and from the surgical site.


In one aspect, the surgical tool includes a shaft having an end effector at a distal end thereof and at least one energy treatment associated with the end effector, an aspiration tube, and an irrigation tube. The aspiration tube can have an inlet port at a distal end thereof and the aspiration tube extends through the shaft. Similarly, an irrigation tube can extend through the shaft and can have an inlet port in proximity to the energy deliver implement. The energy deliver implement is configured to deliver ultrasonic and/or RF energy to the surgical site and is coupled to the generator module 140 by a cable extending initially through the shaft.


The irrigation tube can be in fluid communication with a fluid source, and the aspiration tube can be in fluid communication with a vacuum source. The fluid source and/or the vacuum source can be housed in the suction/irrigation module 128. In one example, the fluid source and/or the vacuum source can be housed in the hub enclosure 136 separately from the suction/irrigation module 128. In such example, a fluid interface can be configured to connect the suction/irrigation module 128 to the fluid source and/or the vacuum source.


In one aspect, the modules 140, 126, 128 and/or their corresponding docking stations on the hub modular enclosure 136 may include alignment features that are configured to align the docking ports of the modules into engagement with their counterparts in the docking stations of the hub modular enclosure 136. For example, as illustrated in FIG. 4, the combo generator module 145 includes side brackets 155 that are configured to slidably engage with corresponding brackets 156 of the corresponding docking station 151 of the hub modular enclosure 136. The brackets cooperate to guide the docking port contacts of the combo generator module 145 into an electrical engagement with the docking port contacts of the hub modular enclosure 136.


In some aspects, the drawers 151 of the hub modular enclosure 136 are the same, or substantially the same size, and the modules are adjusted in size to be received in the drawers 151. For example, the side brackets 155 and/or 156 can be larger or smaller depending on the size of the module. In other aspects, the drawers 151 are different in size and are each designed to accommodate a particular module.


Furthermore, the contacts of a particular module can be keyed for engagement with the contacts of a particular drawer to avoid inserting a module into a drawer with mismatching contacts.


As illustrated in FIG. 4, the docking port 150 of one drawer 151 can be coupled to the docking port 150 of another drawer 151 through a communications link 157 to facilitate an interactive communication between the modules housed in the hub modular enclosure 136. The docking ports 150 of the hub modular enclosure 136 may alternatively, or additionally, facilitate a wireless interactive communication between the modules housed in the hub modular enclosure 136. Any suitable wireless communication can be employed, such as for example Air Titan-Bluetooth.



FIG. 6 illustrates individual power bus attachments for a plurality of lateral docking ports of a lateral modular housing 160 configured to receive a plurality of modules of a surgical hub 206. The lateral modular housing 160 is configured to laterally receive and interconnect the modules 161. The modules 161 are slidably inserted into docking stations 162 of lateral modular housing 160, which includes a backplane for interconnecting the modules 161. As illustrated in FIG. 6, the modules 161 are arranged laterally in the lateral modular housing 160. Alternatively, the modules 161 may be arranged vertically in a lateral modular housing.



FIG. 7 illustrates a vertical modular housing 164 configured to receive a plurality of modules 165 of the surgical hub 106. The modules 165 are slidably inserted into docking stations, or drawers, 167 of vertical modular housing 164, which includes a backplane for interconnecting the modules 165. Although the drawers 167 of the vertical modular housing 164 are arranged vertically, in certain instances, a vertical modular housing 164 may include drawers that are arranged laterally. Furthermore, the modules 165 may interact with one another through the docking ports of the vertical modular housing 164. In the example of FIG. 7, a display 177 is provided for displaying data relevant to the operation of the modules 165. In addition, the vertical modular housing 164 includes a master module 178 housing a plurality of sub-modules that are slidably received in the master module 178.


In various aspects, the imaging module 138 comprises an integrated video processor and a modular light source and is adapted for use with various imaging devices. In one aspect, the imaging device is comprised of a modular housing that can be assembled with a light source module and a camera module. The housing can be a disposable housing. In at least one example, the disposable housing is removably coupled to a reusable controller, a light source module, and a camera module. The light source module and/or the camera module can be selectively chosen depending on the type of surgical procedure. In one aspect, the camera module comprises a CCD sensor. In another aspect, the camera module comprises a CMOS sensor. In another aspect, the camera module is configured for scanned beam imaging. Likewise, the light source module can be configured to deliver a white light or a different light, depending on the surgical procedure.


During a surgical procedure, removing a surgical device from the surgical field and replacing it with another surgical device that includes a different camera or a different light source can be inefficient. Temporarily losing sight of the surgical field may lead to undesirable consequences. The module imaging device of the present disclosure is configured to permit the replacement of a light source module or a camera module midstream during a surgical procedure, without having to remove the imaging device from the surgical field.


In one aspect, the imaging device comprises a tubular housing that includes a plurality of channels. A first channel is configured to slidably receive the camera module, which can be configured for a snap-fit engagement with the first channel. A second channel is configured to slidably receive the light source module, which can be configured for a snap-fit engagement with the second channel. In another example, the camera module and/or the light source module can be rotated into a final position within their respective channels. A threaded engagement can be employed in lieu of the snap-fit engagement.


In various examples, multiple imaging devices are placed at different positions in the surgical field to provide multiple views. The imaging module 138 can be configured to switch between the imaging devices to provide an optimal view. In various aspects, the imaging module 138 can be configured to integrate the images from the different imaging device.


Various image processors and imaging devices suitable for use with the present disclosure are described in U.S. Pat. No. 7,995,045, titled COMBINED SBI AND CONVENTIONAL IMAGE PROCESSOR, which issued on Aug. 9, 2011, which is herein incorporated by reference in its entirety. In addition, U.S. Pat. No. 7,982,776, titled SBI MOTION ARTIFACT REMOVAL APPARATUS AND METHOD, which issued on Jul. 19, 2011, which is herein incorporated by reference in its entirety, describes various systems for removing motion artifacts from image data. Such systems can be integrated with the imaging module 138. Furthermore, U.S. Patent Application Publication No. 2011/0306840, titled CONTROLLABLE MAGNETIC SOURCE TO FIXTURE INTRACORPOREAL APPARATUS, which published on Dec. 15, 2011, and U.S. Patent Application Publication No. 2014/0243597, titled SYSTEM FOR PERFORMING A MINIMALLY INVASIVE SURGICAL PROCEDURE, which published on Aug. 28, 2014, each of which is herein incorporated by reference in its entirety.



FIG. 8 illustrates a surgical data network 201 comprising a modular communication hub 203 configured to connect modular devices located in one or more operating theaters of a healthcare facility, or any room in a healthcare facility specially equipped for surgical operations, to a cloud-based system (e.g., the cloud 204 that may include a remote server 213 coupled to a storage device 205). In one aspect, the modular communication hub 203 comprises a network hub 207 and/or a network switch 209 in communication with a network router. The modular communication hub 203 also can be coupled to a local computer system 210 to provide local computer processing and data manipulation. The surgical data network 201 may be configured as passive, intelligent, or switching. A passive surgical data network serves as a conduit for the data, enabling it to go from one device (or segment) to another and to the cloud computing resources. An intelligent surgical data network includes additional features to enable the traffic passing through the surgical data network to be monitored and to configure each port in the network hub 207 or network switch 209. An intelligent surgical data network may be referred to as a manageable hub or switch. A switching hub reads the destination address of each packet and then forwards the packet to the correct port.


Modular devices 1a-1n located in the operating theater may be coupled to the modular communication hub 203. The network hub 207 and/or the network switch 209 may be coupled to a network router 211 to connect the devices 1a-1n to the cloud 204 or the local computer system 210. Data associated with the devices 1a-1n may be transferred to cloud-based computers via the router for remote data processing and manipulation. Data associated with the devices 1a-1n may also be transferred to the local computer system 210 for local data processing and manipulation. Modular devices 2a-2m located in the same operating theater also may be coupled to a network switch 209. The network switch 209 may be coupled to the network hub 207 and/or the network router 211 to connect to the devices 2a-2m to the cloud 204. Data associated with the devices 2a-2n may be transferred to the cloud 204 via the network router 211 for data processing and manipulation. Data associated with the devices 2a-2m may also be transferred to the local computer system 210 for local data processing and manipulation.


It will be appreciated that the surgical data network 201 may be expanded by interconnecting multiple network hubs 207 and/or multiple network switches 209 with multiple network routers 211. The modular communication hub 203 may be contained in a modular control tower configured to receive multiple devices 1a-1n/2a-2m. The local computer system 210 also may be contained in a modular control tower. The modular communication hub 203 is connected to a display 212 to display images obtained by some of the devices 1a-1n/2a-2m, for example during surgical procedures. In various aspects, the devices 1a-1n/2a-2m may include, for example, various modules such as an imaging module 138 coupled to an endoscope, a generator module 140 coupled to an energy-based surgical device, a smoke evacuation module 126, a suction/irrigation module 128, a communication module 130, a processor module 132, a storage array 134, a surgical device coupled to a display, and/or a non-contact sensor module, among other modular devices that may be connected to the modular communication hub 203 of the surgical data network 201.


In one aspect, the surgical data network 201 may comprise a combination of network hub(s), network switch(es), and network router(s) connecting the devices 1a-1n/2a-2m to the cloud. Any one of or all of the devices 1a-1n/2a-2m coupled to the network hub or network switch may collect data in real time and transfer the data to cloud computers for data processing and manipulation. It will be appreciated that cloud computing relies on sharing computing resources rather than having local servers or personal devices to handle software applications. The word “cloud” may be used as a metaphor for “the Internet,” although the term is not limited as such. Accordingly, the term “cloud computing” may be used herein to refer to “a type of Internet-based computing,” where different services—such as servers, storage, and applications—are delivered to the modular communication hub 203 and/or computer system 210 located in the surgical theater (e.g., a fixed, mobile, temporary, or field operating room or space) and to devices connected to the modular communication hub 203 and/or computer system 210 through the Internet. The cloud infrastructure may be maintained by a cloud service provider. In this context, the cloud service provider may be the entity that coordinates the usage and control of the devices 1a-1n/2a-2m located in one or more operating theaters. The cloud computing services can perform a large number of calculations based on the data gathered by smart surgical instruments, robots, and other computerized devices located in the operating theater. The hub hardware enables multiple devices or connections to be connected to a computer that communicates with the cloud computing resources and storage.


Applying cloud computer data processing techniques on the data collected by the devices 1a-1n/2a-2m, the surgical data network provides improved surgical outcomes, reduced costs, and improved patient satisfaction. At least some of the devices 1a-1n/2a-2m may be employed to view tissue states to assess leaks or perfusion of sealed tissue after a tissue sealing and cutting procedure. At least some of the devices 1a-1n/2a-2m may be employed to identify pathology, such as the effects of diseases, using the cloud-based computing to examine data including images of samples of body tissue for diagnostic purposes. This includes localization and margin confirmation of tissue and phenotypes. At least some of the devices 1a-1n/2a-2m may be employed to identify anatomical structures of the body using a variety of sensors integrated with imaging devices and techniques such as overlaying images captured by multiple imaging devices. The data gathered by the devices 1a-1n/2a-2m, including image data, may be transferred to the cloud 204 or the local computer system 210 or both for data processing and manipulation including image processing and manipulation. The data may be analyzed to improve surgical procedure outcomes by determining if further treatment, such as the application of endoscopic intervention, emerging technologies, a targeted radiation, targeted intervention, and precise robotics to tissue-specific sites and conditions, may be pursued. Such data analysis may further employ outcome analytics processing, and using standardized approaches may provide beneficial feedback to either confirm surgical treatments and the behavior of the surgeon or suggest modifications to surgical treatments and the behavior of the surgeon.


In one implementation, the operating theater devices 1a-1n may be connected to the modular communication hub 203 over a wired channel or a wireless channel depending on the configuration of the devices 1a-1n to a network hub. The network hub 207 may be implemented, in one aspect, as a local network broadcast device that works on the physical layer of the Open System Interconnection (OSI) model. The network hub provides connectivity to the devices 1a-1n located in the same operating theater network. The network hub 207 collects data in the form of packets and sends them to the router in half duplex mode. The network hub 207 does not store any media access control/internet protocol (MAC/IP) to transfer the device data. Only one of the devices 1a-1n can send data at a time through the network hub 207. The network hub 207 has no routing tables or intelligence regarding where to send information and broadcasts all network data across each connection and to a remote server 213 (FIG. 9) over the cloud 204. The network hub 207 can detect basic network errors such as collisions, but having all information broadcast to multiple ports can be a security risk and cause bottlenecks.


In another implementation, the operating theater devices 2a-2m may be connected to a network switch 209 over a wired channel or a wireless channel. The network switch 209 works in the data link layer of the OSI model. The network switch 209 is a multicast device for connecting the devices 2a-2m located in the same operating theater to the network. The network switch 209 sends data in the form of frames to the network router 211 and works in full duplex mode. Multiple devices 2a-2m can send data at the same time through the network switch 209. The network switch 209 stores and uses MAC addresses of the devices 2a-2m to transfer data.


The network hub 207 and/or the network switch 209 are coupled to the network router 211 for connection to the cloud 204. The network router 211 works in the network layer of the OSI model. The network router 211 creates a route for transmitting data packets received from the network hub 207 and/or network switch 211 to cloud-based computer resources for further processing and manipulation of the data collected by any one of or all the devices 1a-1n/2a-2m. The network router 211 may be employed to connect two or more different networks located in different locations, such as, for example, different operating theaters of the same healthcare facility or different networks located in different operating theaters of different healthcare facilities. The network router 211 sends data in the form of packets to the cloud 204 and works in full duplex mode. Multiple devices can send data at the same time. The network router 211 uses IP addresses to transfer data.


In one example, the network hub 207 may be implemented as a USB hub, which allows multiple USB devices to be connected to a host computer. The USB hub may expand a single USB port into several tiers so that there are more ports available to connect devices to the host system computer. The network hub 207 may include wired or wireless capabilities to receive information over a wired channel or a wireless channel. In one aspect, a wireless USB short-range, high-bandwidth wireless radio communication protocol may be employed for communication between the devices 1a-1n and devices 2a-2m located in the operating theater.


In other examples, the operating theater devices 1a-1n/2a-2m may communicate to the modular communication hub 203 via Bluetooth wireless technology standard for exchanging data over short distances (using short-wavelength UHF radio waves in the ISM band from 2.4 to 2.485 GHz) from fixed and mobile devices and building personal area networks (PANs). In other aspects, the operating theater devices 1a-1n/2a-2m may communicate to the modular communication hub 203 via a number of wireless or wired communication standards or protocols, including but not limited to Wi-Fi (IEEE 802.11 family), WiMAX (IEEE 802.16 family), IEEE 802.20, long-term evolution (LTE), and Ev-DO, HSPA+, HSDPA+, HSUPA+, EDGE, GSM, GPRS, CDMA, TDMA, DECT, and Ethernet derivatives thereof, as well as any other wireless and wired protocols that are designated as 3G, 4G, 5G, and beyond. The computing module may include a plurality of communication modules. For instance, a first communication module may be dedicated to shorter-range wireless communications such as Wi-Fi and Bluetooth, and a second communication module may be dedicated to longer-range wireless communications such as GPS, EDGE, GPRS, CDMA, WiMAX, LTE, Ev-DO, and others.


The modular communication hub 203 may serve as a central connection for one or all of the operating theater devices 1a-1n/2a-2m and handles a data type known as frames. Frames carry the data generated by the devices 1a-1n/2a-2m. When a frame is received by the modular communication hub 203, it is amplified and transmitted to the network router 211, which transfers the data to the cloud computing resources by using a number of wireless or wired communication standards or protocols, as described herein.


The modular communication hub 203 can be used as a standalone device or be connected to compatible network hubs and network switches to form a larger network. The modular communication hub 203 is generally easy to install, configure, and maintain, making it a good option for networking the operating theater devices 1a-1n/2a-2m.



FIG. 9 illustrates a computer-implemented interactive surgical system 200. The computer-implemented interactive surgical system 200 is similar in many respects to the computer-implemented interactive surgical system 100. For example, the computer-implemented interactive surgical system 200 includes one or more surgical systems 202, which are similar in many respects to the surgical systems 102. Each surgical system 202 includes at least one surgical hub 206 in communication with a cloud 204 that may include a remote server 213. In one aspect, the computer-implemented interactive surgical system 200 comprises a modular control tower 236 connected to multiple operating theater devices such as, for example, intelligent surgical instruments, robots, and other computerized devices located in the operating theater. As shown in FIG. 10, the modular control tower 236 comprises a modular communication hub 203 coupled to a computer system 210. As illustrated in the example of FIG. 9, the modular control tower 236 is coupled to an imaging module 238 that is coupled to an endoscope 239, a generator module 240 that is coupled to an energy device 241, a smoke evacuator module 226, a suction/irrigation module 228, a communication module 230, a processor module 232, a storage array 234, a smart device/instrument 235 optionally coupled to a display 237, and a non-contact sensor module 242. The operating theater devices are coupled to cloud computing resources and data storage via the modular control tower 236. A robot hub 222 also may be connected to the modular control tower 236 and to the cloud computing resources. The devices/instruments 235, visualization systems 208, among others, may be coupled to the modular control tower 236 via wired or wireless communication standards or protocols, as described herein. The modular control tower 236 may be coupled to a hub display 215 (e.g., monitor, screen) to display and overlay images received from the imaging module, device/instrument display, and/or other visualization systems 208. The hub display also may display data received from devices connected to the modular control tower in conjunction with images and overlaid images.



FIG. 10 illustrates a surgical hub 206 comprising a plurality of modules coupled to the modular control tower 236. The modular control tower 236 comprises a modular communication hub 203, e.g., a network connectivity device, and a computer system 210 to provide local processing, visualization, and imaging, for example. As shown in FIG. 10, the modular communication hub 203 may be connected in a tiered configuration to expand the number of modules (e.g., devices) that may be connected to the modular communication hub 203 and transfer data associated with the modules to the computer system 210, cloud computing resources, or both. As shown in FIG. 10, each of the network hubs/switches in the modular communication hub 203 includes three downstream ports and one upstream port. The upstream network hub/switch is connected to a processor to provide a communication connection to the cloud computing resources and a local display 217. Communication to the cloud 204 may be made either through a wired or a wireless communication channel.


The surgical hub 206 employs a non-contact sensor module 242 to measure the dimensions of the operating theater and generate a map of the surgical theater using either ultrasonic or laser-type non-contact measurement devices. An ultrasound-based non-contact sensor module scans the operating theater by transmitting a burst of ultrasound and receiving the echo when it bounces off the perimeter walls of an operating theater as described under the heading “Surgical Hub Spatial Awareness Within an Operating Room” in U.S. Provisional Patent Application Ser. No. 62/611,341, titled INTERACTIVE SURGICAL PLATFORM, filed Dec. 28, 2017, the disclosure of which is herein incorporated by reference in its entirety, in which the sensor module is configured to determine the size of the operating theater and to adjust Bluetooth-pairing distance limits. A laser-based non-contact sensor module scans the operating theater by transmitting laser light pulses, receiving laser light pulses that bounce off the perimeter walls of the operating theater, and comparing the phase of the transmitted pulse to the received pulse to determine the size of the operating theater and to adjust Bluetooth pairing distance limits, for example.


The computer system 210 comprises a processor 244 and a network interface 245. The processor 244 is coupled to a communication module 247, storage 248, memory 249, non-volatile memory 250, and input/output interface 251 via a system bus. The system bus can be any of several types of bus structure(s) including the memory bus or memory controller, a peripheral bus or external bus, and/or a local bus using any variety of available bus architectures including, but not limited to, 9-bit bus, Industrial Standard Architecture (ISA), Micro-Charmel Architecture (MSA), Extended ISA (EISA), Intelligent Drive Electronics (IDE), VESA Local Bus (VLB), Peripheral Component Interconnect (PCI), USB, Advanced Graphics Port (AGP), Personal Computer Memory Card International Association bus (PCMCIA), Small Computer Systems Interface (SCSI), or any other proprietary bus.


The processor 244 may be any single-core or multicore processor such as those known under the trade name ARM Cortex by Texas Instruments. In one aspect, the processor may be an LM4F230H5QR ARM Cortex-M4F Processor Core, available from Texas Instruments, for example, comprising an on-chip memory of 256 KB single-cycle flash memory, or other non-volatile memory, up to 40 MHz, a prefetch buffer to improve performance above 40 MHz, a 32 KB single-cycle serial random access memory (SRAM), an internal read-only memory (ROM) loaded with StellarisWare® software, a 2 KB electrically erasable programmable read-only memory (EEPROM), and/or one or more pulse width modulation (PWM) modules, one or more quadrature encoder inputs (QEI) analogs, one or more 12-bit analog-to-digital converters (ADCs) with 12 analog input channels, details of which are available for the product datasheet.


In one aspect, the processor 244 may comprise a safety controller comprising two controller-based families such as TMS570 and RM4x, known under the trade name Hercules ARM Cortex R4, also by Texas Instruments. The safety controller may be configured specifically for IEC 61508 and ISO 26262 safety critical applications, among others, to provide advanced integrated safety features while delivering scalable performance, connectivity, and memory options.


The system memory includes volatile memory and non-volatile memory. The basic input/output system (BIOS), containing the basic routines to transfer information between elements within the computer system, such as during start-up, is stored in non-volatile memory. For example, the non-volatile memory can include ROM, programmable ROM (PROM), electrically programmable ROM (EPROM), EEPROM, or flash memory. Volatile memory includes random-access memory (RAM), which acts as external cache memory. Moreover, RAM is available in many forms such as SRAM, dynamic RAM (DRAM), synchronous DRAM (SDRAM), double data rate SDRAM (DDR SDRAM), enhanced SDRAM (ESDRAM), Synchlink DRAM (SLDRAM), and direct Rambus RAM (DRRAM).


The computer system 210 also includes removable/non-removable, volatile/non-volatile computer storage media, such as for example disk storage. The disk storage includes, but is not limited to, devices like a magnetic disk drive, floppy disk drive, tape drive, Jaz drive, Zip drive, LS-60 drive, flash memory card, or memory stick. In addition, the disk storage can include storage media separately or in combination with other storage media including, but not limited to, an optical disc drive such as a compact disc ROM device (CD-ROM), compact disc recordable drive (CD-R Drive), compact disc rewritable drive (CD-RW Drive), or a digital versatile disc ROM drive (DVD-ROM). To facilitate the connection of the disk storage devices to the system bus, a removable or non-removable interface may be employed.


It is to be appreciated that the computer system 210 includes software that acts as an intermediary between users and the basic computer resources described in a suitable operating environment. Such software includes an operating system. The operating system, which can be stored on the disk storage, acts to control and allocate resources of the computer system. System applications take advantage of the management of resources by the operating system through program modules and program data stored either in the system memory or on the disk storage. It is to be appreciated that various components described herein can be implemented with various operating systems or combinations of operating systems.


A user enters commands or information into the computer system 210 through input device(s) coupled to the I/O interface 251. The input devices include, but are not limited to, a pointing device such as a mouse, trackball, stylus, touch pad, keyboard, microphone, joystick, game pad, satellite dish, scanner, TV tuner card, digital camera, digital video camera, web camera, and the like. These and other input devices connect to the processor through the system bus via interface port(s). The interface port(s) include, for example, a serial port, a parallel port, a game port, and a USB. The output device(s) use some of the same types of ports as input device(s). Thus, for example, a USB port may be used to provide input to the computer system and to output information from the computer system to an output device. An output adapter is provided to illustrate that there are some output devices like monitors, displays, speakers, and printers, among other output devices that require special adapters. The output adapters include, by way of illustration and not limitation, video and sound cards that provide a means of connection between the output device and the system bus. It should be noted that other devices and/or systems of devices, such as remote computer(s), provide both input and output capabilities.


The computer system 210 can operate in a networked environment using logical connections to one or more remote computers, such as cloud computer(s), or local computers. The remote cloud computer(s) can be a personal computer, server, router, network PC, workstation, microprocessor-based appliance, peer device, or other common network node, and the like, and typically includes many or all of the elements described relative to the computer system. For purposes of brevity, only a memory storage device is illustrated with the remote computer(s). The remote computer(s) is logically connected to the computer system through a network interface and then physically connected via a communication connection. The network interface encompasses communication networks such as local area networks (LANs) and wide area networks (WANs). LAN technologies include Fiber Distributed Data Interface (FDDI), Copper Distributed Data Interface (CDDI), Ethernet/IEEE 802.3, Token Ring/IEEE 802.5 and the like. WAN technologies include, but are not limited to, point-to-point links, circuit-switching networks like Integrated Services Digital Networks (ISDN) and variations thereon, packet-switching networks, and Digital Subscriber Lines (DSL).


In various aspects, the computer system 210 of FIG. 10, the imaging module 238 and/or visualization system 208, and/or the processor module 232 of FIGS. 9-10, may comprise an image processor, image processing engine, media processor, or any specialized digital signal processor (DSP) used for the processing of digital images. The image processor may employ parallel computing with single instruction, multiple data (SIMD) or multiple instruction, multiple data (MIMD) technologies to increase speed and efficiency. The digital image processing engine can perform a range of tasks. The image processor may be a system on a chip with multicore processor architecture.


The communication connection(s) refers to the hardware/software employed to connect the network interface to the bus. While the communication connection is shown for illustrative clarity inside the computer system, it can also be external to the computer system 210. The hardware/software necessary for connection to the network interface includes, for illustrative purposes only, internal and external technologies such as modems, including regular telephone-grade modems, cable modems, and DSL modems, ISDN adapters, and Ethernet cards.



FIG. 11 illustrates a functional block diagram of one aspect of a USB network hub 300 device, according to one aspect of the present disclosure. In the illustrated aspect, the USB network hub device 300 employs a TUSB2036 integrated circuit hub by Texas Instruments. The USB network hub 300 is a CMOS device that provides an upstream USB transceiver port 302 and up to three downstream USB transceiver ports 304, 306, 308 in compliance with the USB 2.0 specification. The upstream USB transceiver port 302 is a differential root data port comprising a differential data minus (DM0) input paired with a differential data plus (DP0) input. The three downstream USB transceiver ports 304, 306, 308 are differential data ports where each port includes differential data plus (DP1-DP3) outputs paired with differential data minus (DM1-DM3) outputs.


The USB network hub 300 device is implemented with a digital state machine instead of a microcontroller, and no firmware programming is required. Fully compliant USB transceivers are integrated into the circuit for the upstream USB transceiver port 302 and all downstream USB transceiver ports 304, 306, 308. The downstream USB transceiver ports 304, 306, 308 support both full-speed and low-speed devices by automatically setting the slew rate according to the speed of the device attached to the ports. The USB network hub 300 device may be configured either in bus-powered or self-powered mode and includes a hub power logic 312 to manage power.


The USB network hub 300 device includes a serial interface engine 310 (SIE). The SIE 310 is the front end of the USB network hub 300 hardware and handles most of the protocol described in chapter 8 of the USB specification. The SIE 310 typically comprehends signaling up to the transaction level. The functions that it handles could include: packet recognition, transaction sequencing, SOP, EOP, RESET, and RESUME signal detection/generation, clock/data separation, non-return-to-zero invert (NRZI) data encoding/decoding and bit-stuffing, CRC generation and checking (token and data), packet ID (PID) generation and checking/decoding, and/or serial-parallel/parallel-serial conversion. The 310 receives a clock input 314 and is coupled to a suspend/resume logic and frame timer 316 circuit and a hub repeater circuit 318 to control communication between the upstream USB transceiver port 302 and the downstream USB transceiver ports 304, 306, 308 through port logic circuits 320, 322, 324. The SIE 310 is coupled to a command decoder 326 via interface logic to control commands from a serial EEPROM via a serial EEPROM interface 330.


In various aspects, the USB network hub 300 can connect 127 functions configured in up to six logical layers (tiers) to a single computer. Further, the USB network hub 300 can connect to all peripherals using a standardized four-wire cable that provides both communication and power distribution. The power configurations are bus-powered and self-powered modes. The USB network hub 300 may be configured to support four modes of power management: a bus-powered hub, with either individual-port power management or ganged-port power management, and the self-powered hub, with either individual-port power management or ganged-port power management. In one aspect, using a USB cable, the USB network hub 300, the upstream USB transceiver port 302 is plugged into a USB host controller, and the downstream USB transceiver ports 304, 306, 308 are exposed for connecting USB compatible devices, and so forth.


Surgical Instrument Hardware


FIG. 12 illustrates a logic diagram of a control system 470 of a surgical instrument or tool in accordance with one or more aspects of the present disclosure. The system 470 comprises a control circuit. The control circuit includes a microcontroller 461 comprising a processor 462 and a memory 468. One or more of sensors 472, 474, 476, for example, provide real-time feedback to the processor 462. A motor 482, driven by a motor driver 492, operably couples a longitudinally movable displacement member to drive the I-beam knife element. A tracking system 480 is configured to determine the position of the longitudinally movable displacement member. The position information is provided to the processor 462, which can be programmed or configured to determine the position of the longitudinally movable drive member as well as the position of a firing member, firing bar, and I-beam knife element. Additional motors may be provided at the tool driver interface to control I-beam firing, closure tube travel, shaft rotation, and articulation. A display 473 displays a variety of operating conditions of the instruments and may include touch screen functionality for data input. Information displayed on the display 473 may be overlaid with images acquired via endoscopic imaging modules.


In one aspect, the microcontroller 461 may be any single-core or multicore processor such as those known under the trade name ARM Cortex by Texas Instruments. In one aspect, the main microcontroller 461 may be an LM4F230H5QR ARM Cortex-M4F Processor Core, available from Texas Instruments, for example, comprising an on-chip memory of 256 KB single-cycle flash memory, or other non-volatile memory, up to 40 MHz, a prefetch buffer to improve performance above 40 MHz, a 32 KB single-cycle SRAM, and internal ROM loaded with StellarisWare® software, a 2 KB EEPROM, one or more PWM modules, one or more QEI analogs, and/or one or more 12-bit ADCs with 12 analog input channels, details of which are available for the product datasheet.


In one aspect, the microcontroller 461 may comprise a safety controller comprising two controller-based families such as TMS570 and RM4x, known under the trade name Hercules ARM Cortex R4, also by Texas Instruments. The safety controller may be configured specifically for IEC 61508 and ISO 26262 safety critical applications, among others, to provide advanced integrated safety features while delivering scalable performance, connectivity, and memory options.


The microcontroller 461 may be programmed to perform various functions such as precise control over the speed and position of the knife and articulation systems. In one aspect, the microcontroller 461 includes a processor 462 and a memory 468. The electric motor 482 may be a brushed direct current (DC) motor with a gearbox and mechanical links to an articulation or knife system. In one aspect, a motor driver 492 may be an A3941 available from Allegro Microsystems, Inc. Other motor drivers may be readily substituted for use in the tracking system 480 comprising an absolute positioning system. A detailed description of an absolute positioning system is described in U.S. Patent Application Publication No. 2017/0296213, titled SYSTEMS AND METHODS FOR CONTROLLING A SURGICAL STAPLING AND CUTTING INSTRUMENT, which published on Oct. 19, 2017, which is herein incorporated by reference in its entirety.


The microcontroller 461 may be programmed to provide precise control over the speed and position of displacement members and articulation systems. The microcontroller 461 may be configured to compute a response in the software of the microcontroller 461. The computed response is compared to a measured response of the actual system to obtain an “observed” response, which is used for actual feedback decisions. The observed response is a favorable, tuned value that balances the smooth, continuous nature of the simulated response with the measured response, which can detect outside influences on the system.


In one aspect, the motor 482 may be controlled by the motor driver 492 and can be employed by the firing system of the surgical instrument or tool. In various forms, the motor 482 may be a brushed DC driving motor having a maximum rotational speed of approximately 25,000 RPM. In other arrangements, the motor 482 may include a brushless motor, a cordless motor, a synchronous motor, a stepper motor, or any other suitable electric motor. The motor driver 492 may comprise an H-bridge driver comprising field-effect transistors (FETs), for example. The motor 482 can be powered by a power assembly releasably mounted to the handle assembly or tool housing for supplying control power to the surgical instrument or tool. The power assembly may comprise a battery which may include a number of battery cells connected in series that can be used as the power source to power the surgical instrument or tool. In certain circumstances, the battery cells of the power assembly may be replaceable and/or rechargeable. In at least one example, the battery cells can be lithium-ion batteries which can be couplable to and separable from the power assembly.


The motor driver 492 may be an A3941 available from Allegro Microsystems, Inc. The A3941 492 is a full-bridge controller for use with external N-channel power metal-oxide semiconductor field-effect transistors (MOSFETs) specifically designed for inductive loads, such as brush DC motors. The driver 492 comprises a unique charge pump regulator that provides full (>10 V) gate drive for battery voltages down to 7 V and allows the A3941 to operate with a reduced gate drive, down to 5.5 V. A bootstrap capacitor may be employed to provide the above battery supply voltage required for N-channel MOSFETs. An internal charge pump for the high-side drive allows DC (100% duty cycle) operation. The full bridge can be driven in fast or slow decay modes using diode or synchronous rectification. In the slow decay mode, current recirculation can be through the high-side or the lowside FETs. The power FETs are protected from shoot-through by resistor-adjustable dead time. Integrated diagnostics provide indications of undervoltage, overtemperature, and power bridge faults and can be configured to protect the power MOSFETs under most short circuit conditions. Other motor drivers may be readily substituted for use in the tracking system 480 comprising an absolute positioning system.


The tracking system 480 comprises a controlled motor drive circuit arrangement comprising a position sensor 472 according to one aspect of this disclosure. The position sensor 472 for an absolute positioning system provides a unique position signal corresponding to the location of a displacement member. In one aspect, the displacement member represents a longitudinally movable drive member comprising a rack of drive teeth for meshing engagement with a corresponding drive gear of a gear reducer assembly. In other aspects, the displacement member represents the firing member, which could be adapted and configured to include a rack of drive teeth. In yet another aspect, the displacement member represents a firing bar or the I-beam, each of which can be adapted and configured to include a rack of drive teeth. Accordingly, as used herein, the term displacement member is used generically to refer to any movable member of the surgical instrument or tool such as the drive member, the firing member, the firing bar, the I-beam, or any element that can be displaced. In one aspect, the longitudinally movable drive member is coupled to the firing member, the firing bar, and the I-beam. Accordingly, the absolute positioning system can, in effect, track the linear displacement of the I-beam by tracking the linear displacement of the longitudinally movable drive member. In various other aspects, the displacement member may be coupled to any position sensor 472 suitable for measuring linear displacement. Thus, the longitudinally movable drive member, the firing member, the firing bar, or the I-beam, or combinations thereof, may be coupled to any suitable linear displacement sensor. Linear displacement sensors may include contact or non-contact displacement sensors. Linear displacement sensors may comprise linear variable differential transformers (LVDT), differential variable reluctance transducers (DVRT), a slide potentiometer, a magnetic sensing system comprising a movable magnet and a series of linearly arranged Hall effect sensors, a magnetic sensing system comprising a fixed magnet and a series of movable, linearly arranged Hall effect sensors, an optical sensing system comprising a movable light source and a series of linearly arranged photo diodes or photo detectors, an optical sensing system comprising a fixed light source and a series of movable linearly, arranged photo diodes or photo detectors, or any combination thereof.


The electric motor 482 can include a rotatable shaft that operably interfaces with a gear assembly that is mounted in meshing engagement with a set, or rack, of drive teeth on the displacement member. A sensor element may be operably coupled to a gear assembly such that a single revolution of the position sensor 472 element corresponds to some linear longitudinal translation of the displacement member. An arrangement of gearing and sensors can be connected to the linear actuator, via a rack and pinion arrangement, or a rotary actuator, via a spur gear or other connection. A power source supplies power to the absolute positioning system and an output indicator may display the output of the absolute positioning system. The displacement member represents the longitudinally movable drive member comprising a rack of drive teeth formed thereon for meshing engagement with a corresponding drive gear of the gear reducer assembly. The displacement member represents the longitudinally movable firing member, firing bar, I-beam, or combinations thereof.


A single revolution of the sensor element associated with the position sensor 472 is equivalent to a longitudinal linear displacement d1 of the of the displacement member, where d1 is the longitudinal linear distance that the displacement member moves from point “a” to point “b” after a single revolution of the sensor element coupled to the displacement member. The sensor arrangement may be connected via a gear reduction that results in the position sensor 472 completing one or more revolutions for the full stroke of the displacement member. The position sensor 472 may complete multiple revolutions for the full stroke of the displacement member.


A series of switches, where n is an integer greater than one, may be employed alone or in combination with a gear reduction to provide a unique position signal for more than one revolution of the position sensor 472. The state of the switches are fed back to the microcontroller 461 that applies logic to determine a unique position signal corresponding to the longitudinal linear displacement d1+d2+ . . . dn of the displacement member. The output of the position sensor 472 is provided to the microcontroller 461. The position sensor 472 of the sensor arrangement may comprise a magnetic sensor, an analog rotary sensor like a potentiometer, or an array of analog Hall-effect elements, which output a unique combination of position signals or values.


The position sensor 472 may comprise any number of magnetic sensing elements, such as, for example, magnetic sensors classified according to whether they measure the total magnetic field or the vector components of the magnetic field. The techniques used to produce both types of magnetic sensors encompass many aspects of physics and electronics. The technologies used for magnetic field sensing include search coil, fluxgate, optically pumped, nuclear precession, SQUID, Hall-effect, anisotropic magnetoresistance, giant magnetoresistance, magnetic tunnel junctions, giant magnetoimpedance, magnetostrictive/piezoelectric composites, magnetodiode, magnetotransistor, fiber-optic, magneto-optic, and microelectromechanical systems-based magnetic sensors, among others.


In one aspect, the position sensor 472 for the tracking system 480 comprising an absolute positioning system comprises a magnetic rotary absolute positioning system. The position sensor 472 may be implemented as an AS5055EQFT single-chip magnetic rotary position sensor available from Austria Microsystems, AG. The position sensor 472 is interfaced with the microcontroller 461 to provide an absolute positioning system. The position sensor 472 is a low-voltage and low-power component and includes four Hall-effect elements in an area of the position sensor 472 that is located above a magnet. A high-resolution ADC and a smart power management controller are also provided on the chip. A coordinate rotation digital computer (CORDIC) processor, also known as the digit-by-digit method and Volder's algorithm, is provided to implement a simple and efficient algorithm to calculate hyperbolic and trigonometric functions that require only addition, subtraction, bitshift, and table lookup operations. The angle position, alarm bits, and magnetic field information are transmitted over a standard serial communication interface, such as a serial peripheral interface (SPI) interface, to the microcontroller 461. The position sensor 472 provides 12 or 14 bits of resolution. The position sensor 472 may be an AS5055 chip provided in a small QFN 16-pin 4×4×0.85 mm package.


The tracking system 480 comprising an absolute positioning system may comprise and/or be programmed to implement a feedback controller, such as a PID, state feedback, and adaptive controller. A power source converts the signal from the feedback controller into a physical input to the system: in this case the voltage. Other examples include a PWM of the voltage, current, and force. Other sensor(s) may be provided to measure physical parameters of the physical system in addition to the position measured by the position sensor 472. In some aspects, the other sensor(s) can include sensor arrangements such as those described in U.S. Pat. No. 9,345,481, titled STAPLE CARTRIDGE TISSUE THICKNESS SENSOR SYSTEM, which issued on May 24, 2016, which is herein incorporated by reference in its entirety; U.S. Patent Application Publication No. 2014/0263552, titled STAPLE CARTRIDGE TISSUE THICKNESS SENSOR SYSTEM, which published on Sep. 18, 2014, which is herein incorporated by reference in its entirety; and U.S. patent application Ser. No. 15/628,175, titled TECHNIQUES FOR ADAPTIVE CONTROL OF MOTOR VELOCITY OF A SURGICAL STAPLING AND CUTTING INSTRUMENT, filed Jun. 20, 2017, which is herein incorporated by reference in its entirety. In a digital signal processing system, an absolute positioning system is coupled to a digital data acquisition system where the output of the absolute positioning system will have a finite resolution and sampling frequency. The absolute positioning system may comprise a compare-and-combine circuit to combine a computed response with a measured response using algorithms, such as a weighted average and a theoretical control loop, that drive the computed response towards the measured response. The computed response of the physical system takes into account properties like mass, inertial, viscous friction, inductance resistance, etc., to predict what the states and outputs of the physical system will be by knowing the input.


The absolute positioning system provides an absolute position of the displacement member upon power-up of the instrument, without retracting or advancing the displacement member to a reset (zero or home) position as may be required with conventional rotary encoders that merely count the number of steps forwards or backwards that the motor 482 has taken to infer the position of a device actuator, drive bar, knife, or the like.


A sensor 474, such as, for example, a strain gauge or a micro-strain gauge, is configured to measure one or more parameters of the end effector, such as, for example, the amplitude of the strain exerted on the anvil during a clamping operation, which can be indicative of the closure forces applied to the anvil. The measured strain is converted to a digital signal and provided to the processor 462. Alternatively, or in addition to the sensor 474, a sensor 476, such as, for example, a load sensor, can measure the closure force applied by the closure drive system to the anvil. The sensor 476, such as, for example, a load sensor, can measure the firing force applied to an I-beam in a firing stroke of the surgical instrument or tool. The I-beam is configured to engage a wedge sled, which is configured to upwardly cam staple drivers to force out staples into deforming contact with an anvil. The I-beam also includes a sharpened cutting edge that can be used to sever tissue as the I-beam is advanced distally by the firing bar. Alternatively, a current sensor 478 can be employed to measure the current drawn by the motor 482. The force required to advance the firing member can correspond to the current drawn by the motor 482, for example. The measured force is converted to a digital signal and provided to the processor 462.


In one form, the strain gauge sensor 474 can be used to measure the force applied to the tissue by the end effector. A strain gauge can be coupled to the end effector to measure the force on the tissue being treated by the end effector. A system for measuring forces applied to the tissue grasped by the end effector comprises a strain gauge sensor 474, such as, for example, a micro-strain gauge, that is configured to measure one or more parameters of the end effector, for example. In one aspect, the strain gauge sensor 474 can measure the amplitude or magnitude of the strain exerted on a jaw member of an end effector during a clamping operation, which can be indicative of the tissue compression. The measured strain is converted to a digital signal and provided to a processor 462 of the microcontroller 461. A load sensor 476 can measure the force used to operate the knife element, for example, to cut the tissue captured between the anvil and the staple cartridge. A magnetic field sensor can be employed to measure the thickness of the captured tissue. The measurement of the magnetic field sensor also may be converted to a digital signal and provided to the processor 462.


The measurements of the tissue compression, the tissue thickness, and/or the force required to close the end effector on the tissue, as respectively measured by the sensors 474, 476, can be used by the microcontroller 461 to characterize the selected position of the firing member and/or the corresponding value of the speed of the firing member. In one instance, a memory 468 may store a technique, an equation, and/or a lookup table which can be employed by the microcontroller 461 in the assessment.


The control system 470 of the surgical instrument or tool also may comprise wired or wireless communication circuits to communicate with the modular communication hub as shown in FIGS. 8-11.



FIG. 13 illustrates a control circuit 500 configured to control aspects of the surgical instrument or tool according to one aspect of this disclosure. The control circuit 500 can be configured to implement various processes described herein. The control circuit 500 may comprise a microcontroller comprising one or more processors 502 (e.g., microprocessor, microcontroller) coupled to at least one memory circuit 504. The memory circuit 504 stores machine-executable instructions that, when executed by the processor 502, cause the processor 502 to execute machine instructions to implement various processes described herein. The processor 502 may be any one of a number of single-core or multicore processors known in the art. The memory circuit 504 may comprise volatile and non-volatile storage media. The processor 502 may include an instruction processing unit 506 and an arithmetic unit 508. The instruction processing unit may be configured to receive instructions from the memory circuit 504 of this disclosure.



FIG. 14 illustrates a combinational logic circuit 510 configured to control aspects of the surgical instrument or tool according to one aspect of this disclosure. The combinational logic circuit 510 can be configured to implement various processes described herein. The combinational logic circuit 510 may comprise a finite state machine comprising a combinational logic 512 configured to receive data associated with the surgical instrument or tool at an input 514, process the data by the combinational logic 512, and provide an output 516.



FIG. 15 illustrates a sequential logic circuit 520 configured to control aspects of the surgical instrument or tool according to one aspect of this disclosure. The sequential logic circuit 520 or the combinational logic 522 can be configured to implement various processes described herein. The sequential logic circuit 520 may comprise a finite state machine. The sequential logic circuit 520 may comprise a combinational logic 522, at least one memory circuit 524, and a clock 529, for example. The at least one memory circuit 524 can store a current state of the finite state machine. In certain instances, the sequential logic circuit 520 may be synchronous or asynchronous. The combinational logic 522 is configured to receive data associated with the surgical instrument or tool from an input 526, process the data by the combinational logic 522, and provide an output 528. In other aspects, the circuit may comprise a combination of a processor (e.g., processor 502, FIG. 13) and a finite state machine to implement various processes herein. In other aspects, the finite state machine may comprise a combination of a combinational logic circuit (e.g., combinational logic circuit 510, FIG. 14) and the sequential logic circuit 520.



FIG. 16 illustrates a surgical instrument or tool comprising a plurality of motors which can be activated to perform various functions. In certain instances, a first motor can be activated to perform a first function, a second motor can be activated to perform a second function, a third motor can be activated to perform a third function, a fourth motor can be activated to perform a fourth function, and so on. In certain instances, the plurality of motors of robotic surgical instrument 600 can be individually activated to cause firing, closure, and/or articulation motions in the end effector. The firing, closure, and/or articulation motions can be transmitted to the end effector through a shaft assembly, for example.


In certain instances, the surgical instrument system or tool may include a firing motor 602. The firing motor 602 may be operably coupled to a firing motor drive assembly 604 which can be configured to transmit firing motions, generated by the motor 602 to the end effector, in particular to displace the I-beam element. In certain instances, the firing motions generated by the motor 602 may cause the staples to be deployed from the staple cartridge into tissue captured by the end effector and/or the cutting edge of the I-beam element to be advanced to cut the captured tissue, for example. The I-beam element may be retracted by reversing the direction of the motor 602.


In certain instances, the surgical instrument or tool may include a closure motor 603. The closure motor 603 may be operably coupled to a closure motor drive assembly 605 which can be configured to transmit closure motions, generated by the motor 603 to the end effector, in particular to displace a closure tube to close the anvil and compress tissue between the anvil and the staple cartridge. The closure motions may cause the end effector to transition from an open configuration to an approximated configuration to capture tissue, for example. The end effector may be transitioned to an open position by reversing the direction of the motor 603.


In certain instances, the surgical instrument or tool may include one or more articulation motors 606a, 606b, for example. The motors 606a, 606b may be operably coupled to respective articulation motor drive assemblies 608a, 608b, which can be configured to transmit articulation motions generated by the motors 606a, 606b to the end effector. In certain instances, the articulation motions may cause the end effector to articulate relative to the shaft, for example.


As described above, the surgical instrument or tool may include a plurality of motors which may be configured to perform various independent functions. In certain instances, the plurality of motors of the surgical instrument or tool can be individually or separately activated to perform one or more functions while the other motors remain inactive. For example, the articulation motors 606a, 606b can be activated to cause the end effector to be articulated while the firing motor 602 remains inactive. Alternatively, the firing motor 602 can be activated to fire the plurality of staples, and/or to advance the cutting edge, while the articulation motor 606 remains inactive. Furthermore the closure motor 603 may be activated simultaneously with the firing motor 602 to cause the closure tube and the I-beam element to advance distally as described in more detail hereinbelow.


In certain instances, the surgical instrument or tool may include a common control module 610 which can be employed with a plurality of motors of the surgical instrument or tool. In certain instances, the common control module 610 may accommodate one of the plurality of motors at a time. For example, the common control module 610 can be couplable to and separable from the plurality of motors of the robotic surgical instrument individually. In certain instances, a plurality of the motors of the surgical instrument or tool may share one or more common control modules such as the common control module 610. In certain instances, a plurality of motors of the surgical instrument or tool can be individually and selectively engaged with the common control module 610. In certain instances, the common control module 610 can be selectively switched from interfacing with one of a plurality of motors of the surgical instrument or tool to interfacing with another one of the plurality of motors of the surgical instrument or tool.


In at least one example, the common control module 610 can be selectively switched between operable engagement with the articulation motors 606a, 606b and operable engagement with either the firing motor 602 or the closure motor 603. In at least one example, as illustrated in FIG. 16, a switch 614 can be moved or transitioned between a plurality of positions and/or states. In a first position 616, the switch 614 may electrically couple the common control module 610 to the firing motor 602; in a second position 617, the switch 614 may electrically couple the common control module 610 to the closure motor 603; in a third position 618a, the switch 614 may electrically couple the common control module 610 to the first articulation motor 606a; and in a fourth position 618b, the switch 614 may electrically couple the common control module 610 to the second articulation motor 606b, for example. In certain instances, separate common control modules 610 can be electrically coupled to the firing motor 602, the closure motor 603, and the articulations motor 606a, 606b at the same time. In certain instances, the switch 614 may be a mechanical switch, an electromechanical switch, a solid-state switch, or any suitable switching mechanism.


Each of the motors 602, 603, 606a, 606b may comprise a torque sensor to measure the output torque on the shaft of the motor. The force on an end effector may be sensed in any conventional manner, such as by force sensors on the outer sides of the jaws or by a torque sensor for the motor actuating the jaws.


In various instances, as illustrated in FIG. 16, the common control module 610 may comprise a motor driver 626 which may comprise one or more H-Bridge FETs. The motor driver 626 may modulate the power transmitted from a power source 628 to a motor coupled to the common control module 610 based on input from a microcontroller 620 (the “controller”), for example. In certain instances, the microcontroller 620 can be employed to determine the current drawn by the motor, for example, while the motor is coupled to the common control module 610, as described above.


In certain instances, the microcontroller 620 may include a microprocessor 622 (the “processor”) and one or more non-transitory computer-readable mediums or memory units 624 (the “memory”). In certain instances, the memory 624 may store various program instructions, which when executed may cause the processor 622 to perform a plurality of functions and/or calculations described herein. In certain instances, one or more of the memory units 624 may be coupled to the processor 622, for example.


In certain instances, the power source 628 can be employed to supply power to the microcontroller 620, for example. In certain instances, the power source 628 may comprise a battery (or “battery pack” or “power pack”), such as a lithium-ion battery, for example. In certain instances, the battery pack may be configured to be releasably mounted to a handle for supplying power to the surgical instrument 600. A number of battery cells connected in series may be used as the power source 628. In certain instances, the power source 628 may be replaceable and/or rechargeable, for example.


In various instances, the processor 622 may control the motor driver 626 to control the position, direction of rotation, and/or velocity of a motor that is coupled to the common control module 610. In certain instances, the processor 622 can signal the motor driver 626 to stop and/or disable a motor that is coupled to the common control module 610. It should be understood that the term “processor” as used herein includes any suitable microprocessor, microcontroller, or other basic computing device that incorporates the functions of a computer's central processing unit (CPU) on an integrated circuit or, at most, a few integrated circuits. The processor is a multipurpose, programmable device that accepts digital data as input, processes it according to instructions stored in its memory, and provides results as output. It is an example of sequential digital logic, as it has internal memory. Processors operate on numbers and symbols represented in the binary numeral system.


In one instance, the processor 622 may be any single-core or multicore processor such as those known under the trade name ARM Cortex by Texas Instruments. In certain instances, the microcontroller 620 may be an LM 4F230H5QR, available from Texas Instruments, for example. In at least one example, the Texas Instruments LM4F230H5QR is an ARM Cortex-M4F Processor Core comprising an on-chip memory of 256 KB single-cycle flash memory, or other non-volatile memory, up to 40 MHz, a prefetch buffer to improve performance above 40 MHz, a 32 KB single-cycle SRAM, an internal ROM loaded with StellarisWare® software, a 2 KB EEPROM, one or more PWM modules, one or more QEI analogs, one or more 12-bit ADCs with 12 analog input channels, among other features that are readily available for the product datasheet. Other microcontrollers may be readily substituted for use with the module 4410. Accordingly, the present disclosure should not be limited in this context.


In certain instances, the memory 624 may include program instructions for controlling each of the motors of the surgical instrument 600 that are couplable to the common control module 610. For example, the memory 624 may include program instructions for controlling the firing motor 602, the closure motor 603, and the articulation motors 606a, 606b. Such program instructions may cause the processor 622 to control the firing, closure, and articulation functions in accordance with inputs from algorithms or control programs of the surgical instrument or tool.


In certain instances, one or more mechanisms and/or sensors such as, for example, sensors 630 can be employed to alert the processor 622 to the program instructions that should be used in a particular setting. For example, the sensors 630 may alert the processor 622 to use the program instructions associated with firing, closing, and articulating the end effector. In certain instances, the sensors 630 may comprise position sensors which can be employed to sense the position of the switch 614, for example. Accordingly, the processor 622 may use the program instructions associated with firing the I-beam of the end effector upon detecting, through the sensors 630 for example, that the switch 614 is in the first position 616; the processor 622 may use the program instructions associated with closing the anvil upon detecting, through the sensors 630 for example, that the switch 614 is in the second position 617; and the processor 622 may use the program instructions associated with articulating the end effector upon detecting, through the sensors 630 for example, that the switch 614 is in the third or fourth position 618a, 618b.



FIG. 17 is a schematic diagram of a robotic surgical instrument 700 configured to operate a surgical tool described herein according to one aspect of this disclosure. The robotic surgical instrument 700 may be programmed or configured to control distal/proximal translation of a displacement member, distal/proximal displacement of a closure tube, shaft rotation, and articulation, either with single or multiple articulation drive links. In one aspect, the surgical instrument 700 may be programmed or configured to individually control a firing member, a closure member, a shaft member, and/or one or more articulation members. The surgical instrument 700 comprises a control circuit 710 configured to control motor-driven firing members, closure members, shaft members, and/or one or more articulation members.


In one aspect, the robotic surgical instrument 700 comprises a control circuit 710 configured to control an anvil 716 and an I-beam 714 (including a sharp cutting edge) portion of an end effector 702, a removable staple cartridge 718, a shaft 740, and one or more articulation members 742a, 742b via a plurality of motors 704a-704e. A position sensor 734 may be configured to provide position feedback of the I-beam 714 to the control circuit 710. Other sensors 738 may be configured to provide feedback to the control circuit 710. A timer/counter 731 provides timing and counting information to the control circuit 710. An energy source 712 may be provided to operate the motors 704a-704e, and a current sensor 736 provides motor current feedback to the control circuit 710. The motors 704a-704e can be operated individually by the control circuit 710 in an open-loop or closed-loop feedback control.


In one aspect, the control circuit 710 may comprise one or more microcontrollers, microprocessors, or other suitable processors for executing instructions that cause the processor or processors to perform one or more tasks. In one aspect, a timer/counter 731 provides an output signal, such as the elapsed time or a digital count, to the control circuit 710 to correlate the position of the I-beam 714 as determined by the position sensor 734 with the output of the timer/counter 731 such that the control circuit 710 can determine the position of the I-beam 714 at a specific time (t) relative to a starting position or the time (t) when the I-beam 714 is at a specific position relative to a starting position. The timer/counter 731 may be configured to measure elapsed time, count external events, or time external events.


In one aspect, the control circuit 710 may be programmed to control functions of the end effector 702 based on one or more tissue conditions. The control circuit 710 may be programmed to sense tissue conditions, such as thickness, either directly or indirectly, as described herein. The control circuit 710 may be programmed to select a firing control program or closure control program based on tissue conditions. A firing control program may describe the distal motion of the displacement member. Different firing control programs may be selected to better treat different tissue conditions. For example, when thicker tissue is present, the control circuit 710 may be programmed to translate the displacement member at a lower velocity and/or with lower power. When thinner tissue is present, the control circuit 710 may be programmed to translate the displacement member at a higher velocity and/or with higher power. A closure control program may control the closure force applied to the tissue by the anvil 716. Other control programs control the rotation of the shaft 740 and the articulation members 742a, 742b.


In one aspect, the control circuit 710 may generate motor set point signals. The motor set point signals may be provided to various motor controllers 708a-708e. The motor controllers 708a-708e may comprise one or more circuits configured to provide motor drive signals to the motors 704a-704e to drive the motors 704a-704e as described herein. In some examples, the motors 704a-704e may be brushed DC electric motors. For example, the velocity of the motors 704a-704e may be proportional to the respective motor drive signals. In some examples, the motors 704a-704e may be brushless DC electric motors, and the respective motor drive signals may comprise a PWM signal provided to one or more stator windings of the motors 704a-704e. Also, in some examples, the motor controllers 708a-708e may be omitted and the control circuit 710 may generate the motor drive signals directly.


In one aspect, the control circuit 710 may initially operate each of the motors 704a-704e in an open-loop configuration for a first open-loop portion of a stroke of the displacement member. Based on the response of the robotic surgical instrument 700 during the open-loop portion of the stroke, the control circuit 710 may select a firing control program in a closed-loop configuration. The response of the instrument may include a translation distance of the displacement member during the open-loop portion, a time elapsed during the open-loop portion, the energy provided to one of the motors 704a-704e during the open-loop portion, a sum of pulse widths of a motor drive signal, etc. After the open-loop portion, the control circuit 710 may implement the selected firing control program for a second portion of the displacement member stroke. For example, during a closed-loop portion of the stroke, the control circuit 710 may modulate one of the motors 704a-704e based on translation data describing a position of the displacement member in a closed-loop manner to translate the displacement member at a constant velocity.


In one aspect, the motors 704a-704e may receive power from an energy source 712. The energy source 712 may be a DC power supply driven by a main alternating current power source, a battery, a super capacitor, or any other suitable energy source. The motors 704a-704e may be mechanically coupled to individual movable mechanical elements such as the I-beam 714, anvil 716, shaft 740, articulation 742a, and articulation 742b via respective transmissions 706a-706e. The transmissions 706a-706e may include one or more gears or other linkage components to couple the motors 704a-704e to movable mechanical elements. A position sensor 734 may sense a position of the I-beam 714. The position sensor 734 may be or include any type of sensor that is capable of generating position data that indicate a position of the I-beam 714. In some examples, the position sensor 734 may include an encoder configured to provide a series of pulses to the control circuit 710 as the I-beam 714 translates distally and proximally. The control circuit 710 may track the pulses to determine the position of the I-beam 714. Other suitable position sensors may be used, including, for example, a proximity sensor. Other types of position sensors may provide other signals indicating motion of the I-beam 714. Also, in some examples, the position sensor 734 may be omitted. Where any of the motors 704a-704e is a stepper motor, the control circuit 710 may track the position of the I-beam 714 by aggregating the number and direction of steps that the motor 704 has been instructed to execute. The position sensor 734 may be located in the end effector 702 or at any other portion of the instrument. The outputs of each of the motors 704a-704e include a torque sensor 744a-744e to sense force and have an encoder to sense rotation of the drive shaft.


In one aspect, the control circuit 710 is configured to drive a firing member such as the I-beam 714 portion of the end effector 702. The control circuit 710 provides a motor set point to a motor control 708a, which provides a drive signal to the motor 704a. The output shaft of the motor 704a is coupled to a torque sensor 744a. The torque sensor 744a is coupled to a transmission 706a which is coupled to the I-beam 714. The transmission 706a comprises movable mechanical elements such as rotating elements and a firing member to control the movement of the I-beam 714 distally and proximally along a longitudinal axis of the end effector 702. In one aspect, the motor 704a may be coupled to the knife gear assembly, which includes a knife gear reduction set that includes a first knife drive gear and a second knife drive gear. A torque sensor 744a provides a firing force feedback signal to the control circuit 710. The firing force signal represents the force required to fire or displace the I-beam 714. A position sensor 734 may be configured to provide the position of the I-beam 714 along the firing stroke or the position of the firing member as a feedback signal to the control circuit 710. The end effector 702 may include additional sensors 738 configured to provide feedback signals to the control circuit 710. When ready to use, the control circuit 710 may provide a firing signal to the motor control 708a. In response to the firing signal, the motor 704a may drive the firing member distally along the longitudinal axis of the end effector 702 from a proximal stroke start position to a stroke end position distal to the stroke start position. As the firing member translates distally, an I-beam 714, with a cutting element positioned at a distal end, advances distally to cut tissue located between the staple cartridge 718 and the anvil 716.


In one aspect, the control circuit 710 is configured to drive a closure member such as the anvil 716 portion of the end effector 702. The control circuit 710 provides a motor set point to a motor control 708b, which provides a drive signal to the motor 704b. The output shaft of the motor 704b is coupled to a torque sensor 744b. The torque sensor 744b is coupled to a transmission 706b which is coupled to the anvil 716. The transmission 706b comprises movable mechanical elements such as rotating elements and a closure member to control the movement of the anvil 716 from the open and closed positions. In one aspect, the motor 704b is coupled to a closure gear assembly, which includes a closure reduction gear set that is supported in meshing engagement with the closure spur gear. The torque sensor 744b provides a closure force feedback signal to the control circuit 710. The closure force feedback signal represents the closure force applied to the anvil 716. The position sensor 734 may be configured to provide the position of the closure member as a feedback signal to the control circuit 710. Additional sensors 738 in the end effector 702 may provide the closure force feedback signal to the control circuit 710. The pivotable anvil 716 is positioned opposite the staple cartridge 718. When ready to use, the control circuit 710 may provide a closure signal to the motor control 708b. In response to the closure signal, the motor 704b advances a closure member to grasp tissue between the anvil 716 and the staple cartridge 718.


In one aspect, the control circuit 710 is configured to rotate a shaft member such as the shaft 740 to rotate the end effector 702. The control circuit 710 provides a motor set point to a motor control 708c, which provides a drive signal to the motor 704c. The output shaft of the motor 704c is coupled to a torque sensor 744c. The torque sensor 744c is coupled to a transmission 706c which is coupled to the shaft 740. The transmission 706c comprises movable mechanical elements such as rotating elements to control the rotation of the shaft 740 clockwise or counterclockwise up to and over 360°. In one aspect, the motor 704c is coupled to the rotational transmission assembly, which includes a tube gear segment that is formed on (or attached to) the proximal end of the proximal closure tube for operable engagement by a rotational gear assembly that is operably supported on the tool mounting plate. The torque sensor 744c provides a rotation force feedback signal to the control circuit 710. The rotation force feedback signal represents the rotation force applied to the shaft 740. The position sensor 734 may be configured to provide the position of the closure member as a feedback signal to the control circuit 710. Additional sensors 738 such as a shaft encoder may provide the rotational position of the shaft 740 to the control circuit 710.


In one aspect, the control circuit 710 is configured to articulate the end effector 702. The control circuit 710 provides a motor set point to a motor control 708d, which provides a drive signal to the motor 704d. The output shaft of the motor 704d is coupled to a torque sensor 744d. The torque sensor 744d is coupled to a transmission 706d which is coupled to an articulation member 742a. The transmission 706d comprises movable mechanical elements such as articulation elements to control the articulation of the end effector 702±65°. In one aspect, the motor 704d is coupled to an articulation nut, which is rotatably journaled on the proximal end portion of the distal spine portion and is rotatably driven thereon by an articulation gear assembly. The torque sensor 744d provides an articulation force feedback signal to the control circuit 710. The articulation force feedback signal represents the articulation force applied to the end effector 702. Sensors 738, such as an articulation encoder, may provide the articulation position of the end effector 702 to the control circuit 710.


In another aspect, the articulation function of the robotic surgical system 700 may comprise two articulation members, or links, 742a, 742b. These articulation members 742a, 742b are driven by separate disks on the robot interface (the rack) which are driven by the two motors 708d, 708e. When the separate firing motor 704a is provided, each of articulation links 742a, 742b can be antagonistically driven with respect to the other link in order to provide a resistive holding motion and a load to the head when it is not moving and to provide an articulation motion as the head is articulated. The articulation members 742a, 742b attach to the head at a fixed radius as the head is rotated. Accordingly, the mechanical advantage of the push-and-pull link changes as the head is rotated. This change in the mechanical advantage may be more pronounced with other articulation link drive systems.


In one aspect, the one or more motors 704a-704e may comprise a brushed DC motor with a gearbox and mechanical links to a firing member, closure member, or articulation member. Another example includes electric motors 704a-704e that operate the movable mechanical elements such as the displacement member, articulation links, closure tube, and shaft. An outside influence is an unmeasured, unpredictable influence of things like tissue, surrounding bodies, and friction on the physical system. Such outside influence can be referred to as drag, which acts in opposition to one of electric motors 704a-704e. The outside influence, such as drag, may cause the operation of the physical system to deviate from a desired operation of the physical system.


In one aspect, the position sensor 734 may be implemented as an absolute positioning system. In one aspect, the position sensor 734 may comprise a magnetic rotary absolute positioning system implemented as an AS5055EQFT single-chip magnetic rotary position sensor available from Austria Microsystems, AG. The position sensor 734 may interface with the control circuit 710 to provide an absolute positioning system. The position may include multiple Hall-effect elements located above a magnet and coupled to a CORDIC processor, also known as the digit-by-digit method and Volder's algorithm, that is provided to implement a simple and efficient algorithm to calculate hyperbolic and trigonometric functions that require only addition, subtraction, bitshift, and table lookup operations.


In one aspect, the control circuit 710 may be in communication with one or more sensors 738. The sensors 738 may be positioned on the end effector 702 and adapted to operate with the robotic surgical instrument 700 to measure the various derived parameters such as the gap distance versus time, tissue compression versus time, and anvil strain versus time. The sensors 738 may comprise a magnetic sensor, a magnetic field sensor, a strain gauge, a load cell, a pressure sensor, a force sensor, a torque sensor, an inductive sensor such as an eddy current sensor, a resistive sensor, a capacitive sensor, an optical sensor, and/or any other suitable sensor for measuring one or more parameters of the end effector 702. The sensors 738 may include one or more sensors. The sensors 738 may be located on the staple cartridge 718 deck to determine tissue location using segmented electrodes. The torque sensors 744a-744e may be configured to sense force such as firing force, closure force, and/or articulation force, among others. Accordingly, the control circuit 710 can sense (1) the closure load experienced by the distal closure tube and its position, (2) the firing member at the rack and its position, (3) what portion of the staple cartridge 718 has tissue on it, and (4) the load and position on both articulation rods.


In one aspect, the one or more sensors 738 may comprise a strain gauge, such as a micro-strain gauge, configured to measure the magnitude of the strain in the anvil 716 during a clamped condition. The strain gauge provides an electrical signal whose amplitude varies with the magnitude of the strain. The sensors 738 may comprise a pressure sensor configured to detect a pressure generated by the presence of compressed tissue between the anvil 716 and the staple cartridge 718. The sensors 738 may be configured to detect impedance of a tissue section located between the anvil 716 and the staple cartridge 718 that is indicative of the thickness and/or fullness of tissue located therebetween.


In one aspect, the sensors 738 may be implemented as one or more limit switches, electromechanical devices, solid-state switches, Hall-effect devices, magneto-resistive (MR) devices, giant magneto-resistive (GMR) devices, magnetometers, among others. In other implementations, the sensors 738 may be implemented as solid-state switches that operate under the influence of light, such as optical sensors, IR sensors, ultraviolet sensors, among others. Still, the switches may be solid-state devices such as transistors (e.g., FET, junction FET, MOSFET, bipolar, and the like). In other implementations, the sensors 738 may include electrical conductorless switches, ultrasonic switches, accelerometers, and inertial sensors, among others.


In one aspect, the sensors 738 may be configured to measure forces exerted on the anvil 716 by the closure drive system. For example, one or more sensors 738 can be at an interaction point between the closure tube and the anvil 716 to detect the closure forces applied by the closure tube to the anvil 716. The forces exerted on the anvil 716 can be representative of the tissue compression experienced by the tissue section captured between the anvil 716 and the staple cartridge 718. The one or more sensors 738 can be positioned at various interaction points along the closure drive system to detect the closure forces applied to the anvil 716 by the closure drive system. The one or more sensors 738 may be sampled in real time during a clamping operation by the processor of the control circuit 710. The control circuit 710 receives real-time sample measurements to provide and analyze time-based information and assess, in real time, closure forces applied to the anvil 716.


In one aspect, a current sensor 736 can be employed to measure the current drawn by each of the motors 704a-704e. The force required to advance any of the movable mechanical elements such as the I-beam 714 corresponds to the current drawn by one of the motors 704a-704e. The force is converted to a digital signal and provided to the control circuit 710. The control circuit 710 can be configured to simulate the response of the actual system of the instrument in the software of the controller. A displacement member can be actuated to move an I-beam 714 in the end effector 702 at or near a target velocity. The robotic surgical instrument 700 can include a feedback controller, which can be one of any feedback controllers, including, but not limited to a PID, a state feedback, a linear-quadratic (LQR), and/or an adaptive controller, for example. The robotic surgical instrument 700 can include a power source to convert the signal from the feedback controller into a physical input such as case voltage, PWM voltage, frequency modulated voltage, current, torque, and/or force, for example. Additional details are disclosed in U.S. patent application Ser. No. 15/636,829, titled CLOSED LOOP VELOCITY CONTROL TECHNIQUES FOR ROBOTIC SURGICAL INSTRUMENT, filed Jun. 29, 2017, which is herein incorporated by reference in its entirety.



FIG. 18 illustrates a block diagram of a surgical instrument 750 programmed to control the distal translation of a displacement member according to one aspect of this disclosure. In one aspect, the surgical instrument 750 is programmed to control the distal translation of a displacement member such as the I-beam 764. The surgical instrument 750 comprises an end effector 752 that may comprise an anvil 766, an I-beam 764 (including a sharp cutting edge), and a removable staple cartridge 768.


The position, movement, displacement, and/or translation of a linear displacement member, such as the I-beam 764, can be measured by an absolute positioning system, sensor arrangement, and position sensor 784. Because the I-beam 764 is coupled to a longitudinally movable drive member, the position of the I-beam 764 can be determined by measuring the position of the longitudinally movable drive member employing the position sensor 784. Accordingly, in the following description, the position, displacement, and/or translation of the I-beam 764 can be achieved by the position sensor 784 as described herein. A control circuit 760 may be programmed to control the translation of the displacement member, such as the I-beam 764. The control circuit 760, in some examples, may comprise one or more microcontrollers, microprocessors, or other suitable processors for executing instructions that cause the processor or processors to control the displacement member, e.g., the I-beam 764, in the manner described. In one aspect, a timer/counter 781 provides an output signal, such as the elapsed time or a digital count, to the control circuit 760 to correlate the position of the I-beam 764 as determined by the position sensor 784 with the output of the timer/counter 781 such that the control circuit 760 can determine the position of the I-beam 764 at a specific time (t) relative to a starting position. The timer/counter 781 may be configured to measure elapsed time, count external events, or time external events.


The control circuit 760 may generate a motor set point signal 772. The motor set point signal 772 may be provided to a motor controller 758. The motor controller 758 may comprise one or more circuits configured to provide a motor drive signal 774 to the motor 754 to drive the motor 754 as described herein. In some examples, the motor 754 may be a brushed DC electric motor. For example, the velocity of the motor 754 may be proportional to the motor drive signal 774. In some examples, the motor 754 may be a brushless DC electric motor and the motor drive signal 774 may comprise a PWM signal provided to one or more stator windings of the motor 754. Also, in some examples, the motor controller 758 may be omitted, and the control circuit 760 may generate the motor drive signal 774 directly.


The motor 754 may receive power from an energy source 762. The energy source 762 may be or include a battery, a super capacitor, or any other suitable energy source. The motor 754 may be mechanically coupled to the I-beam 764 via a transmission 756. The transmission 756 may include one or more gears or other linkage components to couple the motor 754 to the I-beam 764. A position sensor 784 may sense a position of the I-beam 764. The position sensor 784 may be or include any type of sensor that is capable of generating position data that indicate a position of the I-beam 764. In some examples, the position sensor 784 may include an encoder configured to provide a series of pulses to the control circuit 760 as the I-beam 764 translates distally and proximally. The control circuit 760 may track the pulses to determine the position of the I-beam 764. Other suitable position sensors may be used, including, for example, a proximity sensor. Other types of position sensors may provide other signals indicating motion of the I-beam 764. Also, in some examples, the position sensor 784 may be omitted. Where the motor 754 is a stepper motor, the control circuit 760 may track the position of the I-beam 764 by aggregating the number and direction of steps that the motor 754 has been instructed to execute. The position sensor 784 may be located in the end effector 752 or at any other portion of the instrument.


The control circuit 760 may be in communication with one or more sensors 788. The sensors 788 may be positioned on the end effector 752 and adapted to operate with the surgical instrument 750 to measure the various derived parameters such as gap distance versus time, tissue compression versus time, and anvil strain versus time. The sensors 788 may comprise a magnetic sensor, a magnetic field sensor, a strain gauge, a pressure sensor, a force sensor, an inductive sensor such as an eddy current sensor, a resistive sensor, a capacitive sensor, an optical sensor, and/or any other suitable sensor for measuring one or more parameters of the end effector 752. The sensors 788 may include one or more sensors.


The one or more sensors 788 may comprise a strain gauge, such as a micro-strain gauge, configured to measure the magnitude of the strain in the anvil 766 during a clamped condition. The strain gauge provides an electrical signal whose amplitude varies with the magnitude of the strain. The sensors 788 may comprise a pressure sensor configured to detect a pressure generated by the presence of compressed tissue between the anvil 766 and the staple cartridge 768. The sensors 788 may be configured to detect impedance of a tissue section located between the anvil 766 and the staple cartridge 768 that is indicative of the thickness and/or fullness of tissue located therebetween.


The sensors 788 may be is configured to measure forces exerted on the anvil 766 by a closure drive system. For example, one or more sensors 788 can be at an interaction point between a closure tube and the anvil 766 to detect the closure forces applied by a closure tube to the anvil 766. The forces exerted on the anvil 766 can be representative of the tissue compression experienced by the tissue section captured between the anvil 766 and the staple cartridge 768. The one or more sensors 788 can be positioned at various interaction points along the closure drive system to detect the closure forces applied to the anvil 766 by the closure drive system. The one or more sensors 788 may be sampled in real time during a clamping operation by a processor of the control circuit 760. The control circuit 760 receives real-time sample measurements to provide and analyze time-based information and assess, in real time, closure forces applied to the anvil 766.


A current sensor 786 can be employed to measure the current drawn by the motor 754. The force required to advance the I-beam 764 corresponds to the current drawn by the motor 754. The force is converted to a digital signal and provided to the control circuit 760.


The control circuit 760 can be configured to simulate the response of the actual system of the instrument in the software of the controller. A displacement member can be actuated to move an I-beam 764 in the end effector 752 at or near a target velocity. The surgical instrument 750 can include a feedback controller, which can be one of any feedback controllers, including, but not limited to a PID, a state feedback, LQR, and/or an adaptive controller, for example. The surgical instrument 750 can include a power source to convert the signal from the feedback controller into a physical input such as case voltage, PWM voltage, frequency modulated voltage, current, torque, and/or force, for example.


The actual drive system of the surgical instrument 750 is configured to drive the displacement member, cutting member, or I-beam 764, by a brushed DC motor with gearbox and mechanical links to an articulation and/or knife system. Another example is the electric motor 754 that operates the displacement member and the articulation driver, for example, of an interchangeable shaft assembly. An outside influence is an unmeasured, unpredictable influence of things like tissue, surrounding bodies and friction on the physical system. Such outside influence can be referred to as drag which acts in opposition to the electric motor 754. The outside influence, such as drag, may cause the operation of the physical system to deviate from a desired operation of the physical system.


Various example aspects are directed to a surgical instrument 750 comprising an end effector 752 with motor-driven surgical stapling and cutting implements. For example, a motor 754 may drive a displacement member distally and proximally along a longitudinal axis of the end effector 752. The end effector 752 may comprise a pivotable anvil 766 and, when configured for use, a staple cartridge 768 positioned opposite the anvil 766. A clinician may grasp tissue between the anvil 766 and the staple cartridge 768, as described herein. When ready to use the instrument 750, the clinician may provide a firing signal, for example by depressing a trigger of the instrument 750. In response to the firing signal, the motor 754 may drive the displacement member distally along the longitudinal axis of the end effector 752 from a proximal stroke begin position to a stroke end position distal of the stroke begin position. As the displacement member translates distally, an I-beam 764 with a cutting element positioned at a distal end, may cut the tissue between the staple cartridge 768 and the anvil 766.


In various examples, the surgical instrument 750 may comprise a control circuit 760 programmed to control the distal translation of the displacement member, such as the I-beam 764, for example, based on one or more tissue conditions. The control circuit 760 may be programmed to sense tissue conditions, such as thickness, either directly or indirectly, as described herein. The control circuit 760 may be programmed to select a firing control program based on tissue conditions. A firing control program may describe the distal motion of the displacement member. Different firing control programs may be selected to better treat different tissue conditions. For example, when thicker tissue is present, the control circuit 760 may be programmed to translate the displacement member at a lower velocity and/or with lower power. When thinner tissue is present, the control circuit 760 may be programmed to translate the displacement member at a higher velocity and/or with higher power.


In some examples, the control circuit 760 may initially operate the motor 754 in an open loop configuration for a first open loop portion of a stroke of the displacement member. Based on a response of the instrument 750 during the open loop portion of the stroke, the control circuit 760 may select a firing control program. The response of the instrument may include, a translation distance of the displacement member during the open loop portion, a time elapsed during the open loop portion, energy provided to the motor 754 during the open loop portion, a sum of pulse widths of a motor drive signal, etc. After the open loop portion, the control circuit 760 may implement the selected firing control program for a second portion of the displacement member stroke. For example, during the closed loop portion of the stroke, the control circuit 760 may modulate the motor 754 based on translation data describing a position of the displacement member in a closed loop manner to translate the displacement member at a constant velocity. Additional details are disclosed in U.S. patent application Ser. No. 15/720,852, titled SYSTEM AND METHODS FOR CONTROLLING A DISPLAY OF A SURGICAL INSTRUMENT, filed Sep. 29, 2017, which is herein incorporated by reference in its entirety.



FIG. 19 is a schematic diagram of a surgical instrument 790 configured to control various functions according to one aspect of this disclosure. In one aspect, the surgical instrument 790 is programmed to control distal translation of a displacement member such as the I-beam 764. The surgical instrument 790 comprises an end effector 792 that may comprise an anvil 766, an I-beam 764, and a removable staple cartridge 768 which may be interchanged with an RF cartridge 796 (shown in dashed line).


In one aspect, sensors 788 may be implemented as a limit switch, electromechanical device, solid-state switches, Hall-effect devices, MR devices, GMR devices, magnetometers, among others. In other implementations, the sensors 638 may be solid-state switches that operate under the influence of light, such as optical sensors, IR sensors, ultraviolet sensors, among others. Still, the switches may be solid-state devices such as transistors (e.g., FET, junction FET, MOSFET, bipolar, and the like). In other implementations, the sensors 788 may include electrical conductorless switches, ultrasonic switches, accelerometers, and inertial sensors, among others.


In one aspect, the position sensor 784 may be implemented as an absolute positioning system comprising a magnetic rotary absolute positioning system implemented as an AS5055EQFT single-chip magnetic rotary position sensor available from Austria Microsystems, AG. The position sensor 784 may interface with the control circuit 760 to provide an absolute positioning system. The position may include multiple Hall-effect elements located above a magnet and coupled to a CORDIC processor, also known as the digit-by-digit method and Volder's algorithm, that is provided to implement a simple and efficient algorithm to calculate hyperbolic and trigonometric functions that require only addition, subtraction, bitshift, and table lookup operations.


In one aspect, the I-beam 764 may be implemented as a knife member comprising a knife body that operably supports a tissue cutting blade thereon and may further include anvil engagement tabs or features and channel engagement features or a foot. In one aspect, the staple cartridge 768 may be implemented as a standard (mechanical) surgical fastener cartridge. In one aspect, the RF cartridge 796 may be implemented as an RF cartridge. These and other sensors arrangements are described in commonly-owned U.S. patent application Ser. No. 15/628,175, titled TECHNIQUES FOR ADAPTIVE CONTROL OF MOTOR VELOCITY OF A SURGICAL STAPLING AND CUTTING INSTRUMENT, filed Jun. 20, 2017, which is herein incorporated by reference in its entirety.


The position, movement, displacement, and/or translation of a linear displacement member, such as the I-beam 764, can be measured by an absolute positioning system, sensor arrangement, and position sensor represented as position sensor 784. Because the I-beam 764 is coupled to the longitudinally movable drive member, the position of the I-beam 764 can be determined by measuring the position of the longitudinally movable drive member employing the position sensor 784. Accordingly, in the following description, the position, displacement, and/or translation of the I-beam 764 can be achieved by the position sensor 784 as described herein. A control circuit 760 may be programmed to control the translation of the displacement member, such as the I-beam 764, as described herein. The control circuit 760, in some examples, may comprise one or more microcontrollers, microprocessors, or other suitable processors for executing instructions that cause the processor or processors to control the displacement member, e.g., the I-beam 764, in the manner described. In one aspect, a timer/counter 781 provides an output signal, such as the elapsed time or a digital count, to the control circuit 760 to correlate the position of the I-beam 764 as determined by the position sensor 784 with the output of the timer/counter 781 such that the control circuit 760 can determine the position of the I-beam 764 at a specific time (t) relative to a starting position. The timer/counter 781 may be configured to measure elapsed time, count external events, or time external events.


The control circuit 760 may generate a motor set point signal 772. The motor set point signal 772 may be provided to a motor controller 758. The motor controller 758 may comprise one or more circuits configured to provide a motor drive signal 774 to the motor 754 to drive the motor 754 as described herein. In some examples, the motor 754 may be a brushed DC electric motor. For example, the velocity of the motor 754 may be proportional to the motor drive signal 774. In some examples, the motor 754 may be a brushless DC electric motor and the motor drive signal 774 may comprise a PWM signal provided to one or more stator windings of the motor 754. Also, in some examples, the motor controller 758 may be omitted, and the control circuit 760 may generate the motor drive signal 774 directly.


The motor 754 may receive power from an energy source 762. The energy source 762 may be or include a battery, a super capacitor, or any other suitable energy source. The motor 754 may be mechanically coupled to the I-beam 764 via a transmission 756. The transmission 756 may include one or more gears or other linkage components to couple the motor 754 to the I-beam 764. A position sensor 784 may sense a position of the I-beam 764. The position sensor 784 may be or include any type of sensor that is capable of generating position data that indicate a position of the I-beam 764. In some examples, the position sensor 784 may include an encoder configured to provide a series of pulses to the control circuit 760 as the I-beam 764 translates distally and proximally. The control circuit 760 may track the pulses to determine the position of the I-beam 764. Other suitable position sensors may be used, including, for example, a proximity sensor. Other types of position sensors may provide other signals indicating motion of the I-beam 764. Also, in some examples, the position sensor 784 may be omitted. Where the motor 754 is a stepper motor, the control circuit 760 may track the position of the I-beam 764 by aggregating the number and direction of steps that the motor has been instructed to execute. The position sensor 784 may be located in the end effector 792 or at any other portion of the instrument.


The control circuit 760 may be in communication with one or more sensors 788. The sensors 788 may be positioned on the end effector 792 and adapted to operate with the surgical instrument 790 to measure the various derived parameters such as gap distance versus time, tissue compression versus time, and anvil strain versus time. The sensors 788 may comprise a magnetic sensor, a magnetic field sensor, a strain gauge, a pressure sensor, a force sensor, an inductive sensor such as an eddy current sensor, a resistive sensor, a capacitive sensor, an optical sensor, and/or any other suitable sensor for measuring one or more parameters of the end effector 792. The sensors 788 may include one or more sensors.


The one or more sensors 788 may comprise a strain gauge, such as a micro-strain gauge, configured to measure the magnitude of the strain in the anvil 766 during a clamped condition. The strain gauge provides an electrical signal whose amplitude varies with the magnitude of the strain. The sensors 788 may comprise a pressure sensor configured to detect a pressure generated by the presence of compressed tissue between the anvil 766 and the staple cartridge 768. The sensors 788 may be configured to detect impedance of a tissue section located between the anvil 766 and the staple cartridge 768 that is indicative of the thickness and/or fullness of tissue located therebetween.


The sensors 788 may be is configured to measure forces exerted on the anvil 766 by the closure drive system. For example, one or more sensors 788 can be at an interaction point between a closure tube and the anvil 766 to detect the closure forces applied by a closure tube to the anvil 766. The forces exerted on the anvil 766 can be representative of the tissue compression experienced by the tissue section captured between the anvil 766 and the staple cartridge 768. The one or more sensors 788 can be positioned at various interaction points along the closure drive system to detect the closure forces applied to the anvil 766 by the closure drive system. The one or more sensors 788 may be sampled in real time during a clamping operation by a processor portion of the control circuit 760. The control circuit 760 receives real-time sample measurements to provide and analyze time-based information and assess, in real time, closure forces applied to the anvil 766.


A current sensor 786 can be employed to measure the current drawn by the motor 754. The force required to advance the I-beam 764 corresponds to the current drawn by the motor 754. The force is converted to a digital signal and provided to the control circuit 760.


An RF energy source 794 is coupled to the end effector 792 and is applied to the RF cartridge 796 when the RF cartridge 796 is loaded in the end effector 792 in place of the staple cartridge 768. The control circuit 760 controls the delivery of the RF energy to the RF cartridge 796.


Additional details are disclosed in U.S. patent application Ser. No. 15/636,096, titled SURGICAL SYSTEM COUPLABLE WITH STAPLE CARTRIDGE AND RADIO FREQUENCY CARTRIDGE, AND METHOD OF USING SAME, filed Jun. 28, 2017, which is herein incorporated by reference in its entirety.


Generator Hardware


FIG. 20 is a simplified block diagram of a generator 800 configured to provide inductorless tuning, among other benefits. Additional details of the generator 800 are described in U.S. Pat. No. 9,060,775, titled SURGICAL GENERATOR FOR ULTRASONIC AND ELECTROSURGICAL DEVICES, which issued on Jun. 23, 2015, which is herein incorporated by reference in its entirety. The generator 800 may comprise a patient isolated stage 802 in communication with a non-isolated stage 804 via a power transformer 806. A secondary winding 808 of the power transformer 806 is contained in the isolated stage 802 and may comprise a tapped configuration (e.g., a center-tapped or a non-center-tapped configuration) to define drive signal outputs 810a, 810b, 810c for delivering drive signals to different surgical instruments, such as, for example, an ultrasonic surgical instrument, an RF electrosurgical instrument, and a multifunction surgical instrument which includes ultrasonic and RF energy modes that can be delivered alone or simultaneously. In particular, drive signal outputs 810a, 810c may output an ultrasonic drive signal (e.g., a 420V root-mean-square (RMS) drive signal) to an ultrasonic surgical instrument, and drive signal outputs 810b, 810c may output an RF electrosurgical drive signal (e.g., a 100V RMS drive signal) to an RF electrosurgical instrument, with the drive signal output 810b corresponding to the center tap of the power transformer 806.


In certain forms, the ultrasonic and electrosurgical drive signals may be provided simultaneously to distinct surgical instruments and/or to a single surgical instrument, such as the multifunction surgical instrument, having the capability to deliver both ultrasonic and electrosurgical energy to tissue. It will be appreciated that the electrosurgical signal, provided either to a dedicated electrosurgical instrument and/or to a combined multifunction ultrasonic/electrosurgical instrument may be either a therapeutic or sub-therapeutic level signal where the sub-therapeutic signal can be used, for example, to monitor tissue or instrument conditions and provide feedback to the generator. For example, the ultrasonic and RF signals can be delivered separately or simultaneously from a generator with a single output port in order to provide the desired output signal to the surgical instrument, as will be discussed in more detail below. Accordingly, the generator can combine the ultrasonic and electrosurgical RF energies and deliver the combined energies to the multifunction ultrasonic/electrosurgical instrument. Bipolar electrodes can be placed on one or both jaws of the end effector. One jaw may be driven by ultrasonic energy in addition to electrosurgical RF energy, working simultaneously. The ultrasonic energy may be employed to dissect tissue, while the electrosurgical RF energy may be employed for vessel sealing.


The non-isolated stage 804 may comprise a power amplifier 812 having an output connected to a primary winding 814 of the power transformer 806. In certain forms, the power amplifier 812 may comprise a push-pull amplifier. For example, the non-isolated stage 804 may further comprise a logic device 816 for supplying a digital output to a digital-to-analog converter (DAC) circuit 818, which in turn supplies a corresponding analog signal to an input of the power amplifier 812. In certain forms, the logic device 816 may comprise a programmable gate array (PGA), a FPGA, programmable logic device (PLD), among other logic circuits, for example. The logic device 816, by virtue of controlling the input of the power amplifier 812 via the DAC circuit 818, may therefore control any of a number of parameters (e.g., frequency, waveform shape, waveform amplitude) of drive signals appearing at the drive signal outputs 810a, 810b, 810c. In certain forms and as discussed below, the logic device 816, in conjunction with a processor (e.g., a DSP discussed below), may implement a number of DSP-based and/or other control algorithms to control parameters of the drive signals output by the generator 800.


Power may be supplied to a power rail of the power amplifier 812 by a switch-mode regulator 820, e.g., a power converter. In certain forms, the switch-mode regulator 820 may comprise an adjustable buck regulator, for example. The non-isolated stage 804 may further comprise a first processor 822, which in one form may comprise a DSP processor such as an Analog Devices ADSP-21469 SHARC DSP, available from Analog Devices, Norwood, Mass., for example, although in various forms any suitable processor may be employed. In certain forms the DSP processor 822 may control the operation of the switch-mode regulator 820 responsive to voltage feedback data received from the power amplifier 812 by the DSP processor 822 via an ADC circuit 824. In one form, for example, the DSP processor 822 may receive as input, via the ADC circuit 824, the waveform envelope of a signal (e.g., an RF signal) being amplified by the power amplifier 812. The DSP processor 822 may then control the switch-mode regulator 820 (e.g., via a PWM output) such that the rail voltage supplied to the power amplifier 812 tracks the waveform envelope of the amplified signal. By dynamically modulating the rail voltage of the power amplifier 812 based on the waveform envelope, the efficiency of the power amplifier 812 may be significantly improved relative to a fixed rail voltage amplifier schemes.


In certain forms, the logic device 816, in conjunction with the DSP processor 822, may implement a digital synthesis circuit such as a direct digital synthesizer control scheme to control the waveform shape, frequency, and/or amplitude of drive signals output by the generator 800. In one form, for example, the logic device 816 may implement a DDS control algorithm by recalling waveform samples stored in a dynamically updated lookup table (LUT), such as a RAM LUT, which may be embedded in an FPGA. This control algorithm is particularly useful for ultrasonic applications in which an ultrasonic transducer, such as an ultrasonic transducer, may be driven by a clean sinusoidal current at its resonant frequency. Because other frequencies may excite parasitic resonances, minimizing or reducing the total distortion of the motional branch current may correspondingly minimize or reduce undesirable resonance effects. Because the waveform shape of a drive signal output by the generator 800 is impacted by various sources of distortion present in the output drive circuit (e.g., the power transformer 806, the power amplifier 812), voltage and current feedback data based on the drive signal may be input into an algorithm, such as an error control algorithm implemented by the DSP processor 822, which compensates for distortion by suitably pre-distorting or modifying the waveform samples stored in the LUT on a dynamic, ongoing basis (e.g., in real time). In one form, the amount or degree of pre-distortion applied to the LUT samples may be based on the error between a computed motional branch current and a desired current waveform shape, with the error being determined on a sample-by-sample basis. In this way, the pre-distorted LUT samples, when processed through the drive circuit, may result in a motional branch drive signal having the desired waveform shape (e.g., sinusoidal) for optimally driving the ultrasonic transducer. In such forms, the LUT waveform samples will therefore not represent the desired waveform shape of the drive signal, but rather the waveform shape that is required to ultimately produce the desired waveform shape of the motional branch drive signal when distortion effects are taken into account.


The non-isolated stage 804 may further comprise a first ADC circuit 826 and a second ADC circuit 828 coupled to the output of the power transformer 806 via respective isolation transformers 830, 832 for respectively sampling the voltage and current of drive signals output by the generator 800. In certain forms, the ADC circuits 826, 828 may be configured to sample at high speeds (e.g., 80 mega samples per second (MSPS)) to enable oversampling of the drive signals. In one form, for example, the sampling speed of the ADC circuits 826, 828 may enable approximately 200× (depending on frequency) oversampling of the drive signals. In certain forms, the sampling operations of the ADC circuit 826, 828 may be performed by a single ADC circuit receiving input voltage and current signals via a two-way multiplexer. The use of high-speed sampling in forms of the generator 800 may enable, among other things, calculation of the complex current flowing through the motional branch (which may be used in certain forms to implement DDS-based waveform shape control described above), accurate digital filtering of the sampled signals, and calculation of real power consumption with a high degree of precision. Voltage and current feedback data output by the ADC circuits 826, 828 may be received and processed (e.g., first-in-first-out (FIFO) buffer, multiplexer) by the logic device 816 and stored in data memory for subsequent retrieval by, for example, the DSP processor 822. As noted above, voltage and current feedback data may be used as input to an algorithm for pre-distorting or modifying LUT waveform samples on a dynamic and ongoing basis. In certain forms, this may require each stored voltage and current feedback data pair to be indexed based on, or otherwise associated with, a corresponding LUT sample that was output by the logic device 816 when the voltage and current feedback data pair was acquired. Synchronization of the LUT samples and the voltage and current feedback data in this manner contributes to the correct timing and stability of the pre-distortion algorithm.


In certain forms, the voltage and current feedback data may be used to control the frequency and/or amplitude (e.g., current amplitude) of the drive signals. In one form, for example, voltage and current feedback data may be used to determine impedance phase. The frequency of the drive signal may then be controlled to minimize or reduce the difference between the determined impedance phase and an impedance phase setpoint (e.g., 0°), thereby minimizing or reducing the effects of harmonic distortion and correspondingly enhancing impedance phase measurement accuracy. The determination of phase impedance and a frequency control signal may be implemented in the DSP processor 822, for example, with the frequency control signal being supplied as input to a DDS control algorithm implemented by the logic device 816.


In another form, for example, the current feedback data may be monitored in order to maintain the current amplitude of the drive signal at a current amplitude setpoint. The current amplitude setpoint may be specified directly or determined indirectly based on specified voltage amplitude and power setpoints. In certain forms, control of the current amplitude may be implemented by control algorithm, such as, for example, a proportional-integral-derivative (PID) control algorithm, in the DSP processor 822. Variables controlled by the control algorithm to suitably control the current amplitude of the drive signal may include, for example, the scaling of the LUT waveform samples stored in the logic device 816 and/or the full-scale output voltage of the DAC circuit 818 (which supplies the input to the power amplifier 812) via a DAC circuit 834.


The non-isolated stage 804 may further comprise a second processor 836 for providing, among other things user interface (UI) functionality. In one form, the UI processor 836 may comprise an Atmel AT91SAM9263 processor having an ARM 926EJ-S core, available from Atmel Corporation, San Jose, Calif., for example. Examples of UI functionality supported by the UI processor 836 may include audible and visual user feedback, communication with peripheral devices (e.g., via a USB interface), communication with a foot switch, communication with an input device (e.g., a touch screen display) and communication with an output device (e.g., a speaker). The UI processor 836 may communicate with the DSP processor 822 and the logic device 816 (e.g., via SPI buses). Although the UI processor 836 may primarily support UI functionality, it may also coordinate with the DSP processor 822 to implement hazard mitigation in certain forms. For example, the UI processor 836 may be programmed to monitor various aspects of user input and/or other inputs (e.g., touch screen inputs, foot switch inputs, temperature sensor inputs) and may disable the drive output of the generator 800 when an erroneous condition is detected.


In certain forms, both the DSP processor 822 and the UI processor 836, for example, may determine and monitor the operating state of the generator 800. For the DSP processor 822, the operating state of the generator 800 may dictate, for example, which control and/or diagnostic processes are implemented by the DSP processor 822. For the UI processor 836, the operating state of the generator 800 may dictate, for example, which elements of a UI (e.g., display screens, sounds) are presented to a user. The respective DSP and UI processors 822, 836 may independently maintain the current operating state of the generator 800 and recognize and evaluate possible transitions out of the current operating state. The DSP processor 822 may function as the master in this relationship and determine when transitions between operating states are to occur. The UI processor 836 may be aware of valid transitions between operating states and may confirm if a particular transition is appropriate. For example, when the DSP processor 822 instructs the UI processor 836 to transition to a specific state, the UI processor 836 may verify that requested transition is valid. In the event that a requested transition between states is determined to be invalid by the UI processor 836, the UI processor 836 may cause the generator 800 to enter a failure mode.


The non-isolated stage 804 may further comprise a controller 838 for monitoring input devices (e.g., a capacitive touch sensor used for turning the generator 800 on and off, a capacitive touch screen). In certain forms, the controller 838 may comprise at least one processor and/or other controller device in communication with the UI processor 836. In one form, for example, the controller 838 may comprise a processor (e.g., a Meg168 8-bit controller available from Atmel) configured to monitor user input provided via one or more capacitive touch sensors. In one form, the controller 838 may comprise a touch screen controller (e.g., a QT5480 touch screen controller available from Atmel) to control and manage the acquisition of touch data from a capacitive touch screen.


In certain forms, when the generator 800 is in a “power off” state, the controller 838 may continue to receive operating power (e.g., via a line from a power supply of the generator 800, such as the power supply 854 discussed below). In this way, the controller 838 may continue to monitor an input device (e.g., a capacitive touch sensor located on a front panel of the generator 800) for turning the generator 800 on and off. When the generator 800 is in the power off state, the controller 838 may wake the power supply (e.g., enable operation of one or more DC/DC voltage converters 856 of the power supply 854) if activation of the “on/off” input device by a user is detected. The controller 838 may therefore initiate a sequence for transitioning the generator 800 to a “power on” state. Conversely, the controller 838 may initiate a sequence for transitioning the generator 800 to the power off state if activation of the “on/off” input device is detected when the generator 800 is in the power on state. In certain forms, for example, the controller 838 may report activation of the “on/off” input device to the UI processor 836, which in turn implements the necessary process sequence for transitioning the generator 800 to the power off state. In such forms, the controller 838 may have no independent ability for causing the removal of power from the generator 800 after its power on state has been established.


In certain forms, the controller 838 may cause the generator 800 to provide audible or other sensory feedback for alerting the user that a power on or power off sequence has been initiated. Such an alert may be provided at the beginning of a power on or power off sequence and prior to the commencement of other processes associated with the sequence.


In certain forms, the isolated stage 802 may comprise an instrument interface circuit 840 to, for example, provide a communication interface between a control circuit of a surgical instrument (e.g., a control circuit comprising handpiece switches) and components of the non-isolated stage 804, such as, for example, the logic device 816, the DSP processor 822, and/or the UI processor 836. The instrument interface circuit 840 may exchange information with components of the non-isolated stage 804 via a communication link that maintains a suitable degree of electrical isolation between the isolated and non-isolated stages 802, 804, such as, for example, an IR-based communication link. Power may be supplied to the instrument interface circuit 840 using, for example, a low-dropout voltage regulator powered by an isolation transformer driven from the non-isolated stage 804.


In one form, the instrument interface circuit 840 may comprise a logic circuit 842 (e.g., logic circuit, programmable logic circuit, PGA, FPGA, PLD) in communication with a signal conditioning circuit 844. The signal conditioning circuit 844 may be configured to receive a periodic signal from the logic circuit 842 (e.g., a 2 kHz square wave) to generate a bipolar interrogation signal having an identical frequency. The interrogation signal may be generated, for example, using a bipolar current source fed by a differential amplifier. The interrogation signal may be communicated to a surgical instrument control circuit (e.g., by using a conductive pair in a cable that connects the generator 800 to the surgical instrument) and monitored to determine a state or configuration of the control circuit. The control circuit may comprise a number of switches, resistors, and/or diodes to modify one or more characteristics (e.g., amplitude, rectification) of the interrogation signal such that a state or configuration of the control circuit is uniquely discernable based on the one or more characteristics. In one form, for example, the signal conditioning circuit 844 may comprise an ADC circuit for generating samples of a voltage signal appearing across inputs of the control circuit resulting from passage of interrogation signal therethrough. The logic circuit 842 (or a component of the non-isolated stage 804) may then determine the state or configuration of the control circuit based on the ADC circuit samples.


In one form, the instrument interface circuit 840 may comprise a first data circuit interface 846 to enable information exchange between the logic circuit 842 (or other element of the instrument interface circuit 840) and a first data circuit disposed in or otherwise associated with a surgical instrument. In certain forms, for example, a first data circuit may be disposed in a cable integrally attached to a surgical instrument handpiece or in an adaptor for interfacing a specific surgical instrument type or model with the generator 800. The first data circuit may be implemented in any suitable manner and may communicate with the generator according to any suitable protocol, including, for example, as described herein with respect to the first data circuit. In certain forms, the first data circuit may comprise a non-volatile storage device, such as an EEPROM device. In certain forms, the first data circuit interface 846 may be implemented separately from the logic circuit 842 and comprise suitable circuitry (e.g., discrete logic devices, a processor) to enable communication between the logic circuit 842 and the first data circuit. In other forms, the first data circuit interface 846 may be integral with the logic circuit 842.


In certain forms, the first data circuit may store information pertaining to the particular surgical instrument with which it is associated. Such information may include, for example, a model number, a serial number, a number of operations in which the surgical instrument has been used, and/or any other type of information. This information may be read by the instrument interface circuit 840 (e.g., by the logic circuit 842), transferred to a component of the non-isolated stage 804 (e.g., to logic device 816, DSP processor 822, and/or UI processor 836) for presentation to a user via an output device and/or for controlling a function or operation of the generator 800. Additionally, any type of information may be communicated to the first data circuit for storage therein via the first data circuit interface 846 (e.g., using the logic circuit 842). Such information may comprise, for example, an updated number of operations in which the surgical instrument has been used and/or dates and/or times of its usage.


As discussed previously, a surgical instrument may be detachable from a handpiece (e.g., the multifunction surgical instrument may be detachable from the handpiece) to promote instrument interchangeability and/or disposability. In such cases, conventional generators may be limited in their ability to recognize particular instrument configurations being used and to optimize control and diagnostic processes accordingly. The addition of readable data circuits to surgical instruments to address this issue is problematic from a compatibility standpoint, however. For example, designing a surgical instrument to remain backwardly compatible with generators that lack the requisite data reading functionality may be impractical due to, for example, differing signal schemes, design complexity, and cost. Forms of instruments discussed herein address these concerns by using data circuits that may be implemented in existing surgical instruments economically and with minimal design changes to preserve compatibility of the surgical instruments with current generator platforms.


Additionally, forms of the generator 800 may enable communication with instrument-based data circuits. For example, the generator 800 may be configured to communicate with a second data circuit contained in an instrument (e.g., the multifunction surgical instrument). In some forms, the second data circuit may be implemented in a many similar to that of the first data circuit described herein. The instrument interface circuit 840 may comprise a second data circuit interface 848 to enable this communication. In one form, the second data circuit interface 848 may comprise a tri-state digital interface, although other interfaces may also be used. In certain forms, the second data circuit may generally be any circuit for transmitting and/or receiving data. In one form, for example, the second data circuit may store information pertaining to the particular surgical instrument with which it is associated. Such information may include, for example, a model number, a serial number, a number of operations in which the surgical instrument has been used, and/or any other type of information.


In some forms, the second data circuit may store information about the electrical and/or ultrasonic properties of an associated ultrasonic transducer, end effector, or ultrasonic drive system. For example, the first data circuit may indicate a burn-in frequency slope, as described herein. Additionally or alternatively, any type of information may be communicated to second data circuit for storage therein via the second data circuit interface 848 (e.g., using the logic circuit 842). Such information may comprise, for example, an updated number of operations in which the instrument has been used and/or dates and/or times of its usage. In certain forms, the second data circuit may transmit data acquired by one or more sensors (e.g., an instrument-based temperature sensor). In certain forms, the second data circuit may receive data from the generator 800 and provide an indication to a user (e.g., a light emitting diode indication or other visible indication) based on the received data.


In certain forms, the second data circuit and the second data circuit interface 848 may be configured such that communication between the logic circuit 842 and the second data circuit can be effected without the need to provide additional conductors for this purpose (e.g., dedicated conductors of a cable connecting a handpiece to the generator 800). In one form, for example, information may be communicated to and from the second data circuit using a one-wire bus communication scheme implemented on existing cabling, such as one of the conductors used transmit interrogation signals from the signal conditioning circuit 844 to a control circuit in a handpiece. In this way, design changes or modifications to the surgical instrument that might otherwise be necessary are minimized or reduced. Moreover, because different types of communications implemented over a common physical channel can be frequency-band separated, the presence of a second data circuit may be “invisible” to generators that do not have the requisite data reading functionality, thus enabling backward compatibility of the surgical instrument.


In certain forms, the isolated stage 802 may comprise at least one blocking capacitor 850-1 connected to the drive signal output 810b to prevent passage of DC current to a patient. A single blocking capacitor may be required to comply with medical regulations or standards, for example. While failure in single-capacitor designs is relatively uncommon, such failure may nonetheless have negative consequences. In one form, a second blocking capacitor 850-2 may be provided in series with the blocking capacitor 850-1, with current leakage from a point between the blocking capacitors 850-1, 850-2 being monitored by, for example, an ADC circuit 852 for sampling a voltage induced by leakage current. The samples may be received by the logic circuit 842, for example. Based changes in the leakage current (as indicated by the voltage samples), the generator 800 may determine when at least one of the blocking capacitors 850-1, 850-2 has failed, thus providing a benefit over single-capacitor designs having a single point of failure.


In certain forms, the non-isolated stage 804 may comprise a power supply 854 for delivering DC power at a suitable voltage and current. The power supply may comprise, for example, a 400 W power supply for delivering a 48 VDC system voltage. The power supply 854 may further comprise one or more DC/DC voltage converters 856 for receiving the output of the power supply to generate DC outputs at the voltages and currents required by the various components of the generator 800. As discussed above in connection with the controller 838, one or more of the DC/DC voltage converters 856 may receive an input from the controller 838 when activation of the “on/off” input device by a user is detected by the controller 838 to enable operation of, or wake, the DC/DC voltage converters 856.



FIG. 21 illustrates an example of a generator 900, which is one form of the generator 800 (FIG. 20). The generator 900 is configured to deliver multiple energy modalities to a surgical instrument. The generator 900 provides RF and ultrasonic signals for delivering energy to a surgical instrument either independently or simultaneously. The RF and ultrasonic signals may be provided alone or in combination and may be provided simultaneously. As noted above, at least one generator output can deliver multiple energy modalities (e.g., ultrasonic, bipolar or monopolar RF, irreversible and/or reversible electroporation, and/or microwave energy, among others) through a single port, and these signals can be delivered separately or simultaneously to the end effector to treat tissue.


The generator 900 comprises a processor 902 coupled to a waveform generator 904. The processor 902 and waveform generator 904 are configured to generate a variety of signal waveforms based on information stored in a memory coupled to the processor 902, not shown for clarity of disclosure. The digital information associated with a waveform is provided to the waveform generator 904 which includes one or more DAC circuits to convert the digital input into an analog output. The analog output is fed to an amplifier 1106 for signal conditioning and amplification. The conditioned and amplified output of the amplifier 906 is coupled to a power transformer 908. The signals are coupled across the power transformer 908 to the secondary side, which is in the patient isolation side. A first signal of a first energy modality is provided to the surgical instrument between the terminals labeled ENERGY1 and RETURN. A second signal of a second energy modality is coupled across a capacitor 910 and is provided to the surgical instrument between the terminals labeled ENERGY2 and RETURN. It will be appreciated that more than two energy modalities may be output and thus the subscript “n” may be used to designate that up to n ENERGYn terminals may be provided, where n is a positive integer greater than 1. It also will be appreciated that up to “n” return paths RETURNn may be provided without departing from the scope of the present disclosure.


A first voltage sensing circuit 912 is coupled across the terminals labeled ENERGY1 and the RETURN path to measure the output voltage therebetween. A second voltage sensing circuit 924 is coupled across the terminals labeled ENERGY2 and the RETURN path to measure the output voltage therebetween. A current sensing circuit 914 is disposed in series with the RETURN leg of the secondary side of the power transformer 908 as shown to measure the output current for either energy modality. If different return paths are provided for each energy modality, then a separate current sensing circuit should be provided in each return leg. The outputs of the first and second voltage sensing circuits 912, 924 are provided to respective isolation transformers 916, 922 and the output of the current sensing circuit 914 is provided to another isolation transformer 918. The outputs of the isolation transformers 916, 928, 922 in the on the primary side of the power transformer 908 (non-patient isolated side) are provided to a one or more ADC circuit 926. The digitized output of the ADC circuit 926 is provided to the processor 902 for further processing and computation. The output voltages and output current feedback information can be employed to adjust the output voltage and current provided to the surgical instrument and to compute output impedance, among other parameters. Input/output communications between the processor 902 and patient isolated circuits is provided through an interface circuit 920. Sensors also may be in electrical communication with the processor 902 by way of the interface circuit 920.


In one aspect, the impedance may be determined by the processor 902 by dividing the output of either the first voltage sensing circuit 912 coupled across the terminals labeled ENERGY1/RETURN or the second voltage sensing circuit 924 coupled across the terminals labeled ENERGY2/RETURN by the output of the current sensing circuit 914 disposed in series with the RETURN leg of the secondary side of the power transformer 908. The outputs of the first and second voltage sensing circuits 912, 924 are provided to separate isolations transformers 916, 922 and the output of the current sensing circuit 914 is provided to another isolation transformer 916. The digitized voltage and current sensing measurements from the ADC circuit 926 are provided the processor 902 for computing impedance. As an example, the first energy modality ENERGY1 may be ultrasonic energy and the second energy modality ENERGY2 may be RF energy. Nevertheless, in addition to ultrasonic and bipolar or monopolar RF energy modalities, other energy modalities include irreversible and/or reversible electroporation and/or microwave energy, among others. Also, although the example illustrated in FIG. 21 shows a single return path RETURN may be provided for two or more energy modalities, in other aspects, multiple return paths RETURNn may be provided for each energy modality ENERGYn. Thus, as described herein, the ultrasonic transducer impedance may be measured by dividing the output of the first voltage sensing circuit 912 by the current sensing circuit 914 and the tissue impedance may be measured by dividing the output of the second voltage sensing circuit 924 by the current sensing circuit 914.


As shown in FIG. 21, the generator 900 comprising at least one output port can include a power transformer 908 with a single output and with multiple taps to provide power in the form of one or more energy modalities, such as ultrasonic, bipolar or monopolar RF, irreversible and/or reversible electroporation, and/or microwave energy, among others, for example, to the end effector depending on the type of treatment of tissue being performed. For example, the generator 900 can deliver energy with higher voltage and lower current to drive an ultrasonic transducer, with lower voltage and higher current to drive RF electrodes for sealing tissue, or with a coagulation waveform for spot coagulation using either monopolar or bipolar RF electrosurgical electrodes. The output waveform from the generator 900 can be steered, switched, or filtered to provide the frequency to the end effector of the surgical instrument. The connection of an ultrasonic transducer to the generator 900 output would be preferably located between the output labeled ENERGY1 and RETURN as shown in FIG. 21. In one example, a connection of RF bipolar electrodes to the generator 900 output would be preferably located between the output labeled ENERGY2 and RETURN. In the case of monopolar output, the preferred connections would be active electrode (e.g., pencil or other probe) to the ENERGY2 output and a suitable return pad connected to the RETURN output.


Additional details are disclosed in U.S. Patent Application Publication No. 2017/0086914, titled TECHNIQUES FOR OPERATING GENERATOR FOR DIGITALLY GENERATING ELECTRICAL SIGNAL WAVEFORMS AND SURGICAL INSTRUMENTS, which published on Mar. 30, 2017, which is herein incorporated by reference in its entirety.


As used throughout this description, the term “wireless” and its derivatives may be used to describe circuits, devices, systems, methods, techniques, communications channels, etc., that may communicate data through the use of modulated electromagnetic radiation through a non-solid medium. The term does not imply that the associated devices do not contain any wires, although in some aspects they might not. The communication module may implement any of a number of wireless or wired communication standards or protocols, including but not limited to W-Fi (IEEE 802.11 family), WiMAX (IEEE 802.16 family), IEEE 802.20, long term evolution (LTE), Ev-DO, HSPA+, HSDPA+, HSUPA+, EDGE, GSM, GPRS, CDMA, TDMA, DECT, Bluetooth, Ethernet derivatives thereof, as well as any other wireless and wired protocols that are designated as 3G, 4G, 5G, and beyond. The computing module may include a plurality of communication modules. For instance, a first communication module may be dedicated to shorter range wireless communications such as Wi-Fi and Bluetooth and a second communication module may be dedicated to longer range wireless communications such as GPS, EDGE, GPRS, CDMA, WiMAX, LTE, Ev-DO, and others.


As used herein a processor or processing unit is an electronic circuit which performs operations on some external data source, usually memory or some other data stream. The term is used herein to refer to the central processor (central processing unit) in a system or computer systems (especially systems on a chip (SoCs)) that combine a number of specialized “processors.”


As used herein, a system on a chip or system on chip (SoC or SOC) is an integrated circuit (also known as an “IC” or “chip”) that integrates all components of a computer or other electronic systems. It may contain digital, analog, mixed-signal, and often radio-frequency functions—all on a single substrate. A SoC integrates a microcontroller (or microprocessor) with advanced peripherals like graphics processing unit (GPU), Wi-Fi module, or coprocessor. A SoC may or may not contain built-in memory.


As used herein, a microcontroller or controller is a system that integrates a microprocessor with peripheral circuits and memory. A microcontroller (or MCU for microcontroller unit) may be implemented as a small computer on a single integrated circuit. It may be similar to a SoC; an SoC may include a microcontroller as one of its components. A microcontroller may contain one or more core processing units (CPUs) along with memory and programmable input/output peripherals. Program memory in the form of Ferroelectric RAM, NOR flash or OTP ROM is also often included on chip, as well as a small amount of RAM. Microcontrollers may be employed for embedded applications, in contrast to the microprocessors used in personal computers or other general purpose applications consisting of various discrete chips.


As used herein, the term controller or microcontroller may be a stand-alone IC or chip device that interfaces with a peripheral device. This may be a link between two parts of a computer or a controller on an external device that manages the operation of (and connection with) that device.


Any of the processors or microcontrollers described herein, may be implemented by any single core or multicore processor such as those known under the trade name ARM Cortex by Texas Instruments. In one aspect, the processor may be an LM4F230H5QR ARM Cortex-M4F Processor Core, available from Texas Instruments, for example, comprising on-chip memory of 256 KB single-cycle flash memory, or other non-volatile memory, up to 40 MHz, a prefetch buffer to improve performance above 40 MHz, a 32 KB single-cycle serial random access memory (SRAM), internal read-only memory (ROM) loaded with StellarisWare® software, 2 KB electrically erasable programmable read-only memory (EEPROM), one or more pulse width modulation (PWM) modules, one or more quadrature encoder inputs (QEI) analog, one or more 12-bit Analog-to-Digital Converters (ADC) with 12 analog input channels, details of which are available for the product datasheet.


In one aspect, the processor may comprise a safety controller comprising two controller-based families such as TMS570 and RM4x known under the trade name Hercules ARM Cortex R4, also by Texas Instruments. The safety controller may be configured specifically for IEC 61508 and ISO 26262 safety critical applications, among others, to provide advanced integrated safety features while delivering scalable performance, connectivity, and memory options.


Modular devices include the modules (as described in connection with FIGS. 3 and 9, for example) that are receivable within a surgical hub and the surgical devices or instruments that can be connected to the various modules in order to connect or pair with the corresponding surgical hub. The modular devices include, for example, intelligent surgical instruments, medical imaging devices, suction/irrigation devices, smoke evacuators, energy generators, ventilators, insufflators, and displays. The modular devices described herein can be controlled by control algorithms. The control algorithms can be executed on the modular device itself, on the surgical hub to which the particular modular device is paired, or on both the modular device and the surgical hub (e.g., via a distributed computing architecture). In some exemplifications, the modular devices' control algorithms control the devices based on data sensed by the modular device itself (i.e., by sensors in, on, or connected to the modular device). This data can be related to the patient being operated on (e.g., tissue properties or insufflation pressure) or the modular device itself (e.g., the rate at which a knife is being advanced, motor current, or energy levels). For example, a control algorithm for a surgical stapling and cutting instrument can control the rate at which the instrument's motor drives its knife through tissue according to resistance encountered by the knife as it advances.


Situational Awareness

Situational awareness is the ability of some aspects of a surgical system to determine or infer information related to a surgical procedure from data received from databases and/or instruments. The information can include the type of procedure being undertaken, the type of tissue being operated on, or the body cavity that is the subject of the procedure. With the contextual information related to the surgical procedure, the surgical system can, for example, improve the manner in which it controls the modular devices (e.g. a robotic arm and/or robotic surgical tool) that are connected to it and provide contextualized information or suggestions to the surgeon during the course of the surgical procedure.


Referring now to FIG. 49, a timeline 5200 depicting situational awareness of a hub, such as the surgical hub 106 or 206, for example, is depicted. The timeline 5200 is an illustrative surgical procedure and the contextual information that the surgical hub 106, 206 can derive from the data received from the data sources at each step in the surgical procedure. The timeline 5200 depicts the typical steps that would be taken by the nurses, surgeons, and other medical personnel during the course of a lung segmentectomy procedure, beginning with setting up the operating theater and ending with transferring the patient to a post-operative recovery room.


The situationally aware surgical hub 106, 206 receives data from the data sources throughout the course of the surgical procedure, including data generated each time medical personnel utilize a modular device that is paired with the surgical hub 106, 206. The surgical hub 106, 206 can receive this data from the paired modular devices and other data sources and continually derive inferences (i.e., contextual information) about the ongoing procedure as new data is received, such as which step of the procedure is being performed at any given time. The situational awareness system of the surgical hub 106, 206 is able to, for example, record data pertaining to the procedure for generating reports, verify the steps being taken by the medical personnel, provide data or prompts (e.g., via a display screen) that may be pertinent for the particular procedural step, adjust modular devices based on the context (e.g., activate monitors, adjust the field of view (FOV) of the medical imaging device, or change the energy level of an ultrasonic surgical instrument or RF electrosurgical instrument), and take any other such action described above.


As the first step S202 in this illustrative procedure, the hospital staff members retrieve the patient's Electronic Medical Record (EMR) from the hospital's EMR database. Based on select patient data in the EMR, the surgical hub 106, 206 determines that the procedure to be performed is a thoracic procedure.


Second step S204, the staff members scan the incoming medical supplies for the procedure. The surgical hub 106, 206 cross-references the scanned supplies with a list of supplies that are utilized in various types of procedures and confirms that the mix of supplies corresponds to a thoracic procedure. Further, the surgical hub 106, 206 is also able to determine that the procedure is not a wedge procedure (because the incoming supplies either lack certain supplies that are necessary for a thoracic wedge procedure or do not otherwise correspond to a thoracic wedge procedure).


Third step S206, the medical personnel scan the patient band via a scanner that is communicably connected to the surgical hub 106, 206. The surgical hub 106, 206 can then confirm the patient's identity based on the scanned data.


Fourth step S208, the medical staff turns on the auxiliary equipment. The auxiliary equipment being utilized can vary according to the type of surgical procedure and the techniques to be used by the surgeon, but in this illustrative case they include a smoke evacuator, insufflator, and medical imaging device. When activated, the auxiliary equipment that are modular devices can automatically pair with the surgical hub 106, 206 that is located within a particular vicinity of the modular devices as part of their initialization process. The surgical hub 106, 206 can then derive contextual information about the surgical procedure by detecting the types of modular devices that pair with it during this pre-operative or initialization phase. In this particular example, the surgical hub 106, 206 determines that the surgical procedure is a VATS procedure based on this particular combination of paired modular devices. Based on the combination of the data from the patient's EMR, the list of medical supplies to be used in the procedure, and the type of modular devices that connect to the hub, the surgical hub 106, 206 can generally infer the specific procedure that the surgical team will be performing. Once the surgical hub 106, 206 knows what specific procedure is being performed, the surgical hub 106, 206 can then retrieve the steps of that procedure from a memory or from the cloud and then cross-reference the data it subsequently receives from the connected data sources (e.g., modular devices and patient monitoring devices) to infer what step of the surgical procedure the surgical team is performing.


Fifth step S210, the staff members attach the EKG electrodes and other patient monitoring devices to the patient. The EKG electrodes and other patient monitoring devices are able to pair with the surgical hub 106, 206. As the surgical hub 106, 206 begins receiving data from the patient monitoring devices, the surgical hub 106, 206 thus confirms that the patient is in the operating theater.


Sixth step S212, the medical personnel induce anesthesia in the patient. The surgical hub 106, 206 can infer that the patient is under anesthesia based on data from the modular devices and/or patient monitoring devices, including EKG data, blood pressure data, ventilator data, or combinations thereof, for example. Upon completion of the sixth step S212, the pre-operative portion of the lung segmentectomy procedure is completed and the operative portion begins.


Seventh step S214, the patient's lung that is being operated on is collapsed (while ventilation is switched to the contralateral lung). The surgical hub 106, 206 can infer from the ventilator data that the patient's lung has been collapsed, for example. The surgical hub 106, 206 can infer that the operative portion of the procedure has commenced as it can compare the detection of the patient's lung collapsing to the expected steps of the procedure (which can be accessed or retrieved previously) and thereby determine that collapsing the lung is the first operative step in this particular procedure.


Eighth step S216, the medical imaging device (e.g., a scope) is inserted and video from the medical imaging device is initiated. The surgical hub 106, 206 receives the medical imaging device data (i.e., video or image data) through its connection to the medical imaging device. Upon receipt of the medical imaging device data, the surgical hub 106, 206 can determine that the laparoscopic portion of the surgical procedure has commenced. Further, the surgical hub 106, 206 can determine that the particular procedure being performed is a segmentectomy, as opposed to a lobectomy (note that a wedge procedure has already been discounted by the surgical hub 106, 206 based on data received at the second step S204 of the procedure). The data from the medical imaging device 124 (FIG. 2) can be utilized to determine contextual information regarding the type of procedure being performed in a number of different ways, including by determining the angle at which the medical imaging device is oriented with respect to the visualization of the patient's anatomy, monitoring the number or medical imaging devices being utilized (i.e., that are activated and paired with the surgical hub 106, 206), and monitoring the types of visualization devices utilized. For example, one technique for performing a VATS lobectomy places the camera in the lower anterior corner of the patient's chest cavity above the diaphragm, whereas one technique for performing a VATS segmentectomy places the camera in an anterior intercostal position relative to the segmental fissure. Using pattern recognition or machine learning techniques, for example, the situational awareness system can be trained to recognize the positioning of the medical imaging device according to the visualization of the patient's anatomy. As another example, one technique for performing a VATS lobectomy utilizes a single medical imaging device, whereas another technique for performing a VATS segmentectomy utilizes multiple cameras. As yet another example, one technique for performing a VATS segmentectomy utilizes an infrared light source (which can be communicably coupled to the surgical hub as part of the visualization system) to visualize the segmental fissure, which is not utilized in a VATS lobectomy. By tracking any or all of this data from the medical imaging device, the surgical hub 106, 206 can thereby determine the specific type of surgical procedure being performed and/or the technique being used for a particular type of surgical procedure.


Ninth step S218, the surgical team begins the dissection step of the procedure. The surgical hub 106, 206 can infer that the surgeon is in the process of dissecting to mobilize the patient's lung because it receives data from the RF or ultrasonic generator indicating that an energy instrument is being fired. The surgical hub 106, 206 can cross-reference the received data with the retrieved steps of the surgical procedure to determine that an energy instrument being fired at this point in the process (i.e., after the completion of the previously discussed steps of the procedure) corresponds to the dissection step. In certain instances, the energy instrument can be an energy tool mounted to a robotic arm of a robotic surgical system.


Tenth step S220, the surgical team proceeds to the ligation step of the procedure. The surgical hub 106, 206 can infer that the surgeon is ligating arteries and veins because it receives data from the surgical stapling and cutting instrument indicating that the instrument is being fired. Similarly to the prior step, the surgical hub 106, 206 can derive this inference by cross-referencing the receipt of data from the surgical stapling and cutting instrument with the retrieved steps in the process. In certain instances, the surgical instrument can be a surgical tool mounted to a robotic arm of a robotic surgical system.


Eleventh step S222, the segmentectomy portion of the procedure is performed. The surgical hub 106, 206 can infer that the surgeon is transecting the parenchyma based on data from the surgical stapling and cutting instrument, including data from its cartridge. The cartridge data can correspond to the size or type of staple being fired by the instrument, for example. As different types of staples are utilized for different types of tissues, the cartridge data can thus indicate the type of tissue being stapled and/or transected. In this case, the type of staple being fired is utilized for parenchyma (or other similar tissue types), which allows the surgical hub 106, 206 to infer that the segmentectomy portion of the procedure is being performed.


Twelfth step S224, the node dissection step is then performed. The surgical hub 106, 206 can infer that the surgical team is dissecting the node and performing a leak test based on data received from the generator indicating that an RF or ultrasonic instrument is being fired. For this particular procedure, an RF or ultrasonic instrument being utilized after parenchyma was transected corresponds to the node dissection step, which allows the surgical hub 106, 206 to make this inference. It should be noted that surgeons regularly switch back and forth between surgical stapling/cutting instruments and surgical energy (i.e., RF or ultrasonic) instruments depending upon the particular step in the procedure because different instruments are better adapted for particular tasks. Therefore, the particular sequence in which the stapling/cutting instruments and surgical energy instruments are used can indicate what step of the procedure the surgeon is performing. Moreover, in certain instances, robotic tools can be utilized for one or more steps in a surgical procedure and/or handheld surgical instruments can be utilized for one or more steps in the surgical procedure. The surgeon(s) can alternate between robotic tools and handheld surgical instruments and/or can use the devices concurrently, for example. Upon completion of the twelfth step S224, the incisions are closed up and the post-operative portion of the procedure begins.


Thirteenth step S226, the patient's anesthesia is reversed. The surgical hub 106, 206 can infer that the patient is emerging from the anesthesia based on the ventilator data (i.e., the patient's breathing rate begins increasing), for example.


Lastly, the fourteenth step S228 is that the medical personnel remove the various patient monitoring devices from the patient. The surgical hub 106, 206 can thus infer that the patient is being transferred to a recovery room when the hub loses EKG, BP, and other data from the patient monitoring devices. As can be seen from the description of this illustrative procedure, the surgical hub 106, 206 can determine or infer when each step of a given surgical procedure is taking place according to data received from the various data sources that are communicably coupled to the surgical hub 106, 206.


Situational awareness is further described in U.S. Provisional Patent Application Ser. No. 62/611,341, titled INTERACTIVE SURGICAL PLATFORM, filed Dec. 28, 2017, the disclosure of which is herein incorporated by reference in its entirety. In certain instances, operation of a robotic surgical system, including the various robotic surgical systems disclosed herein, for example, can be controlled by the hub 106, 206 based on its situational awareness and/or feedback from the components thereof and/or based on information from the cloud 104.


Robotic Systems

Robotic surgical systems can be used in minimally invasive medical procedures. During such medical procedures, a patient can be placed on a platform adjacent to a robotic surgical system, and a surgeon can be positioned at a command console that is remote from the platform and/or from the robot. For example, the surgeon can be positioned outside the sterile field that surrounds the surgical site. The surgeon provides input to a user interface via an input device at the command console to manipulate a surgical tool coupled to an arm of the robotic system. The input device can be a mechanical input devices such as control handles or joysticks, for example, or contactless input devices such as optical gesture sensors, for example.


The robotic surgical system can include a robot tower supporting one or more robotic arms. At least one surgical tool (e.g. an end effector and/or endoscope) can be mounted to the robotic arm. The surgical tool(s) can be configured to articulate relative to the respective robotic arm via an articulating wrist assembly and/or to translate relative to the robotic arm via a linear slide mechanism, for example. During the surgical procedure, the surgical tool can be inserted into a small incision in a patient via a cannula or trocar, for example, or into a natural orifice of the patient to position the distal end of the surgical tool at the surgical site within the body of the patient. Additionally or alternatively, the robotic surgical system can be employed in an open surgical procedure in certain instances.


A schematic of a robotic surgical system 15000 is depicted in FIG. 22. The robotic surgical system 15000 includes a central control unit 15002, a surgeon's console 15012, a robot 15022 including one or more robotic arms 15024, and a primary display 15040 operably coupled to the control unit 15002. The surgeon's console 15012 includes a display 15014 and at least one manual input device 15016 (e.g., switches, buttons, touch screens, joysticks, gimbals, etc.) that allow the surgeon to telemanipulate the robotic arms 15024 of the robot 15022. The reader will appreciate that additional and alternative input devices can be employed.


The central control unit 15002 includes a processor 15004 operably coupled to a memory 15006. The processor 15004 includes a plurality of inputs and outputs for interfacing with the components of the robotic surgical system 15000. The processor 15004 can be configured to receive input signals and/or generate output signals to control one or more of the various components (e.g., one or more motors, sensors, and/or displays) of the robotic surgical system 15000. The output signals can include, and/or can be based upon, algorithmic instructions which may be pre-programmed and/or input by the surgeon or another clinician. The processor 15004 can be configured to accept a plurality of inputs from a user, such as the surgeon at the console 15012, and/or may interface with a remote system. The memory 15006 can be directly and/or indirectly coupled to the processor 15004 to store instructions and/or databases.


The robot 15022 includes one or more robotic arms 15024. Each robotic arm 15024 includes one or more motors 15026 and each motor 15026 is coupled to one or more motor drivers 15028. For example, the motors 15026, which can be assigned to different drivers and/or mechanisms, can be housed in a carriage assembly or housing. In certain instances, a transmission intermediate a motor 15026 and one or more drivers 15028 can permit coupling and decoupling of the motor 15026 to one or more drivers 15028. The drivers 15028 can be configured to implement one or more surgical functions. For example, one or more drivers 15028 can be tasked with moving a robotic arm 15024 by rotating the robotic arm 15024 and/or a linkage and/or joint thereof. Additionally, one or more drivers 15028 can be coupled to a surgical tool 15030 and can implement articulating, rotating, clamping, sealing, stapling, energizing, firing, cutting, and/or opening, for example. In certain instances, the surgical tools 15030 can be interchangeable and/or replaceable. Examples of robotic surgical systems and surgical tools are further described herein.


The reader will readily appreciate that the computer-implemented interactive surgical system 100 (FIG. 1) and the computer-implemented interactive surgical system 200 (FIG. 9) can incorporate the robotic surgical system 15000. Additionally or alternatively, the robotic surgical system 15000 can include various features and/or components of the computer-implemented interactive surgical systems 100 and 200.


In one exemplification, the robotic surgical system 15000 can encompass the robotic system 110 (FIG. 2), which includes the surgeon's console 118, the surgical robot 120, and the robotic hub 122. Additionally or alternatively, the robotic surgical system 15000 can communicate with another hub, such as the surgical hub 106, for example. In one instance, the robotic surgical system 15000 can be incorporated into a surgical system, such as the computer-implemented interactive surgical system 100 (FIG. 1) or the computer-implemented interactive surgical system 200 (FIG. 9), for example. In such instances, the robotic surgical system 15000 may interact with the cloud 104 or the cloud 204, respectively, and the surgical hub 106 or the surgical hub 206, respectively. In certain instances, a robotic hub or a surgical hub can include the central control unit 15002 and/or the central control unit 15002 can communicate with a cloud. In other instances, a surgical hub can embody a discrete unit that is separate from the central control unit 15002 and which can communicate with the central control unit 15002.


Another robotic surgical system is depicted in FIGS. 23 and 24. With reference to FIG. 23, the robotic surgical system 13000 includes robotic arms 13002, 13003, a control device 13004, and a console 13005 coupled to the control device 13004. As illustrated in FIG. 23, the surgical system 13000 is configured for use on a patient 13013 lying on a patient table 13012 for performance of a minimally invasive surgical operation. The console 13005 includes a display device 13006 and input devices 13007, 13008. The display device 13006 is set up to display three-dimensional images, and the manual input devices 13007, 13008 are configured to allow a clinician to telemanipulate the robotic arms 13002, 13003. Controls for a surgeon's console, such as the console 13005, are further described in International Patent Publication No. WO2017/075121, filed Oct. 27, 2016, titled HAPTIC FEEDBACK FOR A ROBOTIC SURGICAL SYSTEM INTERFACE, which is herein incorporated by reference in its entirety.


Each of the robotic arms 13002, 13003 is made up of a plurality of members connected through joints and includes a surgical assembly 13010 connected to a distal end of a corresponding robotic arm 13002, 13003. Support of multiple arms is further described in U.S. Patent Application Publication No. 2017/0071693, filed Nov. 11, 2016, titled SURGICAL ROBOTIC ARM SUPPORT SYSTEMS AND METHODS OF USE, which is herein incorporated by reference in its entirety. Various robotic arm configurations are further described in International Patent Publication No. WO2017/044406, filed Sep. 6, 2016, titled ROBOTIC SURGICAL CONTROL SCHEME FOR MANIPULATING ROBOTIC END EFFECTORS, which is herein incorporated by reference in its entirety. In an exemplification, the surgical assembly 13010 includes a surgical instrument 13020 supporting an end effector 13023. Although two robotic arms 13002, 13003, are depicted, the surgical system 13000 may include a single robotic arm or more than two robotic arms 13002, 13003. Additional robotic arms are likewise connected to the control device 13004 and are telemanipulatable via the console 13005. Accordingly, one or more additional surgical assemblies 13010 and/or surgical instruments 13020 may also be attached to the additional robotic arm(s).


The robotic arms 13002, 13003 may be driven by electric drives that are connected to the control device 13004. According to an exemplification, the control device 13004 is configured to activate drives, for example, via a computer program, such that the robotic arms 13002, 13003 and the surgical assemblies 13010 and/or surgical instruments 13020 corresponding to the robotic arms 13002, 13003, execute a desired movement received through the manual input devices 13007, 13008. The control device 13004 may also be configured to regulate movement of the robotic arms 13002, 13003 and/or of the drives.


The control device 13004 may control a plurality of motors (for example, Motor I . . . n) with each motor configured to drive a pushing or a pulling of one or more cables, such as cables coupled to the end effector 13023 of the surgical instrument 13020. In use, as these cables are pushed and/or pulled, the one or more cables affect operation and/or movement of the end effector 13023. The control device 13004 coordinates the activation of the various motors to coordinate a pushing or a pulling motion of one or more cables in order to coordinate an operation and/or movement of one or more end effectors 13023. For example, articulation of an end effector by a robotic assembly such as the surgical assembly 13010 is further described in U.S. Patent Application Publication No. 2016/0303743, filed Jun. 6, 2016, titled WRIST AND JAW ASSEMBLIES FOR ROBOTIC SURGICAL SYSTEMS and in International Patent Publication No. WO2016/144937, filed Mar. 8, 2016, titled MEASURING HEALTH OF A CONNECTOR MEMBER OF A ROBOTIC SURGICAL SYSTEM, each of which is herein incorporated by reference in its entirety. In an exemplification, each motor is configured to actuate a drive rod or a lever arm to affect operation and/or movement of end effectors 13023 in addition to, or instead of, one or more cables.


Driver configurations for surgical instruments, such as drive arrangements for a surgical end effector, are further described in International Patent Publication No. WO2016/183054, filed May 10, 2016, titled COUPLING INSTRUMENT DRIVE UNIT AND ROBOTIC SURGICAL INSTRUMENT, International Patent Publication No. WO2016/205266, filed Jun. 15, 2016, titled ROBOTIC SURGICAL SYSTEM TORQUE TRANSDUCTION SENSING, International Patent Publication No. WO2016/205452, filed Jun. 16, 2016, titled CONTROLLING ROBOTIC SURGICAL INSTRUMENTS WITH BIDIRECTIONAL COUPLING, and International Patent Publication No. WO2017/053507, filed Sep. 22, 2016, titled ELASTIC SURGICAL INTERFACE FOR ROBOTIC SURGICAL SYSTEMS, each of which is herein incorporated by reference in its entirety. The modular attachment of surgical instruments to a driver is further described in International Patent Publication No. WO2016/209769, filed Jun. 20, 2016, titled ROBOTIC SURGICAL ASSEMBLIES, which is herein incorporated by reference in its entirety. Housing configurations for a surgical instrument driver and interface are further described in International Patent Publication No. WO2016/144998, filed Mar. 9, 2016, titled ROBOTIC SURGICAL SYSTEMS, INSTRUMENT DRIVE UNITS, AND DRIVE ASSEMBLIES, which is herein incorporated by reference in its entirety. Various endocutter instrument configurations for use with the robotic arms 13002, 13003 are further described in International Patent Publication No. WO2017/053358, filed Sep. 21, 2016, titled SURGICAL ROBOTIC ASSEMBLIES AND INSTRUMENT ADAPTERS THEREOF and International Patent Publication No. WO2017/053363, filed Sep. 21, 2016, titled ROBOTIC SURGICAL ASSEMBLIES AND INSTRUMENT DRIVE CONNECTORS THEREOF, each of which is herein incorporated by reference in its entirety. Bipolar instrument configurations for use with the robotic arms 13002, 13003 are further described in International Patent Publication No. WO2017/053698, filed Sep. 23, 2016, titled ROBOTIC SURGICAL ASSEMBLIES AND ELECTROMECHANICAL INSTRUMENTS THEREOF, which is herein incorporated by reference in its entirety. Reposable shaft arrangements for use with the robotic arms 13002, 13003 are further described in International Patent Publication No. WO2017/116793, filed Dec. 19, 2016, titled ROBOTIC SURGICAL SYSTEMS AND INSTRUMENT DRIVE ASSEMBLIES, which is herein incorporated by reference in its entirety.


The control device 13004 includes any suitable logic control circuit adapted to perform calculations and/or operate according to a set of instructions. The control device 13004 can be configured to communicate with a remote system “RS,” either via a wireless (e.g., Bluetooth, LTE, etc.) and/or wired connection. The remote system “RS” can include data, instructions and/or information related to the various components, algorithms, and/or operations of system 13000. The remote system “RS” can include any suitable electronic service, database, platform, cloud “C” (see FIG. 23), or the like. The control device 13004 may include a central processing unit operably connected to memory. The memory may include transitory type memory (e.g., RAM) and/or non-transitory type memory (e.g., flash media, disk media, etc.). In some exemplifications, the memory is part of, and/or operably coupled to, the remote system “RS.”


The control device 13004 can include a plurality of inputs and outputs for interfacing with the components of the system 13000, such as through a driver circuit. The control device 13004 can be configured to receive input signals and/or generate output signals to control one or more of the various components (e.g., one or more motors) of the system 13000. The output signals can include, and/or can be based upon, algorithmic instructions which may be pre-programmed and/or input by a user. The control device 13004 can be configured to accept a plurality of user inputs from a user interface (e.g., switches, buttons, touch screen, etc. of operating the console 13005) which may be coupled to remote system “RS.”


A memory 13014 can be directly and/or indirectly coupled to the control device 13004 to store instructions and/or databases including pre-operative data from living being(s) and/or anatomical atlas(es). The memory 13014 can be part of, and/or or operatively coupled to, remote system “RS.”


In accordance with an exemplification, the distal end of each robotic arm 13002, 13003 is configured to releasably secure the end effector 13023 (or other surgical tool) therein and may be configured to receive any number of surgical tools or instruments, such as a trocar or retractor, for example.


A simplified functional block diagram of a system architecture 13400 of the robotic surgical system 13010 is depicted in FIG. 24. The system architecture 13400 includes a core module 13420, a surgeon master module 13430, a robotic arm module 13440, and an instrument module 13450. The core module 13420 serves as a central controller for the robotic surgical system 13000 and coordinates operations of all of the other modules 13430, 13440, 13450. For example, the core module 13420 maps control devices to the arms 13002, 13003, determines current status, performs all kinematics and frame transformations, and relays resulting movement commands. In this regard, the core module 13420 receives and analyzes data from each of the other modules 13430, 13440, 13450 in order to provide instructions or commands to the other modules 13430, 13440, 13450 for execution within the robotic surgical system 13000. Although depicted as separate modules, one or more of the modules 13420, 13430, 13440, and 13450 are a single component in other exemplifications.


The core module 13420 includes models 13422, observers 13424, a collision manager 13426, controllers 13428, and a skeleton 13429. The models 13422 include units that provide abstracted representations (base classes) for controlled components, such as the motors (for example, Motor I . . . n) and/or the arms 13002, 13003. The observers 13424 create state estimates based on input and output signals received from the other modules 13430, 13440, 13450. The collision manager 13426 prevents collisions between components that have been registered within the system 13010. The skeleton 13429 tracks the system 13010 from a kinematic and dynamics point of view. For example, the kinematics item may be implemented either as forward or inverse kinematics, in an exemplification. The dynamics item may be implemented as algorithms used to model dynamics of the system's components.


The surgeon master module 13430 communicates with surgeon control devices at the console 13005 and relays inputs received from the console 13005 to the core module 13420. In accordance with an exemplification, the surgeon master module 13430 communicates button status and control device positions to the core module 13420 and includes a node controller 13432 that includes a state/mode manager 13434, a fail-over controller 13436, and a N-degree of freedom (“DOF”) actuator 13438.


The robotic arm module 13440 coordinates operation of a robotic arm subsystem, an arm cart subsystem, a set up arm, and an instrument subsystem in order to control movement of a corresponding arm 13002, 13003. Although a single robotic arm module 13440 is included, it will be appreciated that the robotic arm module 13440 corresponds to and controls a single arm. As such, additional robotic arm modules 13440 are included in configurations in which the system 13010 includes multiple arms 13002, 13003. The robotic arm module 13440 includes a node controller 13442, a state/mode manager 13444, a fail-over controller 13446, and a N-degree of freedom (“DOF”) actuator 13348.


The instrument module 13450 controls movement of an instrument and/or tool component attached to the arm 13002, 13003. The instrument module 13450 is configured to correspond to and control a single instrument. Thus, in configurations in which multiple instruments are included, additional instrument modules 13450 are likewise included. In an exemplification, the instrument module 13450 obtains and communicates data related to the position of the end effector or jaw assembly (which may include the pitch and yaw angle of the jaws), the width of or the angle between the jaws, and the position of an access port. The instrument module 13450 has a node controller 13452, a state/mode manager 13454, a fail-over controller 13456, and a N-degree of freedom (“DOF”) actuator 13458.


The position data collected by the instrument module 13450 is used by the core module 13420 to determine when the instrument is within the surgical site, within a cannula, adjacent to an access port, or above an access port in free space. The core module 13420 can determine whether to provide instructions to open or close the jaws of the instrument based on the positioning thereof. For example, when the position of the instrument indicates that the instrument is within a cannula, instructions are provided to maintain a jaw assembly in a closed position. When the position of the instrument indicates that the instrument is outside of an access port, instructions are provided to open the jaw assembly.


Additional features and operations of a robotic surgical system, such as the surgical robot system depicted in FIGS. 23 and 24, are further described in the following references, each of which is herein incorporated by reference in its entirety:

    • U.S. Patent Application Publication No. 2016/0303743, filed Jun. 6, 2016, titled WRIST AND JAW ASSEMBLIES FOR ROBOTIC SURGICAL SYSTEMS;
    • U.S. Patent Application Publication No. 2017/0071693, filed Nov. 11, 2016, titled SURGICAL ROBOTIC ARM SUPPORT SYSTEMS AND METHODS OF USE;
    • International Patent Publication No. WO2016/144937, filed Mar. 8, 2016, titled MEASURING HEALTH OF A CONNECTOR MEMBER OF A ROBOTIC SURGICAL SYSTEM;
    • International Patent Publication No. WO2016/144998, filed Mar. 9, 2016, titled ROBOTIC SURGICAL SYSTEMS, INSTRUMENT DRIVE UNITS, AND DRIVE ASSEMBLIES;
    • International Patent Publication No. WO2016/183054, filed May 10, 2016, titled COUPLING INSTRUMENT DRIVE UNIT AND ROBOTIC SURGICAL INSTRUMENT;
    • International Patent Publication No. WO2016/205266, filed Jun. 15, 2016, titled ROBOTIC SURGICAL SYSTEM TORQUE TRANSDUCTION SENSING;
    • International Patent Publication No. WO2016/205452, filed Jun. 16, 2016, titled CONTROLLING ROBOTIC SURGICAL INSTRUMENTS WITH BIDIRECTIONAL COUPLING;
    • International Patent Publication No. WO2016/209769, filed Jun. 20, 2016, titled ROBOTIC SURGICAL ASSEMBLIES;
    • International Patent Publication No. WO2017/044406, filed Sep. 6, 2016, titled ROBOTIC SURGICAL CONTROL SCHEME FOR MANIPULATING ROBOTIC END EFFECTORS;
    • International Patent Publication No. WO2017/053358, filed Sep. 21, 2016, titled SURGICAL ROBOTIC ASSEMBLIES AND INSTRUMENT ADAPTERS THEREOF;
    • International Patent Publication No. WO2017/053363, filed Sep. 21, 2016, titled ROBOTIC SURGICAL ASSEMBLIES AND INSTRUMENT DRIVE CONNECTORS THEREOF;
    • International Patent Publication No. WO2017/053507, filed Sep. 22, 2016, titled ELASTIC SURGICAL INTERFACE FOR ROBOTIC SURGICAL SYSTEMS;
    • International Patent Publication No. WO2017/053698, filed Sep. 23, 2016, titled ROBOTIC SURGICAL ASSEMBLIES AND ELECTROMECHANICAL INSTRUMENTS THEREOF;
    • International Patent Publication No. WO2017/075121, filed Oct. 27, 2016, titled HAPTIC FEEDBACK CONTROLS FOR A ROBOTIC SURGICAL SYSTEM INTERFACE;
    • International Patent Publication No. WO2017/116793, filed Dec. 19, 2016, titled ROBOTIC SURGICAL SYSTEMS AND INSTRUMENT DRIVE ASSEMBLIES.


The robotic surgical systems and features disclosed herein can be employed with the robotic surgical system of FIGS. 23 and 24. The reader will further appreciate that various systems and/or features disclosed herein can also be employed with alternative surgical systems including the computer-implemented interactive surgical system 100, the computer-implemented interactive surgical system 200, the robotic surgical system 110, the robotic hub 122, the robotic hub 222, and/or the robotic surgical system 15000, for example.


In various instances, a robotic surgical system can include a robotic control tower, which can house the control unit of the system. For example, the control unit 13004 of the robotic surgical system 13000 (FIG. 23) can be housed within a robotic control tower. The robotic control tower can include a robotic hub such as the robotic hub 122 (FIG. 2) or the robotic hub 222 (FIG. 9), for example. Such a robotic hub can include a modular interface for coupling with one or more generators, such as an ultrasonic generator and/or a radio frequency generator, and/or one or more modules, such as an imaging module, suction module, an irrigation module, a smoke evacuation module, and/or a communication module.


A robotic hub can include a situational awareness module, which can be configured to synthesize data from multiple sources to determine an appropriate response to a surgical event. For example, a situational awareness module can determine the type of surgical procedure, step in the surgical procedure, type of tissue, and/or tissue characteristics, as further described herein. Moreover, such a module can recommend a particular course of action or possible choices to the robotic system based on the synthesized data. In various instances, a sensor system encompassing a plurality of sensors distributed throughout the robotic system can provide data, images, and/or other information to the situational awareness module. Such a situational awareness module can be incorporated into a control unit, such as the control unit 13004, for example. In various instances, the situational awareness module can obtain data and/or information from a non-robotic surgical hub and/or a cloud, such as the surgical hub 106 (FIG. 1), the surgical hub 206 (FIG. 10), the cloud 104 (FIG. 1), and/or the cloud 204 (FIG. 9), for example. Situational awareness of a surgical system is further disclosed herein and in U.S. Provisional Patent Application Ser. No. 62/611,341, titled INTERACTIVE SURGICAL PLATFORM, filed Dec. 28, 2017, and U.S. Provisional Patent Application Ser. No. 62/611,340, titled CLOUD-BASED MEDICAL ANALYTICS, filed Dec. 28, 2017, the disclosure of each of which is herein incorporated by reference in its entirety.


In certain instances, the activation of a surgical tool at certain times during a surgical procedure and/or for certain durations may cause tissue trauma and/or may prolong a surgical procedure. For example, a robotic surgical system can utilize an electrosurgical tool having an energy delivery surface that should only be energized when a threshold condition is met. In one example, the energy delivery surface should only be activated when the energy delivery surface is in contact with the appropriate, or targeted, tissue. As another example, a robotic surgical system can utilize a suction element that should only be activated when a threshold condition is met, such as when an appropriate volume of fluid is present. Due to visibility restrictions, evolving situations, and the multitude of moving parts during a robotic surgical procedure, it can be difficult for a clinician to determine and/or monitor certain conditions at the surgical site. For example, it can be difficult to determine if an energy delivery surface of an electrosurgical tool is in contact with tissue. It can also be difficult to determine if a particular suctioning pressure is sufficient for the volume of fluid in the proximity of the suctioning port.


Moreover, a plurality of surgical devices can be used in certain robotic surgical procedures. For example, a robotic surgical system can use one or more surgical tools during the surgical procedure. Additionally, one or more handheld instruments can also be used during the surgical procedure. One or more of the surgical devices can include a sensor. For example, multiple sensors can be positioned around the surgical site and/or the operating room. A sensor system including the one or more sensors can be configured to detect one or more conditions at the surgical site. For example, data from the sensor system can determine if a surgical tool mounted to the surgical robot is being used and/or if a feature of the surgical tool should be activated. More specifically, a sensor system can detect if an electrosurgical device is positioned in abutting contact with tissue, for example. As another example, a sensor system can detect if a suctioning element of a surgical tool is applying a sufficient suctioning force to fluid at the surgical site.


When in an automatic activation mode, the robotic surgical system can automatically activate one or more features of one or more surgical tools based on data, images, and/or other information received from the sensor system. For example, an energy delivery surface of an electrosurgical tool can be activated upon detecting that the electrosurgical tool is in use (e.g. positioned in abutting contact with tissue). As another example, a suctioning element on a surgical tool can be activated when the suction port is moved into contact with a fluid. In certain instances, the surgical tool can be adjusted based on the sensed conditions.


A robotic surgical system incorporating an automatic activation mode can automatically provide a scenario-specific result based on detected condition(s) at the surgical site. The scenario-specific result can be outcome-based, for example, and can streamline the decision-making process of the clinician. In certain instances, such an automatic activation mode can improve the efficiency and/or effectiveness of the clinician. For example, the robotic surgical system can aggregate data to compile a more complete view of the surgical site and/or the surgical procedure in order to determine the best possible course of action. Additionally or alternatively, in instances in which the clinician makes fewer decisions, the clinician can be better focused on other tasks and/or can process other information more effectively.


In one instance, a robotic surgical system can automatically adjust a surgical tool based on the proximity of the tool to a visually-detectable need and/or the situational awareness of the system. Referring to FIGS. 25A and 25B, an ultrasonic surgical tool for a robotic system 13050 is depicted in two different positions. In a first position, as depicted in FIG. 25A, the blade 13052 of an ultrasonic surgical tool 13050 is positioned out of contact with tissue 13060. In such a position, a sensor on the ultrasonic surgical tool 13050 can detect a high resistance. When the resistance detected is above a threshold value, the ultrasonic blade 13052 can be de-energized. Referring now to FIG. 25B, the ultrasonic blade 13052 is depicted in a second position in which the distal end of the blade 13052 is positioned in abutting contact with tissue 13060. In such instances, a sensor on the ultrasonic surgical tool 13050 can detect a low resistance. When the detected resistance is below a threshold value, the ultrasonic blade 13052 can be activated such that therapeutic energy is delivered to the tissue 13060. Alternative sensor configurations are also envisioned and various sensors are further described herein.


Referring to FIGS. 26A and 26B, another surgical tool, a monopolar cautery pencil 13055, is depicted in two different positions. In a first position, as depicted in FIG. 26A, the monopolar cautery pencil 13055 is positioned out of contact with tissue. In such a position, a sensor on the monopolar cautery pencil 13055 can detect a high resistance. When the resistance detected is above a threshold value, the monopolar cautery pencil 13055 can be de-energized. Referring now to FIG. 26B, the monopolar cautery pencil 13055 is depicted in a second position in which the distal end of the monopolar cautery pencil 13055 is positioned in abutting contact with tissue. In such instances, a sensor on the monopolar cautery pencil 13055 can detect a low resistance. When the detected resistance is below a threshold value, the monopolar cautery pencil 13055 can be activated such that therapeutic energy is delivered to the tissue. Alternative sensor configurations are also envisioned and various sensors are further described herein.



FIG. 27 shows a graphical display 13070 of continuity C and current I over time t for the ultrasonic surgical tool 13050 of FIGS. 25A and 25B. Similarly, the monopolar cautery pencil 13055 can generate a graphical display similar in many respects to the graphical display 13070, in certain instances. In the graphical display 13070, continuity C is represented by a dotted line, and current I is represented by a solid line. When the resistance is high and above a threshold value, the continuity C can also be high. The threshold value can be between 40 and 400 ohms, for example. At time A′, the continuity C can decrease below the threshold value, which can indicate a degree of tissue contact. As a result, the robotic surgical system can automatically activate advanced energy treatment of the tissue. The ultrasonic transducer current depicted in FIG. 27 increases from time A′ to B′ when the continuity parameters indicate the degree of tissue contact. In various instances, the current I can be capped at a maximum value indicated at B′, which can correspond to an open jaw transducer limit, such as in instances in which the jaw is not clamped, as shown in FIGS. 25A and 25B. In various instances, the situational awareness module of the robotic surgical system may indicate that the jaw is unclamped. Referring again to the graphical display 13070 in FIG. 27, energy is applied until time C′, at which time a loss of tissue contact is indicated by the increase in continuity C above the threshold value. As a result, the ultrasonic transducer current I can decrease to zero as the ultrasonic blade is de-energized.


In various instances, a sensor system can be configured to detect at least one condition at the surgical site. For example, a sensor of the sensor system can detect tissue contact by measuring continuity along the energy delivery surface of the ultrasonic blade. Additionally or alternatively, the sensor system can include one or more additional sensors positioned around the surgical site. For example, one or more surgical tools and/or instruments being used in the surgical procedure can be configured to detect a condition at the surgical site. The sensor system can be in signal communication with a processor of the robotic surgical system. For example, the robotic surgical system can include a central control tower including a control unit housing a processor and memory, as further described herein. The processor can issue commands to the surgical tool based on inputs from the sensor system. In various instances, situational awareness can also dictate and/or influence the commands issued by the processor.


Turning now to FIG. 28, an end effector 196400 includes RF data sensors 196406, 196408a, 196408b located on jaw member 196402. The end effector 196400 includes jaw member 196402 and an ultrasonic blade 196404. The jaw member 196402 is shown clamping tissue 196410 located between the jaw member 196402 and the ultrasonic blade 196404. A first sensor 196406 is located in a center portion of the jaw member 196402. Second and third sensors 196408a, 196408b, respectively, are located on lateral portions of the jaw member 196402. The sensors 196406, 196408a, 196408b are mounted or formed integrally with a flexible circuit 196412 (shown more particularly in FIG. 29) configured to be fixedly mounted to the jaw member 196402.


The end effector 196400 is an example end effector for various surgical devices described herein. The sensors 196406, 196408a, 196408b are electrically connected to a control circuit via interface circuits. The sensors 196406, 196408a, 196408b are battery powered and the signals generated by the sensors 196406, 196408a, 196408b are provided to analog and/or digital processing circuits of the control circuit.


In one aspect, the first sensor 196406 is a force sensor to measure a normal force F3 applied to the tissue 196410 by the jaw member 196402. The second and third sensors 196408a, 196408b include one or more elements to apply RF energy to the tissue 196410, measure tissue impedance, down force F1, transverse forces F2, and temperature, among other parameters. Electrodes 196409a, 196409b are electrically coupled to an energy source such as an electrical circuit and apply RF energy to the tissue 196410. In one aspect, the first sensor 196406 and the second and third sensors 196408a, 196408b are strain gauges to measure force or force per unit area. It will be appreciated that the measurements of the down force F1, the lateral forces F2, and the normal force F3 may be readily converted to pressure by determining the surface area upon which the force sensors 196406, 196408a, 196408b are acting upon. Additionally, as described with particularity herein, the flexible circuit 196412 may include temperature sensors embedded in one or more layers of the flexible circuit 196412. The one or more temperature sensors may be arranged symmetrically or asymmetrically and provide tissue 196410 temperature feedback to control circuits of an ultrasonic drive circuit and an RF drive circuit.


One or more sensors such as a magnetic field sensor, a strain gauge, a pressure sensor, a force sensor, an inductive sensor such as, for example, an eddy current sensor, a resistive sensor, a capacitive sensor, an optical sensor, and/or any other suitable sensor, may be adapted and configured to measure tissue compression and/or impedance.



FIG. 29 illustrates one aspect of the flexible circuit 196412 shown in FIG. 28 in which the sensors 196406, 196408a, 196408b may be mounted to or formed integrally therewith. The flexible circuit 196412 is configured to fixedly attach to the jaw member 196402. As shown particularly in FIG. 29, asymmetric temperature sensors 196414a, 196414b are mounted to the flexible circuit 196412 to enable measuring the temperature of the tissue 196410 (FIG. 28).


The reader will appreciate that alternative surgical tools can be utilized in the automatic activation mode described above with respect to FIGS. 25A-29.



FIG. 30 is a flow chart 13150 depicting an automatic activation mode 13151 of a surgical tool. In various instances, the robotic surgical system and processor thereof is configured to implement the processes indicated in FIG. 30. Initially, a sensor system is configured to detect a condition at step 13152. The detected condition is communicated to a processor, which compares the detected condition to a threshold parameter at step 13154. The threshold parameter can be a maximum value, minimum value, or range of values. If the sensed condition is an out-of-bounds condition, the processor can adjust the surgical function at step 13156 and the processor can repeat the comparison process of steps 13152 and 13154. If the sensed condition is not an out-of-bounds condition, no adjustment is necessary (13158) and the comparison process of steps 13152 and 13154 can be repeated again.


In various instances, the robotic surgical system can permit a manual override mode 13153. For example, upon activation of the manual override input 13160, such as by a clinician, the surgical system can exit the automatic activation mode 13151 at step 13162 depicted in FIG. 30. In such instances, even when a sensed condition is an out-of-bounds condition, the surgical function would not be automatically adjusted by the processor. However, in such instances, the processor can issue a warning or recommendation to the clinician recommending a particular course of action based on the sensed condition(s).


In various instances, an automatic activation mode can be utilized with a robotic surgical system including a suctioning feature. In one instance, a robotic surgical system can communicate with a suction and/or irrigation tool. For example, a suction and/or irrigation device (see module 128 in FIG. 3) can communicate with a robotic surgical system via the surgical hub 106 (FIG. 1) and/or the surgical hub 206 (FIG. 9) and a suction and/or irrigation tool can be mounted to a robotic arm. The suction/irrigation device can include a distal suction port and a sensor. In another instance, a robotic surgical tool, such as an electrosurgical tool, can include a suctioning feature and a suction port on the end effector of the tool.


Referring to FIG. 31, when a suction port on an end effector 13210 is moved into contact with a fluid, a processor of the robotic surgical system can automatically activate the suction feature. For example, a fluid detection sensor 13230 on the tool 13200 can detect fluid 13220 in the proximity of the tool 13200 and/or contacting the tool 13200. The fluid detection sensor 13230 can be a continuity sensor, for example. The fluid detection sensor 13230 can be in signal communication with the processor such that the processor is configured to receive input and/or feedback from the fluid detection sensor 13230. In certain instances, the suctioning feature can be automatically activated when the suction port is moved into proximity with a fluid 13220. For example, when the suction port moves within a predefined spatial range of a fluid 13220, the suction feature can be activated by the processor. The fluid 13220 can be saline, for example, which can be provided to the surgical site to enhance conductivity and/or irrigate the tissue.


In various instances, the tool can be a smoke evacuation tool and/or can include a smoke evacuation system, for example. A detail view of an end effector 13210 of a bipolar radio-frequency surgical tool 13200 is shown in FIG. 31. The end effector 13210 is shown in a clamped configuration. Moreover, smoke and steam 13220 from an RF weld accumulate around the end effector 13210. In various instances, to improve visibility and efficiency of the tool 13200, the smoke and steam 13220 at the surgical site can be evacuated along a smoke evacuation channel 13240 extending proximally from the end effector. The evacuation channel 13240 can extend through the shaft 13205 of the surgical tool 13200 to the interface of the surgical tool 13200 and the robot. The evacuation channel 13240 can be coupled to a pump for drawing the smoke and/or steam 13220 along the smoke evacuation channel 13240 within the shaft 13205 of the surgical tool 13200. In various instances, the surgical tool 13200 can include insufflation, cooling, and/or irrigation capabilities, as well.


In one instance, the intensity of the suction pressure can be automatically adjusted based on a measured parameter from one or more surgical devices. In such instances, the suction pressure can vary depending on the sensed parameters. Suction tubing can include a sensor for detecting the volume of fluid being extracted from the surgical site. When increased volumes of fluid are being extracted, the power to the suction feature can be increased such that the suctioning pressure is increased. Similarly, when decreased volumes of fluid are being extracted, the power to the suction feature can be decreased such that the suctioning pressure is decreased.


In various instances, the sensing system for a suction tool can include a pressure sensor. The pressure sensor can detect when an occlusion is obstructing, or partially obstructing, the fluid flow. The pressure sensor can also detect when the suction port is moved into abutting contact with tissue. In such instances, the processor can reduce and/or pause the suctioning force to release the tissue and/or clear the obstruction. In various instances, the processor can compare the detected pressure to a threshold maximum pressure. Exceeding the maximum threshold pressure may lead to unintentional tissue trauma from the suctioning tool. Thus, to avoid such trauma, the processor can reduce and/or pause the suctioning force to protect the integrity of tissue in the vicinity thereof.


A user can manually override the automatic adjustments implemented in the automatic activation mode(s) described herein. The manual override can be a one-time adjustment to the surgical tool. In other instances, the manual override can be a setting that turns off the automatic activation mode for a specific surgical action, a specific duration, and/or a global override for the entire procedure.


In one aspect, the robotic surgical system includes a processor and a memory communicatively coupled to the processor, as described herein. The processor is communicatively coupled to a sensor system, and the memory stores instructions executable by the processor to determine a use of a robotic tool based on input from the sensor system and to automatically energize an energy delivery surface of the robotic tool when the use is determined, as described herein.


In various aspects, the present disclosure provides a control circuit to automatically energize an energy delivery surface, as described herein. In various aspects, the present disclosure provides a non-transitory computer readable medium storing computer readable instructions which, when executed, cause a machine to automatically energize an energy delivery surface of a robotic tool, as described herein.


In one aspect, the robotic surgical system includes a processor and a memory communicatively coupled to the processor, as described herein. The processor is communicatively coupled to a fluid detection sensor, and the memory stores instructions executable by the processor to receive input from the fluid detection sensor and to automatically activate a suctioning mode when fluid is detected, as described herein.


In various aspects, the present disclosure provides a control circuit to automatically activate a suctioning mode, as described herein. In various aspects, the present disclosure provides a non-transitory computer readable medium storing computer readable instructions which, when executed, cause a machine to automatically activate a suctioning mode, as described herein.


Multiple surgical devices, including a robotic surgical system and various handheld instruments, can be used by a clinician during a particular surgical procedure. When manipulating one or more robotic tools of the robotic surgical system, a clinician is often positioned at a surgeon's command console or module, which is also referred to as a remote control console. In various instances, the remote control console is positioned outside of a sterile field and, thus, can be remote to the sterile field and, in some instances, remote to the patient and even to the operating room. If the clinician desires to use a handheld instrument, the clinician may be required to step away from the remote control console. At this point, the clinician may be unable to control the robotic tools. For example, the clinician may be unable to adjust the position or utilize the functionality of the robotic tools. Upon stepping away from the remote control console, the clinician may also lose sight of one or more displays on the robotic surgical system. The separation between the control points for the handheld instruments and the robotic surgical system may inhibit the effectiveness with which the clinician can utilize the surgical devices, both robotic tools and surgical instruments, together.


In various instances, an interactive secondary display is configured to be in signal communication with the robotic surgical system. The interactive secondary display includes a control module in various instances. Moreover, the interactive secondary display is configured to be wireless and movable around an operating room. In various instances, the interactive secondary display is positioned within a sterile field. In one instance, the interactive secondary display allows the clinician to manipulate and control the one or more robotic tools of the robotic surgical system without having to be physically present at the remote control console. In one instance, the ability for the clinician to operate the robotic surgical system away from the remote control console allows multiple devices to be used in a synchronized manner. As a safety measure, in certain instances, the remote control console includes an override function configured to prohibit control of the robotic tools by the interactive secondary display.



FIG. 32 depicts a surgical system 13100 for use during a surgical procedure that utilizes a surgical instrument 13140 and a robotic surgical system 13110. The surgical instrument 13140 is a powered handheld instrument. The surgical instrument 13140 can be a radio frequency (RF) instrument, an ultrasonic instrument, a surgical stapler, and/or a combination thereof, for example. The surgical instrument 13140 includes a display 13142 and a processor 13144. In certain instances, the handheld surgical instrument 13140 can be a smart or intelligent surgical instrument having a plurality of sensors and a wireless communication module.


The robotic surgical system 13110 includes a robot 13112 including at least one robotic tool 13117 configured to perform a particular surgical function. The robotic surgical system 13110 is similar in many respects to robotic surgical system 13000 discussed herein. The robotic tool 13117 is movable in a space defined by a control envelope of the robotic surgical system 13110. In various instances, the robotic tool 13117 is controlled by various clinician inputs at a remote control console 13116. In other words, when a clinician applies an input at the remote control console 13116, the clinician is away from the patient's body and outside of a sterile field 13138. Clinician input to the remote control console 13116 is communicated to a robotic control unit 13114 that includes a robot display 13113 and a processor 13115. The processor 13115 directs the robotic tool(s) 13117 to perform the desired function(s).


In various instances, the surgical system 13100 includes a surgical hub 13120, which is similar in many respects to the hub 106, the hub 206, the robotic hub 122, or the robotic hub 222, for example. The surgical hub 13120 is configured to enhance cooperative and/or coordinated usage of the robotic surgical system 13110 and the surgical instrument(s) 13140. The surgical hub 13120 is in signal communication with the control unit 13114 of the robotic surgical system 13110 and the processor 13144 of the surgical instrument(s) 13140. In various instances, a signal is transmitted through a wireless connection, although any suitable connection can be used to facilitate the communication. The control unit 13114 of the robotic surgical system 13110 is configured to send information to the surgical hub 13120 regarding the robotic tool(s) 13117. Such information includes, for example, a position of the robotic tool(s) 13117 within the surgical site, an operating status of the robotic tool(s) 13117, a detected force by the robotic tool(s), and/or the type of robotic tool(s) 13117 attached to the robotic surgical system 13110, although any relevant information and/or operating parameters can be communicated. Examples of surgical hubs are further described herein and in U.S. Provisional Patent Application Ser. No. 62/611,341, titled INTERACTIVE SURGICAL PLATFORM, filed Dec. 28, 2017, the disclosure of which is herein incorporated by reference in its entirety.


In other instances, the robotic surgical system 13110 can encompass the surgical hub 13120 and/or the control unit 13114 can be incorporated into the surgical hub 13120. For example, the robotic surgical system 13110 can include a robotic hub including a modular control tower that includes a computer system and a modular communication hub. One or more modules can be installed in the modular control tower of the robotic hub. Examples of robotic hubs are further described herein and in U.S. Provisional Patent Application Ser. No. 62/611,341, titled INTERACTIVE SURGICAL PLATFORM, filed Dec. 28, 2017, the disclosure of which is herein incorporated by reference in its entirety.


The processor 13144 of the surgical instrument(s) 13140 is configured to send information to the surgical hub 13120 regarding the surgical instrument 13140. Such information includes, for example, a position of the surgical instrument(s) 13140 within the surgical site, an operating status of the surgical instrument(s) 13140, a detected force by the surgical instrument(s) 13140, and/or identification information regarding the surgical instrument(s) 13140, although any relevant information and/or operating parameters can be sent to the surgical hub.


In various instances, a hub display 13125 is in signal communication with the surgical hub 13120 and may be incorporated into the modular control tower, for example. The hub display 13125 is configured to display information received from the robotic surgical system 13110 and the surgical instrument(s) 13140. The hub display 13125 can be similar in many respects to the visualization system 108 (FIG. 1), for example. In one aspect, the hub display 13125 can include an array of displays such as video monitors and/or heads-up displays around the operating room, for example.


In various instances, the surgical hub 13120 is configured to recognize when the surgical instrument 13140 is activated by a clinician via wireless communication signal(s). Upon activation, the surgical instrument 13140 is configured to send identification information to the surgical hub 13120. Such identification information may include, for example, a model number of the surgical instrument, an operating status of the surgical instrument, and/or a location of the surgical instrument, although other suitable device parameters can be communicated. In various instances, the surgical hub 13120 is configured to utilize the communicated information to assess the compatibility of the surgical instrument 13140 with the capabilities of the surgical hub 13120. Examples of capabilities of the surgical hub with compatible surgical instruments are further discussed herein.


In various instances, the control unit 13114 of the robotic surgical system 13110 is configured to communicate a video feed to the surgical hub 13120, and the surgical hub 13120 is configured to communicate the information, or a portion thereof, to the surgical instrument 13140, which can replicate a portion of the robot display 13113, or other information from the robotic surgical system 13110, on a display 13142 of the surgical instrument 13140. In other instances, the robotic surgical system 13110 (e.g. the control unit 13114 or surgical tool 13117) can communicate directly with the surgical instrument 13140, such as when the robotic surgical system 13110 includes a robotic hub and/or the surgical tool 13117 includes a wireless communication module, for example. The reproduction of a portion of the robot display 13113 on the surgical instrument 13140 allows the clinician to cooperatively use both surgical devices by providing, for example, alignment data to achieve integrated positioning of the surgical instrument 13140 relative to the robotic tool(s) 13117. In various instances, the clinician is able to remove any unwanted information displayed on the display 13142 of the surgical instrument 13140.


Referring still to FIG. 32, in various instances, the surgical system 13100 further includes an interactive secondary display 13130 within the sterile field 13138. The interactive secondary display 13130 is also a local control module within the sterile field 13138. The remote control console 13116, or the primary control, can be positioned outside the sterile field 13138. For example, the interactive secondary display 13130 can be a handheld mobile electronic device, such as an iPad® tablet, which can be placed on a patient or the patient's table during a surgical procedure. For example, the interactive secondary display 13130 can be placed on the abdomen or leg of the patient during the surgical procedure. In other instances, the interactive secondary display 13130 can be incorporated into the surgical instrument 13140 within the sterile field 13138. In various instances, the interactive secondary display 13130 is configured to be in signal communication with the robotic surgical system 13110 and/or the surgical instrument 13140. In such instances, the interactive secondary display 13130 is configured to display information received from the robotic tool(s) 13117 (for example, robotic tool 1, robotic tool 2, . . . robotic tool n) and the surgical instruments 13140 (for example, surgical instrument 1, surgical instrument 2, . . . surgical instrument n). The interactive secondary display 13130 depicts tool information 13133 and instrument information 13135 thereon. In various instances, the user is able to interact with the interactive secondary display 13130 to customize the size and/or location of the information displayed.


Referring still to FIG. 32, in various instances, the surgical hub 13120 is configured to transmit robot status information of the surgical robot system 13100 to the surgical instrument 13140, and the surgical instrument 13140 is configured to display the robot status information on the display 13142 of the surgical instrument 13140.


In various instances, the display 13142 of the surgical instrument 13140 is configured to communicate commands through the surgical hub 13120 to the control unit 13114 of the robotic surgical system 13110. After viewing and interpreting the robot status information displayed on the display 13142 of the surgical instrument 13140 as described herein, a clinician may want to utilize one or more functions of the robotic surgical system 13110. Using the buttons and/or a touch-sensitive display 13142 on the surgical instrument 13140, the clinician is able to input a desired utilization of and/or adjustment to the robotic surgical system 13110. The clinician input is communicated from the surgical instrument 13140 to the surgical hub 13120. The surgical hub 13120 is then configured to communicate the clinician input to the control unit 13114 of the robotic surgical system 13110 for implementation of the desired function. In other instances, the handheld surgical instrument 13140 can communicate directly with the control unit 13114 of the robotic surgical system 13110, such as when the robotic surgical system 13110 includes a robotic hub, for example.


In various instances, the surgical hub 13120 is in signal communication with both the robotic surgical system 13110 and the surgical instrument 13140, allowing the surgical system 13100 to adjust multiple surgical devices in a synchronized, coordinated, and/or cooperative manner. The information communicated between the surgical hub 13120 and the various surgical devices includes, for example, surgical instrument identification information and/or the operating status of the various surgical devices. In various instances, the surgical hub 13120 is configured to detect when the surgical instrument 13140 is activated. In one instance, the surgical instrument 13140 is an ultrasonic dissector. Upon activation of the ultrasonic dissector, the surgical hub 13120 is configured to communicate the received activation information to the control unit 13114 of the robotic surgical system 13110.


In various instances, the surgical hub 13120 automatically communicates the information to the control unit 13114 of the robotic surgical system 13110. The reader will appreciate that the information can be communicated at any suitable time, rate, interval and/or schedule. Based on the information received from the surgical hub 13120, the control unit 13114 of the robotic surgical system 13110 is configured to decide whether to activate at least one robotic tool 13117 and/or activate a particular operating mode, such as a smoke evacuation mode, for example. For example, upon activation of a surgical tool that is known to generate, or possibly generate, smoke and/or contaminants at the surgical site, such as an ultrasonic dissector, the robotic surgical system 13110 can automatically activate the smoke evacuation mode or can cue the surgeon to activate the smoke evacuation mode. In various instances, the surgical hub 13120 is configured to continuously communicate additional information to the control unit 13114 of the robotic surgical system 13110, such as various sensed tissue conditions, in order to adjust, continue, and/or suspend further movement of the robotic tool 13117 and/or the entered operating mode.


In various instances, the surgical hub 13120 may calculate parameters, such as smoke generation intensity, for example, based on the additional information communicated from the surgical instrument 13140. Upon communicating the calculated parameter to the control unit 13114 of the robotic surgical system 13110, the control unit 13114 is configured to move at least one robotic tool and/or adjust the operating mode to account for the calculated parameter. For example, when the robotic surgical system 13110 enters the smoke evacuation mode, the control unit 13114 is configured to adjust a smoke evacuation motor speed to be proportionate to the calculated smoke generation intensity.


In certain instances, an ultrasonic tool mounted to the robot 13112 can include a smoke evacuation feature that can be activated by the control unit 13114 to operate in a smoke evacuation mode. In other instances, a separate smoke evacuation device can be utilized. For example, a smoke evacuation tool can be mounted to another robotic arm and utilized during the surgical procedure. In still other instances, a smoke evacuation instrument that is separate from the robotic surgical system 13110 can be utilized. The surgical hub 13120 can coordinate communication between the robotically-controlled ultrasonic tool and the smoke evacuation instrument, for example.


In FIGS. 33-36, various surgical devices and components thereof are described with reference to a colon resection procedure. The reader will appreciate that the surgical devices, systems, and procedures described with respect to those figures are an exemplary application of the system of FIG. 32. Referring now to FIG. 33, a handle portion 13202 of a handheld surgical instrument 13300 is depicted. In certain aspects, the handheld surgical instrument 13300 corresponds to the surgical instrument 13140 of the surgical system 13100 in FIG. 32. In one instance, the handheld surgical instrument 13300 is a powered circular stapler and includes a display 13310 on the handle portion 13302 thereof.


Before pairing the handheld surgical instrument 13300 to a robotic surgical system (e.g. the robotic surgical system 13110 in FIG. 32) via the surgical hub 13320 (FIG. 34), as described herein, the display 13310 on the handle 13302 of the handheld surgical instrument 13300 can include information regarding the status of the instrument 13300, such as the clamping load 13212, the anvil status 13214, and/or the instrument or cartridge status 13216, for example. In various instances, the display 13310 of the handheld surgical instrument 13300 includes an alert 13318 to the user that communicates the status of the firing system. In various instances, the display 13310 is configured to display the information in a manner that communicates the most important information to the user. For example, in various instances, the display 13310 is configured to display warning information in a larger size, in a flashing manner, and/or in a different color. When the handheld surgical instrument 13300 is not paired with a surgical hub, the display 13310 can depict information gathered only from the handheld surgical instrument 13300 itself.


Referring now to FIG. 34, after pairing the handheld surgical instrument 13300 with the surgical hub 13320, as described herein with respect to FIG. 32, for example, the information detected and displayed by the handheld surgical instrument 13300 can be communicated to the surgical hub 13320 and displayed on a hub display (e.g. the hub display 13125 of FIG. 32). Additionally or alternatively, the information can be displayed on the display of the robotic surgical system. Additionally or alternatively, the information can be displayed on the display 13310 on the handle portion 13302 of the handheld surgical instrument 13300. In various instances, a clinician can decide what information is displayed at the one or multiple locations. As mentioned above, in various instances, the clinician is able to remove any unwanted information displayed on the display 13310 of the handheld surgical instrument 13300, the display of the robotic surgical system, and/or the display on the hub display.


Referring still to FIG. 34, after pairing the handheld surgical instrument 13300 with the robotic surgical system, the display 13310 on the handle portion 13302 of the handheld surgical instrument 13300 can be different than the display 13310 on the handheld surgical instrument 13300 before pairing with the robotic surgical system. For example, procedural information from the surgical hub 13320 and/or robotic surgical system can be displayed on the powered circular stapler. For example, as seen in FIG. 34, robot status information including alignment information 13312 from the surgical hub 13320 and one or more retraction tensions 13316, 13317 exerted by a robotic tool on particular tissue(s), is displayed on the display 13310 of the handheld surgical instrument 13300 for the convenience of the clinician. In various instances, the display 13310 of the handheld surgical instrument 13300 includes an alert 13318 to the user that communicates a parameter monitored by the surgical hub 13320 during a surgical procedure. In various instances, the display 13310 is configured to display the information in a manner that communicates the most important information to the user. For example, in various instances, the display 13310 is configured to display warning information in a larger size, in a flashing manner, and/or in a different color.


Referring still to FIG. 34, the display 13310 of the handheld surgical instrument 13300 is configured to display information regarding one or more retraction tensions 13316, 13317 exerted by one or more devices during a surgical procedure involving one or more robotic tools. For example, the handheld surgical instrument 13300, the powered circular stapler, is involved a the colon resection procedure of FIG. 35. In this procedure, one device (e.g. a robotic tool) is configured to grasp colonic tissue and another device (e.g. the handheld circular stapler) is configured to grasp rectal tissue. As the devices move apart from one another, the force of retracting the colonic tissue FRC and the force of retracting the rectal tissue FRR are monitored. In the illustrated example, an alert notification 13318 is issued to the user as the force of retracting the colonic tissue has exceeded a predetermined threshold. Predetermined thresholds for both retracting forces FRC, FRR are indicated by horizontal dotted lines on the display 13310. The user is notified when one or both thresholds are surpassed and/or reached in an effort to minimize damage and/or trauma to the surrounding tissue.


In FIG. 36, graphical displays 13330, 13340 of retracting forces FRC, FRR are illustrated. In the circumstances illustrated in the graphical displays 13330, 13340, the user is notified when pre-determined thresholds are exceeded, depicted by the shaded region 13332 of the graphical display 13330, indicating that the retracting force of the colonic tissue FRC has exceeded a predetermined threshold of 0.5 lbs.


In certain instances, it can be difficult to align the end effector of a circular stapler with targeted tissue during a colorectal procedure because of visibility limitations. For example, referring again to FIG. 35, during a colon resection, the surgical instrument 13300, a circular stapler, can be positioned adjacent to a transected rectum 13356. Moreover, the anvil 13301 of the surgical instrument 13300 can be engaged with a transected colon 13355. A robotic tool 133175 is configured to engage the anvil 13301 and apply the retracting force FRC. It can be difficult to confirm the relative position of the surgical instrument 13300 with the targeted tissue, for example, with the staple line through the transected colon 13355. In certain instances, information from the surgical hub 13320 and robotic surgical system can facilitate the alignment. For example, as shown in FIG. 34, the center of the surgical instrument 13300 can be shown relative to the center of the targeted tissue 13318 on the display screen 13310 of the surgical instrument 13300. In certain instances, and as shown in FIG. 35, sensors and a wireless transmitter on the surgical instrument 13300 can be configured to convey positioning information to the surgical hub 13320, for example.


A colorectal procedure, visibility limitations thereof, and an alignment tool for a surgical hub are further described herein and in U.S. Provisional Patent Application Ser. No. 62/611,341, titled INTERACTIVE SURGICAL PLATFORM, filed Dec. 28, 2017, the disclosure of which is herein incorporated by reference in its entirety.


As mentioned above, the display 13310 on the handheld instrument 13300 can also be configured to alert the clinician in certain scenarios. For example, the display 13310 in FIG. 34 includes an alert 13318 because the one or more of the forces exceed the predefined force thresholds. Referring again to FIGS. 35 and 36, during the colon resection, the robotic arm can exert a first force FRC on the anvil, and the handheld instrument 13300 can exert a second force FRR on the rectum 13356. The tension on the rectum 13356 by the circular stapler can be capped at a first limit (for example 0.5 lb in FIG. 36), and the tension on the colon 13355 from the robotic arm can be capped at a second limit (for example 0.5 lb in FIG. 36). An intervention may be suggested to the clinician when the tension on the rectum 13356 or colon 13355 exceeds a threshold value.


The tension on the colon FRC in FIGS. 35 and 36 can be ascertained by resistance to the robotic arm, and thus, can be determined by a control unit (e.g. the control unit 13114 of the robotic surgical system 13110). Such information can be communicated to the handheld surgical instrument 13300 and displayed on the display 13310 thereof in the sterile field such that the information is readily available to the appropriate clinician in real-time, or near real-time, or any suitable interval, rate, and/or schedule, for example.


In various instances, a surgical system, such as a surgical system 13360 of FIGS. 37 and 38, includes interactive secondary displays 13362, 13364 within the sterile field. The interactive secondary displays 13362, 13364 are also mobile control modules in certain instances and can be similar to the interactive secondary displays 13130 in FIG. 32, for example. A surgeon's command console, or remote control module, 13370, is the primary control module and can be positioned outside the sterile field. In one instance, the interactive secondary display 13362 can be a mobile device, a watch, and/or a small tablet, which can be worn on the wrist and/or forearm of the user, and the interactive secondary display 13364 can be a handheld mobile electronic device, such as an iPad® tablet, which can be placed on a patient 13361 or the patient's table during a surgical procedure. For example, the interactive secondary displays 13362, 13364 can be placed on the abdomen or leg of the patient 13361 during the surgical procedure. In other instances, the interactive secondary displays 13362, 13364 can be incorporated into a handheld surgical instrument 13366 within the sterile field.


In one instance, the surgical system 13360 is shown during a surgical procedure. For example, the surgical procedure can be the colon resection procedure described herein with respect to FIGS. 33-36. In such instances, the surgical system 13360 includes a robot 13372 and a robotic tool 13374 extending into the surgical site. The robotic tool can be an ultrasonic device comprising an ultrasonic blade and a clamp arm, for example. The surgical system 13360 also includes the remote command console 13370 that encompasses a robotic hub 13380. The control unit for the robot 13372 is housed in the robotic hub 13380. A surgeon 13371 is initially positioned at the remote command console 13370. An assistant 13367 holds the handheld surgical instrument 13366, a circular stapler that extends into the surgical site. The assistant 13367 also holds a secondary display 13364 that communicates with the robotic hub 13380. The secondary display 13364 is a mobile digital electronic device, which can be secured to the assistant's forearm, for example. The handheld surgical instrument 13366 includes a wireless communication module. A second surgical hub 13382 is also stationed in the operating room. The surgical hub 13382 includes a generator module and can include additional modules as further described herein and in U.S. Provisional Patent Application Ser. No. 62/611,341, titled INTERACTIVE SURGICAL PLATFORM, filed Dec. 28, 2017, the disclosure of which is herein incorporated by reference in its entirety.


Referring primarily to FIG. 37, hubs 13380, 13382 include wireless communication modules such that a wireless communication link is established between the two hubs 13380, 13382. Additionally, the robotic hub 13380 is in signal communication with the interactive secondary displays 13362, 13364 within the sterile field. The hub 13382 is in signal communication with the handheld surgical instrument 13366. If the surgeon 13371 moves over towards the patient 13361 and within the sterile field (as indicated by the reference character 13371′), the surgeon 13371 can use one of the wireless interactive displays 13362, 13364 to operate the robot 13372 away from the remote command console 13370. The plurality of secondary displays 13362, 13364 within the sterile field allows the surgeon 13371 to move away from the remote command console 13370 without losing sight of important information for the surgical procedure and controls for the robotic tools utilized therein.


The interactive secondary displays 13362, 13364 permit the clinician to step away from the remote command console 13370 and into the sterile field while maintaining control of the robot 13372. For example, the interactive secondary displays 13362, 13364 allow the clinician to maintain cooperative and/or coordinated control over the powered handheld surgical instrument(s) 13366 and the robotic surgical system at the same time. In various instances, information is communicated between the robotic surgical system, one or more powered handheld surgical instruments 13366, surgical hubs 13380, 13382, and the interactive secondary displays 13362, 13364. Such information may include, for example, the images on the display of the robotic surgical system and/or the powered handheld surgical instruments, a parameter of the robotic surgical system and/or the powered handheld surgical instruments, and/or a control command for the robotic surgical system and/or the powered handheld surgical instruments.


In various instances, the control unit of the robotic surgical system (e.g. the control unit 13113 of the robotic surgical system 13110) is configured to communicate at least one display element from the surgeon's command console (e.g. the console 13116) to an interactive secondary display (e.g. the display 13130). In other words, a portion of the display at the surgeon's console is replicated on the display of the interactive secondary display, integrating the robot display with the interactive secondary display. The replication of the robot display on to the display of the interactive secondary display allows the clinician to step away from the remote command console without losing the visual image that is displayed there. For example, at least one of the interactive secondary displays 13362, 13364 can display information from the robot, such as information from the robot display and/or the surgeon's command console 13370.


In various instances, the interactive secondary displays 13362, 13364 are configured to control and/or adjust at least one operating parameter of the robotic surgical system. Such control can occur automatically and/or in response to a clinician input. Interacting with a touch-sensitive screen and/or buttons on the interactive secondary display(s) 13362, 13364, the clinician is able to input a command to control movement and/or functionality of the one or more robotic tools. For example, when utilizing a handheld surgical instrument 13366, the clinician may want to move the robotic tool 13374 to a different position. To control the robotic tool 13374, the clinician applies an input to the interactive secondary display(s) 13362, 13364, and the respective interactive secondary display(s) 13362, 13364 communicates the clinician input to the control unit of the robotic surgical system in the robotic hub 13380.


In various instances, a clinician positioned at the remote command console 13370 of the robotic surgical system can manually override any robot command initiated by a clinician input on the one or more interactive secondary displays 13362, 13364. For example, when a clinician input is received from the one or more interactive secondary displays 13362, 13364, a clinician positioned at the remote command console 13370 can either allow the command to be issued and the desired function performed or the clinician can override the command by interacting with the remote command console 13370 and prohibiting the command from being issued.


In certain instances, a clinician within the sterile field can be required to request permission to control the robot 13372 and/or the robotic tool 13374 mounted thereto. The surgeon 13371 at the remote command console 13370 can grant or deny the clinician's request. For example, the surgeon can receive a pop-up or other notification indicating the permission is being requested by another clinician operating a handheld surgical instrument and/or interacting with an interactive secondary display 13362, 13364.


In various instances, the processor of a robotic surgical system, such as the robotic surgical systems 13000 (FIG. 23), 13400 (FIG. 24), 13150 (FIG. 30), 13100 (FIG. 32), and/or the surgical hub 13380, 13382, for example, is programmed with pre-approved functions of the robotic surgical system. For example, if a clinician input from the interactive secondary display 13362, 13364 corresponds to a pre-approved function, the robotic surgical system allows for the interactive secondary display 13362, 13364 to control the robotic surgical system and/or does not prohibit the interactive secondary display 13362, 13364 from controlling the robotic surgical system. If a clinician input from the interactive secondary display 13362, 13364 does not correspond to a pre-approved function, the interactive secondary display 13362, 13364 is unable to command the robotic surgical system to perform the desired function. In one instances, a situational awareness module in the robotic hub 13370 and/or the surgical hub 13382 is configured to dictate and/or influence when the interactive secondary display can issue control motions to the robot surgical system.


In various instances, an interactive secondary display 13362, 13364 has control over a portion of the robotic surgical system upon making contact with the portion of the robotic surgical system. For example, when the interactive secondary display 13362, 13364 is brought into contact with the robotic tool 13374, control of the contacted robotic tool 13374 is granted to the interactive secondary display 13362, 13364. A clinician can then utilize a touch-sensitive screen and/or buttons on the interactive secondary display 13362, 13364 to input a command to control movement and/or functionality of the contacted robotic tool 13374. This control scheme allows for a clinician to reposition a robotic arm, reload a robotic tool, and/or otherwise reconfigure the robotic surgical system. In a similar manner as discussed above, the clinician 13371 positioned at the remote command console 13370 of the robotic surgical system can manually override any robot command initiated by the interactive secondary display 13362, 13364.


In one aspect, the robotic surgical system includes a processor and a memory communicatively coupled to the processor, as described herein. The memory stores instructions executable by the processor to receive a first user input from a console and to receive a second user input from a mobile wireless control module for controlling a function of a robotic surgical tool, as described herein.


In various aspects, the present disclosure provides a control circuit to receive a first user input from a console and to receive a second user input from a mobile wireless control module for controlling a function of a robotic surgical tool, as described herein. In various aspects, the present disclosure provides a non-transitory computer readable medium storing computer readable instructions which, when executed, cause a machine to receive a first user input from a console and to receive a second user input from a mobile wireless control module for controlling a function of a robotic surgical tool, as described herein.


A robotic surgical system may include multiple robotic arms that are configured to assist the clinician during a surgical procedure. Each robotic arm may be operable independently of the others. A lack of communication may exist between each of the robotic arms as they are independently operated, which may increase the risk of tissue trauma. For example, in a scenario where one robotic arm is configured to apply a force that is stronger and in a different direction than a force configured to be applied by a second robotic arm, tissue trauma can result. For example, tissue trauma and/or tearing may occur when a first robotic arm applies a strong retracting force to the tissue while a second robotic arm is configured to rigidly hold the tissue in place.


In various instances, one or more sensors are attached to each robotic arm of a robotic surgical system. The one or more sensors are configured to sense a force applied to the surrounding tissue during the operation of the robotic arm. Such forces can include, for example, a holding force, a retracting force, and/or a dragging force. The sensor from each robotic arm is configured to communicate the magnitude and direction of the detected force to a control unit of the robotic surgical system. The control unit is configured to analyze the communicated forces and set limits for maximum loads to avoid causing trauma to the tissue in a surgical site. For example, the control unit may minimize the holding force applied by a first robotic arm if the retracting or dragging force applied by a second robotic arm increases.



FIG. 39 depicts a robotic surgical system 13800 including a control unit 13820 and a robot 13810. The robotic surgical system 13800 is similar in many respects to the robotic surgical system 13000 including the robot 13002 (FIG. 23), for example. The control unit 13820 includes a processor 13822 and a display 13824. The robot 13810 includes two robotic arms, 13830, 13840 configured to carry out various surgical functions. Each of the robotic arms 13830, 13840 are independently operable and are free to move in a space defining a control envelope of the robotic surgical system 13800. The one or more robotic arms, 13830, 13840, are configured to receive a tool, such as a stapler, a radio frequency (RF) tool, an ultrasonic blade, graspers, and/or a cutting instrument, for example. Other suitable surgical tool can be used. In various instances, the robotic arms 13830, 13840 each include a different tool configured to perform different functions. In other instances, all of the robotic arms 13830, 13840 include the same tool, although any suitable arrangement can be used.


The first robotic arm 13830 includes a first driver 13834 and a first motor 13836. When activated by the processor 13822, the first motor 13836 drives the first driver 13834 actuating the corresponding component of the first robotic arm 13830. The second robotic arm 13840 includes a second driver, 13844 and a second motor 13846. When activated by the processor 13822, the second motor 13846 drives the second driver 13844 actuating the corresponding component of the second robotic arm 13840.


Each of the robotic arms 13830, 13840, includes a sensor 13832, 13842 in signal communication with the processor 13822 of the control unit 13820. The sensors 13832, 13842 can be positioned on the drivers 13834, 13844, respectively, and/or on the motors 13836, 13846, respectively. In various instances, the sensors 13832, 13842 are configured to detect the location of each individual robotic arm 13830, 13840 within the control envelope of the robotic surgical system 13800. The sensors 13832, 13842 are configured to communicate the detected locations to the processor 13822 of the robotic surgical system 13800. In various instances, the positions of the robotic arms 13830, 13840 are displayed on the display 13824 of the control unit 13820. As described in more detail below, in various instances, the processor 13822 is configured to run an algorithm to implement position limits specific to each robotic arm 13830, 13840 in an effort to avoid tissue trauma and damage to the robotic surgical system 13800, for example. Such position limits may increase the clinician's ability to cooperatively operate numerous robotic arms 13830, 13840 of the robotic surgical system 13800 at the same time.


In various instances, the sensors 13832, 13842 are configured to detect the force exerted by each robotic arm 13830, 13840. The sensors 13832, 13842 can be torque sensors. As stated above, each robotic arm 13830, 13840 of the robotic surgical system 13800 is independently operable. During a particular surgical procedure, a clinician may want to perform different surgical functions with each robotic arm 13830, 13840. Upon detecting the exerted forces of each robotic arm 13830, 13840, each sensor 13832, 13842 is configured to communicate the detected forces to the processor 13822. The processor 13822 is then configured to analyze the communicated information and set maximum and/or minimum force limits for each robotic arm 13830, 13840 to reduce the risk of causing tissue trauma, for example. In addition, the processor 13822 is configured to continuously monitor the exerted forces by each robotic arm 13830, 13840 and, based on the direction and magnitude of the exerted forces, proportionally control each robotic arm 13830, 13840 with respect to one another. For example, the opposing force between two robotic arms 13830, 13840 can be measured and maintained below a maximum force limit. To maintain the opposing force below a maximum force limit, at least one of the forces can be reduced, which can result in displacement of the robotic arm 13830, 13840.


By way of example, FIG. 40 depicts a surgical site and a portion of the surgical system 13800, which includes three robotic arms, including a robotic arm 13850 (a third robotic arm) in addition to the robotic arms 13830 and 13840, which are also schematically depicted in FIG. 39. The first robotic arm 13830 is configured to hold a portion of stomach connective tissue. In order to hold the portion of stomach connective tissue, the first robotic arm 13830 exerts an upward force FH1. The second robotic arm 13840 applies a dragging and/or cutting force FD2 to the tissue. Simultaneously, the third robotic arm 13850 retracts a portion of liver tissue away from the current surgical cut location, further exposing the next surgical cut location. In order to move the portion of liver tissue out of the way of the advancing second robotic arm 13840, the third robotic arm 13850 applies a retracting force FR3 away from the second robotic arm 13840. In various exemplifications, as the second robotic arm 13840 advances further into the surgical site, the control unit of the robotic surgical system directs the third robotic arm 13850 to increase the exerted retracting force FR3 to continue exposing the next surgical cut location. While FIG. 40 depicts a particular surgical procedure and specific robotic arms, any suitable surgical procedure can be performed, and any suitable combination of robotic arms can utilize the control algorithms disclosed herein.



FIG. 41 depicts graphical representations 13852, 13854 of the forces exerted by the robotic arms 13830, 13840, and 13850 of FIG. 40 and the relative locations of the robotic arm 13830, 13840, and 13850, respectively, from the particular surgical procedure detailed above. The graphical display 13852 in FIG. 41 represents the exerted forces of each robotic arm 13830, 13840, and 13850 over a period of time, while the graphical display 13854 represents the relative positions of each robotic arm 13830, 13840, and 13850 over the same period of time. As discussed above, the first robotic arm 13830 is configured to exert a holding force FH1 on a portion of stomach connective tissue. The holding force FH1 is represented by a solid line on the graphs 13852, 13854. The second robotic arm 13840 is configured to exert a dragging and/or cutting force FD2 on the stomach connective tissue. The dragging force FD2 is represented by a dash-dot line on the graphs 13852, 13854. The third robotic arm 13850 is configured to exert a retracting force FR3 on a portion of liver tissue. The retracting force FR3 is represented by a dotted line on the graphs 13852, 13854.


In various instances, the control unit of the robotic surgical system imposes at least one force threshold, such as a maximum force threshold, as depicted in the graphical display 13852. Thus, the third robotic arm 13850 is prevented from exerting a retraction force FR3 greater than the maximum retraction force threshold. Such maximum force limits are imposed in order to avoid tissue trauma and/or avoid damage to the various robotic arms 13830, 13840, and 13850, for example.


Additionally or alternatively, the control unit 13820 of the robotic surgical system 13800 can impose least one force threshold, such as a minimum force threshold, as depicted in the graphical display 13852. In the depicted instance, the first robotic arm 13830 is prevented from exerting a holding force FH1 less than the minimum holding force threshold. Such minimum force limits are imposed in order to avoid maintain appropriate tissue tension and/or visibility of the surgical site, for example.


In various instances, the control unit 13820 of the robotic surgical system 13800 imposes maximum force differentials detected between various robotic arms during a load control mode. In order to set maximum force differentials, the control unit 13820 of the robotic surgical system is configured to continuously monitor the difference in magnitude and direction of opposing forces by the robotic arms. As stated above, the first robotic arm 13830 is configured to hold a portion of the stomach connective tissue by exerting a holding force FH1. The second robotic arm 13840 is configured to apply a dragging force FD2, which opposes the holding force FH1 exerted by the first robotic arm 13830. In various instances, maximum force differentials prevent inadvertent overloading and/or damaging an object caught between the robotic arms 13830, 13840, and 13850. Such objects include, for example, surrounding tissue and/or surgical components like clasps, gastric bands, and/or sphincter reinforcing devices. Fmax opposing represents the maximum force differential set by the control unit 13820 in this particular exemplification.


As can be seen in the graphical display 13852, the holding force FH1 and the dragging force FD2 both increase in magnitude at the beginning of the surgical procedure. Such an increase in magnitudes can indicate a pulling of the tissue. The holding force FH1 and the dragging force FD2 increase in opposite directions to a point where the difference between the opposing forces is equal to Fmax opposing. In the graphic display 13852, the slanted lines highlight the point in time when Fmax opposing is reached. Upon reaching Fmax opposing, the processor 13822 instructs the first robotic arm 13830 to reduce the holding force FH1 and continues to allow the second robotic arm 13840 to exert the dragging force FD2 at the same value, and may allow a clinician to increase the dragging force. In various instances, the value of Fmax opposing is set by the processor 13822 based on various variables, such as the type of surgery and/or relevant patient demographics. In various instances, Fmax opposing is a default value stored in a memory of the processor 13822.


The relative positions of the robotic arms 13830, 13840, and 13850 within the surgical site are depicted in the graph display 13854 of FIG. 41. As the first robotic arm 13830 exerts a holding force FH1 on the stomach connective tissue and the third robotic arm 13850 exerts a retracting force FR3 on the liver tissue, the surgical site becomes clear and allows the second robotic arm 13840 to exert a dragging and/or cutting force FD2 on the desired tissue. The second robotic arm 13840 and the third robotic arm 13850 become farther away from the first robotic arm 13830 as the procedure progresses. When the force differential Fmax opposing is reached between the holding force FH1 and the dragging force FD2, the first robotic arm 13830 is moved closer towards the second robotic arm 13840, lessening the exerted holding force FH1 by the first robotic arm 13830. In one aspect, the processor 13822 can transition the first robotic arm 13830 from the load control mode into a position control mode such that the position of the first robotic arm 13830 is held constant. As depicted in the graphical representations of FIG. 41, when the first robotic arm 13830 is held in a constant position, the force control for the second robotic arm 13840 can continue to displace the second robotic arm 13840.


In various instances, the control unit 13820 of the robotic surgical system directs the first robotic arm 13830 to hold a specific position until a pre-determined force threshold between the first robotic arm 13830 and a second robotic arm 13840 is reached. When the pre-determined force threshold is reached, the first robotic arm 13830 is configured to automatically move along with the second robotic arm 13840 in order to maintain the pre-determined force threshold. The first robotic arm 13830 stops moving (or may move at a different rate) when the detected force of the second robotic arm 13840 no longer maintains the pre-determined force threshold.


In various instances, the control unit 13820 of the robotic surgical system is configured to alternate between the position control mode and the load control mode in response to detected conditions by the robotic arms 13830, 13840, and 13850. For example, when the first robotic arm 13830 and the second robotic arm 13840 of the robotic surgical system 13800 are freely moving throughout a surgical site, the control unit 13820 may impose a maximum force that each arm 13830, 13840 can exert. In various instances, the first and second arms 13830, 13840 each include a sensor configured to detect resistance. In other instances, the sensors can be positioned on a surgical tool, such as an intelligent surgical stapler or jawed tool. A resistance can be encountered upon contact with tissue and/or other surgical instruments. When such resistance is detected, the control unit 13820 may activate the load control mode and lower the exerted forces by one and/or more than one of the robotic arms 13830, 13840 to, for example, reduce damage to the tissue. In various instances, the control unit 13820 may activate the position control mode and move the one and/or more than one of the robotic arms 13830, 13840 to a position where such resistance is no longer detected.


In one aspect, the processor 13822 of the control unit 13820 is configured to switch from the load control mode to the position control mode upon movement of a surgical tool mounted to one of the robotic arms 13830, 13840 outside a defined surgical space. For example, if one of the robotic arms 13830, 13840 moves out of a defined boundary around the surgical site, or into abutting contact with an organ or other tissue, or too close to another surgical device, the processor 13822 can switch to a position control mode and prevent further movement of the robotic arm 13830, 13840 and/or move the robotic arm 13830, 13840 back within the defined surgical space.


Turning now to the flow chart shown in FIG. 42, an algorithm 13500 is initiated at step 13501 when the clinician and/or the robotic surgical system activates one or more of the robotic arms at step 13505. The algorithm 13500 can be employed by the robotic surgical system 13800 in FIG. 39, for example. Each robotic arm is in signal communication with the processor 13822 of the robotic surgical system. Following activation, each robotic arm is configured to send information to the processor. In various instances, the information may include, for example, identification of the tool attachment and/or the initial position of the activated robotic arm. In various instances, such information is communicated automatically upon attachment of the tool to the robotic arm, upon activation of the robotic arm by the robotic surgical system, and/or after interrogation of the robotic arm by the processor, although the information may be sent at any suitable time. Furthermore, the information may be sent automatically and/or in response to an interrogation signal.


Based on the information gathered from each of the activated robotic arms at step 13510, the processor is configured to set a position limit for each specific robotic arm within a work envelope of the robotic surgical system at step 13515. The position limit can set three-dimensional boundaries for where each robotic arm can travel. The setting of position limits allows for efficient and cooperative usage of each activated robotic arm while, for example, preventing trauma to surrounding tissue and/or collisions between activated robotic arms. In various instances, the processor includes a memory including a set of stored data to assist in defining each position limit. The stored data can be specific to the particular surgical procedure, the robotic tool attachment, and/or relevant patient demographics, for example. In various instances, the clinician can assist in the definition of the position limit for each activated robotic arm. The processor is configured to determine if the robotic arms are still activated at step 13520. If the processor determines that the robotic arms are no longer activated, the processor is configured to end position monitoring at step 13522. Once the processor determines that the robotic arms are still activated, the processor is configured to monitor the position of each activated robotic arm at step 13525.


The processor is then configured to evaluate whether the detected position is within the predefined position limit(s) at step 13530. In instances where information is unable to be gathered from the robotic arm and clinician input is absent, a default position limit is assigned at step 13533. Such a default position limit assigns a conservative three-dimensional boundary to minimize, for example, tissue trauma and/or collisions between robotic arms. If the detected limit is within the position limit, the processor is configured to allow the robotic arm(s) to remain in position and/or freely move within the surgical site at step 13535, and the monitoring process continues as long as the robotic arm is still activated. If the detected limit is outside of the position limit, the processor is configured to move the robotic arm back into the position limit at step 13532, and the monitoring process continues as long as the robotic arm is still activated.


The processor is configured to continuously monitor the position of each robotic arm at step 13525. In various instances, the processor is configured to repeatedly send interrogation signals in pre-determined time intervals. As discussed above, if the detected position exceeds the position limit set for the specific robotic arm, in certain instances, the processor is configured to automatically move the robotic arm back within the three-dimensional boundary at step 13532. In certain instances, the processor is configured to re-adjust the position limits of the other robotic arms in response to one robotic arm exceeding its original position limit. In certain instances, prior to moving the robotic arm back within its position limit and/or adjusting the position limits of the other robotic arms, the processor is configured to alert the clinician. If the detected position is within the position limit set for the robotic arm, the processor permits the robotic arm to remain in the same position and/or freely travel until the detected position exceeds the position limit at step 13535. If the processor is unable to detect the position of the robotic arm, the processor is configured to alert the clinician and/or assign the robotic arm with the default position limit at step 13533. The processor is configured to monitor the position of each robotic arm until the surgery is completed and/or the robotic arm is deactivated.


Similar to the algorithm of FIG. 42, the flow chart of FIG. 43 depicts an algorithm 13600 that is initiated at step 13601 when a clinician and/or a robotic surgical system activates one or more of the robotic arms at step 13605. The algorithm 13600 can be employed by the robotic surgical system 13800 in FIG. 39, for example. Each robotic arm is in signal communication with the processor. Following activation, each robotic arm is configured to send information to the processor at step 13610. In various instances, the information may include, for example, identification of the tool attachment, exerted forces detected by one or more force sensors on the robotic arm, and/or the initial position of the activated robotic arm. In various instances, such information is communicated automatically upon attachment of the tool to the robotic arm, upon activation of the robotic arm by the robotic surgical system, and/or after interrogation of the robotic arm by the processor, although the information may be sent at any suitable time. Furthermore, the information may be sent automatically and/or in response to an interrogation signal.


Based on the information gathered from each of the activated robotic arms, the processor is configured to set a force limit for each specific robotic arm at step 13615. The force limit sets maximum and minimum force thresholds for forces exerted by each robotic arm. Additionally or alternatively, a force limit can be the maximum force differential between two or more arms. The setting of force limits allows for efficient and cooperative usage of all of the activated robotic arms while, for example, preventing trauma to surrounding tissue and/or damage to the robotic arms. In various instances, the processor includes a memory including a set of stored data to assist in defining each force limit. The stored data can be specific to the particular surgical procedure, the robotic tool attachment, and/or relevant patient demographics, for example. In various instances, the clinician can assist in the definition of the force limit for each activated robotic arm. In instances where information is unable to be gathered from the robotic arm and clinician input is absent, a default force limit is assigned. Such a default force limit assigns conservative maximum and minimum force thresholds to minimize, for example, tissue trauma and/or damage to the robotic arms.


The processor is configured to determine if the robotic arm is active at step at step 13620. If the processor determines that the robotic arm has been deactivated, the processor is configured to end force monitoring at step 13622. Once it has been determined that the robotic arm is still activated at step 13620, the processor is configured to continuously monitor the force exerted by each robotic arm at step 13625. In various instances, the processor is configured to repeatedly send interrogation signals in pre-determined time intervals. If the detected force exceeds the maximum force threshold set for the specific robotic arm, in certain instances, the processor is configured to automatically decrease the force exerted by the robotic arm and/or decrease an opposing force exerted by another robotic arm at step 13632. In certain instances, the processor is configured to re-adjust the force limits assigned to the other robotic arms in response to one robotic arm exceeding its original force limits. In certain instances, prior to adjusting the force exerted by the robotic arm, adjusting the opposing force exerted by another robotic arm, and/or adjusting the force limits of the other robotic arms, the processor is configured to alert the clinician. If the detected force is within the force limit set for the robotic arm, the robotic arm is permitted to maintain the exertion of the force and/or the clinician can increase or decrease the exerted force until the force is out of the set force limit at step 13635. If the processor is unable to detect the exerted force of the robotic arm, the processor is configured to alert the clinician and/or assign the robotic arm with a default force limit at step 13633. The processor is configured to monitor the exerted force of each robotic arm until the surgery is completed and/or the robotic arm is deactivated at step 13620.


Similar to the algorithms of FIGS. 42 and 43, the flow chart of FIG. 44 depicts an algorithm 13700 that is initiated 13701 when a clinician and/or a robotic surgical system activates one or more of the robotic arms 13705. The algorithm 13700 can be employed by the robotic surgical system 13800 in FIG. 39, for example. Each robotic arm is in signal communication with the processor. Following activation, each robotic arm is configured to send information to the processor at step 13710. In various instances, the information may include, for example, identification of the tool attachment, forces detected by one or more force sensors on the robotic arm, and/or the initial position of the activated robotic arm. In various instances, such information is communicated automatically upon attachment of the tool to the robotic arm, upon activation of the robotic arm by the robotic surgical system, and/or after interrogation of the robotic arm by the processor, although the information may be sent at any suitable time. In various instances, the information is sent automatically and/or in response to an interrogation signal.


Based on the information gathered from all of the activated robotic arms, the processor is configured to set both a position limit within a work envelope of the robotic surgical system and a force limit for each specific robotic arm at step 13715. The position limit sets three-dimensional boundaries for where each robotic arm can travel. The setting of position limits allows for efficient and cooperative usage of all of the activated robotic arms while, for example, preventing trauma to surrounding tissue and/or collisions between activated robotic arms. The force limit sets maximum and/or minimum force thresholds for forces exerted by each robotic arm. Additionally or alternatively, a force limit can be the maximum force differential between two or more arms. The setting of force limits allows for efficient and cooperative usage of the activated robotic arms while, for example, preventing trauma to surrounding tissue and/or damage to the robotic arms.


In various instances, the processor includes a memory including a set of stored data to assist in defining each position limit and force limit. The stored data can be specific to the particular surgical procedure, the robotic tool attachment, and/or relevant patient demographics, for example. In various instances, the clinician can assist in the definition of the position limit and force limit for each activated robotic arm. In instances where information is unable to be gathered from the robotic arm and clinician input is absent, a default position limit and/or default force limit is assigned to the robotic arm. Such a default position limit assigns a conservative three-dimensional boundary to minimize, for example, tissue trauma and/or collisions between robotic arms, while the default force limit assigns conservative maximum and/or minimum force thresholds to minimize, for example, tissue trauma and/or damage to the robotic arms. In various instances, the processor is configured to adjust the position limit of one robotic arm based on the force limit of another robotic arm, adjust the force limit of one robotic arm based on the position limit of another robotic arm, and vice versa.


The processor is configured to determine whether the robotic arm is active at step 13720. Once the processor has determined that the robotic arm is activated at step 13720, the processor is configured to continuously monitor the position of each arm 13737 and the force exerted by each robotic arm at step 13725. If the robotic arm is no longer activated, the processor is configured to end position monitoring at step 13727 and end force monitoring at step 13722. In various instances, the processor is configured to repeatedly send interrogation signals in pre-determined time intervals. If the detected position exceeds the position limit set for the specific robotic arm, in certain instances, the processor is configured to automatically move the robotic arm back within the three-dimensional boundary at step 13742. In certain instances, prior to moving the robotic arm back within its position limit, the processor is configured to alert the clinician. If the detected position is within the position limit set for the robotic arm, the robotic arm is permitted to remain in the same position and/or freely travel until the detected position exceeds the position limit at step 13745. If the processor is unable to detect the position of the robotic arm, the processor is configured to alert the clinician and/or rewrite the original position limit of the robotic arm with the default position limit at step 13743. The processor is configured to monitor the position of each robotic arm until the surgery is completed and/or the robotic arm is deactivated.


In certain instances, the robotic surgical system includes a manual override configured to control the position of each robotic arm. If the detected force exceeds the maximum force threshold set for the specific robotic arm, in certain instances, the processor is configured to automatically decrease the force exerted by the robotic arm and/or decrease an opposing force exerted by another robotic arm at step 13732. In certain instances, prior to decreasing the force exerted by the robotic arm and/or decrease the opposing force exerted by another robotic arm, the processor is configured to alert the clinician. If the detected force is within the force limit set for the robotic arm, the robotic arm is permitted to maintain the exertion of the force and/or increase or decrease the exerted force until the force is out of the set force limit at step 13735. If the processor is unable to detect the exerted force of the robotic arm, the processor is configured to alert the clinician and/or rewrite the original force limit of the robotic arm with the default force limit at step 13733. The processor is configured to monitor the exerted force of each robotic arm until the surgery is completed and/or the robotic arm is deactivated.


In various instances, the position monitoring system and the force monitoring system are interconnected. In certain instances, the force monitoring system can override the resultant decision 13742, 14743, 14745 of the position detection step 13740. In certain instances, the position monitoring system can override the resultant decision 13732, 13733, 13735 of the force detection step 13730. In other instances, the position monitoring system and the force monitoring system are independent of one another.


A clinician can manually override the automatic adjustments implemented in the automatic load and/or position control mode(s) described herein. The manual override can be a one-time adjustment to the surgical robot. In other instances, the manual override can be a setting that turns off the automatic load and/or position mode for a specific surgical action, a specific duration, and/or a global override for the entire procedure.


In one aspect, the robotic surgical system includes a processor and a memory communicatively coupled to the processor, as described herein. The processor is communicatively coupled to a first force sensor and a second force sensor, and the memory stores instructions executable by the processor to affect cooperative movement of a first robotic arm and a second robotic arm based on a first input from the first force sensor and from a second input from the second force sensor in a load control mode, as described herein.


In various aspects, the present disclosure provides a control circuit to affect cooperative movement of a first robotic arm and a second robotic arm, as described herein. In various aspects, the present disclosure provides a non-transitory computer readable medium storing computer readable instructions which, when executed, cause a machine to affect cooperative movement of a first robotic arm and a second robotic arm, as described herein.


During a particular surgical procedure, clinicians may rely on one or more powered handheld surgical instruments in addition to a robotic surgical system. In various instances, the instruments are controlled and monitored through different platforms, which may inhibit communication between the instruments and the robotic surgical system. For example, the instruments can be produced by different manufacturers and even by competitors. Such instruments may have different communication packages and/or communication and/or linking protocols. The lack of communication between a powered instrument and the robotic surgical system may hinder cooperative and/or coordinated usage and may complicate the surgical procedure for the clinician. For example, each surgical instrument may include an individual display to communicate various information and operating parameters. In such a scenario, a clinician may have to look at numerous instrument-specific displays to monitor the operating status of and analyze data gathered by each device.


In various instances, a robotic surgical system is configured to detect the presence of other powered surgical instruments that are controlled by platforms other than the robotic surgical system. The robotic surgical system can incorporate a hub, i.e., a robotic hub like the robotic hubs 122 (FIG. 2) and 222 (FIG. 9), which can detect other powered surgical instruments, for example. In other instances, a stand-alone surgical hub like the hub 106 (FIGS. 1-3) or the hub 206 (FIG. 9) in communication with the robotic surgical system can facilitate detection of the non-robotic surgical instruments and cooperative and/or coordinated usage of the detected surgical instruments with the robotic surgical system. The hub, which can be a robotic hub or a surgical hub, is configured to display the position and orientation of the powered surgical instruments with respect to the work envelope of the robotic surgical system. In certain instances, the work envelope can be an operating room, for example. A surgical hub having spatial awareness capabilities is further described herein and in U.S. Provisional Patent Application Ser. No. 62/611,341, titled INTERACTIVE SURGICAL PLATFORM, filed Dec. 28, 2017, the disclosure of which is herein incorporated by reference in its entirety. In one aspect, the hub can first ascertain the boundaries of the work envelope and then detect the presence of other powered surgical instruments within the work envelope.



FIG. 45 depicts a surgical system 13860 including a robotic surgical system 13865, a surgical instrument 13890, and a surgical hub 13870. The surgical instrument 13890 is a powered handheld instrument, and can be a motorized surgical stapler, such as the motorized linear stapler depicted in FIG. 46, for example. The surgical system 13865 can be similar in many respects to the robotic surgical system 13000 (FIG. 23), for example. As described herein, the surgical hub 13870 can be incorporated into the robotic surgical system 13865, for example. The surgical hub 13870 is configured to be in signal communication with the robotic surgical system 13865 and the surgical instrument 13890. In other instances, the surgical system 13860 can include additional handheld surgical instruments. The robotic surgical system 13865 includes a robot 13861, which can be similar to the robot 13002, for example. The robotic surgical system 13865 also includes a control unit 13862 and a surgeon's command console, or remote control module, 13864. The surgeon's command console 13864 is configured to receive a clinician input. The control unit 13862 includes a robot display 13868 and a processor 13866. The surgical instrument 13890 includes a display 13894 and a processor 13892.


In various instances, the surgical hub 13870 includes a surgical hub display 13880, which can be similar to the displays of the visualization system 108 (FIG. 1). The surgical hub display 13880 can include, for example, a heads up display. The surgical hub 13880 is configured to detect the presence of the surgical instrument 13890 within a certain distance of the surgical hub 13870. For example, the surgical hub 13870 is configured to detect the presence of all activated surgical instruments 13890 within one operating room, although any suitable distance can be monitored. In various instances, the surgical hub 13870 is configured to display the presence of all activated surgical instruments 13890 on the surgical hub display 13880.


A particular handheld surgical instrument communicates via a first communication process through a first language. A particular robotic surgical system communicates via a second communication process through a second language. In various instances, the first communication process is the same as the second communication process. When the first communication process is the same as the second communication process, the surgical instrument 13890 is configured to directly communicate information to the surgical hub 13870 and/or to the robotic surgical system 13865. Such information includes, for example, a model number and/or type of the surgical instrument, a position of the surgical instrument, an operating status of the surgical instrument, and/or any other relevant parameter of the surgical instrument.


In various instances, the first communication process is different from the second communication process. For example, a surgical system (e.g. a robot) developed by a first manufacturer may utilize a first proprietary language or communication scheme and a surgical system (e.g. a handheld surgical tool) developed by a second manufacturer may utilize a second, different proprietary language or communication scheme. Despite the language difference/barrier, the surgical hub 13870 and/or surgical robot 13865 is configured to sense surgical instruments 13890 that operate on different communication processes. When the surgical hub 13870 does not recognize the communication process utilized by a particular powered handheld surgical instrument, the surgical hub 13870 is configured to detect various signals, such as W-Fi and Bluetooth transmissions emitted by activated powered handheld surgical instruments. Based on the detected signal transmissions, the surgical hub 13870 is configured to alert the clinician of all powered handheld surgical instruments that do not use the same communication process as the robotic surgical system 13865. All data received from newly-detected powered handheld surgical instruments can be stored within the surgical hub 13870 so that the newly-detected powered handheld surgical instruments are recognized by the surgical hub 13870 in the future.


In various instances, the surgical hub 13870 is configured to detect the presence of powered handheld surgical instruments by sensing a magnetic presence of a battery, power usage, and/or electro-magnetic field emitted from activated powered handheld surgical instruments, regardless of whether the activated powered handheld surgical instruments made any attempt to communicate with another surgical instrument, such as the robotic surgical system.


The robot 13861 and the surgical instrument 13890 are exemplified in an example surgical procedure in FIG. 46. In this exemplification, the surgical instrument 13890 is an articulating linear stapler. As depicted in FIG. 46, the surgical instrument 13890 includes a motor 13895 in the handle 13892 thereof. In other instances, the surgical instrument 13890 can include a plurality of motors positioned throughout the surgical instrument. The motor 13895 is configured to emit an electromagnetic field 13896, which can be detected by the robotic surgical system 13865 or the surgical hub 13870. For example, the main robot tower or the modular control tower of the surgical hub 13870 can include a receiver for detecting the electromagnetic fields within the operating room.


In one aspect, a processor of the robotic surgical system (e.g. a processor of the control unit 13862) is configured to calculate a boundary around the surgical instrument 13890. For example, based on the electromagnetic field 13896 and corresponding type of surgical instrument, the processor can determine the dimensions of the surgical instrument 13890 and possible range of positions thereof. For example, when the surgical instrument 13890 includes one or more articulation joints 13891, the range of positions can encompass the articulated positions of the surgical instrument 13890.


In one instance, the robotic surgical system can calculate a first wider boundary B2 around the surgical instrument. When a robotic surgical tool approaches the wider boundary B2, the robotic surgical tool 13861 can issue a notification or warning to the surgeon that the robotic surgical tool attached to the robot 13861 is approaching another surgical instrument 13890. In certain instances, if the surgeon continues to advance the robotic surgical tool toward the surgical instrument 13890 and to a second narrower boundary B1, the robotic surgical system 13865 can stop advancing the robotic surgical tool. For example, if the robotic surgical tool crosses the narrower boundary B1, advancement of the robotic surgical tool can be stopped. In such instances, if the surgeon still desires to continue advancing the robotic surgical tool within the narrower boundary B1, the surgeon can override the hard stop feature of the robotic surgical system 13865.


Referring again to FIG. 45, the surgical system 13860 includes multiple display monitors. Each handheld surgical instrument 13890 and the robotic surgical system 13865 is configured to communicate a video and/or image feed representative of the display on each device to the surgical hub 13870 and/or the hub display 13880. Such video and/or image feeds can include operating parameters of and/or detected conditions by each handheld surgical instrument 13890 and/or the robotic surgical system 13865. The hub 13870 is configured to control the displayed video and/or image feeds on each of the one or more display monitors throughout the system 13800. In various instances, each of the display monitors displays an individual video and/or image feed from a particular surgical device or system. In various instances, the individual video and/or image feed can be overlaid with additional information and/or video and/or image feeds from other devices or systems. Such information can include operating parameters and/or detected conditions. The surgical hub 13870 is configured to request which display monitor displays which video and/or image feed. In other words, the communication link between the surgical hub 13870 and the hub display 13880 allows the surgical hub 13870 to dictate which video and/or image feed is assigned to which display monitor, while direct control of the one or more display monitors remains with the video hub. In various instances, the hub display 13880 is configured to separate one or more of the display monitors from the surgical hub 13870 and allow a different surgical hub or surgical device to display relevant information on the separated display monitors.


In various instances, the surgical hub is configured to communicate stored data with other data systems within an institution data barrier allowing for cooperative utilization of data. Such established data systems may include, for example, an electronic medical records (EMR) database. The surgical hub is configured to utilize the communication between the surgical hub and the EMR database to link overall surgical trends for the hospital with local data sets recorded during use of the surgical hub.


In various instances, the surgical hub is located in a particular operating room at a hospital and/or surgery center. As shown in FIG. 47, the hospital and/or surgery center includes operating rooms, OR1, OR2, OR3, and OR4. Three of the operating rooms OR2, OR3, and OR4 shown in FIG. 47 includes a surgical hub 13910, 13920, 13930, respectively, however any suitable number of surgical hubs can be used. Each surgical hub 13910, 13920, 13930 is configured to be in signal communication with one another, represented by signal arrows A. Each surgical hub 13910, 13920, 13930 is also configured to be in signal communication with a primary server 13940, represented by signal arrows B in FIG. 47.


In various exemplifications, as data is communicated between the surgical hub(s) 13910, 13920, 13930 and the various surgical instruments during a surgical procedure, the surgical hub(s) 13910, 13920, 13930 are configured to temporarily store the communicated data. At the end of the surgical procedure and/or at the end of a pre-determined time period, each surgical hub 13910, 13920, 13930 is configured to communicate the stored information to the primary server 13940. Once the stored information is communicated to the primary server 13940, the information can be deleted from the memory of the individual surgical hub 13910, 13920, 13930. The stored information is communicated to the primary server 13940 to alleviate the competition amongst the surgical hubs 13910, 13920, 13930 for bandwidth to transmit the stored data to cloud analytics “C”, for example. Instead, the primary server 13940 is configured to compile and store and communicated data. The primary server 13940 is configured to be the single clearinghouse for communication of information back to the individual surgical hubs 13910, 13920, 13930 and/or for external downloading. In addition, as all of the data is stored in one location in the primary server 13940, the data is better protected from data destructive events, such as power surges and/or data intrusion, for example. In various instances, the primary server 13940 includes additional server-level equipment that allows for better data integrity. Examples of cloud systems are further described herein and in U.S. Provisional Patent Application Ser. No. 62/611,340, titled CLOUD-BASED MEDICAL ANALYTICS, filed Dec. 28, 2017, the disclosure of which is herein incorporated by reference in its entirety.


Referring to FIGS. 47 and 48, as data begins to be communicated from each control hub 13910, 13920, 13930 to the primary server 13940, a queue 13990 is created to prioritize the order in which data is communicated. In various instances, the queue 13990 prioritizes data as first in, first out, although any suitable prioritization protocol can be used. In various instances, the queue 13990 is configured to re-prioritize the order in which received data is communicated when priority events and/or abnormal data are detected. As illustrated in FIG. 48, a first surgical hub communicates a first set of data at a time t=1 at block 13960. As the first set of data is the only data in the queue for external output at block 13992, the first set of data is the first to be communicated. Thus, the queue 13990 prioritizes the first set of data for external output at block 13965. A second surgical hub communicates a second set of data at a time t=2 at block 13970. At the time t=2, the first set of data has not been externally communicated at block 13994. However, because no priority events and/or abnormal data are present in the second set of data, the second set of data is the second in line to be externally communicated at block 13975. A third surgical hub communicates a third set of data flagged as urgent at a time t=3 at block 13980. At the time t=3, the first set of data and the second set of data have not been externally communicated, however a priority event has been detected in the third set of data at block 13985. The queue is configured to re-prioritize the sets of data to allow the prioritized third set of data to be in the first position for external output at block 13996 above the first set of data and the second set of data collected at time t=1 and t=2, respectively.


In one aspect, the surgical hub includes a processor and a memory communicatively coupled to the processor, as described herein. The memory stores instructions executable by the processor to detect the presence of a powered surgical instrument and represent the powered surgical instrument on a hub display, as described herein.


In various aspects, the present disclosure provides a control circuit to detect the presence of a powered surgical instrument and represent the powered surgical instrument on a hub display, as described herein. In various aspects, the present disclosure provides a non-transitory computer readable medium storing computer readable instructions which, when executed, cause a machine to detect the presence of a powered surgical instrument and represent the powered surgical instrument on a hub display, as described herein.


The entire disclosures of:

    • U.S. Pat. No. 9,072,535, filed May 27, 2011, titled SURGICAL STAPLING INSTRUMENTS WITH ROTATABLE STAPLE DEPLOYMENT ARRANGEMENTS, which issued Jul. 7, 2015;
    • U.S. Pat. No. 9,072,536, filed Jun. 28, 2012, titled DIFFERENTIAL LOCKING ARRANGEMENTS FOR ROTARY POWERED SURGICAL INSTRUMENTS, which issued Jul. 7, 2015;
    • U.S. Pat. No. 9,204,879, filed Jun. 28, 2012, titled FLEXIBLE DRIVE MEMBER, which issued on Dec. 8, 2015;
    • U.S. Pat. No. 9,561,038, filed Jun. 28, 2012, titled INTERCHANGEABLE CLIP APPLIER, which issued on Feb. 7, 2017;
    • U.S. Pat. No. 9,757,128, filed Sep. 5, 2014, titled MULTIPLE SENSORS WITH ONE SENSOR AFFECTING A SECOND SENSOR'S OUTPUT OR INTERPRETATION, which issued on Sep. 12, 2017;
    • U.S. patent application Ser. No. 14/640,935, titled OVERLAID MULTI SENSOR RADIO FREQUENCY (RF) ELECTRODE SYSTEM TO MEASURE TISSUE COMPRESSION, filed Mar. 6, 2015, now U.S. Patent Application Publication No. 2016/0256071;
    • U.S. patent application Ser. No. 15/382,238, titled MODULAR BATTERY POWERED HANDHELD SURGICAL INSTRUMENT WITH SELECTIVE APPLICATION OF ENERGY BASED ON TISSUE CHARACTERIZATION, filed Dec. 16, 2016, now U.S. Patent Application Publication No. 2017/0202591; and
    • U.S. patent application Ser. No. 15/237,753, titled CONTROL OF ADVANCEMENT RATE AND APPLICATION FORCE BASED ON MEASURED FORCES, filed Aug. 16, 2016, now U.S. Patent Application Publication No. 2018/0049822;
    • are herein incorporated by reference in their respective entireties.


Various aspects of the subject matter described herein are set out in the following numbered examples.


Example 1

A robotic surgical system comprises a first robotic arm comprising a first force sensor, a second robotic arm comprising a second force sensor, and a control unit comprising a processor and a memory communicatively coupled to the processor. The memory stores instructions executable by the processor to receive a first input from the first force sensor, receive a second input from the second force sensor, and effect cooperative movement of the first robotic arm and the second robotic arm based on the first input from the first force sensor and the second input from the second force sensor in a load control mode.


Example 2

The robotic surgical system of Example 1, wherein the first robotic arm comprises a first position sensor, wherein the second robotic arm comprises a second position sensor, and wherein the processor is configured to be in signal communication with the first position sensor and the second position sensor.


Example 3

The robotic surgical system of Example 2, wherein the memory is configured to store instructions operable by the processor to receive a first position input from the first position sensor, and receive a second position input from the second position sensor.


Example 4

The robotic surgical system of Example 3, wherein the memory stores instructions executable by the processor to effect cooperative movement of the first robotic arm and the second robotic arm based on the first position input from the first position sensor and the second position input from the second position sensor in a position control mode.


Example 5

The robotic surgical system of any one of Examples 1-4, wherein the memory stores instructions executable by the processor to switch from the load control mode to a position control mode upon movement of a surgical tool mounted to one of the robotic arms outside a defined boundary.


Example 6

The robotic surgical system of any one of Examples 1-5, wherein the processor is communicatively coupled to a situational awareness module configured to recommend a surgical function based on the first input received from the first force sensor and the second input received from the second force sensor.


Example 7

The robotic surgical system of any one of Examples 1-6, wherein the memory stores instructions executable by the processor to determine if the first robotic arm and the second robotic arm are inactive and stop communicating with the first force sensor and the second force sensor when the first robotic arm and the second robotic arm are inactive.


Example 8

A robotic surgical system comprises a first robotic arm comprising a first sensor, a second robotic arm comprising a second sensor, and a control unit comprising a processor and a memory communicatively coupled to the processor. The memory stores instructions executable by the processor to receive a first input from the first sensor, receive a second input from the second sensor, and effect cooperative movement of the first robotic arm and the second robotic arm based on the first input from the first sensor and the second input from the second sensor.


Example 9

The robotic surgical system of Example 8, wherein the first sensor and the second sensor are force sensors.


Example 10

The robotic surgical system of any one of Examples 8 and 9, wherein the memory stores instructions executable by the processor to enter into a load control mode upon receiving the first input from the first sensor and the second input from the second sensor.


Example 11

The robotic surgical system of Example 8, wherein the first sensor and the second sensor are position sensors.


Example 12

The robotic surgical system of Example 11, wherein the memory is configured to store stores instructions executable by the processor to enter into a position control mode upon receiving the first input from the first sensor and the second input from the second sensor.


Example 13

The robotic surgical system of Examples 8-12, wherein the processor is communicatively coupled to a situational awareness module configured to recommend a surgical function based on the first input received from the first sensor and the second input received from the second sensor.


Example 14

A non-transitory computer readable medium storing computer readable instructions which, when executed, cause a machine to receive a first input from a first force sensor, receive a second input from a second force sensor, and effect cooperative movement of a first robotic arm and a second robotic arm based on the first input from the first force sensor and the second input from the second force sensor in a load control mode.


Example 15

The non-transitory computer readable medium of Example 14, wherein the first robotic arm comprises a first position sensor, and wherein the second robotic arm comprises a second position sensor.


Example 16

The non-transitory computer readable medium of Example 15, wherein the first position sensor is configured to communicate a first position input to the machine, and wherein the second position sensor is configured to communicate a second position input to the machine.


Example 17

The non-transitory computer readable medium of Example 16, wherein the computer readable instructions, when executed, cause a machine to effect cooperative movement of the first robotic arm and the second robotic arm based on the first position input from the first position sensor and the second position input from the second position sensor in a position control mode.


Example 18

The non-transitory computer readable medium of Examples 14-17, wherein the machine is operably configured to switch from the load control mode to a position control mode upon movement of a surgical tool mounted to one of the robotic arms outside a defined boundary.


Example 19

The non-transitory computer readable medium of any one of Examples 14-18, further comprising a situational awareness module configured to recommend a surgical function based on the first input received from the first force sensor and the second input received from the second force sensor.


Example 20

The non-transitory computer readable medium of any one of Examples 14-19, wherein the computer readable instructions, when executed, cause a machine to: determine if the first robotic arm and the second robotic arm are activated; and stop communicating with the first force sensor and the second force sensor when the first robotic arm and the second robotic arm are inactive.


Example 21

A robotic surgical system comprises a first robotic arm comprising a first sensor; a second robotic arm comprising a second sensor; and a control circuit. The control circuit is configured to receive a first input from the first sensor, receive a second input from the second sensor, and effect cooperative movement of the first robotic arm and the second robotic arm based on the first input from the first sensor and the second input from the second sensor.


Example 22

The robotic surgical system of Example 21, wherein the first sensor and the second sensor are force sensors.


Example 23

The robotic surgical system of any one of Examples 21 and 22, wherein the control circuit is configured to enter into a load control mode upon receiving the first input from the first sensor and the second input from the second sensor.


Example 24

The robotic surgical system of Example 21, wherein the first sensor and the second sensor are position sensors.


Example 25

The robotic surgical system of Example 24, wherein the control circuit is configured to enter into a position control mode upon receiving the first input from the first sensor and the second input from the second sensor.


Example 26

The robotic surgical system of any one of Examples 21-25, wherein the control circuit is communicatively coupled to a situational awareness module configured to recommend a surgical function based on the first input received from the first sensor and the second input received from the second sensor.


While several forms have been illustrated and described, it is not the intention of the applicant to restrict or limit the scope of the appended claims to such detail. Numerous modifications, variations, changes, substitutions, combinations, and equivalents to those forms may be implemented and will occur to those skilled in the art without departing from the scope of the present disclosure. Moreover, the structure of each element associated with the described forms can be alternatively described as a means for providing the function performed by the element. Also, where materials are disclosed for certain components, other materials may be used. It is therefore to be understood that the foregoing description and the appended claims are intended to cover all such modifications, combinations, and variations as falling within the scope of the disclosed forms. The appended claims are intended to cover all such modifications, variations, changes, substitutions, modifications, and equivalents.


The foregoing detailed description has set forth various forms of the devices and/or processes via the use of block diagrams, flowcharts, and/or examples. Insofar as such block diagrams, flowcharts, and/or examples contain one or more functions and/or operations, it will be understood by those within the art that each function and/or operation within such block diagrams, flowcharts, and/or examples can be implemented, individually and/or collectively, by a wide range of hardware, software, firmware, or virtually any combination thereof. Those skilled in the art will recognize that some aspects of the forms disclosed herein, in whole or in part, can be equivalently implemented in integrated circuits, as one or more computer programs running on one or more computers (e.g., as one or more programs running on one or more computer systems), as one or more programs running on one or more processors (e.g., as one or more programs running on one or more microprocessors), as firmware, or as virtually any combination thereof, and that designing the circuitry and/or writing the code for the software and or firmware would be well within the skill of one of skill in the art in light of this disclosure. In addition, those skilled in the art will appreciate that the mechanisms of the subject matter described herein are capable of being distributed as one or more program products in a variety of forms, and that an illustrative form of the subject matter described herein applies regardless of the particular type of signal bearing medium used to actually carry out the distribution.


Instructions used to program logic to perform various disclosed aspects can be stored within a memory in the system, such as dynamic random access memory (DRAM), cache, flash memory, or other storage. Furthermore, the instructions can be distributed via a network or by way of other computer readable media. Thus a machine-readable medium may include any mechanism for storing or transmitting information in a form readable by a machine (e.g., a computer), but is not limited to, floppy diskettes, optical disks, compact disc, read-only memory (CD-ROMs), and magneto-optical disks, read-only memory (ROMs), random access memory (RAM), erasable programmable read-only memory (EPROM), electrically erasable programmable read-only memory (EEPROM), magnetic or optical cards, flash memory, or a tangible, machine-readable storage used in the transmission of information over the Internet via electrical, optical, acoustical or other forms of propagated signals (e.g., carrier waves, infrared signals, digital signals, etc.). Accordingly, the non-transitory computer-readable medium includes any type of tangible machine-readable medium suitable for storing or transmitting electronic instructions or information in a form readable by a machine (e.g., a computer).


As used in any aspect herein, the term “control circuit” may refer to, for example, hardwired circuitry, programmable circuitry (e.g., a computer processor comprising one or more individual instruction processing cores, processing unit, processor, microcontroller, microcontroller unit, controller, digital signal processor (DSP), programmable logic device (PLD), programmable logic array (PLA), or field programmable gate array (FPGA)), state machine circuitry, firmware that stores instructions executed by programmable circuitry, and any combination thereof. The control circuit may, collectively or individually, be embodied as circuitry that forms part of a larger system, for example, an integrated circuit (IC), an application-specific integrated circuit (ASIC), a system on-chip (SoC), desktop computers, laptop computers, tablet computers, servers, smart phones, etc. Accordingly, as used herein “control circuit” includes, but is not limited to, electrical circuitry having at least one discrete electrical circuit, electrical circuitry having at least one integrated circuit, electrical circuitry having at least one application specific integrated circuit, electrical circuitry forming a general purpose computing device configured by a computer program (e.g., a general purpose computer configured by a computer program which at least partially carries out processes and/or devices described herein, or a microprocessor configured by a computer program which at least partially carries out processes and/or devices described herein), electrical circuitry forming a memory device (e.g., forms of random access memory), and/or electrical circuitry forming a communications device (e.g., a modem, communications switch, or optical-electrical equipment). Those having skill in the art will recognize that the subject matter described herein may be implemented in an analog or digital fashion or some combination thereof.


As used in any aspect herein, the term “logic” may refer to an app, software, firmware and/or circuitry configured to perform any of the aforementioned operations. Software may be embodied as a software package, code, instructions, instruction sets and/or data recorded on non-transitory computer readable storage medium. Firmware may be embodied as code, instructions or instruction sets and/or data that are hard-coded (e.g., nonvolatile) in memory devices.


As used in any aspect herein, the terms “component,” “system,” “module” and the like can refer to a computer-related entity, either hardware, a combination of hardware and software, software, or software in execution.


As used in any aspect herein, an “algorithm” refers to a self-consistent sequence of steps leading to a desired result, where a “step” refers to a manipulation of physical quantities and/or logic states which may, though need not necessarily, take the form of electrical or magnetic signals capable of being stored, transferred, combined, compared, and otherwise manipulated. It is common usage to refer to these signals as bits, values, elements, symbols, characters, terms, numbers, or the like. These and similar terms may be associated with the appropriate physical quantities and are merely convenient labels applied to these quantities and/or states.


A network may include a packet switched network. The communication devices may be capable of communicating with each other using a selected packet switched network communications protocol. One example communications protocol may include an Ethernet communications protocol which may be capable permitting communication using a Transmission Control Protocol/Internet Protocol (TCP/IP). The Ethernet protocol may comply or be compatible with the Ethernet standard published by the Institute of Electrical and Electronics Engineers (IEEE) titled “IEEE 802.3 Standard”, published in December, 2008 and/or later versions of this standard. Alternatively or additionally, the communication devices may be capable of communicating with each other using an X.25 communications protocol. The X.25 communications protocol may comply or be compatible with a standard promulgated by the International Telecommunication Union-Telecommunication Standardization Sector (ITU-T). Alternatively or additionally, the communication devices may be capable of communicating with each other using a frame relay communications protocol. The frame relay communications protocol may comply or be compatible with a standard promulgated by Consultative Committee for International Telegraph and Telephone (CCITT) and/or the American National Standards Institute (ANSI). Alternatively or additionally, the transceivers may be capable of communicating with each other using an Asynchronous Transfer Mode (ATM) communications protocol. The ATM communications protocol may comply or be compatible with an ATM standard published by the ATM Forum titled “ATM-MPLS Network Interworking 2.0” published August 2001, and/or later versions of this standard. Of course, different and/or after-developed connection-oriented network communication protocols are equally contemplated herein.


Unless specifically stated otherwise as apparent from the foregoing disclosure, it is appreciated that, throughout the foregoing disclosure, discussions using terms such as “processing,” “computing,” “calculating,” “determining,” “displaying,” or the like, refer to the action and processes of a computer system, or similar electronic computing device, that manipulates and transforms data represented as physical (electronic) quantities within the computer system's registers and memories into other data similarly represented as physical quantities within the computer system memories or registers or other such information storage, transmission or display devices.


One or more components may be referred to herein as “configured to,” “configurable to,” “operable/operative to,” “adapted/adaptable,” “able to,” “conformable/conformed to,” etc. Those skilled in the art will recognize that “configured to” can generally encompass active-state components and/or inactive-state components and/or standby-state components, unless context requires otherwise.


The terms “proximal” and “distal” are used herein with reference to a clinician manipulating the handle portion of the surgical instrument. The term “proximal” refers to the portion closest to the clinician and the term “distal” refers to the portion located away from the clinician. It will be further appreciated that, for convenience and clarity, spatial terms such as “vertical”, “horizontal”, “up”, and “down” may be used herein with respect to the drawings. However, surgical instruments are used in many orientations and positions, and these terms are not intended to be limiting and/or absolute.


Those skilled in the art will recognize that, in general, terms used herein, and especially in the appended claims (e.g., bodies of the appended claims) are generally intended as “open” terms (e.g., the term “including” should be interpreted as “including but not limited to,” the term “having” should be interpreted as “having at least,” the term “includes” should be interpreted as “includes but is not limited to,” etc.). It will be further understood by those within the art that if a specific number of an introduced claim recitation is intended, such an intent will be explicitly recited in the claim, and in the absence of such recitation no such intent is present. For example, as an aid to understanding, the following appended claims may contain usage of the introductory phrases “at least one” and “one or more” to introduce claim recitations. However, the use of such phrases should not be construed to imply that the introduction of a claim recitation by the indefinite articles “a” or “an” limits any particular claim containing such introduced claim recitation to claims containing only one such recitation, even when the same claim includes the introductory phrases “one or more” or “at least one” and indefinite articles such as “a” or “an” (e.g., “a” and/or “an” should typically be interpreted to mean “at least one” or “one or more”); the same holds true for the use of definite articles used to introduce claim recitations.


In addition, even if a specific number of an introduced claim recitation is explicitly recited, those skilled in the art will recognize that such recitation should typically be interpreted to mean at least the recited number (e.g., the bare recitation of “two recitations,” without other modifiers, typically means at least two recitations, or two or more recitations). Furthermore, in those instances where a convention analogous to “at least one of A, B, and C, etc.” is used, in general such a construction is intended in the sense one having skill in the art would understand the convention (e.g., “a system having at least one of A, B, and C” would include but not be limited to systems that have A alone, B alone, C alone, A and B together, A and C together, B and C together, and/or A, B, and C together, etc.). In those instances where a convention analogous to “at least one of A, B, or C, etc.” is used, in general such a construction is intended in the sense one having skill in the art would understand the convention (e.g., “a system having at least one of A, B, or C” would include but not be limited to systems that have A alone, B alone, C alone, A and B together, A and C together, B and C together, and/or A, B, and C together, etc.). It will be further understood by those within the art that typically a disjunctive word and/or phrase presenting two or more alternative terms, whether in the description, claims, or drawings, should be understood to contemplate the possibilities of including one of the terms, either of the terms, or both terms unless context dictates otherwise. For example, the phrase “A or B” will be typically understood to include the possibilities of “A” or “B” or “A and B.”


With respect to the appended claims, those skilled in the art will appreciate that recited operations therein may generally be performed in any order. Also, although various operational flow diagrams are presented in a sequence(s), it should be understood that the various operations may be performed in other orders than those which are illustrated, or may be performed concurrently. Examples of such alternate orderings may include overlapping, interleaved, interrupted, reordered, incremental, preparatory, supplemental, simultaneous, reverse, or other variant orderings, unless context dictates otherwise. Furthermore, terms like “responsive to,” “related to,” or other past-tense adjectives are generally not intended to exclude such variants, unless context dictates otherwise.


It is worthy to note that any reference to “one aspect,” “an aspect,” “an exemplification,” “one exemplification,” and the like means that a particular feature, structure, or characteristic described in connection with the aspect is included in at least one aspect. Thus, appearances of the phrases “in one aspect,” “in an aspect,” “in an exemplification,” and “in one exemplification” in various places throughout the specification are not necessarily all referring to the same aspect. Furthermore, the particular features, structures or characteristics may be combined in any suitable manner in one or more aspects.


Any patent application, patent, non-patent publication, or other disclosure material referred to in this specification and/or listed in any Application Data Sheet is incorporated by reference herein, to the extent that the incorporated materials is not inconsistent herewith. As such, and to the extent necessary, the disclosure as explicitly set forth herein supersedes any conflicting material incorporated herein by reference. Any material, or portion thereof, that is said to be incorporated by reference herein, but which conflicts with existing definitions, statements, or other disclosure material set forth herein will only be incorporated to the extent that no conflict arises between that incorporated material and the existing disclosure material.


In summary, numerous benefits have been described which result from employing the concepts described herein. The foregoing description of the one or more forms has been presented for purposes of illustration and description. It is not intended to be exhaustive or limiting to the precise form disclosed. Modifications or variations are possible in light of the above teachings. The one or more forms were chosen and described in order to illustrate principles and practical application to thereby enable one of ordinary skill in the art to utilize the various forms and with various modifications as are suited to the particular use contemplated. It is intended that the claims submitted herewith define the overall scope.

Claims
  • 1. A robotic surgical system, comprising: a first robotic arm comprising a first force sensor;a second robotic arm comprising a second force sensor; anda control unit comprising a processor and a memory communicatively coupled to said processor, wherein said memory stores instructions executable by said processor to: receive a first input from said first force sensor;receive a second input from said second force sensor;effect cooperative movement of said first robotic arm and said second robotic arm based on the first input from said first force sensor and the second input from said second force sensor in a load control mode; andswitch from the load control mode to a position control mode upon movement of a surgical tool mounted to one of said robotic arms outside a defined boundary.
  • 2. A robotic surgical system, comprising: a first robotic arm comprising a first force sensor;a second robotic arm comprising a second force sensor; anda control unit comprising a processor and a memory communicatively coupled to said processor, wherein said memory stores instructions executable by said processor to: receive a first input from said first force sensor;receive a second input from said second force sensor; andeffect cooperative movement of said first robotic arm and said second robotic arm based on the first input from said first force sensor and the second input from said second force sensor in a load control mode;wherein said processor is communicatively coupled to a situational awareness module configured to determine, from cartridge data, the type of tissue being stapled and to recommend a surgical function based on the type of tissue being stapled, the first input received from said first force sensor, and the second input received from said second force sensor.
  • 3. The robotic surgical system of claim 2, wherein said memory stores instructions executable by said processor to: determine if said first robotic arm and said second robotic arm are inactive; andstop communicating with said first force sensor and said second force sensor when said first robotic arm and said second robotic arm are inactive.
  • 4. A robotic surgical system, comprising: a first robotic arm comprising a first sensor;a second robotic arm comprising a second sensor; anda control unit comprising a processor and a memory communicatively coupled to said processor, wherein said memory stores instructions executable by said processor to: receive a first input from said first sensor;receive a second input from said second sensor; andeffect cooperative movement of said first robotic arm and said second robotic arm based on the first input from said first sensor and the second input from said second sensor;wherein said processor is communicatively coupled to a situational awareness module configured to determine, from cartridge data, the type of tissue being stapled and to recommend a surgical function based on the type of tissue, the first input received from said first sensor, and the second input received from said second sensor.
  • 5. The robotic surgical system of claim 4, wherein said first sensor and said second sensor are force sensors.
  • 6. The robotic surgical system of claim 5, wherein said memory stores instructions executable by said processor to enter into a load control mode upon receiving the first input from said first sensor and the second input from said second sensor.
  • 7. The robotic surgical system of claim 4, wherein said first sensor and said second sensor are position sensors.
  • 8. The robotic surgical system of claim 7, wherein said memory is configured to store instructions executable by said processor to enter into a position control mode upon receiving the first input from said first sensor and the second input from said second sensor.
  • 9. A non-transitory computer readable medium storing computer readable instructions which, when executed, cause a machine to: receive a first input from a first force sensor;receive a second input from a second force sensor; andeffect cooperative movement of a first robotic arm and a second robotic arm based on the first input from the first force sensor and the second input from the second force sensor in a load control mode;wherein the machine is operably configured to switch from the load control mode to a position control mode upon movement of a surgical tool mounted to one of the robotic arms outside a defined boundary.
  • 10. A non-transitory computer readable medium storing computer readable instructions which, when executed, cause a machine to: receive a first input from a first force sensor;receive a second input from a second force sensor; andeffect cooperative movement of a first robotic arm and a second robotic arm based on the first input from the first force sensor and the second input from the second force sensor in a load control mode;further comprising a situational awareness module configured to determine, from cartridge data, the type of tissue being stapled and to recommend a surgical function based on the type of tissue, the first input received from the first force sensor, and the second input received from the second force sensor.
  • 11. A non-transitory computer readable medium storing computer readable instructions which, when executed, cause a machine to: receive a first input from a first force sensor;receive a second input from a second force sensor;effect cooperative movement of a first robotic arm and a second robotic arm based on the first input from the first force sensor and the second input from the second force sensor in a load control mode;determine if the first robotic arm and the second robotic arm are activated; andstop communicating with the first force sensor and the second force sensor when the first robotic arm and the second robotic arm are inactive.
  • 12. A robotic surgical system, comprising: a first robotic arm comprising a first sensor;a second robotic arm comprising a second sensor; anda control circuit configured to: receive a first input from said first sensor;receive a second input from said second sensor;effect cooperative movement of said first robotic arm and said second robotic arm based on the first input from said first sensor and the second input from said second sensor; andreceive a suggested surgical function from a situational awareness module configured to synthesize data from multiple sources to determine the suggested surgical function, wherein the suggested surgical function is at least based on the first input, the second input, and the type of tissue being stapled determined from cartridge data by the situational awareness module.
  • 13. A robotic surgical system, comprising: a first robotic arm adapted to manipulate a surgical tool;a second robotic arm; anda control circuit configured to: in a load control mode, effect cooperative movement of said first robotic arm and said second robotic arm based on forces detected by said first robotic arm and said second robotic arm;in a position control mode, effect cooperative movement of said first robotic arm and said second robotic arm based on positions of said first robotic arm and said second robotic arm; andswitch from the load control mode to the position control mode upon movement of the surgical tool outside a defined boundary.
  • 14. A robotic surgical system, comprising: a first robotic arm comprising a first sensor;a second robotic arm comprising a second sensor; anda control circuit configured to: receive a first input from said first sensor;receive a second input from said second sensor;effect cooperative movement of said first robotic arm and said second robotic arm based on the first input from said first sensor and the second input from said second sensor;determine if said first robotic arm and said second robotic arm are inactive; andstop communicating with said first sensor and said second sensor when said first robotic arm and said second robotic arm are inactive.
CROSS-REFERENCE TO RELATED APPLICATIONS

This application claims the benefit of priority under 35 U.S.C. 119(e) to U.S. Provisional Patent Application Ser. No. 62/649,307, titled AUTOMATIC TOOL ADJUSTMENTS FOR ROBOT-ASSISTED SURGICAL PLATFORMS, filed Mar. 28, 2018, the disclosure of which is herein incorporated by reference in its entirety. This application also claims the benefit of priority under 35 U.S.C. 119(e) to U.S. Provisional Patent Application Ser. No. 62/611,341, titled INTERACTIVE SURGICAL PLATFORM, filed Dec. 28, 2017, to U.S. Provisional Patent Application Ser. No. 62/611,340, titled CLOUD-BASED MEDICAL ANALYTICS, filed Dec. 28, 2017, and to U.S. Provisional Patent Application Ser. No. 62/611,339, titled ROBOT ASSISTED SURGICAL PLATFORM, filed Dec. 28, 2017, the disclosure of each of which is herein incorporated by reference in its entirety.

US Referenced Citations (1727)
Number Name Date Kind
1853416 Hall Apr 1932 A
3082426 Miles Mar 1963 A
3503396 Pierie et al. Mar 1970 A
3584628 Green Jun 1971 A
3633584 Farrell Jan 1972 A
3759017 Young Sep 1973 A
4412539 Jarvik Nov 1983 A
4448193 Ivanov May 1984 A
4523695 Braun et al. Jun 1985 A
4608160 Zoch Aug 1986 A
4614366 North et al. Sep 1986 A
4701193 Robertson et al. Oct 1987 A
4735603 Goodson et al. Apr 1988 A
4788977 Farin et al. Dec 1988 A
5035692 Lyon et al. Jul 1991 A
5042460 Sakurai et al. Aug 1991 A
5084057 Green et al. Jan 1992 A
5100402 Fan Mar 1992 A
5151102 Kamiyama et al. Sep 1992 A
5156315 Green et al. Oct 1992 A
5158585 Saho et al. Oct 1992 A
5197962 Sansom et al. Mar 1993 A
5242474 Herbst et al. Sep 1993 A
5253793 Green et al. Oct 1993 A
RE34519 Fox et al. Jan 1994 E
5318516 Cosmescu Jun 1994 A
5322055 Davison et al. Jun 1994 A
5342349 Kaufman Aug 1994 A
5383880 Hooven Jan 1995 A
5396900 Slater et al. Mar 1995 A
5397046 Savage et al. Mar 1995 A
5403312 Yates et al. Apr 1995 A
5403327 Thornton et al. Apr 1995 A
5413267 Solyntjes et al. May 1995 A
5417699 Klein et al. May 1995 A
5439468 Schulze et al. Aug 1995 A
5445304 Plyley et al. Aug 1995 A
5465895 Knodel et al. Nov 1995 A
5467911 Tsuruta et al. Nov 1995 A
5474566 Alesi et al. Dec 1995 A
5496315 Weaver et al. Mar 1996 A
5503320 Webster et al. Apr 1996 A
5531743 Nettekoven et al. Jul 1996 A
5545148 Wurster Aug 1996 A
5584425 Savage et al. Dec 1996 A
5610379 Muz et al. Mar 1997 A
5613966 Makower et al. Mar 1997 A
5624452 Yates Apr 1997 A
5626587 Bishop et al. May 1997 A
5643291 Pier et al. Jul 1997 A
5654750 Weil et al. Aug 1997 A
5673841 Schulze et al. Oct 1997 A
5673842 Bittner et al. Oct 1997 A
5675227 Roos et al. Oct 1997 A
5693052 Weaver Dec 1997 A
5695502 Pier et al. Dec 1997 A
5697926 Weaver Dec 1997 A
5706998 Plyley et al. Jan 1998 A
5725536 Oberlin et al. Mar 1998 A
5725542 Yoon Mar 1998 A
5735848 Yates et al. Apr 1998 A
5746209 Yost et al. May 1998 A
5749362 Funda et al. May 1998 A
5749893 Vidal et al. May 1998 A
5752644 Bolanos et al. May 1998 A
5762255 Chrisman et al. Jun 1998 A
5766186 Faraz et al. Jun 1998 A
5769791 Benaron et al. Jun 1998 A
5797537 Oberlin et al. Aug 1998 A
D399561 Ellingson Oct 1998 S
5817093 Williamson, IV et al. Oct 1998 A
5820009 Melling et al. Oct 1998 A
5836849 Mathiak et al. Nov 1998 A
5836909 Cosmescu Nov 1998 A
5843080 Fleenor et al. Dec 1998 A
5846237 Nettekoven Dec 1998 A
5849022 Sakashita et al. Dec 1998 A
5873873 Smith et al. Feb 1999 A
5878938 Bittner et al. Mar 1999 A
5893849 Weaver Apr 1999 A
5906625 Bito et al. May 1999 A
5942333 Arnett et al. Aug 1999 A
5947996 Logeman Sep 1999 A
5968032 Sleister Oct 1999 A
5980510 Tsonton et al. Nov 1999 A
5997528 Bisch et al. Dec 1999 A
6010054 Johnson et al. Jan 2000 A
6030437 Gourrier et al. Feb 2000 A
6036637 Kudo Mar 2000 A
6039734 Goble Mar 2000 A
6039735 Greep Mar 2000 A
6059799 Aranyi et al. May 2000 A
6066137 Greep May 2000 A
6079606 Milliman et al. Jun 2000 A
6090107 Borgmeier et al. Jul 2000 A
6099537 Sugai et al. Aug 2000 A
6155473 Tompkins et al. Dec 2000 A
6214000 Fleenor et al. Apr 2001 B1
6258105 Hart et al. Jul 2001 B1
6273887 Yamauchi et al. Aug 2001 B1
6301495 Gueziec et al. Oct 2001 B1
6302881 Farin Oct 2001 B1
6308089 von der Ruhr et al. Oct 2001 B1
6325808 Bernard et al. Dec 2001 B1
6325811 Messerly Dec 2001 B1
6341164 Dilkie et al. Jan 2002 B1
6391102 Bodden et al. May 2002 B1
6434416 Mizoguchi Aug 2002 B1
6443973 Whitman Sep 2002 B1
6454781 Witt et al. Sep 2002 B1
6457625 Tormala et al. Oct 2002 B1
6461352 Morgan et al. Oct 2002 B2
6530933 Yeung et al. Mar 2003 B1
6551243 Bocionek et al. Apr 2003 B2
6569109 Sakurai et al. May 2003 B2
6582424 Fleenor et al. Jun 2003 B2
6585791 Garito et al. Jul 2003 B1
6618626 West, Jr. et al. Sep 2003 B2
6648223 Boukhny et al. Nov 2003 B2
6685704 Greep Feb 2004 B2
6699187 Webb et al. Mar 2004 B2
6742895 Robin Jun 2004 B2
6752816 Culp et al. Jun 2004 B2
6773444 Messerly Aug 2004 B2
6778846 Martinez et al. Aug 2004 B1
6781683 Kacyra et al. Aug 2004 B2
6783524 Anderson et al. Aug 2004 B2
6783525 Greep et al. Aug 2004 B2
6793663 Kneifel et al. Sep 2004 B2
6846308 Whitman et al. Jan 2005 B2
6852219 Hammond Feb 2005 B2
6863650 Irion Mar 2005 B1
6869430 Balbierz et al. Mar 2005 B2
6869435 Blake, III Mar 2005 B2
6911033 de Guillebon et al. Jun 2005 B2
6937892 Leyde et al. Aug 2005 B2
6945981 Donofrio et al. Sep 2005 B2
6951559 Greep Oct 2005 B1
6978921 Shelton, IV et al. Dec 2005 B2
6988649 Shelton, IV et al. Jan 2006 B2
7000818 Shelton, IV et al. Feb 2006 B2
7030146 Baynes et al. Apr 2006 B2
7032798 Whitman et al. Apr 2006 B2
7041941 Faries, Jr. et al. May 2006 B2
7044352 Shelton, IV et al. May 2006 B2
7044911 Drinan et al. May 2006 B2
7048775 Jornitz et al. May 2006 B2
7053752 Wang et al. May 2006 B2
7077853 Kramer et al. Jul 2006 B2
7077856 Whitman Jul 2006 B2
7081096 Brister et al. Jul 2006 B2
7097640 Wang et al. Aug 2006 B2
7103688 Strong Sep 2006 B2
7118564 Ritchie et al. Oct 2006 B2
7121460 Parsons et al. Oct 2006 B1
7137980 Buysse et al. Nov 2006 B2
7143923 Shelton, IV et al. Dec 2006 B2
7143925 Shelton, IV et al. Dec 2006 B2
7147139 Schwemberger et al. Dec 2006 B2
7155316 Sutherland Dec 2006 B2
7169145 Isaacson et al. Jan 2007 B2
7177533 McFarlin et al. Feb 2007 B2
7182775 de Guillebon et al. Feb 2007 B2
7208005 Frecker et al. Apr 2007 B2
7230529 Ketcherside, Jr. et al. Jun 2007 B2
7232447 Gellman et al. Jun 2007 B2
7236817 Papas et al. Jun 2007 B2
7246734 Shelton, IV Jul 2007 B2
7278563 Green Oct 2007 B1
7294106 Birkenbach et al. Nov 2007 B2
7294116 Ellman et al. Nov 2007 B1
7296724 Green et al. Nov 2007 B2
7317955 McGreevy Jan 2008 B2
7328828 Ortiz et al. Feb 2008 B2
7334717 Rethy et al. Feb 2008 B2
7362228 Nycz et al. Apr 2008 B2
7371227 Zeiner May 2008 B2
7380695 Doll et al. Jun 2008 B2
7383088 Spinelli et al. Jun 2008 B2
7391173 Schena Jun 2008 B2
7407074 Ortiz et al. Aug 2008 B2
7422139 Shelton, IV et al. Sep 2008 B2
7423972 Shaham et al. Sep 2008 B2
7457804 Uber, III et al. Nov 2008 B2
7464847 Viola et al. Dec 2008 B2
7464849 Shelton, IV et al. Dec 2008 B2
7515961 Germanson et al. Apr 2009 B2
7575144 Ortiz et al. Aug 2009 B2
7617137 Kreiner et al. Nov 2009 B2
7621192 Conti et al. Nov 2009 B2
7621898 Lalomia et al. Nov 2009 B2
7637410 Marczyk Dec 2009 B2
7641092 Kruszynski et al. Jan 2010 B2
7667839 Bates Feb 2010 B2
7670334 Hueil et al. Mar 2010 B2
7694865 Scirica Apr 2010 B2
7699860 Huitema et al. Apr 2010 B2
7720306 Gardiner et al. May 2010 B2
7721934 Shelton, IV et al. May 2010 B2
7736357 Lee, Jr. et al. Jun 2010 B2
7742176 Braunecker et al. Jun 2010 B2
7743960 Whitman et al. Jun 2010 B2
7753245 Boudreaux et al. Jul 2010 B2
7766905 Paterson et al. Aug 2010 B2
7770773 Whitman et al. Aug 2010 B2
7776037 Odom Aug 2010 B2
7782789 Stultz et al. Aug 2010 B2
7784663 Shelton, IV Aug 2010 B2
7803151 Whitman Sep 2010 B2
7818041 Kim et al. Oct 2010 B2
7819298 Hall et al. Oct 2010 B2
7836085 Petakov et al. Nov 2010 B2
7837079 Holsten et al. Nov 2010 B2
7837680 Isaacson et al. Nov 2010 B2
7841980 Minosawa et al. Nov 2010 B2
7845537 Shelton, IV et al. Dec 2010 B2
7862560 Marion Jan 2011 B2
7862579 Ortiz et al. Jan 2011 B2
7887530 Zemlok et al. Feb 2011 B2
7892337 Palmerton et al. Feb 2011 B2
7913891 Doll et al. Mar 2011 B2
7918230 Whitman et al. Apr 2011 B2
7918377 Measamer et al. Apr 2011 B2
7920706 Asokan et al. Apr 2011 B2
7927014 Dehler Apr 2011 B2
7954682 Giordano et al. Jun 2011 B2
7955322 Devengenzo et al. Jun 2011 B2
7956620 Gilbert Jun 2011 B2
7963433 Whitman et al. Jun 2011 B2
7966269 Bauer et al. Jun 2011 B2
7967180 Scirica Jun 2011 B2
7976553 Shelton, IV et al. Jul 2011 B2
7979157 Anvari Jul 2011 B2
7980443 Scheib et al. Jul 2011 B2
7982776 Dunki-Jacobs et al. Jul 2011 B2
7988028 Farascioni et al. Aug 2011 B2
7993140 Sakezles Aug 2011 B2
7995045 Dunki-Jacobs Aug 2011 B2
8005947 Morris et al. Aug 2011 B2
8007494 Taylor et al. Aug 2011 B1
8007513 Nalagatla et al. Aug 2011 B2
8012170 Whitman et al. Sep 2011 B2
8015976 Shah Sep 2011 B2
8016855 Whitman et al. Sep 2011 B2
8025199 Whitman et al. Sep 2011 B2
8027710 Dannan Sep 2011 B1
8035685 Jensen Oct 2011 B2
8038686 Huitema et al. Oct 2011 B2
8038693 Allen Oct 2011 B2
8043560 Okumoto et al. Oct 2011 B2
8054184 Cline et al. Nov 2011 B2
8062306 Nobis et al. Nov 2011 B2
8062330 Prommersberger et al. Nov 2011 B2
8066721 Kortenbach et al. Nov 2011 B2
8075571 Vitali et al. Dec 2011 B2
8096459 Ortiz et al. Jan 2012 B2
8118206 Zand et al. Feb 2012 B2
8120301 Goldberg et al. Feb 2012 B2
8123764 Meade et al. Feb 2012 B2
8131565 Dicks et al. Mar 2012 B2
8147486 Honour et al. Apr 2012 B2
8155479 Hoffman Apr 2012 B2
8157145 Shelton, IV et al. Apr 2012 B2
8157150 Viola et al. Apr 2012 B2
8160098 Yan et al. Apr 2012 B1
8161977 Shelton, IV et al. Apr 2012 B2
8170396 Kuspa et al. May 2012 B2
8172836 Ward May 2012 B2
8181839 Beetel May 2012 B2
8185409 Putnam et al. May 2012 B2
8206345 Abboud et al. Jun 2012 B2
8210411 Yates et al. Jul 2012 B2
8214007 Baker et al. Jul 2012 B2
8216849 Petty Jul 2012 B2
8220688 Laurent et al. Jul 2012 B2
8225643 Abboud et al. Jul 2012 B2
8225979 Farascioni et al. Jul 2012 B2
8229549 Whitman et al. Jul 2012 B2
8257387 Cunningham Sep 2012 B2
8260016 Maeda et al. Sep 2012 B2
8262560 Whitman Sep 2012 B2
8292888 Whitman Oct 2012 B2
8295902 Salahieh et al. Oct 2012 B2
8308040 Huang et al. Nov 2012 B2
8321581 Katis et al. Nov 2012 B2
8328065 Shah Dec 2012 B2
8335590 Costa et al. Dec 2012 B2
8346392 Walser et al. Jan 2013 B2
8360299 Zemlok et al. Jan 2013 B2
8364222 Cook et al. Jan 2013 B2
8365975 Manoux et al. Feb 2013 B1
8388652 Viola Mar 2013 B2
8393514 Shelton, IV et al. Mar 2013 B2
8397972 Kostrzewski Mar 2013 B2
8398541 DiMaio et al. Mar 2013 B2
8403944 Pain et al. Mar 2013 B2
8403945 Whitfield et al. Mar 2013 B2
8403946 Whitfield et al. Mar 2013 B2
8406859 Zuzak et al. Mar 2013 B2
8422035 Hinderling et al. Apr 2013 B2
8423182 Robinson et al. Apr 2013 B2
8428722 Verhoef et al. Apr 2013 B2
8439910 Greep et al. May 2013 B2
8444663 Houser et al. May 2013 B2
8452615 Abri May 2013 B2
8454506 Rothman et al. Jun 2013 B2
8461744 Wiener et al. Jun 2013 B2
8468030 Stroup et al. Jun 2013 B2
8469973 Meade et al. Jun 2013 B2
8472630 Konrad et al. Jun 2013 B2
8476227 Kaplan et al. Jul 2013 B2
8489235 Moll et al. Jul 2013 B2
8499992 Whitman et al. Aug 2013 B2
8500756 Papa et al. Aug 2013 B2
8503759 Greer et al. Aug 2013 B2
8505801 Ehrenfels et al. Aug 2013 B2
8506478 Mizuyoshi Aug 2013 B2
8512365 Wiener et al. Aug 2013 B2
8521331 Itkowitz Aug 2013 B2
8523043 Ullrich et al. Sep 2013 B2
8546996 Messerly et al. Oct 2013 B2
8560047 Haider et al. Oct 2013 B2
8561870 Baxter, III et al. Oct 2013 B2
8562598 Falkenstein et al. Oct 2013 B2
8566115 Moore Oct 2013 B2
8571598 Valavi Oct 2013 B2
8573459 Smith et al. Nov 2013 B2
8573465 Shelton, IV Nov 2013 B2
8591536 Robertson Nov 2013 B2
8595607 Nekoomaram et al. Nov 2013 B2
8596513 Olson et al. Dec 2013 B2
8608044 Hueil et al. Dec 2013 B2
8608045 Smith et al. Dec 2013 B2
8616431 Timm et al. Dec 2013 B2
8620055 Barratt et al. Dec 2013 B2
8620473 Diolaiti et al. Dec 2013 B2
8623027 Price et al. Jan 2014 B2
8627483 Rachlin et al. Jan 2014 B2
8627995 Smith et al. Jan 2014 B2
8628518 Blumenkranz et al. Jan 2014 B2
8628545 Cabrera et al. Jan 2014 B2
8631987 Shelton, IV et al. Jan 2014 B2
8632525 Kerr et al. Jan 2014 B2
8636190 Zemlok et al. Jan 2014 B2
8652086 Gerg et al. Feb 2014 B2
8652128 Ward Feb 2014 B2
8657176 Shelton, IV et al. Feb 2014 B2
8657177 Scirica et al. Feb 2014 B2
8663220 Wiener et al. Mar 2014 B2
8666544 Moll et al. Mar 2014 B2
8682049 Zhao et al. Mar 2014 B2
8682489 Itkowitz et al. Mar 2014 B2
8685056 Evans et al. Apr 2014 B2
8688188 Heller et al. Apr 2014 B2
8701962 Kostrzewski Apr 2014 B2
8719061 Birchall May 2014 B2
8720766 Hess et al. May 2014 B2
8733613 Huitema et al. May 2014 B2
8740840 Foley et al. Jun 2014 B2
8740866 Reasoner et al. Jun 2014 B2
8747238 Shelton, IV et al. Jun 2014 B2
8752749 Moore et al. Jun 2014 B2
8757465 Woodard, Jr. et al. Jun 2014 B2
8761717 Buchheit Jun 2014 B1
8763879 Shelton, IV et al. Jul 2014 B2
8768251 Claus et al. Jul 2014 B2
8771270 Burbank Jul 2014 B2
8775196 Simpson et al. Jul 2014 B2
8779648 Giordano et al. Jul 2014 B2
8790253 Sunagawa et al. Jul 2014 B2
8794497 Zingman Aug 2014 B2
8799008 Johnson et al. Aug 2014 B2
8799009 Mellin et al. Aug 2014 B2
8801703 Gregg et al. Aug 2014 B2
8814996 Giurgiutiu et al. Aug 2014 B2
8818556 Sanchez et al. Aug 2014 B2
8820603 Shelton, IV et al. Sep 2014 B2
8820608 Miyamoto Sep 2014 B2
8827134 Viola et al. Sep 2014 B2
8840003 Morgan et al. Sep 2014 B2
8851354 Swensgard et al. Oct 2014 B2
8852174 Burbank Oct 2014 B2
8875973 Whitman Nov 2014 B2
8882662 Charles Nov 2014 B2
8899479 Cappuzzo et al. Dec 2014 B2
8905977 Shelton et al. Dec 2014 B2
8912746 Reid et al. Dec 2014 B2
8914098 Brennan et al. Dec 2014 B2
8918207 Prisco Dec 2014 B2
8920414 Stone et al. Dec 2014 B2
8920433 Barrier et al. Dec 2014 B2
8930203 Kiaie et al. Jan 2015 B2
8930214 Woolford Jan 2015 B2
8931679 Kostrzewski Jan 2015 B2
8945095 Blumenkranz et al. Feb 2015 B2
8945163 Voegele et al. Feb 2015 B2
8955732 Zemlok et al. Feb 2015 B2
8956581 Rosenbaum et al. Feb 2015 B2
8960519 Whitman et al. Feb 2015 B2
8960520 McCuen Feb 2015 B2
8962062 Podhajsky et al. Feb 2015 B2
8967443 McCuen Mar 2015 B2
8967455 Zhou Mar 2015 B2
8968276 Zemlok et al. Mar 2015 B2
8968309 Roy et al. Mar 2015 B2
8968337 Whitfield et al. Mar 2015 B2
8968358 Reschke Mar 2015 B2
8974429 Gordon et al. Mar 2015 B2
8979890 Boudreaux Mar 2015 B2
8986302 Aldridge et al. Mar 2015 B2
8989903 Weir et al. Mar 2015 B2
8991678 Wellman et al. Mar 2015 B2
8992565 Brisson et al. Mar 2015 B2
8998797 Omori Apr 2015 B2
9002518 Manzo et al. Apr 2015 B2
9010611 Ross et al. Apr 2015 B2
9011366 Dean et al. Apr 2015 B2
9011427 Price et al. Apr 2015 B2
9016539 Kostrzewski et al. Apr 2015 B2
9017326 DiNardo et al. Apr 2015 B2
9020240 Pettersson et al. Apr 2015 B2
9023071 Miller et al. May 2015 B2
9027431 Tang et al. May 2015 B2
9028494 Shelton, IV et al. May 2015 B2
9035568 Ganton et al. May 2015 B2
9038882 Racenet et al. May 2015 B2
9043027 Durant et al. May 2015 B2
9044227 Shelton, IV et al. Jun 2015 B2
9044244 Ludwin et al. Jun 2015 B2
9044261 Houser Jun 2015 B2
9050063 Roe et al. Jun 2015 B2
9050083 Yates et al. Jun 2015 B2
9050120 Swarup et al. Jun 2015 B2
9052809 Vesto Jun 2015 B2
9055035 Porsch et al. Jun 2015 B2
9060770 Shelton, IV et al. Jun 2015 B2
9060775 Wiener et al. Jun 2015 B2
9066650 Sekiguchi Jun 2015 B2
9072523 Houser et al. Jul 2015 B2
9072535 Shelton, IV et al. Jul 2015 B2
9072536 Shelton, IV et al. Jul 2015 B2
9078653 Leimbach et al. Jul 2015 B2
9078727 Miller Jul 2015 B2
9084606 Greep Jul 2015 B2
9089360 Messerly et al. Jul 2015 B2
9095362 Dachs, II et al. Aug 2015 B2
9095367 Olson et al. Aug 2015 B2
9099863 Smith et al. Aug 2015 B2
9101358 Kerr et al. Aug 2015 B2
9101359 Smith et al. Aug 2015 B2
9101374 Hoch et al. Aug 2015 B1
9106270 Puterbaugh et al. Aug 2015 B2
9107573 Birnkrant Aug 2015 B2
9107662 Kostrzewski Aug 2015 B2
9107684 Ma Aug 2015 B2
9107688 Kimball et al. Aug 2015 B2
9107689 Robertson et al. Aug 2015 B2
9107694 Hendriks et al. Aug 2015 B2
9111548 Nandy et al. Aug 2015 B2
9113880 Zemlok et al. Aug 2015 B2
9114494 Mah Aug 2015 B1
9116597 Gulasky Aug 2015 B1
9119617 Souls et al. Sep 2015 B2
9119655 Bowling et al. Sep 2015 B2
9119657 Shelton, IV et al. Sep 2015 B2
9123155 Cunningham et al. Sep 2015 B2
9129054 Nawana et al. Sep 2015 B2
9137254 Bilbrey et al. Sep 2015 B2
9138129 Diolaiti Sep 2015 B2
9138225 Huang et al. Sep 2015 B2
9149322 Knowlton Oct 2015 B2
9161803 Yates et al. Oct 2015 B2
9168054 Turner et al. Oct 2015 B2
9179912 Yates et al. Nov 2015 B2
9183723 Sherman et al. Nov 2015 B2
9186143 Timm et al. Nov 2015 B2
9192375 Skinlo et al. Nov 2015 B2
9192447 Choi et al. Nov 2015 B2
9192707 Gerber et al. Nov 2015 B2
9202078 Abuelsaad et al. Dec 2015 B2
9204879 Shelton, IV Dec 2015 B2
9204995 Scheller et al. Dec 2015 B2
9216062 Duque et al. Dec 2015 B2
9218053 Komuro et al. Dec 2015 B2
9226689 Jacobsen et al. Jan 2016 B2
9226766 Aldridge et al. Jan 2016 B2
9226767 Stulen et al. Jan 2016 B2
9232883 Ozawa et al. Jan 2016 B2
9237891 Shelton, IV Jan 2016 B2
9241728 Price et al. Jan 2016 B2
9241731 Boudreaux et al. Jan 2016 B2
9250172 Harris et al. Feb 2016 B2
9255907 Heanue et al. Feb 2016 B2
9265585 Wingardner et al. Feb 2016 B2
9272406 Aronhalt et al. Mar 2016 B2
9277956 Zhang Mar 2016 B2
9280884 Schultz et al. Mar 2016 B1
9282974 Shelton, IV Mar 2016 B2
9283054 Morgan et al. Mar 2016 B2
9289212 Shelton, IV et al. Mar 2016 B2
9295514 Shelton, IV et al. Mar 2016 B2
9301691 Hufnagel et al. Apr 2016 B2
9301753 Aldridge et al. Apr 2016 B2
9301759 Spivey et al. Apr 2016 B2
9301810 Amiri et al. Apr 2016 B2
9302213 Manahan et al. Apr 2016 B2
9307894 von Grunberg et al. Apr 2016 B2
9307914 Fahey Apr 2016 B2
9307986 Hall et al. Apr 2016 B2
9314246 Shelton, IV et al. Apr 2016 B2
9314308 Parihar et al. Apr 2016 B2
9326767 Koch et al. May 2016 B2
9331422 Nazzaro et al. May 2016 B2
9332987 Leimbach et al. May 2016 B2
9333042 Diolaiti et al. May 2016 B2
9341704 Picard et al. May 2016 B2
9345481 Hall et al. May 2016 B2
9345490 Ippisch May 2016 B2
9345546 Toth et al. May 2016 B2
9351726 Leimbach et al. May 2016 B2
9351727 Leimbach et al. May 2016 B2
9358003 Hall et al. Jun 2016 B2
9358685 Meier et al. Jun 2016 B2
9360449 Duric Jun 2016 B2
9364231 Wenchell Jun 2016 B2
9364249 Kimball et al. Jun 2016 B2
9364294 Razzaque et al. Jun 2016 B2
9370400 Parihar Jun 2016 B2
9375282 Nau, Jr. et al. Jun 2016 B2
9375539 Stearns et al. Jun 2016 B2
9381003 Todor et al. Jul 2016 B2
9381058 Houser et al. Jul 2016 B2
9386984 Aronhalt et al. Jul 2016 B2
9386988 Baxter, III et al. Jul 2016 B2
9387295 Mastri et al. Jul 2016 B1
9393017 Flanagan et al. Jul 2016 B2
9393037 Olson et al. Jul 2016 B2
9398905 Martin Jul 2016 B2
9398911 Auld Jul 2016 B2
9402629 Ehrenfels et al. Aug 2016 B2
9414776 Sillay et al. Aug 2016 B2
9419018 Sasagawa et al. Aug 2016 B2
9421014 Ingmanson et al. Aug 2016 B2
9433470 Choi Sep 2016 B2
9439622 Case et al. Sep 2016 B2
9439668 Timm et al. Sep 2016 B2
9439736 Olson Sep 2016 B2
9445764 Gross et al. Sep 2016 B2
9445813 Shelton, IV et al. Sep 2016 B2
9450701 Do et al. Sep 2016 B2
9451949 Gorek et al. Sep 2016 B2
9451958 Shelton, IV et al. Sep 2016 B2
9463022 Swayze et al. Oct 2016 B2
9468438 Baber et al. Oct 2016 B2
9480492 Aranyi et al. Nov 2016 B2
9485475 Speier et al. Nov 2016 B2
9492146 Kostrzewski et al. Nov 2016 B2
9492237 Kang et al. Nov 2016 B2
9498215 Duque Nov 2016 B2
9498231 Haider et al. Nov 2016 B2
9516239 Blanquart et al. Dec 2016 B2
9519753 Gerdeman et al. Dec 2016 B1
9522003 Weir et al. Dec 2016 B2
9526407 Hoeg et al. Dec 2016 B2
9526499 Kostrzewski et al. Dec 2016 B2
9526587 Zhao et al. Dec 2016 B2
9539007 Dhakad et al. Jan 2017 B2
9539020 Conlon et al. Jan 2017 B2
9542481 Halter et al. Jan 2017 B2
9546662 Shener-Irmakoglu et al. Jan 2017 B2
9554794 Baber et al. Jan 2017 B2
9554854 Yates et al. Jan 2017 B2
9561038 Shelton, IV et al. Feb 2017 B2
9561045 Hinman et al. Feb 2017 B2
9566708 Kurnianto Feb 2017 B2
9572592 Price et al. Feb 2017 B2
9579503 McKinney et al. Feb 2017 B2
9585657 Shelton, IV et al. Mar 2017 B2
9592095 Panescu et al. Mar 2017 B2
9597081 Swayze et al. Mar 2017 B2
9600138 Thomas et al. Mar 2017 B2
9603024 Wang et al. Mar 2017 B2
9610114 Baxter, III et al. Apr 2017 B2
9622808 Beller et al. Apr 2017 B2
9629560 Joseph Apr 2017 B2
9629623 Lytle, IV et al. Apr 2017 B2
9629628 Aranyi Apr 2017 B2
9629629 Leimbach et al. Apr 2017 B2
9630318 Ibarz Gabardos et al. Apr 2017 B2
9636188 Gattani et al. May 2017 B2
9636825 Penn et al. May 2017 B2
9641596 Unagami et al. May 2017 B2
9641815 Richardson et al. May 2017 B2
9649110 Parihar et al. May 2017 B2
9649111 Shelton, IV et al. May 2017 B2
9649126 Robertson et al. May 2017 B2
9649169 Cinquin et al. May 2017 B2
9652655 Satish et al. May 2017 B2
9655616 Aranyi May 2017 B2
9656092 Golden May 2017 B2
9662116 Smith et al. May 2017 B2
9662177 Weir et al. May 2017 B2
9668729 Williams et al. Jun 2017 B2
9668732 Patel et al. Jun 2017 B2
9668765 Grace et al. Jun 2017 B2
9671860 Ogawa Jun 2017 B2
9675264 Acquista et al. Jun 2017 B2
9675354 Weir Jun 2017 B2
9681870 Baxter, III et al. Jun 2017 B2
9686306 Chizeck et al. Jun 2017 B2
9687230 Leimbach et al. Jun 2017 B2
9690362 Leimbach et al. Jun 2017 B2
9700292 Nawana et al. Jul 2017 B2
9700309 Jaworek et al. Jul 2017 B2
9700312 Kostrzewski et al. Jul 2017 B2
9706993 Hessler et al. Jul 2017 B2
9710214 Lin et al. Jul 2017 B2
9710644 Reybok et al. Jul 2017 B2
9713424 Spaide Jul 2017 B2
9717141 Tegg Jul 2017 B1
9717498 Aranyi et al. Aug 2017 B2
9717525 Ahluwalia et al. Aug 2017 B2
9717548 Couture Aug 2017 B2
9724094 Baber et al. Aug 2017 B2
9724118 Schulte et al. Aug 2017 B2
9733663 Leimbach et al. Aug 2017 B2
9737301 Baber et al. Aug 2017 B2
9737310 Whitfield et al. Aug 2017 B2
9737335 Butler et al. Aug 2017 B2
9737355 Yates et al. Aug 2017 B2
9740826 Raghavan et al. Aug 2017 B2
9743016 Nestares et al. Aug 2017 B2
9743929 Leimbach et al. Aug 2017 B2
9743946 Faller et al. Aug 2017 B2
9743947 Price et al. Aug 2017 B2
9750499 Leimbach et al. Sep 2017 B2
9750500 Malkowski Sep 2017 B2
9750522 Scheib et al. Sep 2017 B2
9750523 Tsubuku Sep 2017 B2
9753135 Bosch Sep 2017 B2
9757126 Cappola Sep 2017 B2
9757128 Baber et al. Sep 2017 B2
9757142 Shimizu Sep 2017 B2
9757152 Ogilvie et al. Sep 2017 B2
9764164 Wiener et al. Sep 2017 B2
9770541 Carr et al. Sep 2017 B2
9777913 Talbert et al. Oct 2017 B2
9782164 Mumaw et al. Oct 2017 B2
9782169 Kimsey et al. Oct 2017 B2
9782212 Wham et al. Oct 2017 B2
9782214 Houser et al. Oct 2017 B2
9788836 Overmyer et al. Oct 2017 B2
9788851 Dannaher et al. Oct 2017 B2
9788902 Inoue et al. Oct 2017 B2
9788907 Alvi et al. Oct 2017 B1
9795436 Yates et al. Oct 2017 B2
9797486 Zergiebel et al. Oct 2017 B2
9801531 Morita et al. Oct 2017 B2
9801626 Parihar et al. Oct 2017 B2
9801627 Harris et al. Oct 2017 B2
9801679 Trees et al. Oct 2017 B2
9802033 Hibner et al. Oct 2017 B2
9804618 Leimbach et al. Oct 2017 B2
9805472 Chou et al. Oct 2017 B2
9808244 Leimbach et al. Nov 2017 B2
9808245 Richard et al. Nov 2017 B2
9808246 Shelton, IV et al. Nov 2017 B2
9808248 Hoffman Nov 2017 B2
9808249 Shelton, IV Nov 2017 B2
9814457 Martin et al. Nov 2017 B2
9814460 Kimsey et al. Nov 2017 B2
9814462 Woodard, Jr. et al. Nov 2017 B2
9814463 Williams et al. Nov 2017 B2
9820699 Bingley et al. Nov 2017 B2
9820738 Lytle, IV et al. Nov 2017 B2
9820741 Kostrzewski Nov 2017 B2
9826976 Parihar et al. Nov 2017 B2
9826977 Leimbach et al. Nov 2017 B2
9827054 Richmond et al. Nov 2017 B2
9827059 Robinson et al. Nov 2017 B2
9830424 Dixon et al. Nov 2017 B2
9833241 Huitema et al. Dec 2017 B2
9839419 Deck et al. Dec 2017 B2
9839424 Zergiebel et al. Dec 2017 B2
9839428 Baxter, III et al. Dec 2017 B2
9839470 Gilbert et al. Dec 2017 B2
9839487 Dachs, II Dec 2017 B2
9844368 Boudreaux et al. Dec 2017 B2
9844369 Huitema et al. Dec 2017 B2
9844374 Lytle, IV et al. Dec 2017 B2
9844375 Overmyer et al. Dec 2017 B2
9844379 Shelton, IV et al. Dec 2017 B2
9848058 Johnson et al. Dec 2017 B2
9848877 Shelton, IV et al. Dec 2017 B2
9861354 Saliman et al. Jan 2018 B2
9861363 Chen et al. Jan 2018 B2
9861428 Trees et al. Jan 2018 B2
9867612 Parihar et al. Jan 2018 B2
9867651 Wham Jan 2018 B2
9867914 Bonano et al. Jan 2018 B2
9872609 Levy Jan 2018 B2
9872683 Hopkins et al. Jan 2018 B2
9877718 Weir et al. Jan 2018 B2
9877721 Schellin et al. Jan 2018 B2
9883860 Leimbach Feb 2018 B2
9888914 Martin et al. Feb 2018 B2
9888919 Leimbach et al. Feb 2018 B2
9888921 Williams et al. Feb 2018 B2
9895148 Shelton, IV et al. Feb 2018 B2
9900787 Ou Feb 2018 B2
9901342 Shelton, IV et al. Feb 2018 B2
9901406 State et al. Feb 2018 B2
9905000 Chou et al. Feb 2018 B2
9907550 Sniffin et al. Mar 2018 B2
9913642 Leimbach et al. Mar 2018 B2
9913645 Zerkle et al. Mar 2018 B2
9918730 Trees et al. Mar 2018 B2
9918778 Walberg et al. Mar 2018 B2
9918788 Paul et al. Mar 2018 B2
9922304 DeBusk et al. Mar 2018 B2
9924941 Burbank Mar 2018 B2
9924961 Shelton, IV et al. Mar 2018 B2
9931040 Homyk et al. Apr 2018 B2
9931118 Shelton, IV et al. Apr 2018 B2
9931124 Gokharu Apr 2018 B2
9936942 Chin et al. Apr 2018 B2
9936955 Miller et al. Apr 2018 B2
9936961 Chien et al. Apr 2018 B2
9937012 Hares et al. Apr 2018 B2
9937014 Bowling et al. Apr 2018 B2
9937626 Rockrohr Apr 2018 B2
9938972 Walley Apr 2018 B2
9943230 Kaku et al. Apr 2018 B2
9943309 Shelton, IV et al. Apr 2018 B2
9943312 Posada et al. Apr 2018 B2
9943377 Yates et al. Apr 2018 B2
9943379 Gregg, II et al. Apr 2018 B2
9943918 Grogan et al. Apr 2018 B2
9949785 Price et al. Apr 2018 B2
9962157 Sapre May 2018 B2
9968355 Shelton, IV et al. May 2018 B2
9980769 Trees et al. May 2018 B2
9980778 Ohline et al. May 2018 B2
9987000 Shelton, IV et al. Jun 2018 B2
9993248 Shelton, IV et al. Jun 2018 B2
9993258 Shelton, IV et al. Jun 2018 B2
9993305 Andersson Jun 2018 B2
10004491 Martin et al. Jun 2018 B2
10004497 Overmyer et al. Jun 2018 B2
10004500 Shelton, IV et al. Jun 2018 B2
10004501 Shelton, IV et al. Jun 2018 B2
10004527 Gee et al. Jun 2018 B2
D822206 Shelton, IV et al. Jul 2018 S
10010322 Shelton, IV et al. Jul 2018 B2
10010324 Huitema et al. Jul 2018 B2
10013049 Leimbach et al. Jul 2018 B2
10016199 Baber et al. Jul 2018 B2
10021318 Hugosson et al. Jul 2018 B2
10022120 Martin et al. Jul 2018 B2
10022391 Ruderman Chen et al. Jul 2018 B2
10022568 Messerly et al. Jul 2018 B2
10028761 Leimbach et al. Jul 2018 B2
10028788 Kang Jul 2018 B2
10034704 Asher et al. Jul 2018 B2
10037641 Hyde et al. Jul 2018 B2
D826405 Shelton, IV et al. Aug 2018 S
10039564 Hibner et al. Aug 2018 B2
10039565 Vezzu Aug 2018 B2
10041822 Zemlok Aug 2018 B2
10044791 Kamen et al. Aug 2018 B2
10045776 Shelton, IV et al. Aug 2018 B2
10045779 Savage et al. Aug 2018 B2
10045781 Cropper et al. Aug 2018 B2
10045813 Mueller Aug 2018 B2
10048379 Markendorf et al. Aug 2018 B2
10052044 Shelton, IV et al. Aug 2018 B2
10052102 Baxter, III et al. Aug 2018 B2
10054441 Schorr et al. Aug 2018 B2
10076326 Yates et al. Sep 2018 B2
10080618 Marshall et al. Sep 2018 B2
D831209 Huitema et al. Oct 2018 S
10085748 Morgan et al. Oct 2018 B2
10085749 Cappola et al. Oct 2018 B2
10095942 Mentese et al. Oct 2018 B2
10098527 Weisenburgh, II et al. Oct 2018 B2
10098635 Burbank Oct 2018 B2
10098705 Brisson et al. Oct 2018 B2
10105140 Malinouskas et al. Oct 2018 B2
10105142 Baxter, III et al. Oct 2018 B2
10111658 Chowaniec et al. Oct 2018 B2
10111665 Aranyi et al. Oct 2018 B2
10111679 Baber et al. Oct 2018 B2
10117649 Baxter et al. Nov 2018 B2
10117651 Whitman et al. Nov 2018 B2
10117702 Danziger et al. Nov 2018 B2
10118119 Sappok et al. Nov 2018 B2
10130359 Hess et al. Nov 2018 B2
10130360 Olson et al. Nov 2018 B2
10130361 Yates et al. Nov 2018 B2
10130367 Cappola et al. Nov 2018 B2
10133248 Fitzsimmons et al. Nov 2018 B2
10135242 Baber et al. Nov 2018 B2
10136887 Shelton, IV et al. Nov 2018 B2
10136949 Felder et al. Nov 2018 B2
10143526 Walker et al. Dec 2018 B2
10143948 Bonifas et al. Dec 2018 B2
10149680 Parihar et al. Dec 2018 B2
10152789 Carnes et al. Dec 2018 B2
10159044 Hrabak Dec 2018 B2
10159481 Whitman et al. Dec 2018 B2
10159483 Beckman et al. Dec 2018 B2
10164466 Calderoni Dec 2018 B2
10166025 Leimbach et al. Jan 2019 B2
10169862 Andre et al. Jan 2019 B2
10172687 Garbus et al. Jan 2019 B2
10175096 Dickerson Jan 2019 B2
10175127 Collins et al. Jan 2019 B2
10178992 Wise et al. Jan 2019 B2
10179413 Rockrohr Jan 2019 B2
10180463 Beckman et al. Jan 2019 B2
10182814 Okoniewski Jan 2019 B2
10182816 Shelton, IV et al. Jan 2019 B2
10182818 Hensel et al. Jan 2019 B2
10188385 Kerr et al. Jan 2019 B2
10189157 Schlegel et al. Jan 2019 B2
10194907 Marczyk et al. Feb 2019 B2
10194913 Nalagatla et al. Feb 2019 B2
10198965 Hart Feb 2019 B2
10201311 Chou et al. Feb 2019 B2
10201349 Leimbach et al. Feb 2019 B2
10201364 Leimbach et al. Feb 2019 B2
10201365 Boudreaux et al. Feb 2019 B2
10205708 Fletcher et al. Feb 2019 B1
10206605 Shelton, IV et al. Feb 2019 B2
10206752 Hares et al. Feb 2019 B2
10213201 Shelton, IV et al. Feb 2019 B2
10213266 Zemlok et al. Feb 2019 B2
10213268 Dachs, II Feb 2019 B2
10219491 Stiles, Jr. et al. Mar 2019 B2
10220522 Rockrohr Mar 2019 B2
10222750 Bang et al. Mar 2019 B2
10226249 Jaworek et al. Mar 2019 B2
10226250 Beckman et al. Mar 2019 B2
10226302 Lacal et al. Mar 2019 B2
10231634 Zand et al. Mar 2019 B2
10231733 Ehrenfels et al. Mar 2019 B2
10231775 Shelton, IV et al. Mar 2019 B2
10238413 Hibner et al. Mar 2019 B2
10245027 Shelton, IV et al. Apr 2019 B2
10245028 Shelton, IV et al. Apr 2019 B2
10245029 Hunter et al. Apr 2019 B2
10245030 Hunter et al. Apr 2019 B2
10245033 Overmyer et al. Apr 2019 B2
10245037 Conklin et al. Apr 2019 B2
10245038 Hopkins et al. Apr 2019 B2
10251661 Collings et al. Apr 2019 B2
10258331 Shelton, IV et al. Apr 2019 B2
10258359 Kapadia Apr 2019 B2
10258362 Conlon Apr 2019 B2
10258363 Worrell et al. Apr 2019 B2
10258415 Harrah et al. Apr 2019 B2
10258425 Mustufa et al. Apr 2019 B2
10263171 Wiener et al. Apr 2019 B2
10265035 Fehre et al. Apr 2019 B2
10265068 Harris et al. Apr 2019 B2
10265072 Shelton, IV et al. Apr 2019 B2
10265090 Ingmanson et al. Apr 2019 B2
10265130 Hess et al. Apr 2019 B2
10271840 Sapre Apr 2019 B2
10271844 Valentine et al. Apr 2019 B2
10271850 Williams Apr 2019 B2
10271851 Shelton, IV et al. Apr 2019 B2
D847989 Shelton, IV et al. May 2019 S
10278698 Racenet May 2019 B2
10278778 State et al. May 2019 B2
10283220 Azizian et al. May 2019 B2
10285694 Viola et al. May 2019 B2
10285698 Cappola et al. May 2019 B2
10285705 Shelton, IV et al. May 2019 B2
10292704 Harris et al. May 2019 B2
10292707 Shelton, IV et al. May 2019 B2
10292758 Boudreaux et al. May 2019 B2
10292771 Wood et al. May 2019 B2
10299792 Huitema et al. May 2019 B2
D850617 Shelton, IV et al. Jun 2019 S
10307159 Harris et al. Jun 2019 B2
10307170 Parfett et al. Jun 2019 B2
10307199 Farritor et al. Jun 2019 B2
10311036 Hussam et al. Jun 2019 B1
10313137 Aarnio et al. Jun 2019 B2
10314577 Laurent et al. Jun 2019 B2
10314582 Shelton, IV et al. Jun 2019 B2
10321907 Shelton, IV et al. Jun 2019 B2
10321964 Grover et al. Jun 2019 B2
10327764 Harris et al. Jun 2019 B2
10335147 Rector et al. Jul 2019 B2
10335149 Baxter, III et al. Jul 2019 B2
10335180 Johnson et al. Jul 2019 B2
10335227 Heard Jul 2019 B2
10342543 Shelton, IV et al. Jul 2019 B2
10342602 Strobl et al. Jul 2019 B2
10342623 Huelman et al. Jul 2019 B2
10343102 Reasoner et al. Jul 2019 B2
10357246 Shelton, IV et al. Jul 2019 B2
10357247 Shelton, IV et al. Jul 2019 B2
10362179 Harris Jul 2019 B2
10363037 Aronhalt et al. Jul 2019 B2
10368861 Baxter, III et al. Aug 2019 B2
10368865 Harris et al. Aug 2019 B2
10368867 Harris et al. Aug 2019 B2
10368876 Bhatnagar et al. Aug 2019 B2
10368894 Madan et al. Aug 2019 B2
10368903 Morales et al. Aug 2019 B2
10376263 Morgan et al. Aug 2019 B2
10376305 Yates et al. Aug 2019 B2
10376337 Kilroy et al. Aug 2019 B2
10376338 Taylor et al. Aug 2019 B2
10378893 Mankovskii Aug 2019 B2
10383518 Abu-Tarif et al. Aug 2019 B2
10383699 Kilroy et al. Aug 2019 B2
10390718 Chen et al. Aug 2019 B2
10390794 Kuroiwa et al. Aug 2019 B2
10390825 Shelton, IV et al. Aug 2019 B2
10390831 Holsten et al. Aug 2019 B2
10390895 Henderson et al. Aug 2019 B2
10398517 Eckert et al. Sep 2019 B2
10398521 Itkowitz et al. Sep 2019 B2
10404521 McChord et al. Sep 2019 B2
10404801 Martch Sep 2019 B2
10405857 Shelton, IV et al. Sep 2019 B2
10405863 Wise et al. Sep 2019 B2
10413291 Worthington et al. Sep 2019 B2
10413293 Shelton, IV et al. Sep 2019 B2
10413297 Harris et al. Sep 2019 B2
10417446 Takeyama Sep 2019 B2
10420552 Shelton, IV et al. Sep 2019 B2
10420558 Nalagatla et al. Sep 2019 B2
10420559 Marczyk et al. Sep 2019 B2
10420620 Rockrohr Sep 2019 B2
10420865 Reasoner et al. Sep 2019 B2
10422727 Pliskin Sep 2019 B2
10426466 Contini et al. Oct 2019 B2
10426467 Miller et al. Oct 2019 B2
10426468 Contini et al. Oct 2019 B2
10426471 Shelton, IV et al. Oct 2019 B2
10433837 Worthington et al. Oct 2019 B2
10433844 Shelton, IV et al. Oct 2019 B2
10433849 Shelton, IV et al. Oct 2019 B2
10441279 Shelton, IV et al. Oct 2019 B2
10441345 Aldridge et al. Oct 2019 B2
10448948 Shelton, IV et al. Oct 2019 B2
10448950 Shelton, IV et al. Oct 2019 B2
10456137 Vendely et al. Oct 2019 B2
10456140 Shelton, IV et al. Oct 2019 B2
10456193 Yates et al. Oct 2019 B2
10463365 Williams Nov 2019 B2
10463367 Kostrzewski et al. Nov 2019 B2
10463371 Kostrzewski Nov 2019 B2
10463436 Jackson et al. Nov 2019 B2
10470762 Leimbach et al. Nov 2019 B2
10470764 Baxter, III et al. Nov 2019 B2
10470768 Harris et al. Nov 2019 B2
10470791 Houser Nov 2019 B2
10471254 Sano et al. Nov 2019 B2
10478181 Shelton, IV et al. Nov 2019 B2
10478189 Bear et al. Nov 2019 B2
10478190 Miller et al. Nov 2019 B2
10478544 Friederichs et al. Nov 2019 B2
10485450 Gupta et al. Nov 2019 B2
10485542 Shelton, IV et al. Nov 2019 B2
10485543 Shelton, IV et al. Nov 2019 B2
10492785 Overmyer et al. Dec 2019 B2
10496788 Amarasingham et al. Dec 2019 B2
10498269 Zemlok et al. Dec 2019 B2
10499891 Chaplin et al. Dec 2019 B2
10499914 Huang et al. Dec 2019 B2
10499915 Aranyi Dec 2019 B2
10499994 Luks et al. Dec 2019 B2
10507068 Kopp et al. Dec 2019 B2
10512461 Gupta et al. Dec 2019 B2
10512499 McHenry et al. Dec 2019 B2
10512514 Nowlin et al. Dec 2019 B2
10517588 Gupta et al. Dec 2019 B2
10517595 Hunter et al. Dec 2019 B2
10517596 Hunter et al. Dec 2019 B2
10517686 Vokrot et al. Dec 2019 B2
10524789 Swayze et al. Jan 2020 B2
10531874 Morgan et al. Jan 2020 B2
10531929 Widenhouse et al. Jan 2020 B2
10532330 Diallo et al. Jan 2020 B2
10536617 Liang et al. Jan 2020 B2
10537324 Shelton, IV et al. Jan 2020 B2
10537325 Bakos et al. Jan 2020 B2
10537351 Shelton, IV et al. Jan 2020 B2
10542978 Chowaniec et al. Jan 2020 B2
10542979 Shelton, IV et al. Jan 2020 B2
10542982 Beckman et al. Jan 2020 B2
10542991 Shelton, IV et al. Jan 2020 B2
10548504 Shelton, IV et al. Feb 2020 B2
10548612 Martinez et al. Feb 2020 B2
10548673 Harris et al. Feb 2020 B2
10552574 Sweeney Feb 2020 B2
10555675 Satish et al. Feb 2020 B2
10555748 Yates et al. Feb 2020 B2
10555750 Conlon et al. Feb 2020 B2
10555769 Worrell et al. Feb 2020 B2
10561422 Schellin et al. Feb 2020 B2
10561471 Nichogi Feb 2020 B2
10568625 Harris et al. Feb 2020 B2
10568626 Shelton, IV et al. Feb 2020 B2
10568632 Miller et al. Feb 2020 B2
10575868 Hall et al. Mar 2020 B2
10582928 Hunter et al. Mar 2020 B2
10582931 Mujawar Mar 2020 B2
10586074 Rose et al. Mar 2020 B2
10588625 Weaner et al. Mar 2020 B2
10588629 Malinouskas et al. Mar 2020 B2
10588630 Shelton, IV et al. Mar 2020 B2
10588631 Shelton, IV et al. Mar 2020 B2
10588632 Shelton, IV et al. Mar 2020 B2
10588711 DiCarlo et al. Mar 2020 B2
10595882 Parfett et al. Mar 2020 B2
10595887 Shelton, IV et al. Mar 2020 B2
10595930 Scheib et al. Mar 2020 B2
10595952 Forrest et al. Mar 2020 B2
10602848 Magana Mar 2020 B2
10603036 Hunter et al. Mar 2020 B2
10603128 Zergiebel et al. Mar 2020 B2
10610223 Wellman et al. Apr 2020 B2
10610224 Shelton, IV et al. Apr 2020 B2
10610286 Wiener et al. Apr 2020 B2
10610313 Bailey et al. Apr 2020 B2
10617412 Shelton, IV et al. Apr 2020 B2
10617414 Shelton, IV et al. Apr 2020 B2
10617482 Houser et al. Apr 2020 B2
10617484 Kilroy et al. Apr 2020 B2
10624635 Harris et al. Apr 2020 B2
10624691 Wiener et al. Apr 2020 B2
10631423 Collins et al. Apr 2020 B2
10631912 McFarlin et al. Apr 2020 B2
10631916 Horner et al. Apr 2020 B2
10631917 Ineson Apr 2020 B2
10631939 Dachs, II et al. Apr 2020 B2
10639027 Shelton, IV et al. May 2020 B2
10639034 Harris et al. May 2020 B2
10639035 Shelton, IV et al. May 2020 B2
10639036 Yates May 2020 B2
10639037 Shelton, IV et al. May 2020 B2
10639039 Vendely et al. May 2020 B2
10639111 Kopp May 2020 B2
10653413 Worthington et al. May 2020 B2
10653476 Ross May 2020 B2
10653489 Kopp May 2020 B2
10656720 Holz May 2020 B1
10660705 Piron et al. May 2020 B2
10667809 Bakos et al. Jun 2020 B2
10667810 Shelton, IV et al. Jun 2020 B2
10667811 Harris et al. Jun 2020 B2
10667877 Kapadia Jun 2020 B2
10674897 Levy Jun 2020 B2
10675021 Harris et al. Jun 2020 B2
10675023 Cappola Jun 2020 B2
10675024 Shelton, IV et al. Jun 2020 B2
10675025 Swayze et al. Jun 2020 B2
10675026 Harris et al. Jun 2020 B2
10675104 Kapadia Jun 2020 B2
10677764 Ross et al. Jun 2020 B2
10679758 Fox et al. Jun 2020 B2
10682136 Harris et al. Jun 2020 B2
10682138 Shelton, IV et al. Jun 2020 B2
10686805 Reybok, Jr. et al. Jun 2020 B2
10687806 Shelton, IV et al. Jun 2020 B2
10687809 Shelton, IV et al. Jun 2020 B2
10687810 Shelton, IV et al. Jun 2020 B2
10687884 Wiener et al. Jun 2020 B2
10687905 Kostrzewski Jun 2020 B2
10695055 Shelton, IV et al. Jun 2020 B2
10695081 Shelton, IV et al. Jun 2020 B2
10695134 Barral et al. Jun 2020 B2
10702270 Shelton, IV et al. Jul 2020 B2
10702271 Aranyi et al. Jul 2020 B2
10709446 Harris et al. Jul 2020 B2
10716615 Shelton, IV et al. Jul 2020 B2
10716639 Kapadia et al. Jul 2020 B2
10717194 Griffiths et al. Jul 2020 B2
10722222 Aranyi Jul 2020 B2
10722233 Wellman Jul 2020 B2
10729458 Stoddard et al. Aug 2020 B2
10733267 Pedersen Aug 2020 B2
10736219 Seow et al. Aug 2020 B2
10736616 Scheib et al. Aug 2020 B2
10736628 Yates et al. Aug 2020 B2
10736629 Shelton, IV et al. Aug 2020 B2
10736636 Baxter, III et al. Aug 2020 B2
10736705 Scheib et al. Aug 2020 B2
10743872 Leimbach et al. Aug 2020 B2
10748115 Laster et al. Aug 2020 B2
10751052 Stokes et al. Aug 2020 B2
10751136 Farritor et al. Aug 2020 B2
10751768 Hersey et al. Aug 2020 B2
10755813 Shelton, IV et al. Aug 2020 B2
10758229 Shelton, IV et al. Sep 2020 B2
10758230 Shelton, IV et al. Sep 2020 B2
10758294 Jones Sep 2020 B2
10758310 Shelton, IV et al. Sep 2020 B2
10842897 Schwartz et al. Nov 2020 B2
10864050 Tabandeh et al. Dec 2020 B2
10881446 Strobl Jan 2021 B2
20020049551 Friedman et al. Apr 2002 A1
20030093503 Yamaki et al. May 2003 A1
20030114851 Truckai et al. Jun 2003 A1
20030210812 Khamene et al. Nov 2003 A1
20030223877 Anstine et al. Dec 2003 A1
20040078236 Stoodley et al. Apr 2004 A1
20040199180 Knodel et al. Oct 2004 A1
20040199659 Ishikawa et al. Oct 2004 A1
20040206365 Knowlton Oct 2004 A1
20040243148 Wasielewski Dec 2004 A1
20040243435 Williams Dec 2004 A1
20050020909 Moctezuma de la Barrera et al. Jan 2005 A1
20050063575 Ma et al. Mar 2005 A1
20050065438 Miller Mar 2005 A1
20050131390 Heinrich et al. Jun 2005 A1
20050149001 Uchikubo et al. Jul 2005 A1
20050149356 Cyr et al. Jul 2005 A1
20050192633 Montpetit Sep 2005 A1
20050222631 Dalal et al. Oct 2005 A1
20050236474 Onuma et al. Oct 2005 A1
20050277913 McCary Dec 2005 A1
20060020272 Gildenberg Jan 2006 A1
20060059018 Shiobara et al. Mar 2006 A1
20060116908 Dew et al. Jun 2006 A1
20060142657 Quaid Jun 2006 A1
20060241399 Fabian Oct 2006 A1
20070010838 Shelton et al. Jan 2007 A1
20070016235 Tanaka et al. Jan 2007 A1
20070027459 Horvath et al. Feb 2007 A1
20070049947 Menn et al. Mar 2007 A1
20070078678 DiSilvestro et al. Apr 2007 A1
20070167702 Hasser et al. Jul 2007 A1
20070168461 Moore Jul 2007 A1
20070173803 Wham et al. Jul 2007 A1
20070175955 Shelton et al. Aug 2007 A1
20070225556 Ortiz et al. Sep 2007 A1
20070244478 Bahney Oct 2007 A1
20070249990 Cosmescu Oct 2007 A1
20070270660 Caylor et al. Nov 2007 A1
20070293218 Meylan et al. Dec 2007 A1
20080013460 Allen et al. Jan 2008 A1
20080015664 Podhajsky Jan 2008 A1
20080015912 Rosenthal et al. Jan 2008 A1
20080033404 Romoda et al. Feb 2008 A1
20080040151 Moore Feb 2008 A1
20080059658 Williams Mar 2008 A1
20080077158 Haider et al. Mar 2008 A1
20080083414 Messerges Apr 2008 A1
20080177362 Phillips et al. Jul 2008 A1
20080255413 Zemlok et al. Oct 2008 A1
20080262654 Omori et al. Oct 2008 A1
20080281678 Keuls et al. Nov 2008 A1
20080296346 Shelton, IV et al. Dec 2008 A1
20090036750 Weinstein et al. Feb 2009 A1
20090036794 Stubhaug et al. Feb 2009 A1
20090043253 Podaima Feb 2009 A1
20090046146 Hoyt Feb 2009 A1
20090048589 Takashino et al. Feb 2009 A1
20090076409 Wu et al. Mar 2009 A1
20090090763 Zemlok et al. Apr 2009 A1
20090099866 Newman Apr 2009 A1
20090182577 Squilla et al. Jul 2009 A1
20090206131 Weisenburgh, II et al. Aug 2009 A1
20090217932 Voegele Sep 2009 A1
20090259149 Tahara et al. Oct 2009 A1
20090259221 Tahara et al. Oct 2009 A1
20090307681 Armado et al. Dec 2009 A1
20090326321 Jacobsen et al. Dec 2009 A1
20090326336 Lemke et al. Dec 2009 A1
20100065604 Weng Mar 2010 A1
20100070417 Flynn et al. Mar 2010 A1
20100132334 Duclos et al. Jun 2010 A1
20100191100 Anderson et al. Jul 2010 A1
20100198248 Vakharia Aug 2010 A1
20100217991 Choi Aug 2010 A1
20100235689 Tian et al. Sep 2010 A1
20100250571 Pierce et al. Sep 2010 A1
20100292535 Paskar Nov 2010 A1
20110022032 Zemlok et al. Jan 2011 A1
20110077512 Boswell Mar 2011 A1
20110087238 Wang et al. Apr 2011 A1
20110105895 Kornblau et al. May 2011 A1
20110118708 Burbank et al. May 2011 A1
20110119075 Dhoble May 2011 A1
20110125149 El-Galley et al. May 2011 A1
20110237883 Chun Sep 2011 A1
20110306840 Allen et al. Dec 2011 A1
20120022519 Huang et al. Jan 2012 A1
20120059684 Hampapur et al. Mar 2012 A1
20120116381 Houser et al. May 2012 A1
20120130217 Kauphusman et al. May 2012 A1
20120172696 Kallback et al. Jul 2012 A1
20120191091 Allen Jul 2012 A1
20120203785 Awada Aug 2012 A1
20120211542 Racenet Aug 2012 A1
20120245958 Lawrence et al. Sep 2012 A1
20120265555 Cappuzzo et al. Oct 2012 A1
20120292367 Morgan et al. Nov 2012 A1
20120319859 Taub et al. Dec 2012 A1
20130024213 Poon Jan 2013 A1
20130046182 Hegg et al. Feb 2013 A1
20130046279 Niklewski et al. Feb 2013 A1
20130066647 Andrie et al. Mar 2013 A1
20130090526 Suzuki et al. Apr 2013 A1
20130093829 Rosenblatt et al. Apr 2013 A1
20130105552 Weir et al. May 2013 A1
20130116218 Kaplan et al. May 2013 A1
20130165776 Blomqvist Jun 2013 A1
20130178853 Hyink et al. Jul 2013 A1
20130206813 Nalagatla Aug 2013 A1
20130214025 Zemlok et al. Aug 2013 A1
20130253480 Kimball et al. Sep 2013 A1
20130256373 Schmid et al. Oct 2013 A1
20130277410 Fernandez et al. Oct 2013 A1
20130317837 Ballantyne et al. Nov 2013 A1
20130321425 Greene et al. Dec 2013 A1
20130325809 Kim et al. Dec 2013 A1
20130331875 Ross et al. Dec 2013 A1
20140001231 Shelton, IV et al. Jan 2014 A1
20140001234 Shelton, IV et al. Jan 2014 A1
20140005640 Shelton, IV et al. Jan 2014 A1
20140006132 Barker Jan 2014 A1
20140006943 Robbins et al. Jan 2014 A1
20140029411 Nayak et al. Jan 2014 A1
20140035762 Shelton, IV et al. Feb 2014 A1
20140066700 Wilson et al. Mar 2014 A1
20140081255 Johnson et al. Mar 2014 A1
20140081659 Nawana et al. Mar 2014 A1
20140087999 Kaplan et al. Mar 2014 A1
20140092089 Kasuya et al. Apr 2014 A1
20140107697 Patani et al. Apr 2014 A1
20140108035 Akbay et al. Apr 2014 A1
20140108983 William et al. Apr 2014 A1
20140148729 Schmitz et al. May 2014 A1
20140187856 Holoien et al. Jul 2014 A1
20140204190 Rosenblatt, III et al. Jul 2014 A1
20140243799 Parihar Aug 2014 A1
20140246475 Hall et al. Sep 2014 A1
20140249557 Koch et al. Sep 2014 A1
20140252064 Mozdzierz et al. Sep 2014 A1
20140263541 Leimbach et al. Sep 2014 A1
20140263552 Hall et al. Sep 2014 A1
20140303660 Boyden et al. Oct 2014 A1
20150006201 Pait et al. Jan 2015 A1
20150025549 Kilroy et al. Jan 2015 A1
20150032150 Ishida et al. Jan 2015 A1
20150051617 Takemura et al. Feb 2015 A1
20150053737 Leimbach et al. Feb 2015 A1
20150066000 An et al. Mar 2015 A1
20150070187 Wiesner et al. Mar 2015 A1
20150108198 Estrella Apr 2015 A1
20150122870 Zemlok et al. May 2015 A1
20150133945 Dushyant et al. May 2015 A1
20150196295 Shelton, IV et al. Jul 2015 A1
20150199109 Lee Jul 2015 A1
20150238355 Vezzu et al. Aug 2015 A1
20150272557 Overmyer et al. Oct 2015 A1
20150272571 Leimbach et al. Oct 2015 A1
20150272580 Leimbach et al. Oct 2015 A1
20150272582 Leimbach et al. Oct 2015 A1
20150297200 Fitzsimmons et al. Oct 2015 A1
20150297222 Huitema et al. Oct 2015 A1
20150297228 Huitema et al. Oct 2015 A1
20150297233 Huitema et al. Oct 2015 A1
20150297311 Tesar Oct 2015 A1
20150302157 Collar et al. Oct 2015 A1
20150310174 Coudert et al. Oct 2015 A1
20150313538 Bechtel et al. Nov 2015 A1
20150317899 Dumbauld et al. Nov 2015 A1
20150332003 Stamm et al. Nov 2015 A1
20150332196 Stiller et al. Nov 2015 A1
20160000437 Giordano et al. Jan 2016 A1
20160015471 Piron et al. Jan 2016 A1
20160034648 Mohlenbrock et al. Feb 2016 A1
20160038253 Piron et al. Feb 2016 A1
20160066913 Swayze et al. Mar 2016 A1
20160078190 Greene et al. Mar 2016 A1
20160106516 Mesallum Apr 2016 A1
20160106934 Hiraga et al. Apr 2016 A1
20160180045 Syed Jun 2016 A1
20160192960 Bueno et al. Jul 2016 A1
20160206202 Frangioni Jul 2016 A1
20160235303 Fleming et al. Aug 2016 A1
20160249910 Shelton, IV et al. Sep 2016 A1
20160296246 Schaller Oct 2016 A1
20160302210 Thornton et al. Oct 2016 A1
20160310055 Zand et al. Oct 2016 A1
20160310203 Gaspredes et al. Oct 2016 A1
20160321400 Durrant et al. Nov 2016 A1
20160323283 Kang et al. Nov 2016 A1
20160324537 Green et al. Nov 2016 A1
20160342916 Arceneaux et al. Nov 2016 A1
20160345857 Jensrud et al. Dec 2016 A1
20160350490 Martinez et al. Dec 2016 A1
20160374665 DiNardo et al. Dec 2016 A1
20160374723 Frankhouser et al. Dec 2016 A1
20160374762 Case et al. Dec 2016 A1
20160374775 Prpa et al. Dec 2016 A1
20170000516 Stulen et al. Jan 2017 A1
20170000553 Wiener et al. Jan 2017 A1
20170000554 Yates et al. Jan 2017 A1
20170020462 Brown, III et al. Jan 2017 A1
20170027603 Pandey Feb 2017 A1
20170042717 Agrawal Feb 2017 A1
20170068792 Reiner Mar 2017 A1
20170079730 Azizian Mar 2017 A1
20170086829 Vendely et al. Mar 2017 A1
20170086930 Thompson et al. Mar 2017 A1
20170105754 Boudreaux et al. Apr 2017 A1
20170132374 Lee et al. May 2017 A1
20170132785 Wshah et al. May 2017 A1
20170143284 Sehnert et al. May 2017 A1
20170143442 Tesar et al. May 2017 A1
20170151026 Panescu et al. Jun 2017 A1
20170156076 Eom et al. Jun 2017 A1
20170164997 Johnson et al. Jun 2017 A1
20170165012 Chaplin et al. Jun 2017 A1
20170172565 Heneveld Jun 2017 A1
20170172614 Scheib et al. Jun 2017 A1
20170177807 Fabian Jun 2017 A1
20170181745 Penna et al. Jun 2017 A1
20170196637 Shelton, IV et al. Jul 2017 A1
20170202591 Shelton, IV et al. Jul 2017 A1
20170202605 Shelton, IV et al. Jul 2017 A1
20170202607 Shelton, IV et al. Jul 2017 A1
20170224332 Hunter et al. Aug 2017 A1
20170224334 Worthington et al. Aug 2017 A1
20170224428 Kopp Aug 2017 A1
20170231627 Shelton, IV et al. Aug 2017 A1
20170231628 Shelton, IV et al. Aug 2017 A1
20170249432 Grantcharov Aug 2017 A1
20170252095 Johnson Sep 2017 A1
20170255751 Sanmugalingham Sep 2017 A1
20170262604 Francois Sep 2017 A1
20170265864 Hessler et al. Sep 2017 A1
20170273715 Piron Sep 2017 A1
20170281171 Shelton, IV et al. Oct 2017 A1
20170281173 Shelton, IV et al. Oct 2017 A1
20170281186 Shelton, IV et al. Oct 2017 A1
20170281187 Shelton, IV et al. Oct 2017 A1
20170281189 Nalagatla et al. Oct 2017 A1
20170290585 Shelton, IV et al. Oct 2017 A1
20170296169 Yates et al. Oct 2017 A1
20170296173 Shelton, IV et al. Oct 2017 A1
20170296177 Harris et al. Oct 2017 A1
20170296179 Shelton, IV Oct 2017 A1
20170296185 Swensgard et al. Oct 2017 A1
20170296213 Swensgard et al. Oct 2017 A1
20170303984 Malackowski Oct 2017 A1
20170304020 Ng et al. Oct 2017 A1
20170325876 Nakadate et al. Nov 2017 A1
20170360439 Chen et al. Dec 2017 A1
20170360499 Greep et al. Dec 2017 A1
20170367695 Shelton, IV et al. Dec 2017 A1
20170367697 Shelton, IV et al. Dec 2017 A1
20170367754 Narisawa Dec 2017 A1
20170370710 Chen et al. Dec 2017 A1
20180008260 Baxter, III et al. Jan 2018 A1
20180008359 Randle Jan 2018 A1
20180011983 Zuhars et al. Jan 2018 A1
20180014848 Messerly et al. Jan 2018 A1
20180049817 Swayze et al. Feb 2018 A1
20180050196 Pawsey et al. Feb 2018 A1
20180055529 Messerly et al. Mar 2018 A1
20180065248 Barral et al. Mar 2018 A1
20180092706 Anderson Apr 2018 A1
20180098816 Govari et al. Apr 2018 A1
20180110523 Shelton, IV Apr 2018 A1
20180116662 Shelton, IV et al. May 2018 A1
20180116735 Tierney et al. May 2018 A1
20180122506 Grantcharov et al. May 2018 A1
20180125590 Giordano et al. May 2018 A1
20180132895 Silver May 2018 A1
20180153574 Faller et al. Jun 2018 A1
20180153628 Grover et al. Jun 2018 A1
20180153632 Tokarchuk et al. Jun 2018 A1
20180154297 Maletich et al. Jun 2018 A1
20180161716 Li et al. Jun 2018 A1
20180168575 Simms et al. Jun 2018 A1
20180168577 Aronhalt et al. Jun 2018 A1
20180168578 Aronhalt et al. Jun 2018 A1
20180168579 Aronhalt et al. Jun 2018 A1
20180168584 Harris et al. Jun 2018 A1
20180168586 Shelton, IV et al. Jun 2018 A1
20180168589 Swayze et al. Jun 2018 A1
20180168590 Overmyer et al. Jun 2018 A1
20180168592 Overmyer et al. Jun 2018 A1
20180168593 Overmyer et al. Jun 2018 A1
20180168597 Fanelli et al. Jun 2018 A1
20180168598 Shelton, IV et al. Jun 2018 A1
20180168601 Bakos et al. Jun 2018 A1
20180168603 Morgan et al. Jun 2018 A1
20180168605 Baber et al. Jun 2018 A1
20180168607 Shelton, IV et al. Jun 2018 A1
20180168608 Shelton, IV et al. Jun 2018 A1
20180168609 Fanelli et al. Jun 2018 A1
20180168610 Shelton, IV et al. Jun 2018 A1
20180168614 Shelton, IV et al. Jun 2018 A1
20180168615 Shelton, IV et al. Jun 2018 A1
20180168616 Shelton, IV et al. Jun 2018 A1
20180168617 Shelton, IV et al. Jun 2018 A1
20180168618 Scott et al. Jun 2018 A1
20180168619 Scott et al. Jun 2018 A1
20180168623 Simms et al. Jun 2018 A1
20180168625 Posada et al. Jun 2018 A1
20180168627 Weaner et al. Jun 2018 A1
20180168628 Hunter et al. Jun 2018 A1
20180168629 Shelton, IV et al. Jun 2018 A1
20180168632 Harris et al. Jun 2018 A1
20180168633 Shelton, IV et al. Jun 2018 A1
20180168639 Shelton, IV et al. Jun 2018 A1
20180168647 Shelton, IV et al. Jun 2018 A1
20180168648 Shelton, IV et al. Jun 2018 A1
20180168649 Shelton, IV et al. Jun 2018 A1
20180168650 Shelton, IV et al. Jun 2018 A1
20180168651 Shelton, IV et al. Jun 2018 A1
20180177557 Kapadia et al. Jun 2018 A1
20180199995 Odermatt et al. Jul 2018 A1
20180214025 Homyk et al. Aug 2018 A1
20180221598 Silver Aug 2018 A1
20180228557 Darisse Aug 2018 A1
20180242967 Meade Aug 2018 A1
20180250080 Kopp Sep 2018 A1
20180250084 Kopp et al. Sep 2018 A1
20180263710 Sakaguchi et al. Sep 2018 A1
20180263717 Kopp Sep 2018 A1
20180268320 Shekhar Sep 2018 A1
20180271520 Shelton, IV et al. Sep 2018 A1
20180271603 Nir et al. Sep 2018 A1
20180296286 Peine et al. Oct 2018 A1
20180304471 Tokuchi Oct 2018 A1
20180310935 Wixey Nov 2018 A1
20180310986 Batchelor et al. Nov 2018 A1
20180310997 Peine et al. Nov 2018 A1
20180317826 Muhsin et al. Nov 2018 A1
20180317915 McDonald, II Nov 2018 A1
20180338806 Grubbs Nov 2018 A1
20180358112 Sharifi Sedeh et al. Dec 2018 A1
20180360449 Shelton, IV et al. Dec 2018 A1
20180360452 Shelton, IV et al. Dec 2018 A1
20180360454 Shelton, IV et al. Dec 2018 A1
20180360456 Shelton, IV et al. Dec 2018 A1
20180368930 Esterberg et al. Dec 2018 A1
20180369511 Zergiebel et al. Dec 2018 A1
20190000446 Shelton, IV et al. Jan 2019 A1
20190000447 Shelton, IV Jan 2019 A1
20190000448 Shelton, IV Jan 2019 A1
20190000465 Shelton, IV et al. Jan 2019 A1
20190000478 Messerly et al. Jan 2019 A1
20190000530 Yates et al. Jan 2019 A1
20190000565 Shelton, IV Jan 2019 A1
20190000568 Connolly Jan 2019 A1
20190000569 Crawford et al. Jan 2019 A1
20190000577 Shelton, IV Jan 2019 A1
20190001079 Zergiebel et al. Jan 2019 A1
20190005641 Yamamoto Jan 2019 A1
20190006047 Gorek et al. Jan 2019 A1
20190008600 Pedros et al. Jan 2019 A1
20190029712 Stoddard et al. Jan 2019 A1
20190038335 Mohr et al. Feb 2019 A1
20190038364 Enoki Feb 2019 A1
20190053801 Wixey et al. Feb 2019 A1
20190053866 Seow et al. Feb 2019 A1
20190069949 Vrba et al. Mar 2019 A1
20190069964 Hagn Mar 2019 A1
20190070550 Lalomia et al. Mar 2019 A1
20190070731 Bowling et al. Mar 2019 A1
20190087544 Peterson Mar 2019 A1
20190090969 Jarc et al. Mar 2019 A1
20190099227 Rockrohr Apr 2019 A1
20190104919 Shelton, IV et al. Apr 2019 A1
20190125320 Shelton, IV et al. May 2019 A1
20190125321 Shelton, IV et al. May 2019 A1
20190125324 Scheib et al. May 2019 A1
20190125335 Shelton, IV et al. May 2019 A1
20190125336 Deck et al. May 2019 A1
20190125337 Shelton, IV et al. May 2019 A1
20190125338 Shelton, IV et al. May 2019 A1
20190125339 Shelton, IV et al. May 2019 A1
20190125344 DiNardo et al. May 2019 A1
20190125347 Stokes et al. May 2019 A1
20190125348 Shelton, IV et al. May 2019 A1
20190125352 Shelton, IV et al. May 2019 A1
20190125353 Shelton, IV et al. May 2019 A1
20190125354 Deck et al. May 2019 A1
20190125355 Shelton, IV et al. May 2019 A1
20190125356 Shelton, IV et al. May 2019 A1
20190125357 Shelton, IV et al. May 2019 A1
20190125358 Shelton, IV et al. May 2019 A1
20190125359 Shelton, IV et al. May 2019 A1
20190125360 Shelton, IV et al. May 2019 A1
20190125361 Shelton, IV et al. May 2019 A1
20190125377 Shelton, IV May 2019 A1
20190125378 Shelton, IV et al. May 2019 A1
20190125379 Shelton, IV et al. May 2019 A1
20190125380 Hunter et al. May 2019 A1
20190125381 Scheib et al. May 2019 A1
20190125383 Scheib et al. May 2019 A1
20190125384 Scheib et al. May 2019 A1
20190125385 Scheib et al. May 2019 A1
20190125386 Shelton, IV et al. May 2019 A1
20190125387 Parihar et al. May 2019 A1
20190125388 Shelton, IV et al. May 2019 A1
20190125389 Shelton, IV et al. May 2019 A1
20190125390 Shelton, IV et al. May 2019 A1
20190125430 Shelton, IV et al. May 2019 A1
20190125431 Shelton, IV et al. May 2019 A1
20190125432 Shelton, IV et al. May 2019 A1
20190125454 Stokes et al. May 2019 A1
20190125455 Shelton, IV et al. May 2019 A1
20190125456 Shelton, IV et al. May 2019 A1
20190125457 Parihar et al. May 2019 A1
20190125458 Shelton, IV et al. May 2019 A1
20190125459 Shelton, IV et al. May 2019 A1
20190125476 Shelton, IV et al. May 2019 A1
20190133703 Seow et al. May 2019 A1
20190142449 Shelton, IV et al. May 2019 A1
20190142535 Seow et al. May 2019 A1
20190145942 Dutriez et al. May 2019 A1
20190150975 Kawasaki et al. May 2019 A1
20190159777 Ehrenfels et al. May 2019 A1
20190159778 Shelton, IV et al. May 2019 A1
20190162179 O'Shea et al. May 2019 A1
20190164285 Nye et al. May 2019 A1
20190192157 Scott et al. Jun 2019 A1
20190192236 Shelton, IV et al. Jun 2019 A1
20190200844 Shelton, IV et al. Jul 2019 A1
20190200863 Shelton, IV et al. Jul 2019 A1
20190200905 Shelton, IV et al. Jul 2019 A1
20190200906 Shelton, IV et al. Jul 2019 A1
20190200977 Shelton, IV et al. Jul 2019 A1
20190200980 Shelton, IV et al. Jul 2019 A1
20190200981 Harris et al. Jul 2019 A1
20190200984 Shelton, IV et al. Jul 2019 A1
20190200985 Shelton, IV et al. Jul 2019 A1
20190200986 Shelton, IV et al. Jul 2019 A1
20190200987 Shelton, IV et al. Jul 2019 A1
20190200988 Shelton, IV Jul 2019 A1
20190200996 Shelton, IV et al. Jul 2019 A1
20190200997 Shelton, IV et al. Jul 2019 A1
20190200998 Shelton, IV et al. Jul 2019 A1
20190201020 Shelton, IV et al. Jul 2019 A1
20190201021 Shelton, IV et al. Jul 2019 A1
20190201023 Shelton, IV et al. Jul 2019 A1
20190201024 Shelton, IV et al. Jul 2019 A1
20190201025 Shelton, IV et al. Jul 2019 A1
20190201026 Shelton, IV et al. Jul 2019 A1
20190201027 Shelton, IV et al. Jul 2019 A1
20190201028 Shelton, IV et al. Jul 2019 A1
20190201029 Shelton, IV et al. Jul 2019 A1
20190201030 Shelton, IV et al. Jul 2019 A1
20190201033 Yates et al. Jul 2019 A1
20190201034 Shelton, IV et al. Jul 2019 A1
20190201036 Nott et al. Jul 2019 A1
20190201037 Houser et al. Jul 2019 A1
20190201038 Yates et al. Jul 2019 A1
20190201039 Widenhouse et al. Jul 2019 A1
20190201040 Messerly et al. Jul 2019 A1
20190201041 Kimball et al. Jul 2019 A1
20190201042 Nott et al. Jul 2019 A1
20190201043 Shelton, IV et al. Jul 2019 A1
20190201044 Shelton, IV et al. Jul 2019 A1
20190201045 Yates et al. Jul 2019 A1
20190201046 Shelton, IV et al. Jul 2019 A1
20190201047 Yates et al. Jul 2019 A1
20190201073 Nott et al. Jul 2019 A1
20190201074 Yates et al. Jul 2019 A1
20190201075 Shelton, IV et al. Jul 2019 A1
20190201077 Yates et al. Jul 2019 A1
20190201079 Shelton, IV et al. Jul 2019 A1
20190201080 Messerly et al. Jul 2019 A1
20190201081 Shelton, IV et al. Jul 2019 A1
20190201082 Shelton, IV et al. Jul 2019 A1
20190201083 Shelton, IV et al. Jul 2019 A1
20190201084 Shelton, IV et al. Jul 2019 A1
20190201085 Shelton, IV et al. Jul 2019 A1
20190201086 Shelton, IV et al. Jul 2019 A1
20190201087 Shelton, IV et al. Jul 2019 A1
20190201088 Shelton, IV et al. Jul 2019 A1
20190201090 Shelton, IV et al. Jul 2019 A1
20190201091 Yates et al. Jul 2019 A1
20190201092 Yates et al. Jul 2019 A1
20190201102 Shelton, IV et al. Jul 2019 A1
20190201104 Shelton, IV et al. Jul 2019 A1
20190201105 Shelton, IV et al. Jul 2019 A1
20190201111 Shelton, IV et al. Jul 2019 A1
20190201112 Wiener et al. Jul 2019 A1
20190201113 Shelton, IV et al. Jul 2019 A1
20190201114 Shelton, IV et al. Jul 2019 A1
20190201115 Shelton, IV et al. Jul 2019 A1
20190201116 Shelton, IV et al. Jul 2019 A1
20190201117 Yates et al. Jul 2019 A1
20190201118 Shelton, IV et al. Jul 2019 A1
20190201119 Harris et al. Jul 2019 A1
20190201120 Shelton, IV et al. Jul 2019 A1
20190201123 Shelton, IV et al. Jul 2019 A1
20190201124 Shelton, IV et al. Jul 2019 A1
20190201125 Shelton, IV et al. Jul 2019 A1
20190201126 Shelton, IV et al. Jul 2019 A1
20190201127 Shelton, IV et al. Jul 2019 A1
20190201128 Yates et al. Jul 2019 A1
20190201129 Shelton, IV et al. Jul 2019 A1
20190201130 Shelton, IV et al. Jul 2019 A1
20190201135 Shelton, IV et al. Jul 2019 A1
20190201136 Shelton, IV et al. Jul 2019 A1
20190201137 Shelton, IV et al. Jul 2019 A1
20190201138 Yates et al. Jul 2019 A1
20190201139 Shelton, IV et al. Jul 2019 A1
20190201140 Yates et al. Jul 2019 A1
20190201141 Shelton, IV et al. Jul 2019 A1
20190201142 Shelton, IV et al. Jul 2019 A1
20190201143 Shelton, IV et al. Jul 2019 A1
20190201144 Shelton, IV et al. Jul 2019 A1
20190201146 Shelton, IV et al. Jul 2019 A1
20190201158 Shelton, IV et al. Jul 2019 A1
20190201159 Shelton, IV et al. Jul 2019 A1
20190201594 Shelton, IV et al. Jul 2019 A1
20190201597 Shelton, IV et al. Jul 2019 A1
20190204201 Shelton, IV et al. Jul 2019 A1
20190205001 Messerly et al. Jul 2019 A1
20190205441 Shelton, IV et al. Jul 2019 A1
20190205566 Shelton, IV et al. Jul 2019 A1
20190205567 Shelton, IV et al. Jul 2019 A1
20190206003 Harris et al. Jul 2019 A1
20190206004 Shelton, IV et al. Jul 2019 A1
20190206050 Yates et al. Jul 2019 A1
20190206216 Shelton, IV et al. Jul 2019 A1
20190206542 Shelton, IV et al. Jul 2019 A1
20190206551 Yates et al. Jul 2019 A1
20190206555 Morgan et al. Jul 2019 A1
20190206556 Shelton, IV et al. Jul 2019 A1
20190206561 Shelton, IV et al. Jul 2019 A1
20190206562 Shelton, IV et al. Jul 2019 A1
20190206563 Shelton, IV et al. Jul 2019 A1
20190206564 Shelton, IV et al. Jul 2019 A1
20190206565 Shelton, IV Jul 2019 A1
20190206569 Shelton, IV et al. Jul 2019 A1
20190206576 Shelton, IV et al. Jul 2019 A1
20190207773 Shelton, IV et al. Jul 2019 A1
20190207857 Shelton, IV et al. Jul 2019 A1
20190207911 Wiener et al. Jul 2019 A1
20190208641 Yates et al. Jul 2019 A1
20190254759 Azizian Aug 2019 A1
20190261984 Nelson et al. Aug 2019 A1
20190269476 Bowling et al. Sep 2019 A1
20190274662 Rockman et al. Sep 2019 A1
20190274705 Sawhney et al. Sep 2019 A1
20190274706 Nott et al. Sep 2019 A1
20190274707 Sawhney et al. Sep 2019 A1
20190274708 Boudreaux Sep 2019 A1
20190274709 Scoggins Sep 2019 A1
20190274710 Black Sep 2019 A1
20190274711 Scoggins et al. Sep 2019 A1
20190274712 Faller et al. Sep 2019 A1
20190274713 Scoggins et al. Sep 2019 A1
20190274714 Cuti et al. Sep 2019 A1
20190274716 Nott et al. Sep 2019 A1
20190274717 Nott et al. Sep 2019 A1
20190274718 Denzinger et al. Sep 2019 A1
20190274719 Stulen Sep 2019 A1
20190274720 Gee et al. Sep 2019 A1
20190274749 Brady et al. Sep 2019 A1
20190274750 Jayme et al. Sep 2019 A1
20190274752 Denzinger et al. Sep 2019 A1
20190290389 Kopp Sep 2019 A1
20190298340 Shelton, IV et al. Oct 2019 A1
20190298341 Shelton, IV et al. Oct 2019 A1
20190298342 Shelton, IV et al. Oct 2019 A1
20190298343 Shelton, IV et al. Oct 2019 A1
20190298346 Shelton, IV et al. Oct 2019 A1
20190298347 Shelton, IV et al. Oct 2019 A1
20190298350 Shelton, IV et al. Oct 2019 A1
20190298351 Shelton, IV et al. Oct 2019 A1
20190298352 Shelton, IV et al. Oct 2019 A1
20190298353 Shelton, IV et al. Oct 2019 A1
20190298354 Shelton, IV et al. Oct 2019 A1
20190298355 Shelton, IV et al. Oct 2019 A1
20190298356 Shelton, IV et al. Oct 2019 A1
20190298357 Shelton, IV et al. Oct 2019 A1
20190298464 Abbott Oct 2019 A1
20190298481 Rosenberg et al. Oct 2019 A1
20190307520 Peine et al. Oct 2019 A1
20190314015 Shelton, IV et al. Oct 2019 A1
20190314016 Huitema et al. Oct 2019 A1
20190321117 Itkowitz et al. Oct 2019 A1
20190333626 Mansi et al. Oct 2019 A1
20190343594 Garcia Kilroy et al. Nov 2019 A1
20190374140 Tucker et al. Dec 2019 A1
20200054317 Pisarnwongs et al. Feb 2020 A1
20200054320 Harris et al. Feb 2020 A1
20200054321 Harris et al. Feb 2020 A1
20200054322 Harris et al. Feb 2020 A1
20200054323 Harris et al. Feb 2020 A1
20200054325 Harris et al. Feb 2020 A1
20200054326 Harris et al. Feb 2020 A1
20200054327 Harris et al. Feb 2020 A1
20200054328 Harris et al. Feb 2020 A1
20200054329 Shelton, IV et al. Feb 2020 A1
20200054330 Harris et al. Feb 2020 A1
20200054331 Harris et al. Feb 2020 A1
20200100830 Henderson et al. Apr 2020 A1
20200162896 Su et al. May 2020 A1
20200178971 Harris et al. Jun 2020 A1
20200261075 Boudreaux et al. Aug 2020 A1
20200261076 Boudreaux et al. Aug 2020 A1
20200261077 Shelton, IV et al. Aug 2020 A1
20200261078 Bakos et al. Aug 2020 A1
20200261080 Bakos et al. Aug 2020 A1
20200261081 Boudreaux et al. Aug 2020 A1
20200261082 Boudreaux et al. Aug 2020 A1
20200261083 Bakos et al. Aug 2020 A1
20200261084 Bakos et al. Aug 2020 A1
20200261085 Boudreaux et al. Aug 2020 A1
20200261086 Zeiner et al. Aug 2020 A1
20200261087 Timm et al. Aug 2020 A1
20200261088 Harris et al. Aug 2020 A1
20200261089 Shelton, IV et al. Aug 2020 A1
20200281665 Kopp Sep 2020 A1
20210000555 Shelton, IV et al. Jan 2021 A1
Foreign Referenced Citations (49)
Number Date Country
2015201140 Mar 2015 AU
2795323 May 2014 CA
101617950 Jan 2010 CN
104490448 Mar 2017 CN
206097107 Apr 2017 CN
108652695 Oct 2018 CN
2037167 Jul 1980 DE
3824913 Feb 1990 DE
4002843 Apr 1991 DE
102005051367 Apr 2007 DE
102016207666 Nov 2017 DE
0000756 Oct 1981 EP
2732772 May 2014 EP
3047806 Jul 2016 EP
3056923 Aug 2016 EP
3095399 Nov 2016 EP
3120781 Jan 2017 EP
3135225 Mar 2017 EP
3141181 Mar 2017 EP
2509523 Jul 2014 GB
S5373315 Jun 1978 JP
2017513561 Jun 2017 JP
20140104587 Aug 2014 KR
101587721 Jan 2016 KR
WO-9734533 Sep 1997 WO
WO-0024322 May 2000 WO
WO-0108578 Feb 2001 WO
WO-0112089 Feb 2001 WO
WO-0120892 Mar 2001 WO
WO-03079909 Oct 2003 WO
WO-2007137304 Nov 2007 WO
WO-2008056618 May 2008 WO
WO-2008069816 Jun 2008 WO
WO-2008147555 Dec 2008 WO
WO-2011112931 Sep 2011 WO
WO-2013143573 Oct 2013 WO
WO-2014134196 Sep 2014 WO
WO-2015129395 Sep 2015 WO
WO-2016100719 Jun 2016 WO
WO-2016206015 Dec 2016 WO
WO-2017011382 Jan 2017 WO
WO-2017011646 Jan 2017 WO
WO-2017058695 Apr 2017 WO
WO-2017151996 Sep 2017 WO
WO-2017189317 Nov 2017 WO
WO-2017205308 Nov 2017 WO
WO-2017210499 Dec 2017 WO
WO-2017210501 Dec 2017 WO
WO-2018152141 Aug 2018 WO
Non-Patent Literature Citations (51)
Entry
US 10,504,709 B2, 12/2019, Karancsi et al. (withdrawn)
Flores et al., “Large-scale Offloading in the Internet of Things,” 2017 IEEE International Conference on Pervasive Computing and Communications Workshops (Percom Workshops), IEEE, pp. 479-484, Mar. 13, 2017.
Kalantarian et al., “Computation Offloading for Real-Time Health-Monitoring Devices,” 2016 38th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EBMC), IEEE, pp. 4971-4974, Aug. 16, 2016.
Yuyi Mao et al., “A Survey on Mobile Edge Computing: The Communication Perspective,” IEEE Communications Surveys & Tutorials, pp. 2322-2358, Jun. 13, 2017.
Benkmann et al., “Concept of iterative optimization of minimally invasive surgery,” 2017 22nd International Conference on Methods and Models in Automation and Robotics (MMAR), IEEE pp. 443-446, Aug. 28, 2017.
Trautman, Peter, “Breaking the Human-Robot Deadlock: Surpassing Shared Control Performance Limits with Sparse Human-Robot Interaction,” Robotics: Science and Systems XIIII, pp. 1-10, Jul. 12, 2017.
Khazaei et al., “Health Informatics for Neonatal Intensive Care Units: An Analytical Modeling Perspective,” IEEE Journal of Translational Engineering in Health and Medicine, vol. 3, pp. 1-9, Oct. 21, 2015.
Yang et al., “A dynamic stategy for packet scheduling and bandwidth allocation based on channel quality in IEEE 802.16e OFDMA system,” Journal of Network and Computer Applications, vol. 39, pp. 52-60, May 2, 2013.
Takahashi et al., “Automatic smoke evacuation in laparoscopic surgery: a simplified method for objective evaluation,” Surgical Endoscopy, vol. 27, No. 8, pp. 2980-2987, Feb. 23, 2013.
Miksch et al., “Utilizing temporal data abstraction for data validation and therapy planning for artificially ventilated newborn infants,” Artificial Intelligence in Medicine, vol. 8, No. 6, pp. 543-576 (1996).
Horn et al., “Effective data validation of high-frequency data: Time-point-time-interval-, and trend-based methods,” Computers in Biology and Medic, New York, NY, vol. 27, No. 5, pp. 389-409 (1997).
Stacey et al., “Temporal abstraction in intelligent clinical data analysis: A survey, ” Artificial Intelligence in Medicine, vol. 39, No. 1, pp. 1-24 (2006).
Zoccali, Bruno, “A Method for Approximating Component Temperatures at Altitude Conditions Based on CFD Analysis at Sea Level Conditions,” (white paper), www.tdmginc.com, Dec. 6, 2018 (9 pages).
Slocinski et al., “Distance measure for impedance spectra for quantified evaluations,” Lecture Notes on Impedance Spectroscopy, vol. 3, Taylor and Francis Group (Jul. 2012)—Book Not Attached.
Engel et al. “A safe robot system for craniofacial surgery”, 2013 IEEE International Conference on Robotics and Automation (ICRA); May 6-10, 2013; Karlsruhe, Germany, vol. 2, Jan. 1, 2001, pp. 2020-2024.
Bonaci et al., “To Make a Robot Secure: An Experimental Analysis of Cyber Security Threats Against Teleoperated Surgical Robots,” May 13, 2015. Retrieved from the Internet: URL:https://arxiv.org/pdf/1504.04339v2.pdf [retrieved on Aug. 24, 2019].
Homa Alemzadeh et al., “Targeted Attacks on Teleoperated Surgical Robots: Dynamic Model-Based Detection and Mitigation,” 2016 46th Annual IEEE/IFIP International Conference on Dependable Systems and Networks (DSN), IEEE, Jun. 28, 2016, pp. 395-406.
Phumzile Malindi, “5. QoS in Telemedicine,” “Telemedicine,” Jun. 20, 2011, IntechOpen, pp. 119-138.
Staub et al., “Contour-based Surgical Instrument Tracking Supported by Kinematic Prediction,” Proceedings of the 2010 3rd IEEE RAS & EMBS International Conference on Biomedical Robotics and Biomechatronics, Sep. 1, 2010, pp. 746-752.
Allan et al., “3-D Pose Estimation of Articulated Instruments in Robotic Minimally Invasive Surgery,” IEEE Transactions on Medical Imaging, vol. 37, No. 5, May 1, 2018, pp. 1204-1213.
Kassahun et al., “Surgical Robotics Beyond Enhanced Dexterity Instrumentation: A Survey of the Machine Learning Techniques and their Role in Intelligent and Autonomous Surgical Actions.” International Journal of Computer Assisted Radiology and Surgery, vol. 11, No. 4, Oct. 8, 2015, pp. 553-568.
Weede et al. “An Intelligent and Autonomous Endoscopic Guidance System for Minimally Invasive Surgery,” 2013 IEEE International Conference on Robotics ad Automation (ICRA), May 6-10, 2013. Karlsruhe, Germany, May 1, 2011, pp. 5762-5768.
Altenberg et al., “Genes of Glycolysis are Ubiquitously Overexpressed in 24 Cancer Classes,” Genomics, vol. 84, pp. 1014-1020 (2004).
Harold I. Brandon and V. Leroy Young, March 1997, Surgical Services Management vol. 3 No. 3. retrieved from the internet <https://www.surgimedics.com/Research%20Articles/Electrosurgical%20Plume/Characterization%20And%20Removal%20Of%20Electrosurgical%20Smoke.pdf> (Year: 1997).
Marshall Brain, How Microcontrollers Work, 2006, retrieved from the internet <https://web.archive.org/web/20060221235221/http://electronics.howstuffworks.com/microcontroller.htm/printable> (Year: 2006).
CRC Press, “The Measurement, Instrumentation and Sensors Handbook,” 1999, Section VII, Chapter 41, Peter O'Shea, “Phase Measurement,” pp. 1303-1321, ISBN 0/8493-2145-X.
Jiang, “‘Sound of Silence’ : a secure indoor wireless ultrasonic communication system,” Article, 2014, pp. 46-50, Snapshots of Doctoral Research at University College Cork, School of Engineering—Electrical & Electronic Engineering, UCC, Cork, Ireland.
Li, et al., “Short-range ultrasonic communications in air using quadrature modulation,” Journal, Oct. 30, 2009, pp. 2060-2072, IEEE Transactions on Ultrasonics, Ferroelectrics, and Frequency Control, vol. 56, No. 10, IEEE.
Salamon, “Al Detects Polyps Better Than Colonoscopists” Online Article, Jun. 3, 2018, Medscape Medical News, Digestive Disease Week (DDW) 2018: Presentation 133.
Misawa, et al. “Artificial Intelligence-Assisted Polyp Detection for Colonoscopy: Initial Experience,” Article, Jun. 2018, pp. 2027-2029, vol. 154, Issue 8, American Gastroenterolgy Association.
Dottorato, “Analysis and Design of the Rectangular Microstrip Patch Antennas for TM0n0 operating mode,”Article, Oct. 8, 2010, pp. 1-9, Microwave Journal.
Miller, et al., “Impact of Powered and Tissue-Specific Endoscopic Stapling Technology on Clinical and Economic Outcomes of Video-Assisted Thoracic Surgery Lobectomy Procedures: A Retrospective, Observational Study,” Article, Apr. 2018, pp. 707-723, vol. 35 (Issue 5), Advances in Therapy.
Hsiao-Wei Tang, “ARCM”, Video, Sep. 2012, YouTube, 5 screenshots, Retrieved from internet: <https://www.youtube.com/watch?v=UldQaxb3fRw&feature=youtu.be>.
Giannios, et al., “Visible to near-infrared refractive properties of freshly-excised human-liver tissues: marking hepatic malignancies,” Article, Jun. 14, 2016, pp. 1-10, Scientific Reports 6, Article No. 27910, Nature.
Vander Heiden, et al., “Understanding the Warburg effect: the metabolic requirements of cell proliferation,” Article, May 22, 2009, pp. 1-12, vol. 324, Issue 5930, Science.
Hirayama et al., “Quantitative Metabolome Profiling of Colon and Stomach Cancer Microenvironment by Capillary Electrophoresis Time-of-Flight Mass Spectrometry,” Article, Jun. 2009, pp. 4918-4925, vol. 69, Issue 11, Cancer Research.
Cengiz, et al., “A Tale of Two Compartments: Interstitial Versus Blood Glucose Monitoring,” Article, Jun. 2009, pp. S11-S16, vol. 11, Supplement 1, Diabetes Technology & Therapeutics.
Shen, et al., “An iridium nanoparticles dispersed carbon based thick film electrochemical biosensor and its application for a single use, disposable glucose biosensor,” Article, Feb. 3, 2007, pp. 106-113, vol. 125, Issue 1, Sensors and Actuators B: Chemical, Science Direct.
“ATM-MPLS Network Interworking Version 2.0, af-aic-0178.001” ATM Standard, The ATM Forum Technical Committee, published Aug. 2003.
IEEE Std 802.3-2012 (Revision of IEEE Std 802.3-2008, published Dec. 28, 2012.
IEEE Std No. 177, “Standard Definitions and Methods of Measurement for Piezoelectric Vibrators,” published May 1966, The Institute of Electrical and Electronics Engineers, Inc., New York, N.Y.
Shi et al., An intuitive control console for robotic syrgery system, 2014, IEEE, p. 404-407 (Year: 2014).
Choi et al., A haptic augmented reality surgeon console for a laparoscopic surgery robot system, 2013, IEEE, p. 355-357 (Year: 2013).
Xie et al., Development of stereo vision and master-slave controller for a compact surgical robot system, 2015, IEEE, p. 403-407 (Year: 2015).
Sun et al., Innovative effector design for simulation training in robotic surgery, 2010, IEEE, p. 1735-1759 (Year: 2010).
Anonymous, “Internet of Things Powers Connected Surgical Device Infrastructure Case Study”, Dec. 31, 2016 (Dec. 31, 2016), Retrieved from the Internet: URL:https://www.cognizant.com/services-resources/150110_1oT_connected_surgical_devices.pdf.
Draijer, Matthijs et al., “Review of laser pseckle contrast techniques for visualizing tissue perfusion,” Lasers in Medical Science, Springer-Verlag, LO, vol. 24, No. 4, Dec. 3, 2008, pp. 639-651.
Roy D Cullum, “Handbook of Engineering Design”, ISBN: 9780408005586, Jan. 1, 1988 (Jan. 1, 1988), XP055578597, ISBN: 9780408005586, 10-20, Chapter 6, p. 138, right-hand column, paragraph 3.
“Surgical instrumentation: the true cost of instrument trays and a potential strategy for optimization”; Mhlaba et al.; Sep. 23, 2015 (Year: 2015).
Nabil Simaan et al, “Intelligent Surgical Robots with Situational Awareness: From Good to Great Surgeons”, DOI: 10.1115/1.2015-Sep-6 external link, Sep. 2015 (Sep. 2015), p. 3-6, Retrieved from the Internet: URL:http://memagazineselect.asmedigitalcollection.asme.org/data/journals/meena/936888/me-2015-sep6.pdf XP055530863.
Anonymous: “Titanium Key Chain Tool 1.1, Ultralight Multipurpose Key Chain Tool, Forward Cutting Can Opener—Vargo Titanium,” vargooutdoors.com, Jul. 5, 2014 (Jul. 5, 2014), retrieved from the internet: https://vargooutdoors.com/titanium-key-chain-tool-1-1.html.
Related Publications (1)
Number Date Country
20190201145 A1 Jul 2019 US
Provisional Applications (4)
Number Date Country
62649307 Mar 2018 US
62611341 Dec 2017 US
62611340 Dec 2017 US
62611339 Dec 2017 US