Control of a surgical system through a surgical barrier

Information

  • Patent Grant
  • 11896443
  • Patent Number
    11,896,443
  • Date Filed
    Tuesday, November 6, 2018
    5 years ago
  • Date Issued
    Tuesday, February 13, 2024
    2 months ago
Abstract
A surgical system assembly is disclosed. The surgical system assembly includes a first surgical system and a second surgical system coupled to the first surgical system. The second surgical system includes a control circuit. The control circuit is configured to operate in a first mode or a second mode and control one or more functions of the first surgical system when the second surgical system is in the second mode.
Description
BACKGROUND

The present disclosure relates to various surgical systems. Surgical procedures are typically performed in surgical operating theaters or rooms in a healthcare facility such as, for example, a hospital. A sterile field is typically created around the patient. The sterile field may include the scrubbed team members, who are properly attired, and all furniture and fixtures in the area. Various surgical devices and systems are utilized in performance of a surgical procedure.





FIGURES

The various aspects described herein, both as to organization and methods of operation, together with further objects and advantages thereof, may best be understood by reference to the following description, taken in conjunction with the accompanying drawings as follows.



FIG. 1 is a block diagram of a computer-implemented interactive surgical system, in accordance with at least one aspect of the present disclosure.



FIG. 2 is a surgical system being used to perform a surgical procedure in an operating room, in accordance with at least one aspect of the present disclosure.



FIG. 3 is a surgical hub paired with a visualization system, a robotic system, and an intelligent instrument, in accordance with at least one aspect of the present disclosure.



FIG. 4 is a partial perspective view of a surgical hub enclosure, and of a combo generator module slidably receivable in a drawer of the surgical hub enclosure, in accordance with at least one aspect of the present disclosure.



FIG. 5 is a perspective view of a combo generator module with bipolar, ultrasonic, and monopolar contacts and a smoke evacuation component, in accordance with at least one aspect of the present disclosure.



FIG. 6 illustrates individual power bus attachments for a plurality of lateral docking ports of a lateral modular housing configured to receive a plurality of modules, in accordance with at least one aspect of the present disclosure.



FIG. 7 illustrates a vertical modular housing configured to receive a plurality of modules, in accordance with at least one aspect of the present disclosure.



FIG. 8 illustrates a surgical data network comprising a modular communication hub configured to connect modular devices located in one or more operating theaters of a healthcare facility, or any room in a healthcare facility specially equipped for surgical operations, to the cloud, in accordance with at least one aspect of the present disclosure.



FIG. 9 illustrates a computer-implemented interactive surgical system, in accordance with at least one aspect of the present disclosure.



FIG. 10 illustrates a surgical hub comprising a plurality of modules coupled to the modular control tower, in accordance with at least one aspect of the present disclosure.



FIG. 11 illustrates one aspect of a Universal Serial Bus (USB) network hub device, in accordance with at least one aspect of the present disclosure.



FIG. 12 is a block diagram of a cloud computing system comprising a plurality of smart surgical instruments coupled to surgical hubs that may connect to the cloud component of the cloud computing system, in accordance with at least one aspect of the present disclosure.



FIG. 13 is a functional module architecture of a cloud computing system, in accordance with at least one aspect of the present disclosure.



FIG. 14 illustrates a diagram of a situationally aware surgical system, in accordance with at least one aspect of the present disclosure.



FIG. 15 is a timeline depicting situational awareness of a surgical hub, in accordance with at least one aspect of the present disclosure.



FIG. 16 is a view of a display screen for a surgical procedure, depicting a surgical site and a distal portion of a surgical device at the surgical site, in accordance with at least one aspect of the present disclosure,



FIG. 17 is a view of the surgical device of FIG. 16 extending through a surgical barrier into the surgical site, in accordance with at least one aspect of the present disclosure.



FIG. 18 is a perspective view of a handle portion of the surgical device of FIGS. 16 and 17, the handle portion having an input switch for switching the surgical device between operational modes, in accordance with at least one aspect of the present disclosure.



FIG. 19 is a diagram depicting wearable devices communicating with surgical instruments to facilitate pairing and handing off of the surgical instruments, in accordance with at least one aspect of the present disclosure.



FIG. 20 is a diagram of a wearable wrist device, in accordance with at least one aspect of the present disclosure.



FIG. 21 is a diagram of a wearable ring device, in accordance with at least one aspect of the present disclosure.



FIG. 22A is a first view of a display screen, in which the display screen is configured to receive operator inputs to control a first surgical device—a combination energy device—in accordance with at least one aspect of the present disclosure.



FIG. 22B is a second view of the display screen of FIG. 22A, in which the display screen is configured to receive operator inputs to control a second surgical device—a stapler—in accordance with at least one aspect of the present disclosure.





DESCRIPTION

Applicant of the present application owns the following U.S. Patent Applications, filed on Nov. 6, 2018, the disclosure of each of which is herein incorporated by reference in its entirety:

    • U.S. patent application Ser. No. 16/182,224, titled SURGICAL NETWORK, INSTRUMENT, AND CLOUD RESPONSES BASED ON VALIDATION OF RECEIVED DATASET AND AUTHENTICATION OF ITS SOURCE AND INTEGRITY, now U.S. Pat. No. 11,308,075;
    • U.S. patent application Ser. No. 16/182,230, titled SURGICAL SYSTEM FOR PRESENTING INFORMATION INTERPRETED FROM EXTERNAL DATA, now U.S. Patent Application Publication No. 2019/0200980;
    • U.S. patent application Ser. No. 16/182,233, titled MODIFICATION OF SURGICAL SYSTEMS CONTROL PROGRAMS BASED ON MACHINE LEARNING, now U.S. Patent Application Publication No. 2019/0201123;
    • U.S. patent application Ser. No. 16/182,239, titled ADJUSTMENT OF DEVICE CONTROL PROGRAMS BASED ON STRATIFIED CONTEXTUAL DATA IN ADDITION TO THE DATA, now U.S. Pat. No. 11,423,007;
    • U.S. patent application Ser. No. 16/182,243, titled SURGICAL HUB AND MODULAR DEVICE RESPONSE ADJUSTMENT BASED ON SITUATIONAL AWARENESS, now U.S. Pat. No. 11,273,001;
    • U.S. patent application Ser. No. 16/182,248, titled DETECTION AND ESCALATION OF SECURITY RESPONSES OF SURGICAL INSTRUMENTS TO INCREASING SEVERITY THREATS, now U.S. Pat. No. 10,943,454;
    • U.S. patent application Ser. No. 16/182,251, titled INTERACTIVE SURGICAL SYSTEM, now U.S. Pat. No. 11,278,281;
    • U.S. patent application Ser. No. 16/182,260, titled AUTOMATED DATA SCALING, ALIGNMENT, AND ORGANIZING BASED ON PREDEFINED PARAMETERS WITHIN SURGICAL NETWORKS, now U.S. Pat. No. 11,056,244;
    • U.S. patent application Ser. No. 16/182,267, titled SENSING THE PATIENT POSITION AND CONTACT UTILIZING THE MONO-POLAR RETURN PAD ELECTRODE TO PROVIDE SITUATIONAL AWARENESS TO A SURGICAL NETWORK, now U.S. Patent Application Publication No. 2019/0201128;
    • U.S. patent application Ser. No. 16/182,249, titled POWERED SURGICAL TOOL WITH PREDEFINED ADJUSTABLE CONTROL ALGORITHM FOR CONTROLLING END EFFECTOR PARAMETER, now U.S. Pat. No. 11,234,756;
    • U.S. patent application Ser. No. 16/182,246, titled ADJUSTMENTS BASED ON AIRBORNE PARTICLE PROPERTIES, now U.S. Patent Application Publication No. 2019/0204201;
    • U.S. patent application Ser. No. 16/182,256, titled ADJUSTMENT OF A SURGICAL DEVICE FUNCTION BASED ON SITUATIONAL AWARENESS, now U.S. Patent Application Publication No. 2019/0201127;
    • U.S. patent application Ser. No. 16/182,242, titled REAL-TIME ANALYSIS OF COMPREHENSIVE COST OF ALL INSTRUMENTATION USED IN SURGERY UTILIZING DATA FLUIDITY TO TRACK INSTRUMENTS THROUGH STOCKING AND IN-HOUSE PROCESSES, now U.S. Pat. No. 11,257,589;
    • U.S. patent application Ser. No. 16/182,255, titled USAGE AND TECHNIQUE ANALYSIS OF SURGEON/STAFF PERFORMANCE AGAINST A BASELINE TO OPTIMIZE DEVICE UTILIZATION AND PERFORMANCE FOR BOTH CURRENT AND FUTURE PROCEDURES, now U.S. Pat. No. 11,633,237;
    • U.S. patent application Ser. No. 16/182,269, titled IMAGE CAPTURING OF THE AREAS OUTSIDE THE ABDOMEN TO IMPROVE PLACEMENT AND CONTROL OF A SURGICAL DEVICE IN USE, now U.S. Pat. No. 11,304,763;
    • U.S. patent application Ser. No. 16/182,278, titled COMMUNICATION OF DATA WHERE A SURGICAL NETWORK IS USING CONTEXT OF THE DATA AND REQUIREMENTS OF A RECEIVING SYSTEM/USER TO INFLUENCE INCLUSION OR LINKAGE OF DATA AND METADATA TO ESTABLISH CONTINUITY, now U.S. Patent Application Publication No. 2019/0201130;
    • U.S. patent application Ser. No. 16/182,290, titled SURGICAL NETWORK RECOMMENDATIONS FROM REAL TIME ANALYSIS OF PROCEDURE VARIABLES AGAINST A BASELINE HIGHLIGHTING DIFFERENCES FROM THE OPTIMAL SOLUTION, now U.S. Patent Application Publication No. 2019/0201102;
    • U.S. patent application Ser. No. 16/182,227, titled SURGICAL NETWORK DETERMINATION OF PRIORITIZATION OF COMMUNICATION, INTERACTION, OR PROCESSING BASED ON SYSTEM OR DEVICE NEEDS, now U.S. Pat. No. 10,892,995;
    • U.S. patent application Ser. No. 16/182,231, titled WIRELESS PAIRING OF A SURGICAL DEVICE WITH ANOTHER DEVICE WITHIN A STERILE SURGICAL FIELD BASED ON THE USAGE AND SITUATIONAL AWARENESS OF DEVICES, now U.S. Pat. No. 10,758,310;
    • U.S. patent application Ser. No. 16/182,229, titled ADJUSTMENT OF STAPLE HEIGHT OF AT LEAST ONE ROW OF STAPLES BASED ON THE SENSED TISSUE THICKNESS OR FORCE IN CLOSING, now U.S. Pat. No. 11,096,693;
    • U.S. patent application Ser. No. 16/182,234, titled STAPLING DEVICE WITH BOTH COMPULSORY AND DISCRETIONARY LOCKOUTS BASED ON SENSED PARAMETERS now U.S. Patent Application Publication No. 2019/0200997;
    • U.S. patent application Ser. No. 16/182,240, titled POWERED STAPLING DEVICE CONFIGURED TO ADJUST FORCE, ADVANCEMENT SPEED, AND OVERALL STROKE OF CUTTING MEMBER BASED ON SENSED PARAMETER OF FIRING OR CLAMPING, now U.S. Patent Application Publication No. 2019/0201034;
    • U.S. patent application Ser. No. 16/182,235, titled VARIATION OF RADIO FREQUENCY AND ULTRASONIC POWER LEVEL IN COOPERATION WITH VARYING CLAMP ARM PRESSURE TO ACHIEVE PREDEFINED HEAT FLUX OR POWER APPLIED TO TISSUE, now U.S. Pat. No. 11,446,052; and
    • U.S. patent application Ser. No. 16/182,238, titled ULTRASONIC ENERGY DEVICE WHICH VARIES PRESSURE APPLIED BY CLAMP ARM TO PROVIDE THRESHOLD CONTROL PRESSURE AT A CUT PROGRESSION LOCATION, now U.S. Pat. No. 11,419,667.


Applicant of the present application owns the following U.S. Patent Applications, filed on Sep. 10, 2018, the disclosure of each of which is herein incorporated by reference in its entirety:

    • U.S. Provisional Patent Application No. 62/729,183, titled A CONTROL FOR A SURGICAL NETWORK OR SURGICAL NETWORK CONNECTED DEVICE THAT ADJUSTS ITS FUNCTION BASED ON A SENSED SITUATION OR USAGE;
    • U.S. Provisional Patent Application No. 62/729,177, titled AUTOMATED DATA SCALING, ALIGNMENT, AND ORGANIZING BASED ON PREDEFINED PARAMETERS WITHIN A SURGICAL NETWORK BEFORE TRANSMISSION;
    • U.S. Provisional Patent Application No. 62/729,176, titled INDIRECT COMMAND AND CONTROL OF A FIRST OPERATING ROOM SYSTEM THROUGH THE USE OF A SECOND OPERATING ROOM SYSTEM WITHIN A STERILE FIELD WHERE THE SECOND OPERATING ROOM SYSTEM HAS PRIMARY AND SECONDARY OPERATING MODES;
    • U.S. Provisional Patent Application No. 62/729,185, titled POWERED STAPLING DEVICE THAT IS CAPABLE OF ADJUSTING FORCE, ADVANCEMENT SPEED, AND OVERALL STROKE OF CUTTING MEMBER OF THE DEVICE BASED ON SENSED PARAMETER OF FIRING OR CLAMPING;
    • U.S. Provisional Patent Application No. 62/729,184, titled POWERED SURGICAL TOOL WITH A PREDEFINED ADJUSTABLE CONTROL ALGORITHM FOR CONTROLLING AT LEAST ONE END EFFECTOR PARAMETER AND A MEANS FOR LIMITING THE ADJUSTMENT;
    • U.S. Provisional Patent Application No. 62/729,182, titled SENSING THE PATIENT POSITION AND CONTACT UTILIZING THE MONO POLAR RETURN PAD ELECTRODE TO PROVIDE SITUATIONAL AWARENESS TO THE HUB;
    • U.S. Provisional Patent Application No. 62/729,191, titled SURGICAL NETWORK RECOMMENDATIONS FROM REAL TIME ANALYSIS OF PROCEDURE VARIABLES AGAINST A BASELINE HIGHLIGHTING DIFFERENCES FROM THE OPTIMAL SOLUTION;
    • U.S. Provisional Patent Application No. 62/729,195, titled ULTRASONIC ENERGY DEVICE WHICH VARIES PRESSURE APPLIED BY CLAMP ARM TO PROVIDE THRESHOLD CONTROL PRESSURE AT A CUT PROGRESSION LOCATION; and
    • U.S. Provisional Patent Application No. 62/729,186, titled WIRELESS PAIRING OF A SURGICAL DEVICE WITH ANOTHER DEVICE WITHIN A STERILE SURGICAL FIELD BASED ON THE USAGE AND SITUATIONAL AWARENESS OF DEVICES.


Applicant of the present application owns the following U.S. Patent Applications, filed on Aug. 28, 2018, the disclosure of each of which is herein incorporated by reference in its entirety:

    • U.S. patent application Ser. No. 16/115,214, titled ESTIMATING STATE OF ULTRASONIC END EFFECTOR AND CONTROL SYSTEM THEREFOR;
    • U.S. patent application Ser. No. 16/115,205, titled TEMPERATURE CONTROL OF ULTRASONIC END EFFECTOR AND CONTROL SYSTEM THEREFOR;
    • U.S. patent application Ser. No. 16/115,233, titled RADIO FREQUENCY ENERGY DEVICE FOR DELIVERING COMBINED ELECTRICAL SIGNALS;
    • U.S. patent application Ser. No. 16/115,208, titled CONTROLLING AN ULTRASONIC SURGICAL INSTRUMENT ACCORDING TO TISSUE LOCATION;
    • U.S. patent application Ser. No. 16/115,220, titled CONTROLLING ACTIVATION OF AN ULTRASONIC SURGICAL INSTRUMENT ACCORDING TO THE PRESENCE OF TISSUE;
    • U.S. patent application Ser. No. 16/115,232, titled DETERMINING TISSUE COMPOSITION VIA AN ULTRASONIC SYSTEM;
    • U.S. patent application Ser. No. 16/115,239, titled DETERMINING THE STATE OF AN ULTRASONIC ELECTROMECHANICAL SYSTEM ACCORDING TO FREQUENCY SHIFT;
    • U.S. patent application Ser. No. 16/115,247, titled DETERMINING THE STATE OF AN ULTRASONIC END EFFECTOR;
    • U.S. patent application Ser. No. 16/115,211, titled SITUATIONAL AWARENESS OF ELECTROSURGICAL SYSTEMS;
    • U.S. patent application Ser. No. 16/115,226, titled MECHANISMS FOR CONTROLLING DIFFERENT ELECTROMECHANICAL SYSTEMS OF AN ELECTROSURGICAL INSTRUMENT;
    • U.S. patent application Ser. No. 16/115,240, titled DETECTION OF END EFFECTOR IMMERSION IN LIQUID;
    • U.S. patent application Ser. No. 16/115,249, titled INTERRUPTION OF ENERGY DUE TO INADVERTENT CAPACITIVE COUPLING;
    • U.S. patent application Ser. No. 16/115,256, titled INCREASING RADIO FREQUENCY TO CREATE PAD-LESS MONOPOLAR LOOP;
    • U.S. patent application Ser. No. 16/115,223, titled BIPOLAR COMBINATION DEVICE THAT AUTOMATICALLY ADJUSTS PRESSURE BASED ON ENERGY MODALITY; and
    • U.S. patent application Ser. No. 16/115,238, titled ACTIVATION OF ENERGY DEVICES.


Applicant of the present application owns the following U.S. Patent Applications, filed on Aug. 23, 2018, the disclosure of each of which is herein incorporated by reference in its entirety:

    • U.S. Provisional Patent Application No. 62/721,995, titled CONTROLLING AN ULTRASONIC SURGICAL INSTRUMENT ACCORDING TO TISSUE LOCATION;
    • U.S. Provisional Patent Application No. 62/721,998, titled SITUATIONAL AWARENESS OF ELECTROSURGICAL SYSTEMS;
    • U.S. Provisional Patent Application No. 62/721,999, titled INTERRUPTION OF ENERGY DUE TO INADVERTENT CAPACITIVE COUPLING;
    • U.S. Provisional Patent Application No. 62/721,994, titled BIPOLAR COMBINATION DEVICE THAT AUTOMATICALLY ADJUSTS PRESSURE BASED ON ENERGY MODALITY; and
    • U.S. Provisional Patent Application No. 62/721,996, titled RADIO FREQUENCY ENERGY DEVICE FOR DELIVERING COMBINED ELECTRICAL SIGNALS.


Applicant of the present application owns the following U.S. Patent Applications, filed on Jun. 30, 2018, the disclosure of each of which is herein incorporated by reference in its entirety:

    • U.S. Provisional Patent Application No. 62/692,747, titled SMART ACTIVATION OF AN ENERGY DEVICE BY ANOTHER DEVICE;
    • U.S. Provisional Patent Application No. 62/692,748, titled SMART ENERGY ARCHITECTURE; and
    • U.S. Provisional Patent Application No. 62/692,768, titled SMART ENERGY DEVICES. Applicant of the present application owns the following U.S. Patent Applications, filed on Jun. 29, 2018, the disclosure of each of which is herein incorporated by reference in its entirety:
    • U.S. patent application Ser. No. 16/024,090, titled CAPACITIVE COUPLED RETURN PATH PAD WITH SEPARABLE ARRAY ELEMENTS;
    • U.S. patent application Ser. No. 16/024,057, titled CONTROLLING A SURGICAL INSTRUMENT ACCORDING TO SENSED CLOSURE PARAMETERS;
    • U.S. patent application Ser. No. 16/024,067, titled SYSTEMS FOR ADJUSTING END EFFECTOR PARAMETERS BASED ON PERIOPERATIVE INFORMATION;
    • U.S. patent application Ser. No. 16/024,075, titled SAFETY SYSTEMS FOR SMART POWERED SURGICAL STAPLING;
    • U.S. patent application Ser. No. 16/024,083, titled SAFETY SYSTEMS FOR SMART POWERED SURGICAL STAPLING;
    • U.S. patent application Ser. No. 16/024,094, titled SURGICAL SYSTEMS FOR DETECTING END EFFECTOR TISSUE DISTRIBUTION IRREGULARITIES;
    • U.S. patent application Ser. No. 16/024,138, titled SYSTEMS FOR DETECTING PROXIMITY OF SURGICAL END EFFECTOR TO CANCEROUS TISSUE;
    • U.S. patent application Ser. No. 16/024,150, titled SURGICAL INSTRUMENT CARTRIDGE SENSOR ASSEMBLIES;
    • U.S. patent application Ser. No. 16/024,160, titled VARIABLE OUTPUT CARTRIDGE SENSOR ASSEMBLY;
    • U.S. patent application Ser. No. 16/024,124, titled SURGICAL INSTRUMENT HAVING A FLEXIBLE ELECTRODE;
    • U.S. patent application Ser. No. 16/024,132, titled SURGICAL INSTRUMENT HAVING A FLEXIBLE CIRCUIT;
    • U.S. patent application Ser. No. 16/024,141, titled SURGICAL INSTRUMENT WITH A TISSUE MARKING ASSEMBLY;
    • U.S. patent application Ser. No. 16/024,162, titled SURGICAL SYSTEMS WITH PRIORITIZED DATA TRANSMISSION CAPABILITIES;
    • U.S. patent application Ser. No. 16/024,066, titled SURGICAL EVACUATION SENSING AND MOTOR CONTROL;
    • U.S. patent application Ser. No. 16/024,096, titled SURGICAL EVACUATION SENSOR ARRANGEMENTS;
    • U.S. patent application Ser. No. 16/024,116, titled SURGICAL EVACUATION FLOW PATHS;
    • U.S. patent application Ser. No. 16/024,149, titled SURGICAL EVACUATION SENSING AND GENERATOR CONTROL;
    • U.S. patent application Ser. No. 16/024,180, titled SURGICAL EVACUATION SENSING AND DISPLAY;
    • U.S. patent application Ser. No. 16/024,245, titled COMMUNICATION OF SMOKE EVACUATION SYSTEM PARAMETERS TO HUB OR CLOUD IN SMOKE EVACUATION MODULE FOR INTERACTIVE SURGICAL PLATFORM;
    • U.S. patent application Ser. No. 16/024,258, titled SMOKE EVACUATION SYSTEM INCLUDING A SEGMENTED CONTROL CIRCUIT FOR INTERACTIVE SURGICAL PLATFORM;
    • U.S. patent application Ser. No. 16/024,265, titled SURGICAL EVACUATION SYSTEM WITH A COMMUNICATION CIRCUIT FOR COMMUNICATION BETWEEN A FILTER AND A SMOKE EVACUATION DEVICE; and
    • U.S. patent application Ser. No. 16/024,273, titled DUAL IN-SERIES LARGE AND SMALL DROPLET FILTERS.


Applicant of the present application owns the following U.S. Provisional Patent Applications, filed on Jun. 28, 2018, the disclosure of each of which is herein incorporated by reference in its entirety:

    • U.S. Provisional Patent Application Ser. No. 62/691,228, titled A METHOD OF USING REINFORCED FLEX CIRCUITS WITH MULTIPLE SENSORS WITH ELECTROSURGICAL DEVICES;
    • U.S. Provisional Patent Application Ser. No. 62/691,227, titled CONTROLLING A SURGICAL INSTRUMENT ACCORDING TO SENSED CLOSURE PARAMETERS;
    • U.S. Provisional Patent Application Ser. No. 62/691,230, titled SURGICAL INSTRUMENT HAVING A FLEXIBLE ELECTRODE;
    • U.S. Provisional Patent Application Ser. No. 62/691,219, titled SURGICAL EVACUATION SENSING AND MOTOR CONTROL;
    • U.S. Provisional Patent Application Ser. No. 62/691,257, titled COMMUNICATION OF SMOKE EVACUATION SYSTEM PARAMETERS TO HUB OR CLOUD IN SMOKE EVACUATION MODULE FOR INTERACTIVE SURGICAL PLATFORM;
    • U.S. Provisional Patent Application Ser. No. 62/691,262, titled SURGICAL EVACUATION SYSTEM WITH A COMMUNICATION CIRCUIT FOR COMMUNICATION BETWEEN A FILTER AND A SMOKE EVACUATION DEVICE; and
    • U.S. Provisional Patent Application Ser. No. 62/691,251, titled DUAL IN-SERIES LARGE AND SMALL DROPLET FILTERS.


Applicant of the present application owns the following U.S. Provisional Patent Application, filed on Apr. 19, 2018, the disclosure of which is herein incorporated by reference in its entirety:

    • U.S. Provisional Patent Application Ser. No. 62/659,900, titled METHOD OF HUB COMMUNICATION.


Applicant of the present application owns the following U.S. Provisional Patent Applications, filed on Mar. 30, 2018, the disclosure of each of which is herein incorporated by reference in its entirety:

    • U.S. Provisional Patent Application No. 62/650,898 filed on Mar. 30, 2018, titled CAPACITIVE COUPLED RETURN PATH PAD WITH SEPARABLE ARRAY ELEMENTS;
    • U.S. Provisional Patent Application Ser. No. 62/650,887, titled SURGICAL SYSTEMS WITH OPTIMIZED SENSING CAPABILITIES;
    • U.S. Provisional Patent Application Ser. No. 62/650,882, titled SMOKE EVACUATION MODULE FOR INTERACTIVE SURGICAL PLATFORM; and
    • U.S. Provisional Patent Application Ser. No. 62/650,877, titled SURGICAL SMOKE EVACUATION SENSING AND CONTROLS.


Applicant of the present application owns the following U.S. Patent Applications, filed on Mar. 29, 2018, the disclosure of each of which is herein incorporated by reference in its entirety:

    • U.S. patent application Ser. No. 15/940,641, titled INTERACTIVE SURGICAL SYSTEMS WITH ENCRYPTED COMMUNICATION CAPABILITIES;
    • U.S. patent application Ser. No. 15/940,648, titled INTERACTIVE SURGICAL SYSTEMS WITH CONDITION HANDLING OF DEVICES AND DATA CAPABILITIES;
    • U.S. patent application Ser. No. 15/940,656, titled SURGICAL HUB COORDINATION OF CONTROL AND COMMUNICATION OF OPERATING ROOM DEVICES;
    • U.S. patent application Ser. No. 15/940,666, titled SPATIAL AWARENESS OF SURGICAL HUBS IN OPERATING ROOMS;
    • U.S. patent application Ser. No. 15/940,670, titled COOPERATIVE UTILIZATION OF DATA DERIVED FROM SECONDARY SOURCES BY INTELLIGENT SURGICAL HUBS;
    • U.S. patent application Ser. No. 15/940,677, titled SURGICAL HUB CONTROL ARRANGEMENTS;
    • U.S. patent application Ser. No. 15/940,632, titled DATA STRIPPING METHOD TO INTERROGATE PATIENT RECORDS AND CREATE ANONYMIZED RECORD;
    • U.S. patent application Ser. No. 15/940,640, titled COMMUNICATION HUB AND STORAGE DEVICE FOR STORING PARAMETERS AND STATUS OF A SURGICAL DEVICE TO BE SHARED WITH CLOUD BASED ANALYTICS SYSTEMS;
    • U.S. patent application Ser. No. 15/940,645, titled SELF DESCRIBING DATA PACKETS GENERATED AT AN ISSUING INSTRUMENT;
    • U.S. patent application Ser. No. 15/940,649, titled DATA PAIRING TO INTERCONNECT A DEVICE MEASURED PARAMETER WITH AN OUTCOME;
    • U.S. patent application Ser. No. 15/940,654, titled SURGICAL HUB SITUATIONAL AWARENESS;
    • U.S. patent application Ser. No. 15/940,663, titled SURGICAL SYSTEM DISTRIBUTED PROCESSING;
    • U.S. patent application Ser. No. 15/940,668, titled AGGREGATION AND REPORTING OF SURGICAL HUB DATA;
    • U.S. patent application Ser. No. 15/940,671, titled SURGICAL HUB SPATIAL AWARENESS TO DETERMINE DEVICES IN OPERATING THEATER;
    • U.S. patent application Ser. No. 15/940,686, titled DISPLAY OF ALIGNMENT OF STAPLE CARTRIDGE TO PRIOR LINEAR STAPLE LINE;
    • U.S. patent application Ser. No. 15/940,700, titled STERILE FIELD INTERACTIVE CONTROL DISPLAYS;
    • U.S. patent application Ser. No. 15/940,629, titled COMPUTER IMPLEMENTED INTERACTIVE SURGICAL SYSTEMS;
    • U.S. patent application Ser. No. 15/940,704, titled USE OF LASER LIGHT AND RED-GREEN-BLUE COLORATION TO DETERMINE PROPERTIES OF BACK SCATTERED LIGHT;
    • U.S. patent application Ser. No. 15/940,722, titled CHARACTERIZATION OF TISSUE IRREGULARITIES THROUGH THE USE OF MONO-CHROMATIC LIGHT REFRACTIVITY;
    • U.S. patent application Ser. No. 15/940,742, titled DUAL CMOS ARRAY IMAGING.
    • U.S. patent application Ser. No. 15/940,636, titled ADAPTIVE CONTROL PROGRAM UPDATES FOR SURGICAL DEVICES;
    • U.S. patent application Ser. No. 15/940,653, titled ADAPTIVE CONTROL PROGRAM UPDATES FOR SURGICAL HUBS;
    • U.S. patent application Ser. No. 15/940,660, titled CLOUD-BASED MEDICAL ANALYTICS FOR CUSTOMIZATION AND RECOMMENDATIONS TO A USER;
    • U.S. patent application Ser. No. 15/940,679, titled CLOUD-BASED MEDICAL ANALYTICS FOR LINKING OF LOCAL USAGE TRENDS WITH THE RESOURCE ACQUISITION BEHAVIORS OF LARGER DATA SET;
    • U.S. patent application Ser. No. 15/940,694, titled CLOUD-BASED MEDICAL ANALYTICS FOR MEDICAL FACILITY SEGMENTED INDIVIDUALIZATION OF INSTRUMENT FUNCTION;
    • U.S. patent application Ser. No. 15/940,634, titled CLOUD-BASED MEDICAL ANALYTICS FOR SECURITY AND AUTHENTICATION TRENDS AND REACTIVE MEASURES;
    • U.S. patent application Ser. No. 15/940,706, titled DATA HANDLING AND PRIORITIZATION IN A CLOUD ANALYTICS NETWORK;
    • U.S. patent application Ser. No. 15/940,675, titled CLOUD INTERFACE FOR COUPLED SURGICAL DEVICES;
    • U.S. patent application Ser. No. 15/940,627, titled DRIVE ARRANGEMENTS FOR ROBOT-ASSISTED SURGICAL PLATFORMS;
    • U.S. patent application Ser. No. 15/940,637, titled COMMUNICATION ARRANGEMENTS FOR ROBOT-ASSISTED SURGICAL PLATFORMS;
    • U.S. patent application Ser. No. 15/940,642, titled CONTROLS FOR ROBOT-ASSISTED SURGICAL PLATFORMS;
    • U.S. patent application Ser. No. 15/940,676, titled AUTOMATIC TOOL ADJUSTMENTS FOR ROBOT-ASSISTED SURGICAL PLATFORMS;
    • U.S. patent application Ser. No. 15/940,680, titled CONTROLLERS FOR ROBOT-ASSISTED SURGICAL PLATFORMS;
    • U.S. patent application Ser. No. 15/940,683, titled COOPERATIVE SURGICAL ACTIONS FOR ROBOT-ASSISTED SURGICAL PLATFORMS;
    • U.S. patent application Ser. No. 15/940,690, titled DISPLAY ARRANGEMENTS FOR ROBOT-ASSISTED SURGICAL PLATFORMS; and
    • U.S. patent application Ser. No. 15/940,711, titled SENSING ARRANGEMENTS FOR ROBOT-ASSISTED SURGICAL PLATFORMS.


Applicant of the present application owns the following U.S. Provisional Patent Applications, filed on Mar. 28, 2018, the disclosure of each of which is herein incorporated by reference in its entirety:

    • U.S. Provisional Patent Application Ser. No. 62/649,302, titled INTERACTIVE SURGICAL SYSTEMS WITH ENCRYPTED COMMUNICATION CAPABILITIES;
    • U.S. Provisional Patent Application Ser. No. 62/649,294, titled DATA STRIPPING METHOD TO INTERROGATE PATIENT RECORDS AND CREATE ANONYMIZED RECORD;
    • U.S. Provisional Patent Application Ser. No. 62/649,300, titled SURGICAL HUB SITUATIONAL AWARENESS;
    • U.S. Provisional Patent Application Ser. No. 62/649,309, titled SURGICAL HUB SPATIAL AWARENESS TO DETERMINE DEVICES IN OPERATING THEATER;
    • U.S. Provisional Patent Application Ser. No. 62/649,310, titled COMPUTER IMPLEMENTED INTERACTIVE SURGICAL SYSTEMS;
    • U.S. Provisional Patent Application Ser. No. 62/649,291, titled USE OF LASER LIGHT AND RED-GREEN-BLUE COLORATION TO DETERMINE PROPERTIES OF BACK SCATTERED LIGHT;
    • U.S. Provisional Patent Application Ser. No. 62/649,296, titled ADAPTIVE CONTROL PROGRAM UPDATES FOR SURGICAL DEVICES;
    • U.S. Provisional Patent Application Ser. No. 62/649,333, titled CLOUD-BASED MEDICAL ANALYTICS FOR CUSTOMIZATION AND RECOMMENDATIONS TO A USER;
    • U.S. Provisional Patent Application Ser. No. 62/649,327, titled CLOUD-BASED MEDICAL ANALYTICS FOR SECURITY AND AUTHENTICATION TRENDS AND REACTIVE MEASURES;
    • U.S. Provisional Patent Application Ser. No. 62/649,315, titled DATA HANDLING AND PRIORITIZATION IN A CLOUD ANALYTICS NETWORK;
    • U.S. Provisional Patent Application Ser. No. 62/649,313, titled CLOUD INTERFACE FOR COUPLED SURGICAL DEVICES;
    • U.S. Provisional Patent Application Ser. No. 62/649,320, titled DRIVE ARRANGEMENTS FOR ROBOT-ASSISTED SURGICAL PLATFORMS;
    • U.S. Provisional Patent Application Ser. No. 62/649,307, titled AUTOMATIC TOOL ADJUSTMENTS FOR ROBOT-ASSISTED SURGICAL PLATFORMS; and
    • U.S. Provisional Patent Application Ser. No. 62/649,323, titled SENSING ARRANGEMENTS FOR ROBOT-ASSISTED SURGICAL PLATFORMS.


Applicant of the present application owns the following U.S. Provisional Patent Applications, filed on Mar. 8, 2018, the disclosure of each of which is herein incorporated by reference in its entirety:

    • U.S. Provisional Patent Application Ser. No. 62/640,417, titled TEMPERATURE CONTROL IN ULTRASONIC DEVICE AND CONTROL SYSTEM THEREFOR; and
    • U.S. Provisional Patent Application Ser. No. 62/640,415, titled ESTIMATING STATE OF ULTRASONIC END EFFECTOR AND CONTROL SYSTEM THEREFOR.


Applicant of the present application owns the following U.S. Provisional Patent Applications, filed on Dec. 28, 2017, the disclosure of each of which is herein incorporated by reference in its entirety:

    • U.S. Provisional Patent Application Serial No. U.S. Provisional Patent Application Ser. No. 62/611,341, titled INTERACTIVE SURGICAL PLATFORM;
    • U.S. Provisional Patent Application Ser. No. 62/611,340, titled CLOUD-BASED MEDICAL ANALYTICS; and
    • U.S. Provisional Patent Application Ser. No. 62/611,339, titled ROBOT ASSISTED SURGICAL PLATFORM.


Before explaining various aspects of surgical devices and generators in detail, it should be noted that the illustrative examples are not limited in application or use to the details of construction and arrangement of parts illustrated in the accompanying drawings and description. The illustrative examples may be implemented or incorporated in other aspects, variations and modifications, and may be practiced or carried out in various ways. Further, unless otherwise indicated, the terms and expressions employed herein have been chosen for the purpose of describing the illustrative examples for the convenience of the reader and are not for the purpose of limitation thereof. Also, it will be appreciated that one or more of the following-described aspects, expressions of aspects, and/or examples, can be combined with any one or more of the other following-described aspects, expressions of aspects and/or examples.


Surgical Hubs

Referring to FIG. 1, a computer-implemented interactive surgical system 100 includes one or more surgical systems 102 and a cloud-based system (e.g., the cloud 104 that may include a remote server 113 coupled to a storage device 105). Each surgical system 102 includes at least one surgical hub 106 in communication with the cloud 104 that may include a remote server 113. In one example, as illustrated in FIG. 1, the surgical system 102 includes a visualization system 108, a robotic system 110, and a handheld intelligent surgical instrument 112, which are configured to communicate with one another and/or the hub 106. In some aspects, a surgical system 102 may include an M number of hubs 106, an N number of visualization systems 108, an O number of robotic systems 110, and a P number of handheld intelligent surgical instruments 112, where M, N, O, and P are integers greater than or equal to one.


In various aspects, the intelligent instruments 112 as described herein with reference to FIGS. 1-7 may be implemented as a surgical device 214002 (FIGS. 16-18), a display screen (FIG. 16), a wearable device 214100, 214102, 214200, 214202 (FIGS. 19-21), and a display 214400 (FIGS. 22A and 22B). The intelligent instruments 112 (e.g., devices 1a-1n) such as the surgical device 214002 (FIGS. 16-18), the display screen (FIG. 16), the wearable devices 214100, 214102, 214200, 214202 (FIGS. 19-21), and the display 214400 (FIGS. 22A and 22B) are configured to operate in a surgical data network 201 as described with reference to FIG. 8.



FIG. 2 depicts an example of a surgical system 102 being used to perform a surgical procedure on a patient who is lying down on an operating table 114 in a surgical operating room 116. A robotic system 110 is used in the surgical procedure as a part of the surgical system 102. The robotic system 110 includes a surgeon's console 118, a patient side cart 120 (surgical robot), and a surgical robotic hub 122. The patient side cart 120 can manipulate at least one removably coupled surgical tool 117 through a minimally invasive incision in the body of the patient while the surgeon views the surgical site through the surgeon's console 118. An image of the surgical site can be obtained by a medical imaging device 124, which can be manipulated by the patient side cart 120 to orient the imaging device 124. The robotic hub 122 can be used to process the images of the surgical site for subsequent display to the surgeon through the surgeon's console 118.


Other types of robotic systems can be readily adapted for use with the surgical system 102. Various examples of robotic systems and surgical tools that are suitable for use with the present disclosure are described in U.S. Provisional Patent Application Ser. No. 62/611,339, titled ROBOT ASSISTED SURGICAL PLATFORM, filed Dec. 28, 2017, the disclosure of which is herein incorporated by reference in its entirety.


Various examples of cloud-based analytics that are performed by the cloud 104, and are suitable for use with the present disclosure, are described in U.S. Provisional Patent Application Ser. No. 62/611,340, titled CLOUD-BASED MEDICAL ANALYTICS, filed Dec. 28, 2017, the disclosure of which is herein incorporated by reference in its entirety.


In various aspects, the imaging device 124 includes at least one image sensor and one or more optical components. Suitable image sensors include, but are not limited to, Charge-Coupled Device (CCD) sensors and Complementary Metal-Oxide Semiconductor (CMOS) sensors.


The optical components of the imaging device 124 may include one or more illumination sources and/or one or more lenses. The one or more illumination sources may be directed to illuminate portions of the surgical field. The one or more image sensors may receive light reflected or refracted from the surgical field, including light reflected or refracted from tissue and/or surgical instruments.


The one or more illumination sources may be configured to radiate electromagnetic energy in the visible spectrum as well as the invisible spectrum. The visible spectrum, sometimes referred to as the optical spectrum or luminous spectrum, is that portion of the electromagnetic spectrum that is visible to (i.e., can be detected by) the human eye and may be referred to as visible light or simply light. A typical human eye will respond to wavelengths in air that are from about 380 nm to about 750 nm.


The invisible spectrum (i.e., the non-luminous spectrum) is that portion of the electromagnetic spectrum that lies below and above the visible spectrum (i.e., wavelengths below about 380 nm and above about 750 nm). The invisible spectrum is not detectable by the human eye. Wavelengths greater than about 750 nm are longer than the red visible spectrum, and they become invisible infrared (IR), microwave, and radio electromagnetic radiation. Wavelengths less than about 380 nm are shorter than the violet spectrum, and they become invisible ultraviolet, x-ray, and gamma ray electromagnetic radiation.


In various aspects, the imaging device 124 is configured for use in a minimally invasive procedure. Examples of imaging devices suitable for use with the present disclosure include, but not limited to, an arthroscope, angioscope, bronchoscope, choledochoscope, colonoscope, cytoscope, duodenoscope, enteroscope, esophagogastro-duodenoscope (gastroscope), endoscope, laryngoscope, nasopharyngo-neproscope, sigmoidoscope, thoracoscope, and ureteroscope.


In one aspect, the imaging device employs multi-spectrum monitoring to discriminate topography and underlying structures. A multi-spectral image is one that captures image data within specific wavelength ranges across the electromagnetic spectrum. The wavelengths may be separated by filters or by the use of instruments that are sensitive to particular wavelengths, including light from frequencies beyond the visible light range, e.g., IR and ultraviolet. Spectral imaging can allow extraction of additional information the human eye fails to capture with its receptors for red, green, and blue. The use of multi-spectral imaging is described in greater detail under the heading “Advanced Imaging Acquisition Module” in U.S. Provisional Patent Application Ser. No. 62/611,341, titled INTERACTIVE SURGICAL PLATFORM, filed Dec. 28, 2017, the disclosure of which is herein incorporated by reference in its entirety. Multi-spectrum monitoring can be a useful tool in relocating a surgical field after a surgical task is completed to perform one or more of the previously described tests on the treated tissue.


It is axiomatic that strict sterilization of the operating room and surgical equipment is required during any surgery. The strict hygiene and sterilization conditions required in a “surgical theater,” i.e., an operating or treatment room, necessitate the highest possible sterility of all medical devices and equipment. Part of that sterilization process is the need to sterilize anything that comes in contact with the patient or penetrates the sterile field, including the imaging device 124 and its attachments and components. It will be appreciated that the sterile field may be considered a specified area, such as within a tray or on a sterile towel, that is considered free of microorganisms, or the sterile field may be considered an area, immediately around a patient, who has been prepared for a surgical procedure. The sterile field may include the scrubbed team members, who are properly attired, and all furniture and fixtures in the area.


In various aspects, the visualization system 108 includes one or more imaging sensors, one or more image-processing units, one or more storage arrays, and one or more displays that are strategically arranged with respect to the sterile field, as illustrated in FIG. 2. In one aspect, the visualization system 108 includes an interface for HL7, PACS, and EMR. Various components of the visualization system 108 are described under the heading “Advanced Imaging Acquisition Module” in U.S. Provisional Patent Application Ser. No. 62/611,341, titled INTERACTIVE SURGICAL PLATFORM, filed Dec. 28, 2017, the disclosure of which is herein incorporated by reference in its entirety.


As illustrated in FIG. 2, a primary display 119 is positioned in the sterile field to be visible to an operator at the operating table 114. In addition, a visualization tower 111 is positioned outside the sterile field. The visualization tower 111 includes a first non-sterile display 107 and a second non-sterile display 109, which face away from each other. The visualization system 108, guided by the hub 106, is configured to utilize the displays 107, 109, and 119 to coordinate information flow to operators inside and outside the sterile field. For example, the hub 106 may cause the visualization system 108 to display a snapshot of a surgical site, as recorded by an imaging device 124, on a non-sterile display 107 or 109, while maintaining a live feed of the surgical site on the primary display 119. The snapshot on the non-sterile display 107 or 109 can permit a non-sterile operator to perform a diagnostic step relevant to the surgical procedure, for example.


In one aspect, the hub 106 is also configured to route a diagnostic input or feedback entered by a non-sterile operator at the visualization tower 111 to the primary display 119 within the sterile field, where it can be viewed by a sterile operator at the operating table. In one example, the input can be in the form of a modification to the snapshot displayed on the non-sterile display 107 or 109, which can be routed to the primary display 119 by the hub 106.


Referring to FIG. 2, a surgical instrument 112 is being used in the surgical procedure as part of the surgical system 102. The hub 106 is also configured to coordinate information flow to a display of the surgical instrument 112. For example, coordinate information flow is further described in U.S. Provisional Patent Application Ser. No. 62/611,341, titled INTERACTIVE SURGICAL PLATFORM, filed Dec. 28, 2017, the disclosure of which is herein incorporated by reference in its entirety. A diagnostic input or feedback entered by a non-sterile operator at the visualization tower 111 can be routed by the hub 106 to the surgical instrument display 115 within the sterile field, where it can be viewed by the operator of the surgical instrument 112. Example surgical instruments that are suitable for use with the surgical system 102 are described under the heading “Surgical Instrument Hardware” in U.S. Provisional Patent Application Ser. No. 62/611,341, titled INTERACTIVE SURGICAL PLATFORM, filed Dec. 28, 2017, the disclosure of which is herein incorporated by reference in its entirety, for example.


Referring now to FIG. 3, a hub 106 is depicted in communication with a visualization system 108, a robotic system 110, and a handheld intelligent surgical instrument 112. The hub 106 includes a hub display 135, an imaging module 138, a generator module 140 (which can include a monopolar generator 142, a bipolar generator 144, and/or an ultrasonic generator 143), a communication module 130, a processor module 132, and a storage array 134. In certain aspects, as illustrated in FIG. 3, the hub 106 further includes a smoke evacuation module 126, a suction/irrigation module 128, and/or an OR mapping module 133.


During a surgical procedure, energy application to tissue, for sealing and/or cutting, is generally associated with smoke evacuation, suction of excess fluid, and/or irrigation of the tissue. Fluid, power, and/or data lines from different sources are often entangled during the surgical procedure. Valuable time can be lost addressing this issue during a surgical procedure. Detangling the lines may necessitate disconnecting the lines from their respective modules, which may require resetting the modules. The hub modular enclosure 136 offers a unified environment for managing the power, data, and fluid lines, which reduces the frequency of entanglement between such lines.


Aspects of the present disclosure present a surgical hub for use in a surgical procedure that involves energy application to tissue at a surgical site. The surgical hub includes a hub enclosure and a combo generator module slidably receivable in a docking station of the hub enclosure. The docking station includes data and power contacts. The combo generator module includes two or more of an ultrasonic energy generator component, a bipolar RF energy generator component, and a monopolar RF energy generator component that are housed in a single unit. In one aspect, the combo generator module also includes a smoke evacuation component, at least one energy delivery cable for connecting the combo generator module to a surgical instrument, at least one smoke evacuation component configured to evacuate smoke, fluid, and/or particulates generated by the application of therapeutic energy to the tissue, and a fluid line extending from the remote surgical site to the smoke evacuation component.


In one aspect, the fluid line is a first fluid line and a second fluid line extends from the remote surgical site to a suction and irrigation module slidably received in the hub enclosure. In one aspect, the hub enclosure comprises a fluid interface.


Certain surgical procedures may require the application of more than one energy type to the tissue. One energy type may be more beneficial for cutting the tissue, while another different energy type may be more beneficial for sealing the tissue. For example, a bipolar generator can be used to seal the tissue while an ultrasonic generator can be used to cut the sealed tissue. Aspects of the present disclosure present a solution where a hub modular enclosure 136 is configured to accommodate different generators, and facilitate an interactive communication therebetween. One of the advantages of the hub modular enclosure 136 is enabling the quick removal and/or replacement of various modules.


Aspects of the present disclosure present a modular surgical enclosure for use in a surgical procedure that involves energy application to tissue. The modular surgical enclosure includes a first energy-generator module, configured to generate a first energy for application to the tissue, and a first docking station comprising a first docking port that includes first data and power contacts, wherein the first energy-generator module is slidably movable into an electrical engagement with the power and data contacts and wherein the first energy-generator module is slidably movable out of the electrical engagement with the first power and data contacts,


Further to the above, the modular surgical enclosure also includes a second energy-generator module configured to generate a second energy, different than the first energy, for application to the tissue, and a second docking station comprising a second docking port that includes second data and power contacts, wherein the second energy-generator module is slidably movable into an electrical engagement with the power and data contacts, and wherein the second energy-generator module is slidably movable out of the electrical engagement with the second power and data contacts.


In addition, the modular surgical enclosure also includes a communication bus between the first docking port and the second docking port, configured to facilitate communication between the first energy-generator module and the second energy-generator module.


Referring to FIGS. 3-7, aspects of the present disclosure are presented for a hub modular enclosure 136 that allows the modular integration of a generator module 140, a smoke evacuation module 126, and a suction/irrigation module 128. The hub modular enclosure 136 further facilitates interactive communication between the modules 140, 126, 128. As illustrated in FIG. 5, the generator module 140 can be a generator module with integrated monopolar, bipolar, and ultrasonic components supported in a single housing unit 139 slidably insertable into the hub modular enclosure 136. As illustrated in FIG. 5, the generator module 140 can be configured to connect to a monopolar device 146, a bipolar device 147, and an ultrasonic device 148. Alternatively, the generator module 140 may comprise a series of monopolar, bipolar, and/or ultrasonic generator modules that interact through the hub modular enclosure 136. The hub modular enclosure 136 can be configured to facilitate the insertion of multiple generators and interactive communication between the generators docked into the hub modular enclosure 136 so that the generators would act as a single generator.


In one aspect, the hub modular enclosure 136 comprises a modular power and communication backplane 149 with external and wireless communication headers to enable the removable attachment of the modules 140, 126, 128 and interactive communication therebetween.


In one aspect, the hub modular enclosure 136 includes docking stations, or drawers, 151, herein also referred to as drawers, which are configured to slidably receive the modules 140, 126, 128. FIG. 4 illustrates a partial perspective view of a surgical hub enclosure 136, and a combo generator module 145 slidably receivable in a docking station 151 of the surgical hub enclosure 136. A docking port 152 with power and data contacts on a rear side of the combo generator module 145 is configured to engage a corresponding docking port 150 with power and data contacts of a corresponding docking station 151 of the hub modular enclosure 136 as the combo generator module 145 is slid into position within the corresponding docking station 151 of the hub module enclosure 136. In one aspect, the combo generator module 145 includes a bipolar, ultrasonic, and monopolar module and a smoke evacuation module integrated together into a single housing unit 139, as illustrated in FIG. 5.


In various aspects, the smoke evacuation module 126 includes a fluid line 154 that conveys captured/collected smoke and/or fluid away from a surgical site and to, for example, the smoke evacuation module 126. Vacuum suction originating from the smoke evacuation module 126 can draw the smoke into an opening of a utility conduit at the surgical site. The utility conduit, coupled to the fluid line, can be in the form of a flexible tube terminating at the smoke evacuation module 126. The utility conduit and the fluid line define a fluid path extending toward the smoke evacuation module 126 that is received in the hub enclosure 136.


In various aspects, the suction/irrigation module 128 is coupled to a surgical tool comprising an aspiration fluid line and a suction fluid line. In one example, the aspiration and suction fluid lines are in the form of flexible tubes extending from the surgical site toward the suction/irrigation module 128. One or more drive systems can be configured to cause irrigation and aspiration of fluids to and from the surgical site.


In one aspect, the surgical tool includes a shaft having an end effector at a distal end thereof and at least one energy treatment associated with the end effector, an aspiration tube, and an irrigation tube. The aspiration tube can have an inlet port at a distal end thereof and the aspiration tube extends through the shaft. Similarly, an irrigation tube can extend through the shaft and can have an inlet port in proximity to the energy deliver implement. The energy deliver implement is configured to deliver ultrasonic and/or RF energy to the surgical site and is coupled to the generator module 140 by a cable extending initially through the shaft.


The irrigation tube can be in fluid communication with a fluid source, and the aspiration tube can be in fluid communication with a vacuum source. The fluid source and/or the vacuum source can be housed in the suction/irrigation module 128. In one example, the fluid source and/or the vacuum source can be housed in the hub enclosure 136 separately from the suction/irrigation module 128. In such example, a fluid interface can be configured to connect the suction/irrigation module 128 to the fluid source and/or the vacuum source.


In one aspect, the modules 140, 126, 128 and/or their corresponding docking stations on the hub modular enclosure 136 may include alignment features that are configured to align the docking ports of the modules into engagement with their counterparts in the docking stations of the hub modular enclosure 136. For example, as illustrated in FIG. 4, the combo generator module 145 includes side brackets 155 that are configured to slidably engage with corresponding brackets 156 of the corresponding docking station 151 of the hub modular enclosure 136. The brackets cooperate to guide the docking port contacts of the combo generator module 145 into an electrical engagement with the docking port contacts of the hub modular enclosure 136.


In some aspects, the drawers 151 of the hub modular enclosure 136 are the same, or substantially the same size, and the modules are adjusted in size to be received in the drawers 151. For example, the side brackets 155 and/or 156 can be larger or smaller depending on the size of the module. In other aspects, the drawers 151 are different in size and are each designed to accommodate a particular module.


Furthermore, the contacts of a particular module can be keyed for engagement with the contacts of a particular drawer to avoid inserting a module into a drawer with mismatching contacts.


As illustrated in FIG. 4, the docking port 150 of one drawer 151 can be coupled to the docking port 150 of another drawer 151 through a communications link 157 to facilitate an interactive communication between the modules housed in the hub modular enclosure 136. The docking ports 150 of the hub modular enclosure 136 may alternatively, or additionally, facilitate a wireless interactive communication between the modules housed in the hub modular enclosure 136. Any suitable wireless communication can be employed, such as for example Air Titan-Bluetooth.



FIG. 6 illustrates individual power bus attachments for a plurality of lateral docking ports of a lateral modular housing 160 configured to receive a plurality of modules of a surgical hub 206. The lateral modular housing 160 is configured to laterally receive and interconnect the modules 161. The modules 161 are slidably inserted into docking stations 162 of lateral modular housing 160, which includes a backplane for interconnecting the modules 161. As illustrated in FIG. 6, the modules 161 are arranged laterally in the lateral modular housing 160. Alternatively, the modules 161 may be arranged vertically in a lateral modular housing.



FIG. 7 illustrates a vertical modular housing 164 configured to receive a plurality of modules 165 of the surgical hub 106. The modules 165 are slidably inserted into docking stations, or drawers, 167 of vertical modular housing 164, which includes a backplane for interconnecting the modules 165. Although the drawers 167 of the vertical modular housing 164 are arranged vertically, in certain instances, a vertical modular housing 164 may include drawers that are arranged laterally. Furthermore, the modules 165 may interact with one another through the docking ports of the vertical modular housing 164. In the example of FIG. 7, a display 177 is provided for displaying data relevant to the operation of the modules 165. In addition, the vertical modular housing 164 includes a master module 178 housing a plurality of sub-modules that are slidably received in the master module 178.


In various aspects, the imaging module 138 comprises an integrated video processor and a modular light source and is adapted for use with various imaging devices. In one aspect, the imaging device is comprised of a modular housing that can be assembled with a light source module and a camera module. The housing can be a disposable housing. In at least one example, the disposable housing is removably coupled to a reusable controller, a light source module, and a camera module. The light source module and/or the camera module can be selectively chosen depending on the type of surgical procedure. In one aspect, the camera module comprises a CCD sensor. In another aspect, the camera module comprises a CMOS sensor. In another aspect, the camera module is configured for scanned beam imaging. Likewise, the light source module can be configured to deliver a white light or a different light, depending on the surgical procedure.


During a surgical procedure, removing a surgical device from the surgical field and replacing it with another surgical device that includes a different camera or a different light source can be inefficient. Temporarily losing sight of the surgical field may lead to undesirable consequences. The module imaging device of the present disclosure is configured to permit the replacement of a light source module or a camera module midstream during a surgical procedure, without having to remove the imaging device from the surgical field.


In one aspect, the imaging device comprises a tubular housing that includes a plurality of channels. A first channel is configured to slidably receive the camera module, which can be configured for a snap-fit engagement with the first channel. A second channel is configured to slidably receive the light source module, which can be configured for a snap-fit engagement with the second channel. In another example, the camera module and/or the light source module can be rotated into a final position within their respective channels. A threaded engagement can be employed in lieu of the snap-fit engagement.


In various examples, multiple imaging devices are placed at different positions in the surgical field to provide multiple views. The imaging module 138 can be configured to switch between the imaging devices to provide an optimal view. In various aspects, the imaging module 138 can be configured to integrate the images from the different imaging device.


Various image processors and imaging devices suitable for use with the present disclosure are described in U.S. Pat. No. 7,995,045, titled COMBINED SBI AND CONVENTIONAL IMAGE PROCESSOR, which issued on Aug. 9, 2011, which is herein incorporated by reference in its entirety. In addition, U.S. Pat. No. 7,982,776, titled SBI MOTION ARTIFACT REMOVAL APPARATUS AND METHOD, which issued on Jul. 19, 2011, which is herein incorporated by reference in its entirety, describes various systems for removing motion artifacts from image data. Such systems can be integrated with the imaging module 138. Furthermore, U.S. Patent Application Publication No. 2011/0306840, titled CONTROLLABLE MAGNETIC SOURCE TO FIXTURE INTRACORPOREAL APPARATUS, which published on Dec. 15, 2011, and U.S. Patent Application Publication No. 2014/0243597, titled SYSTEM FOR PERFORMING A MINIMALLY INVASIVE SURGICAL PROCEDURE, which published on Aug. 28, 2014, each of which is herein incorporated by reference in its entirety.



FIG. 8 illustrates a surgical data network 201 comprising a modular communication hub 203 configured to connect modular devices located in one or more operating theaters of a healthcare facility, or any room in a healthcare facility specially equipped for surgical operations, to a cloud-based system (e.g., the cloud 204 that may include a remote server 213 coupled to a storage device 205). In one aspect, the modular communication hub 203 comprises a network hub 207 and/or a network switch 209 in communication with a network router. The modular communication hub 203 also can be coupled to a local computer system 210 to provide local computer processing and data manipulation. The surgical data network 201 may be configured as passive, intelligent, or switching. A passive surgical data network serves as a conduit for the data, enabling it to go from one device (or segment) to another and to the cloud computing resources. An intelligent surgical data network includes additional features to enable the traffic passing through the surgical data network to be monitored and to configure each port in the network hub 207 or network switch 209. An intelligent surgical data network may be referred to as a manageable hub or switch. A switching hub reads the destination address of each packet and then forwards the packet to the correct port.


Modular devices 1a-1n located in the operating theater may be coupled to the modular communication hub 203. The network hub 207 and/or the network switch 209 may be coupled to a network router 211 to connect the devices 1a-1n to the cloud 204 or the local computer system 210. Data associated with the devices 1a-1n may be transferred to cloud-based computers via the router for remote data processing and manipulation. Data associated with the devices 1a-1n may also be transferred to the local computer system 210 for local data processing and manipulation. Modular devices 2a-2m located in the same operating theater also may be coupled to a network switch 209. The network switch 209 may be coupled to the network hub 207 and/or the network router 211 to connect to the devices 2a-2m to the cloud 204. Data associated with the devices 2a-2n may be transferred to the cloud 204 via the network router 211 for data processing and manipulation. Data associated with the devices 2a-2m may also be transferred to the local computer system 210 for local data processing and manipulation.


It will be appreciated that the surgical data network 201 may be expanded by interconnecting multiple network hubs 207 and/or multiple network switches 209 with multiple network routers 211. The modular communication hub 203 may be contained in a modular control tower configured to receive multiple devices 1a-1n/2a-2m. The local computer system 210 also may be contained in a modular control tower. The modular communication hub 203 is connected to a display 212 to display images obtained by some of the devices 1a-1n/2a-2m, for example during surgical procedures. In various aspects, the devices 1a-1n/2a-2m may include, for example, various modules such as an imaging module 138 coupled to an endoscope, a generator module 140 coupled to an energy-based surgical device, a smoke evacuation module 126, a suction/irrigation module 128, a communication module 130, a processor module 132, a storage array 134, a surgical device coupled to a display, and/or a non-contact sensor module, among other modular devices that may be connected to the modular communication hub 203 of the surgical data network 201.


In one aspect, the surgical data network 201 may comprise a combination of network hub(s), network switch(es), and network router(s) connecting the devices 1a-1n/2a-2m to the cloud. Any one of or all of the devices 1a-1n/2a-2m coupled to the network hub or network switch may collect data in real time and transfer the data to cloud computers for data processing and manipulation. It will be appreciated that cloud computing relies on sharing computing resources rather than having local servers or personal devices to handle software applications. The word “cloud” may be used as a metaphor for “the Internet,” although the term is not limited as such. Accordingly, the term “cloud computing” may be used herein to refer to “a type of Internet-based computing,” where different services—such as servers, storage, and applications—are delivered to the modular communication hub 203 and/or computer system 210 located in the surgical theater (e.g., a fixed, mobile, temporary, or field operating room or space) and to devices connected to the modular communication hub 203 and/or computer system 210 through the Internet. The cloud infrastructure may be maintained by a cloud service provider. In this context, the cloud service provider may be the entity that coordinates the usage and control of the devices 1a-1n/2a-2m located in one or more operating theaters. The cloud computing services can perform a large number of calculations based on the data gathered by smart surgical instruments, robots, and other computerized devices located in the operating theater. The hub hardware enables multiple devices or connections to be connected to a computer that communicates with the cloud computing resources and storage.


Applying cloud computer data processing techniques on the data collected by the devices 1a-1n/2a-2m, the surgical data network provides improved surgical outcomes, reduced costs, and improved patient satisfaction. At least some of the devices 1a-1n/2a-2m may be employed to view tissue states to assess leaks or perfusion of sealed tissue after a tissue sealing and cutting procedure. At least some of the devices 1a-1n/2a-2m may be employed to identify pathology, such as the effects of diseases, using the cloud-based computing to examine data including images of samples of body tissue for diagnostic purposes. This includes localization and margin confirmation of tissue and phenotypes. At least some of the devices 1a-1n/2a-2m may be employed to identify anatomical structures of the body using a variety of sensors integrated with imaging devices and techniques such as overlaying images captured by multiple imaging devices. The data gathered by the devices 1a-1n/2a-2m, including image data, may be transferred to the cloud 204 or the local computer system 210 or both for data processing and manipulation including image processing and manipulation. The data may be analyzed to improve surgical procedure outcomes by determining if further treatment, such as the application of endoscopic intervention, emerging technologies, a targeted radiation, targeted intervention, and precise robotics to tissue-specific sites and conditions, may be pursued. Such data analysis may further employ outcome analytics processing, and using standardized approaches may provide beneficial feedback to either confirm surgical treatments and the behavior of the surgeon or suggest modifications to surgical treatments and the behavior of the surgeon.


In one implementation, the operating theater devices 1a-1n may be connected to the modular communication hub 203 over a wired channel or a wireless channel depending on the configuration of the devices 1a-1n to a network hub. The network hub 207 may be implemented, in one aspect, as a local network broadcast device that works on the physical layer of the Open System Interconnection (OSI) model. The network hub provides connectivity to the devices 1a-1n located in the same operating theater network. The network hub 207 collects data in the form of packets and sends them to the router in half duplex mode. The network hub 207 does not store any media access control/Internet Protocol (MAC/IP) to transfer the device data. Only one of the devices 1a-1n can send data at a time through the network hub 207. The network hub 207 has no routing tables or intelligence regarding where to send information and broadcasts all network data across each connection and to a remote server 213 (FIG. 9) over the cloud 204. The network hub 207 can detect basic network errors such as collisions, but having all information broadcast to multiple ports can be a security risk and cause bottlenecks.


In another implementation, the operating theater devices 2a-2m may be connected to a network switch 209 over a wired channel or a wireless channel. The network switch 209 works in the data link layer of the OSI model. The network switch 209 is a multicast device for connecting the devices 2a-2m located in the same operating theater to the network. The network switch 209 sends data in the form of frames to the network router 211 and works in full duplex mode. Multiple devices 2a-2m can send data at the same time through the network switch 209. The network switch 209 stores and uses MAC addresses of the devices 2a-2m to transfer data.


The network hub 207 and/or the network switch 209 are coupled to the network router 211 for connection to the cloud 204. The network router 211 works in the network layer of the OSI model. The network router 211 creates a route for transmitting data packets received from the network hub 207 and/or network switch 211 to cloud-based computer resources for further processing and manipulation of the data collected by any one of or all the devices 1a-1n/2a-2m. The network router 211 may be employed to connect two or more different networks located in different locations, such as, for example, different operating theaters of the same healthcare facility or different networks located in different operating theaters of different healthcare facilities. The network router 211 sends data in the form of packets to the cloud 204 and works in full duplex mode. Multiple devices can send data at the same time. The network router 211 uses IP addresses to transfer data.


In one example, the network hub 207 may be implemented as a USB hub, which allows multiple USB devices to be connected to a host computer. The USB hub may expand a single USB port into several tiers so that there are more ports available to connect devices to the host system computer. The network hub 207 may include wired or wireless capabilities to receive information over a wired channel or a wireless channel. In one aspect, a wireless USB short-range, high-bandwidth wireless radio communication protocol may be employed for communication between the devices 1a-1n and devices 2a-2m located in the operating theater.


In other examples, the operating theater devices 1a-1n/2a-2m may communicate to the modular communication hub 203 via Bluetooth wireless technology standard for exchanging data over short distances (using short-wavelength UHF radio waves in the ISM band from 2.4 to 2.485 GHz) from fixed and mobile devices and building personal area networks (PANs). In other aspects, the operating theater devices 1a-1n/2a-2m may communicate to the modular communication hub 203 via a number of wireless or wired communication standards or protocols, including but not limited to W-Fi (IEEE 802.11 family), WiMAX (IEEE 802.16 family), IEEE 802.20, long-term evolution (LTE), and Ev-DO, HSPA+, HSDPA+, HSUPA+, EDGE, GSM, GPRS, CDMA, TDMA, DECT, and Ethernet derivatives thereof, as well as any other wireless and wired protocols that are designated as 3G, 4G, 5G, and beyond. The computing module may include a plurality of communication modules. For instance, a first communication module may be dedicated to shorter-range wireless communications such as W-Fi and Bluetooth, and a second communication module may be dedicated to longer-range wireless communications such as GPS, EDGE, GPRS, CDMA, WiMAX, LTE, Ev-DO, and others.


The modular communication hub 203 may serve as a central connection for one or all of the operating theater devices 1a-1n/2a-2m and handles a data type known as frames. Frames carry the data generated by the devices 1a-1n/2a-2m. When a frame is received by the modular communication hub 203, it is amplified and transmitted to the network router 211, which transfers the data to the cloud computing resources by using a number of wireless or wired communication standards or protocols, as described herein.


The modular communication hub 203 can be used as a standalone device or be connected to compatible network hubs and network switches to form a larger network. The modular communication hub 203 is generally easy to install, configure, and maintain, making it a good option for networking the operating theater devices 1a-1n/2a-2m.



FIG. 9 illustrates a computer-implemented interactive surgical system 200. The computer-implemented interactive surgical system 200 is similar in many respects to the computer-implemented interactive surgical system 100. For example, the computer-implemented interactive surgical system 200 includes one or more surgical systems 202, which are similar in many respects to the surgical systems 102. Each surgical system 202 includes at least one surgical hub 206 in communication with a cloud 204 that may include a remote server 213. In one aspect, the computer-implemented interactive surgical system 200 comprises a modular control tower 236 connected to multiple operating theater devices such as, for example, intelligent surgical instruments, robots, and other computerized devices located in the operating theater. As shown in FIG. 10, the modular control tower 236 comprises a modular communication hub 203 coupled to a computer system 210. As illustrated in the example of FIG. 9, the modular control tower 236 is coupled to an imaging module 238 that is coupled to an endoscope 239, a generator module 240 that is coupled to an energy device 241, a smoke evacuator module 226, a suction/irrigation module 228, a communication module 230, a processor module 232, a storage array 234, a smart device/instrument 235 optionally coupled to a display 237, and a non-contact sensor module 242. The operating theater devices are coupled to cloud computing resources and data storage via the modular control tower 236. A robot hub 222 also may be connected to the modular control tower 236 and to the cloud computing resources. The devices/instruments 235, visualization systems 208, among others, may be coupled to the modular control tower 236 via wired or wireless communication standards or protocols, as described herein. The modular control tower 236 may be coupled to a hub display 215 (e.g., monitor, screen) to display and overlay images received from the imaging module, device/instrument display, and/or other visualization systems 208. The hub display also may display data received from devices connected to the modular control tower in conjunction with images and overlaid images.



FIG. 10 illustrates a surgical hub 206 comprising a plurality of modules coupled to the modular control tower 236. The modular control tower 236 comprises a modular communication hub 203, e.g., a network connectivity device, and a computer system 210 to provide local processing, visualization, and imaging, for example. As shown in FIG. 10, the modular communication hub 203 may be connected in a tiered configuration to expand the number of modules (e.g., devices) that may be connected to the modular communication hub 203 and transfer data associated with the modules to the computer system 210, cloud computing resources, or both. As shown in FIG. 10, each of the network hubs/switches in the modular communication hub 203 includes three downstream ports and one upstream port. The upstream network hub/switch is connected to a processor to provide a communication connection to the cloud computing resources and a local display 217. Communication to the cloud 204 may be made either through a wired or a wireless communication channel.


The surgical hub 206 employs a non-contact sensor module 242 to measure the dimensions of the operating theater and generate a map of the surgical theater using either ultrasonic or laser-type non-contact measurement devices. An ultrasound-based non-contact sensor module scans the operating theater by transmitting a burst of ultrasound and receiving the echo when it bounces off the perimeter walls of an operating theater as described under the heading “Surgical Hub Spatial Awareness Within an Operating Room” in U.S. Provisional Patent Application Ser. No. 62/611,341, titled INTERACTIVE SURGICAL PLATFORM, filed Dec. 28, 2017, which is herein incorporated by reference in its entirety, in which the sensor module is configured to determine the size of the operating theater and to adjust Bluetooth-pairing distance limits. A laser-based non-contact sensor module scans the operating theater by transmitting laser light pulses, receiving laser light pulses that bounce off the perimeter walls of the operating theater, and comparing the phase of the transmitted pulse to the received pulse to determine the size of the operating theater and to adjust Bluetooth pairing distance limits, for example.


The computer system 210 comprises a processor 244 and a network interface 245. The processor 244 is coupled to a communication module 247, storage 248, memory 249, non-volatile memory 250, and input/output interface 251 via a system bus. The system bus can be any of several types of bus structure(s) including the memory bus or memory controller, a peripheral bus or external bus, and/or a local bus using any variety of available bus architectures including, but not limited to, 9-bit bus, Industrial Standard Architecture (ISA), Micro-Charmel Architecture (MSA), Extended ISA (EISA), Intelligent Drive Electronics (IDE), VESA Local Bus (VLB), Peripheral Component Interconnect (PCI), USB, Advanced Graphics Port (AGP), Personal Computer Memory Card International Association bus (PCMCIA), Small Computer Systems Interface (SCSI), or any other proprietary bus.


The processor 244 may be any single-core or multicore processor such as those known under the trade name ARM Cortex by Texas Instruments. In one aspect, the processor may be an LM4F230H5QR ARM Cortex-M4F Processor Core, available from Texas Instruments, for example, comprising an on-chip memory of 256 KB single-cycle flash memory, or other non-volatile memory, up to 40 MHz, a prefetch buffer to improve performance above 40 MHz, a 32 KB single-cycle serial random access memory (SRAM), an internal read-only memory (ROM) loaded with StellarisWare® software, a 2 KB electrically erasable programmable read-only memory (EEPROM), and/or one or more pulse width modulation (PWM) modules, one or more quadrature encoder inputs (QEI) analogs, one or more 12-bit analog-to-digital converters (ADCs) with 12 analog input channels, details of which are available for the product datasheet.


In one aspect, the processor 244 may comprise a safety controller comprising two controller-based families such as TMS570 and RM4x, known under the trade name Hercules ARM Cortex R4, also by Texas Instruments. The safety controller may be configured specifically for IEC 61508 and ISO 26262 safety critical applications, among others, to provide advanced integrated safety features while delivering scalable performance, connectivity, and memory options.


The system memory includes volatile memory and non-volatile memory. The basic input/output system (BIOS), containing the basic routines to transfer information between elements within the computer system, such as during start-up, is stored in non-volatile memory. For example, the non-volatile memory can include ROM, programmable ROM (PROM), electrically programmable ROM (EPROM), EEPROM, or flash memory. Volatile memory includes random-access memory (RAM), which acts as external cache memory. Moreover, RAM is available in many forms such as SRAM, dynamic RAM (DRAM), synchronous DRAM (SDRAM), double data rate SDRAM (DDR SDRAM), enhanced SDRAM (ESDRAM), Synchlink DRAM (SLDRAM), and direct Rambus RAM (DRRAM).


The computer system 210 also includes removable/non-removable, volatile/non-volatile computer storage media, such as for example disk storage. The disk storage includes, but is not limited to, devices like a magnetic disk drive, floppy disk drive, tape drive, Jaz drive, Zip drive, LS-60 drive, flash memory card, or memory stick. In addition, the disk storage can include storage media separately or in combination with other storage media including, but not limited to, an optical disc drive such as a compact disc ROM device (CD-ROM), compact disc recordable drive (CD-R Drive), compact disc rewritable drive (CD-RW Drive), or a digital versatile disc ROM drive (DVD-ROM). To facilitate the connection of the disk storage devices to the system bus, a removable or non-removable interface may be employed.


It is to be appreciated that the computer system 210 includes software that acts as an intermediary between users and the basic computer resources described in a suitable operating environment. Such software includes an operating system. The operating system, which can be stored on the disk storage, acts to control and allocate resources of the computer system. System applications take advantage of the management of resources by the operating system through program modules and program data stored either in the system memory or on the disk storage. It is to be appreciated that various components described herein can be implemented with various operating systems or combinations of operating systems.


A user enters commands or information into the computer system 210 through input device(s) coupled to the I/O interface 251. The input devices include, but are not limited to, a pointing device such as a mouse, trackball, stylus, touch pad, keyboard, microphone, joystick, game pad, satellite dish, scanner, TV tuner card, digital camera, digital video camera, web camera, and the like. These and other input devices connect to the processor through the system bus via interface port(s). The interface port(s) include, for example, a serial port, a parallel port, a game port, and a USB. The output device(s) use some of the same types of ports as input device(s). Thus, for example, a USB port may be used to provide input to the computer system and to output information from the computer system to an output device. An output adapter is provided to illustrate that there are some output devices like monitors, displays, speakers, and printers, among other output devices that require special adapters. The output adapters include, by way of illustration and not limitation, video and sound cards that provide a means of connection between the output device and the system bus. It should be noted that other devices and/or systems of devices, such as remote computer(s), provide both input and output capabilities.


The computer system 210 can operate in a networked environment using logical connections to one or more remote computers, such as cloud computer(s), or local computers. The remote cloud computer(s) can be a personal computer, server, router, network PC, workstation, microprocessor-based appliance, peer device, or other common network node, and the like, and typically includes many or all of the elements described relative to the computer system. For purposes of brevity, only a memory storage device is illustrated with the remote computer(s). The remote computer(s) is logically connected to the computer system through a network interface and then physically connected via a communication connection. The network interface encompasses communication networks such as local area networks (LANs) and wide area networks (WANs). LAN technologies include Fiber Distributed Data Interface (FDDI), Copper Distributed Data Interface (CDDI), Ethernet/IEEE 802.3, Token Ring/IEEE 802.5 and the like. WAN technologies include, but are not limited to, point-to-point links, circuit-switching networks like Integrated Services Digital Networks (ISDN) and variations thereon, packet-switching networks, and Digital Subscriber Lines (DSL).


In various aspects, the computer system 210 of FIG. 10, the imaging module 238 and/or visualization system 208, and/or the processor module 232 of FIGS. 9-10, may comprise an image processor, image-processing engine, media processor, or any specialized digital signal processor (DSP) used for the processing of digital images. The image processor may employ parallel computing with single instruction, multiple data (SIMD) or multiple instruction, multiple data (MIMD) technologies to increase speed and efficiency. The digital image-processing engine can perform a range of tasks. The image processor may be a system on a chip with multicore processor architecture.


The communication connection(s) refers to the hardware/software employed to connect the network interface to the bus. While the communication connection is shown for illustrative clarity inside the computer system, it can also be external to the computer system 210. The hardware/software necessary for connection to the network interface includes, for illustrative purposes only, internal and external technologies such as modems, including regular telephone-grade modems, cable modems, and DSL modems, ISDN adapters, and Ethernet cards.


In various aspects, the devices/instruments 235 described with reference to FIGS. 9-10, may be implemented as the surgical device 214002 (FIGS. 16-18), the display screen (FIG. 16), the wearable devices 214100, 214102, 214200, 214202 (FIGS. 19-21), and the display 214400 (FIGS. 22A and 22B). Accordingly, the surgical device 214002 (FIGS. 16-18), the display screen (FIG. 16), the wearable devices 214100, 214102, 214200, 214202 (FIGS. 19-21), and the display 214400 (FIGS. 22A and 22B) are configured to interface with the modular control tower 236 and the surgical hub 206. Once connected to the surgical hub 206, the surgical device 214002 (FIGS. 16-18), the display screen (FIG. 16), the wearable devices 214100, 214102, 214200, 214202 (FIGS. 19-21), and the display 214400 (FIGS. 22A and 22B) are configured to interface with the cloud 204, the server 213, other hub connected instruments, the hub display 215, or the visualization system 209, or combinations thereof. Further, once connected to hub 206, the surgical device 214002 (FIGS. 16-18), the display screen (FIG. 16), the wearable devices 214100, 214102, 214200, 214202 (FIGS. 19-21), and the display 214400 (FIGS. 22A and 22B) may utilize the processing circuits available in the hub local computer system 210.



FIG. 11 illustrates a functional block diagram of one aspect of a USB network hub 300 device, in accordance with at least one aspect of the present disclosure. In the illustrated aspect, the USB network hub device 300 employs a TUSB2036 integrated circuit hub by Texas Instruments. The USB network hub 300 is a CMOS device that provides an upstream USB transceiver port 302 and up to three downstream USB transceiver ports 304, 306, 308 in compliance with the USB 2.0 specification. The upstream USB transceiver port 302 is a differential root data port comprising a differential data minus (DM0) input paired with a differential data plus (DP0) input. The three downstream USB transceiver ports 304, 306, 308 are differential data ports where each port includes differential data plus (DP1-DP3) outputs paired with differential data minus (DM1-DM3) outputs.


The USB network hub 300 device is implemented with a digital state machine instead of a microcontroller, and no firmware programming is required. Fully compliant USB transceivers are integrated into the circuit for the upstream USB transceiver port 302 and all downstream USB transceiver ports 304, 306, 308. The downstream USB transceiver ports 304, 306, 308 support both full-speed and low-speed devices by automatically setting the slew rate according to the speed of the device attached to the ports. The USB network hub 300 device may be configured either in bus-powered or self-powered mode and includes a hub power logic 312 to manage power.


The USB network hub 300 device includes a serial interface engine 310 (SIE). The SIE 310 is the front end of the USB network hub 300 hardware and handles most of the protocol described in chapter 8 of the USB specification. The SIE 310 typically comprehends signaling up to the transaction level. The functions that it handles could include: packet recognition, transaction sequencing, SOP, EOP, RESET, and RESUME signal detection/generation, clock/data separation, non-return-to-zero invert (NRZI) data encoding/decoding and bit-stuffing, CRC generation and checking (token and data), packet ID (PID) generation and checking/decoding, and/or serial-parallel/parallel-serial conversion. The 310 receives a clock input 314 and is coupled to a suspend/resume logic and frame timer 316 circuit and a hub repeater circuit 318 to control communication between the upstream USB transceiver port 302 and the downstream USB transceiver ports 304, 306, 308 through port logic circuits 320, 322, 324. The SIE 310 is coupled to a command decoder 326 via interface logic 328 to control commands from a serial EEPROM via a serial EEPROM interface 330.


In various aspects, the USB network hub 300 can connect 127 functions configured in up to six logical layers (tiers) to a single computer. Further, the USB network hub 300 can connect to all peripherals using a standardized four-wire cable that provides both communication and power distribution. The power configurations are bus-powered and self-powered modes. The USB network hub 300 may be configured to support four modes of power management: a bus-powered hub, with either individual-port power management or ganged-port power management, and the self-powered hub, with either individual-port power management or ganged-port power management. In one aspect, using a USB cable, the USB network hub 300, the upstream USB transceiver port 302 is plugged into a USB host controller, and the downstream USB transceiver ports 304, 306, 308 are exposed for connecting USB compatible devices, and so forth.


Additional details regarding the structure and function of the surgical hub and/or surgical hub networks can be found in U.S. Provisional Patent Application No. 62/659,900, titled METHOD OF HUB COMMUNICATION, filed Apr. 19, 2018, which is hereby incorporated by reference herein in its entirety.


Cloud System Hardware and Functional Modules


FIG. 12 is a block diagram of the computer-implemented interactive surgical system, in accordance with at least one aspect of the present disclosure. In one aspect, the computer-implemented interactive surgical system is configured to monitor and analyze data related to the operation of various surgical systems that include surgical hubs, surgical instruments, robotic devices and operating theaters or healthcare facilities. The computer-implemented interactive surgical system comprises a cloud-based analytics system. Although the cloud-based analytics system is described as a surgical system, it is not necessarily limited as such and could be a cloud-based medical system generally. As illustrated in FIG. 12, the cloud-based analytics system comprises a plurality of surgical instruments 7012 (may be the same or similar to instruments 112), a plurality of surgical hubs 7006 (may be the same or similar to hubs 106), and a surgical data network 7001 (may be the same or similar to network 201) to couple the surgical hubs 7006 to the cloud 7004 (may be the same or similar to cloud 204). Each of the plurality of surgical hubs 7006 is communicatively coupled to one or more surgical instruments 7012. The hubs 7006 are also communicatively coupled to the cloud 7004 of the computer-implemented interactive surgical system via the network 7001. The cloud 7004 is a remote centralized source of hardware and software for storing, manipulating, and communicating data generated based on the operation of various surgical systems. As shown in FIG. 12, access to the cloud 7004 is achieved via the network 7001, which may be the Internet or some other suitable computer network. Surgical hubs 7006 that are coupled to the cloud 7004 can be considered the client side of the cloud computing system (i.e., cloud-based analytics system). Surgical instruments 7012 are paired with the surgical hubs 7006 for control and implementation of various surgical procedures or operations as described herein.


In addition, surgical instruments 7012 may comprise transceivers for data transmission to and from their corresponding surgical hubs 7006 (which may also comprise transceivers). Combinations of surgical instruments 7012 and corresponding hubs 7006 may indicate particular locations, such as operating theaters in healthcare facilities (e.g., hospitals), for providing medical operations. For example, the memory of a surgical hub 7006 may store location data. As shown in FIG. 12, the cloud 7004 comprises central servers 7013 (which may be same or similar to remote server 113 in FIG. 1 and/or remote server 213 in FIG. 9), hub application servers 7002, data analytics modules 7034, and an input/output (“I/O”) interface 7007. The central servers 7013 of the cloud 7004 collectively administer the cloud computing system, which includes monitoring requests by client surgical hubs 7006 and managing the processing capacity of the cloud 7004 for executing the requests. Each of the central servers 7013 comprises one or more processors 7008 coupled to suitable memory devices 7010 which can include volatile memory such as random-access memory (RAM) and non-volatile memory such as magnetic storage devices. The memory devices 7010 may comprise machine executable instructions that when executed cause the processors 7008 to execute the data analytics modules 7034 for the cloud-based data analysis, operations, recommendations and other operations described below. Moreover, the processors 7008 can execute the data analytics modules 7034 independently or in conjunction with hub applications independently executed by the hubs 7006. The central servers 7013 also comprise aggregated medical data databases 2212, which can reside in the memory 2210.


Based on connections to various surgical hubs 7006 via the network 7001, the cloud 7004 can aggregate data from specific data generated by various surgical instruments 7012 and their corresponding hubs 7006. Such aggregated data may be stored within the aggregated medical databases 7011 of the cloud 7004. In particular, the cloud 7004 may advantageously perform data analysis and operations on the aggregated data to yield insights and/or perform functions that individual hubs 7006 could not achieve on their own. To this end, as shown in FIG. 12, the cloud 7004 and the surgical hubs 7006 are communicatively coupled to transmit and receive information. The I/O interface 7007 is connected to the plurality of surgical hubs 7006 via the network 7001. In this way, the I/O interface 7007 can be configured to transfer information between the surgical hubs 7006 and the aggregated medical data databases 7011. Accordingly, the I/O interface 7007 may facilitate read/write operations of the cloud-based analytics system. Such read/write operations may be executed in response to requests from hubs 7006. These requests could be transmitted to the hubs 7006 through the hub applications. The I/O interface 7007 may include one or more high speed data ports, which may include universal serial bus (USB) ports, IEEE 1394 ports, as well as W-Fi and Bluetooth I/O interfaces for connecting the cloud 7004 to hubs 7006. The hub application servers 7002 of the cloud 7004 are configured to host and supply shared capabilities to software applications (e.g. hub applications) executed by surgical hubs 7006. For example, the hub application servers 7002 may manage requests made by the hub applications through the hubs 7006, control access to the aggregated medical data databases 7011, and perform load balancing. The data analytics modules 7034 are described in further detail with reference to FIG. 13.


The particular cloud computing system configuration described in the present disclosure is specifically designed to address various issues arising in the context of medical operations and procedures performed using medical devices, such as the surgical instruments 7012, 112. In particular, the surgical instruments 7012 may be digital surgical devices configured to interact with the cloud 7004 for implementing techniques to improve the performance of surgical operations. Various surgical instruments 7012 and/or surgical hubs 7006 may comprise touch controlled user interfaces such that clinicians may control aspects of interaction between the surgical instruments 7012 and the cloud 7004. Other suitable user interfaces for control such as auditory controlled user interfaces can also be used.



FIG. 13 is a block diagram which illustrates the functional architecture of the computer-implemented interactive surgical system, in accordance with at least one aspect of the present disclosure. The cloud-based analytics system includes a plurality of data analytics modules 7034 that may be executed by the processors 7008 of the cloud 7004 for providing data analytic solutions to problems specifically arising in the medical field. As shown in FIG. 13, the functions of the cloud-based data analytics modules 7034 may be assisted via hub applications 7014 hosted by the hub application servers 7002 that may be accessed on surgical hubs 7006. The cloud processors 7008 and hub applications 7014 may operate in conjunction to execute the data analytics modules 7034. Application program interfaces (APIs) 7016 define the set of protocols and routines corresponding to the hub applications 7014. Additionally, the APIs 7016 manage the storing and retrieval of data into and from the aggregated medical data databases 7011 for the operations of the applications 7014. The caches 7018 also store data (e.g., temporarily) and are coupled to the APIs 7016 for more efficient retrieval of data used by the applications 7014. The data analytics modules 7034 in FIG. 13 include modules for resource optimization 7020, data collection and aggregation 7022, authorization and security 7024, control program updating 7026, patient outcome analysis 7028, recommendations 7030, and data sorting and prioritization 7032. Other suitable data analytics modules could also be implemented by the cloud 7004, according to some aspects. In one aspect, the data analytics modules are used for specific recommendations based on analyzing trends, outcomes, and other data.


For example, the data collection and aggregation module 7022 could be used to generate self-describing data (e.g., metadata) including identification of notable features or configuration (e.g., trends), management of redundant data sets, and storage of the data in paired data sets which can be grouped by surgery but not necessarily keyed to actual surgical dates and surgeons. In particular, pair data sets generated from operations of surgical instruments 7012 can comprise applying a binary classification, e.g., a bleeding or a non-bleeding event. More generally, the binary classification may be characterized as either a desirable event (e.g., a successful surgical procedure) or an undesirable event (e.g., a misfired or misused surgical instrument 7012). The aggregated self-describing data may correspond to individual data received from various groups or subgroups of surgical hubs 7006. Accordingly, the data collection and aggregation module 7022 can generate aggregated metadata or other organized data based on raw data received from the surgical hubs 7006. To this end, the processors 7008 can be operationally coupled to the hub applications 7014 and aggregated medical data databases 7011 for executing the data analytics modules 7034. The data collection and aggregation module 7022 may store the aggregated organized data into the aggregated medical data databases 2212.


The resource optimization module 7020 can be configured to analyze this aggregated data to determine an optimal usage of resources for a particular or group of healthcare facilities. For example, the resource optimization module 7020 may determine an optimal order point of surgical stapling instruments 7012 for a group of healthcare facilities based on corresponding predicted demand of such instruments 7012. The resource optimization module 7020 might also assess the resource usage or other operational configurations of various healthcare facilities to determine whether resource usage could be improved. Similarly, the recommendations module 7030 can be configured to analyze aggregated organized data from the data collection and aggregation module 7022 to provide recommendations. For example, the recommendations module 7030 could recommend to healthcare facilities (e.g., medical service providers such as hospitals) that a particular surgical instrument 7012 should be upgraded to an improved version based on a higher than expected error rate, for example. Additionally, the recommendations module 7030 and/or resource optimization module 7020 could recommend better supply chain parameters such as product reorder points and provide suggestions of different surgical instrument 7012, uses thereof, or procedure steps to improve surgical outcomes. The healthcare facilities can receive such recommendations via corresponding surgical hubs 7006. More specific recommendations regarding parameters or configurations of various surgical instruments 7012 can also be provided. Hubs 7006 and/or surgical instruments 7012 each could also have display screens that display data or recommendations provided by the cloud 7004.


The patient outcome analysis module 7028 can analyze surgical outcomes associated with currently used operational parameters of surgical instruments 7012. The patient outcome analysis module 7028 may also analyze and assess other potential operational parameters. In this connection, the recommendations module 7030 could recommend using these other potential operational parameters based on yielding better surgical outcomes, such as better sealing or less bleeding. For example, the recommendations module 7030 could transmit recommendations to a surgical hub 7006 regarding when to use a particular cartridge for a corresponding stapling surgical instrument 7012. Thus, the cloud-based analytics system, while controlling for common variables, may be configured to analyze the large collection of raw data and to provide centralized recommendations over multiple healthcare facilities (advantageously determined based on aggregated data). For example, the cloud-based analytics system could analyze, evaluate, and/or aggregate data based on type of medical practice, type of patient, number of patients, geographic similarity between medical providers, which medical providers/facilities use similar types of instruments, etc., in a way that no single healthcare facility alone would be able to analyze independently.


The control program updating module 7026 could be configured to implement various surgical instrument 7012 recommendations when corresponding control programs are updated. For example, the patient outcome analysis module 7028 could identify correlations linking specific control parameters with successful (or unsuccessful) results. Such correlations may be addressed when updated control programs are transmitted to surgical instruments 7012 via the control program updating module 7026. Updates to instruments 7012 that are transmitted via a corresponding hub 7006 may incorporate aggregated performance data that was gathered and analyzed by the data collection and aggregation module 7022 of the cloud 7004. Additionally, the patient outcome analysis module 7028 and recommendations module 7030 could identify improved methods of using instruments 7012 based on aggregated performance data.


The cloud-based analytics system may include security features implemented by the cloud 7004. These security features may be managed by the authorization and security module 7024. Each surgical hub 7006 can have associated unique credentials such as username, password, and other suitable security credentials. These credentials could be stored in the memory 7010 and be associated with a permitted cloud access level. For example, based on providing accurate credentials, a surgical hub 7006 may be granted access to communicate with the cloud to a predetermined extent (e.g., may only engage in transmitting or receiving certain defined types of information). To this end, the aggregated medical data databases 7011 of the cloud 7004 may comprise a database of authorized credentials for verifying the accuracy of provided credentials. Different credentials may be associated with varying levels of permission for interaction with the cloud 7004, such as a predetermined access level for receiving the data analytics generated by the cloud 7004.


Furthermore, for security purposes, the cloud could maintain a database of hubs 7006, instruments 7012, and other devices that may comprise a “black list” of prohibited devices. In particular, a surgical hub 7006 listed on the black list may not be permitted to interact with the cloud, while surgical instruments 7012 listed on the black list may not have functional access to a corresponding hub 7006 and/or may be prevented from fully functioning when paired to its corresponding hub 7006. Additionally or alternatively, the cloud 7004 may flag instruments 7012 based on incompatibility or other specified criteria. In this manner, counterfeit medical devices and improper reuse of such devices throughout the cloud-based analytics system can be identified and addressed.


The surgical instruments 7012 may use wireless transceivers to transmit wireless signals that may represent, for example, authorization credentials for access to corresponding hubs 7006 and the cloud 7004. Wired transceivers may also be used to transmit signals. Such authorization credentials can be stored in the respective memory devices of the surgical instruments 7012. The authorization and security module 7024 can determine whether the authorization credentials are accurate or counterfeit. The authorization and security module 7024 may also dynamically generate authorization credentials for enhanced security. The credentials could also be encrypted, such as by using hash based encryption. Upon transmitting proper authorization, the surgical instruments 7012 may transmit a signal to the corresponding hubs 7006 and ultimately the cloud 7004 to indicate that the instruments 7012 are ready to obtain and transmit medical data. In response, the cloud 7004 may transition into a state enabled for receiving medical data for storage into the aggregated medical data databases 7011. This data transmission readiness could be indicated by a light indicator on the instruments 7012, for example. The cloud 7004 can also transmit signals to surgical instruments 7012 for updating their associated control programs. The cloud 7004 can transmit signals that are directed to a particular class of surgical instruments 7012 (e.g., electrosurgical instruments) so that software updates to control programs are only transmitted to the appropriate surgical instruments 7012. Moreover, the cloud 7004 could be used to implement system wide solutions to address local or global problems based on selective data transmission and authorization credentials. For example, if a group of surgical instruments 7012 are identified as having a common manufacturing defect, the cloud 7004 may change the authorization credentials corresponding to this group to implement an operational lockout of the group.


The cloud-based analytics system may allow for monitoring multiple healthcare facilities (e.g., medical facilities like hospitals) to determine improved practices and recommend changes (via the recommendations module 2030, for example) accordingly. Thus, the processors 7008 of the cloud 7004 can analyze data associated with an individual healthcare facility to identify the facility and aggregate the data with other data associated with other healthcare facilities in a group. Groups could be defined based on similar operating practices or geographical location, for example. In this way, the cloud 7004 may provide healthcare facility group wide analysis and recommendations. The cloud-based analytics system could also be used for enhanced situational awareness. For example, the processors 7008 may predictively model the effects of recommendations on the cost and effectiveness for a particular facility (relative to overall operations and/or various medical procedures). The cost and effectiveness associated with that particular facility can also be compared to a corresponding local region of other facilities or any other comparable facilities.


The data sorting and prioritization module 7032 may prioritize and sort data based on criticality (e.g., the severity of a medical event associated with the data, unexpectedness, suspiciousness). This sorting and prioritization may be used in conjunction with the functions of the other data analytics modules 7034 described above to improve the cloud-based analytics and operations described herein. For example, the data sorting and prioritization module 7032 can assign a priority to the data analysis performed by the data collection and aggregation module 7022 and patient outcome analysis modules 7028. Different prioritization levels can result in particular responses from the cloud 7004 (corresponding to a level of urgency) such as escalation for an expedited response, special processing, exclusion from the aggregated medical data databases 7011, or other suitable responses. Moreover, if necessary, the cloud 7004 can transmit a request (e.g. a push message) through the hub application servers for additional data from corresponding surgical instruments 7012. The push message can result in a notification displayed on the corresponding hubs 7006 for requesting supporting or additional data. This push message may be required in situations in which the cloud detects a significant irregularity or outlier and the cloud cannot determine the cause of the irregularity. The central servers 7013 may be programmed to trigger this push message in certain significant circumstances, such as when data is determined to be different from an expected value beyond a predetermined threshold or when it appears security has been comprised, for example.


In various aspects, the surgical instrument(s) 7012 described above with reference to FIGS. 12 and 13, may be implemented as the surgical device 214002 (FIGS. 16-18), the display screen (FIG. 16), the wearable devices 214100, 214102, 214200, 214202 (FIGS. 19-21), and the display 214400 (FIGS. 22A and 22B). Accordingly, the surgical device 214002 (FIGS. 16-18), the display screen (FIG. 16), the wearable devices 214100, 214102, 214200, 214202 (FIGS. 19-21), and the display 214400 (FIGS. 22A and 22B) are configured to interface with the surgical hub 7006 and the network 2001, which is configured to interface with cloud 7004. Accordingly, the processing power provided by the central servers 7013 and the data analytics module 7034 are configured to process information (e.g., data and control) from the surgical device 214002 (FIGS. 16-18), the display screen (FIG. 16), the wearable devices 214100, 214102, 214200, 214202 (FIGS. 19-21), and the display 214400 (FIGS. 22A and 22B).


Additional details regarding the cloud analysis system can be found in U.S. Provisional Patent Application No. 62/659,900, titled METHOD OF HUB COMMUNICATION, filed Apr. 19, 2018, which is hereby incorporated by reference herein in its entirety.


Situational Awareness

Although an “intelligent” device including control algorithms that respond to sensed data can be an improvement over a “dumb” device that operates without accounting for sensed data, some sensed data can be incomplete or inconclusive when considered in isolation, i.e., without the context of the type of surgical procedure being performed or the type of tissue that is being operated on. Without knowing the procedural context (e.g., knowing the type of tissue being operated on or the type of procedure being performed), the control algorithm may control the modular device incorrectly or suboptimally given the particular context-free sensed data. For example, the optimal manner for a control algorithm to control a surgical instrument in response to a particular sensed parameter can vary according to the particular tissue type being operated on. This is due to the fact that different tissue types have different properties (e.g., resistance to tearing) and thus respond differently to actions taken by surgical instruments. Therefore, it may be desirable for a surgical instrument to take different actions even when the same measurement for a particular parameter is sensed. As one specific example, the optimal manner in which to control a surgical stapling and cutting instrument in response to the instrument sensing an unexpectedly high force to close its end effector will vary depending upon whether the tissue type is susceptible or resistant to tearing. For tissues that are susceptible to tearing, such as lung tissue, the instrument's control algorithm would optimally ramp down the motor in response to an unexpectedly high force to close to avoid tearing the tissue. For tissues that are resistant to tearing, such as stomach tissue, the instrument's control algorithm would optimally ramp up the motor in response to an unexpectedly high force to close to ensure that the end effector is clamped properly on the tissue. Without knowing whether lung or stomach tissue has been clamped, the control algorithm may make a suboptimal decision.


One solution utilizes a surgical hub including a system that is configured to derive information about the surgical procedure being performed based on data received from various data sources and then control the paired modular devices accordingly. In other words, the surgical hub is configured to infer information about the surgical procedure from received data and then control the modular devices paired to the surgical hub based upon the inferred context of the surgical procedure. FIG. 14 illustrates a diagram of a situationally aware surgical system 5100, in accordance with at least one aspect of the present disclosure. In some exemplifications, the data sources 5126 include, for example, the modular devices 5102 (which can include sensors configured to detect parameters associated with the patient and/or the modular device itself), databases 5122 (e.g., an EMR database containing patient records), and patient monitoring devices 5124 (e.g., a blood pressure (BP) monitor and an electrocardiography (EKG) monitor).


A surgical hub 5104, which may be similar to the hub 106 in many respects, can be configured to derive the contextual information pertaining to the surgical procedure from the data based upon, for example, the particular combination(s) of received data or the particular order in which the data is received from the data sources 5126. The contextual information inferred from the received data can include, for example, the type of surgical procedure being performed, the particular step of the surgical procedure that the surgeon is performing, the type of tissue being operated on, or the body cavity that is the subject of the procedure. This ability by some aspects of the surgical hub 5104 to derive or infer information related to the surgical procedure from received data can be referred to as “situational awareness.” In one exemplification, the surgical hub 5104 can incorporate a situational awareness system, which is the hardware and/or programming associated with the surgical hub 5104 that derives contextual information pertaining to the surgical procedure from the received data.


The situational awareness system of the surgical hub 5104 can be configured to derive the contextual information from the data received from the data sources 5126 in a variety of different ways. In one exemplification, the situational awareness system includes a pattern recognition system, or machine learning system (e.g., an artificial neural network), that has been trained on training data to correlate various inputs (e.g., data from databases 5122, patient monitoring devices 5124, and/or modular devices 5102) to corresponding contextual information regarding a surgical procedure. In other words, a machine learning system can be trained to accurately derive contextual information regarding a surgical procedure from the provided inputs. In another exemplification, the situational awareness system can include a lookup table storing pre-characterized contextual information regarding a surgical procedure in association with one or more inputs (or ranges of inputs) corresponding to the contextual information. In response to a query with one or more inputs, the lookup table can return the corresponding contextual information for the situational awareness system for controlling the modular devices 5102. In one exemplification, the contextual information received by the situational awareness system of the surgical hub 5104 is associated with a particular control adjustment or set of control adjustments for one or more modular devices 5102. In another exemplification, the situational awareness system includes a further machine learning system, lookup table, or other such system, which generates or retrieves one or more control adjustments for one or more modular devices 5102 when provided the contextual information as input.


A surgical hub 5104 incorporating a situational awareness system provides a number of benefits for the surgical system 5100. One benefit includes improving the interpretation of sensed and collected data, which would in turn improve the processing accuracy and/or the usage of the data during the course of a surgical procedure. To return to a previous example, a situationally aware surgical hub 5104 could determine what type of tissue was being operated on; therefore, when an unexpectedly high force to close the surgical instrument's end effector is detected, the situationally aware surgical hub 5104 could correctly ramp up or ramp down the motor of the surgical instrument for the type of tissue.


As another example, the type of tissue being operated can affect the adjustments that are made to the compression rate and load thresholds of a surgical stapling and cutting instrument for a particular tissue gap measurement. A situationally aware surgical hub 5104 could infer whether a surgical procedure being performed is a thoracic or an abdominal procedure, allowing the surgical hub 5104 to determine whether the tissue clamped by an end effector of the surgical stapling and cutting instrument is lung (for a thoracic procedure) or stomach (for an abdominal procedure) tissue. The surgical hub 5104 could then adjust the compression rate and load thresholds of the surgical stapling and cutting instrument appropriately for the type of tissue.


As yet another example, the type of body cavity being operated in during an insufflation procedure can affect the function of a smoke evacuator. A situationally aware surgical hub 5104 could determine whether the surgical site is under pressure (by determining that the surgical procedure is utilizing insufflation) and determine the procedure type. As a procedure type is generally performed in a specific body cavity, the surgical hub 5104 could then control the motor rate of the smoke evacuator appropriately for the body cavity being operated in. Thus, a situationally aware surgical hub 5104 could provide a consistent amount of smoke evacuation for both thoracic and abdominal procedures.


As yet another example, the type of procedure being performed can affect the optimal energy level for an ultrasonic surgical instrument or radio frequency (RF) electrosurgical instrument to operate at. Arthroscopic procedures, for example, require higher energy levels because the end effector of the ultrasonic surgical instrument or RF electrosurgical instrument is immersed in fluid. A situationally aware surgical hub 5104 could determine whether the surgical procedure is an arthroscopic procedure. The surgical hub 5104 could then adjust the RF power level or the ultrasonic amplitude of the generator (i.e., “energy level”) to compensate for the fluid filled environment. Relatedly, the type of tissue being operated on can affect the optimal energy level for an ultrasonic surgical instrument or RF electrosurgical instrument to operate at. A situationally aware surgical hub 5104 could determine what type of surgical procedure is being performed and then customize the energy level for the ultrasonic surgical instrument or RF electrosurgical instrument, respectively, according to the expected tissue profile for the surgical procedure. Furthermore, a situationally aware surgical hub 5104 can be configured to adjust the energy level for the ultrasonic surgical instrument or RF electrosurgical instrument throughout the course of a surgical procedure, rather than just on a procedure-by-procedure basis. A situationally aware surgical hub 5104 could determine what step of the surgical procedure is being performed or will subsequently be performed and then update the control algorithms for the generator and/or ultrasonic surgical instrument or RF electrosurgical instrument to set the energy level at a value appropriate for the expected tissue type according to the surgical procedure step.


As yet another example, data can be drawn from additional data sources 5126 to improve the conclusions that the surgical hub 5104 draws from one data source 5126. A situationally aware surgical hub 5104 could augment data that it receives from the modular devices 5102 with contextual information that it has built up regarding the surgical procedure from other data sources 5126. For example, a situationally aware surgical hub 5104 can be configured to determine whether hemostasis has occurred (i.e., whether bleeding at a surgical site has stopped) according to video or image data received from a medical imaging device. However, in some cases the video or image data can be inconclusive. Therefore, in one exemplification, the surgical hub 5104 can be further configured to compare a physiologic measurement (e.g., blood pressure sensed by a BP monitor communicably connected to the surgical hub 5104) with the visual or image data of hemostasis (e.g., from a medical imaging device 124 (FIG. 2) communicably coupled to the surgical hub 5104) to make a determination on the integrity of the staple line or tissue weld. In other words, the situational awareness system of the surgical hub 5104 can consider the physiological measurement data to provide additional context in analyzing the visualization data. The additional context can be useful when the visualization data may be inconclusive or incomplete on its own.


Another benefit includes proactively and automatically controlling the paired modular devices 5102 according to the particular step of the surgical procedure that is being performed to reduce the number of times that medical personnel are required to interact with or control the surgical system 5100 during the course of a surgical procedure. For example, a situationally aware surgical hub 5104 could proactively activate the generator to which an RF electrosurgical instrument is connected if it determines that a subsequent step of the procedure requires the use of the instrument. Proactively activating the energy source allows the instrument to be ready for use a soon as the preceding step of the procedure is completed.


As another example, a situationally aware surgical hub 5104 could determine whether the current or subsequent step of the surgical procedure requires a different view or degree of magnification on the display according to the feature(s) at the surgical site that the surgeon is expected to need to view. The surgical hub 5104 could then proactively change the displayed view (supplied by, e.g., a medical imaging device for the visualization system 108) accordingly so that the display automatically adjusts throughout the surgical procedure.


As yet another example, a situationally aware surgical hub 5104 could determine which step of the surgical procedure is being performed or will subsequently be performed and whether particular data or comparisons between data will be required for that step of the surgical procedure. The surgical hub 5104 can be configured to automatically call up data screens based upon the step of the surgical procedure being performed, without waiting for the surgeon to ask for the particular information.


Another benefit includes checking for errors during the setup of the surgical procedure or during the course of the surgical procedure. For example, a situationally aware surgical hub 5104 could determine whether the operating theater is setup properly or optimally for the surgical procedure to be performed. The surgical hub 5104 can be configured to determine the type of surgical procedure being performed, retrieve the corresponding checklists, product location, or setup needs (e.g., from a memory), and then compare the current operating theater layout to the standard layout for the type of surgical procedure that the surgical hub 5104 determines is being performed. In one exemplification, the surgical hub 5104 can be configured to compare the list of items for the procedure scanned by a suitable scanner, for example, and/or a list of devices paired with the surgical hub 5104 to a recommended or anticipated manifest of items and/or devices for the given surgical procedure. If there are any discontinuities between the lists, the surgical hub 5104 can be configured to provide an alert indicating that a particular modular device 5102, patient monitoring device 5124, and/or other surgical item is missing. In one exemplification, the surgical hub 5104 can be configured to determine the relative distance or position of the modular devices 5102 and patient monitoring devices 5124 via proximity sensors, for example. The surgical hub 5104 can compare the relative positions of the devices to a recommended or anticipated layout for the particular surgical procedure. If there are any discontinuities between the layouts, the surgical hub 5104 can be configured to provide an alert indicating that the current layout for the surgical procedure deviates from the recommended layout.


As another example, a situationally aware surgical hub 5104 could determine whether the surgeon (or other medical personnel) was making an error or otherwise deviating from the expected course of action during the course of a surgical procedure. For example, the surgical hub 5104 can be configured to determine the type of surgical procedure being performed, retrieve the corresponding list of steps or order of equipment usage (e.g., from a memory), and then compare the steps being performed or the equipment being used during the course of the surgical procedure to the expected steps or equipment for the type of surgical procedure that the surgical hub 5104 determined is being performed. In one exemplification, the surgical hub 5104 can be configured to provide an alert indicating that an unexpected action is being performed or an unexpected device is being utilized at the particular step in the surgical procedure.


Overall, the situational awareness system for the surgical hub 5104 improves surgical procedure outcomes by adjusting the surgical instruments (and other modular devices 5102) for the particular context of each surgical procedure (such as adjusting to different tissue types) and validating actions during a surgical procedure. The situational awareness system also improves surgeons' efficiency in performing surgical procedures by automatically suggesting next steps, providing data, and adjusting displays and other modular devices 5102 in the surgical theater according to the specific context of the procedure.


In one aspect, as described hereinbelow with reference to FIGS. 24-40, the modular device 5102 is implemented as the surgical device 214002 (FIGS. 16-18), the display screen (FIG. 16), the wearable devices 214100, 214102, 214200, 214202 (FIGS. 19-21), and the display 214400 (FIGS. 22A and 22B). Accordingly, the modular device 5102 implemented as the surgical device 214002 (FIGS. 16-18), the display screen (FIG. 16), the wearable devices 214100, 214102, 214200, 214202 (FIGS. 19-21), or the display 214400 (FIGS. 22A and 22B) is configured to operate as a data source 5126 and to interact with the database 5122 and patient monitoring devices 5124. The modular device 5102 implemented as the surgical device 214002 (FIGS. 16-18), the display screen (FIG. 16), the wearable devices 214100, 214102, 214200, 214202 (FIGS. 19-21), and the display 214400 (FIGS. 22A and 22B) is further configured to interact with the surgical hub 5104 to provide information (e.g., data and control) to the surgical hub 5104 and receive information (e.g., data and control) from the surgical hub 5104.


Referring now to FIG. 15, a timeline 5200 depicting situational awareness of a hub, such as the surgical hub 106 or 206 (FIGS. 1-11), for example, is depicted. The timeline 5200 is an illustrative surgical procedure and the contextual information that the surgical hub 106, 206 can derive from the data received from the data sources at each step in the surgical procedure. The timeline 5200 depicts the typical steps that would be taken by the nurses, surgeons, and other medical personnel during the course of a lung segmentectomy procedure, beginning with setting up the operating theater and ending with transferring the patient to a post-operative recovery room.


The situationally aware surgical hub 106, 206 receives data from the data sources throughout the course of the surgical procedure, including data generated each time medical personnel utilize a modular device that is paired with the surgical hub 106, 206. The surgical hub 106, 206 can receive this data from the paired modular devices and other data sources and continually derive inferences (i.e., contextual information) about the ongoing procedure as new data is received, such as which step of the procedure is being performed at any given time. The situational awareness system of the surgical hub 106, 206 is able to, for example, record data pertaining to the procedure for generating reports, verify the steps being taken by the medical personnel, provide data or prompts (e.g., via a display screen) that may be pertinent for the particular procedural step, adjust modular devices based on the context (e.g., activate monitors, adjust the field of view (FOV) of the medical imaging device, or change the energy level of an ultrasonic surgical instrument or RF electrosurgical instrument), and take any other such action described above.


As the first step 5202 in this illustrative procedure, the hospital staff members retrieve the patient's EMR from the hospital's EMR database. Based on select patient data in the EMR, the surgical hub 106, 206 determines that the procedure to be performed is a thoracic procedure.


Second step 5204, the staff members scan the incoming medical supplies for the procedure. The surgical hub 106, 206 cross-references the scanned supplies with a list of supplies that are utilized in various types of procedures and confirms that the mix of supplies corresponds to a thoracic procedure. Further, the surgical hub 106, 206 is also able to determine that the procedure is not a wedge procedure (because the incoming supplies either lack certain supplies that are necessary for a thoracic wedge procedure or do not otherwise correspond to a thoracic wedge procedure).


Third step 5206, the medical personnel scan the patient band via a scanner that is communicably connected to the surgical hub 106, 206. The surgical hub 106, 206 can then confirm the patient's identity based on the scanned data.


Fourth step 5208, the medical staff turns on the auxiliary equipment. The auxiliary equipment being utilized can vary according to the type of surgical procedure and the techniques to be used by the surgeon, but in this illustrative case they include a smoke evacuator, insufflator, and medical imaging device. When activated, the auxiliary equipment that are modular devices can automatically pair with the surgical hub 106, 206 that is located within a particular vicinity of the modular devices as part of their initialization process. The surgical hub 106, 206 can then derive contextual information about the surgical procedure by detecting the types of modular devices that pair with it during this pre-operative or initialization phase. In this particular example, the surgical hub 106, 206 determines that the surgical procedure is a VATS procedure based on this particular combination of paired modular devices. Based on the combination of the data from the patient's EMR, the list of medical supplies to be used in the procedure, and the type of modular devices that connect to the hub, the surgical hub 106, 206 can generally infer the specific procedure that the surgical team will be performing. Once the surgical hub 106, 206 knows what specific procedure is being performed, the surgical hub 106, 206 can then retrieve the steps of that procedure from a memory or from the cloud and then cross-reference the data it subsequently receives from the connected data sources (e.g., modular devices and patient monitoring devices) to infer what step of the surgical procedure the surgical team is performing.


Fifth step 5210, the staff members attach the EKG electrodes and other patient monitoring devices to the patient. The EKG electrodes and other patient monitoring devices are able to pair with the surgical hub 106, 206. As the surgical hub 106, 206 begins receiving data from the patient monitoring devices, the surgical hub 106, 206 thus confirms that the patient is in the operating theater.


Sixth step 5212, the medical personnel induce anesthesia in the patient. The surgical hub 106, 206 can infer that the patient is under anesthesia based on data from the modular devices and/or patient monitoring devices, including EKG data, blood pressure data, ventilator data, or combinations thereof, for example. Upon completion of the sixth step 5212, the pre-operative portion of the lung segmentectomy procedure is completed and the operative portion begins.


Seventh step 5214, the patient's lung that is being operated on is collapsed (while ventilation is switched to the contralateral lung). The surgical hub 106, 206 can infer from the ventilator data that the patient's lung has been collapsed, for example. The surgical hub 106, 206 can infer that the operative portion of the procedure has commenced as it can compare the detection of the patient's lung collapsing to the expected steps of the procedure (which can be accessed or retrieved previously) and thereby determine that collapsing the lung is the first operative step in this particular procedure.


Eighth step 5216, the medical imaging device (e.g., a scope) is inserted and video from the medical imaging device is initiated. The surgical hub 106, 206 receives the medical imaging device data (i.e., video or image data) through its connection to the medical imaging device. Upon receipt of the medical imaging device data, the surgical hub 106, 206 can determine that the laparoscopic portion of the surgical procedure has commenced. Further, the surgical hub 106, 206 can determine that the particular procedure being performed is a segmentectomy, as opposed to a lobectomy (note that a wedge procedure has already been discounted by the surgical hub 106, 206 based on data received at the second step 5204 of the procedure). The data from the medical imaging device 124 (FIG. 2) can be utilized to determine contextual information regarding the type of procedure being performed in a number of different ways, including by determining the angle at which the medical imaging device is oriented with respect to the visualization of the patient's anatomy, monitoring the number or medical imaging devices being utilized (i.e., that are activated and paired with the surgical hub 106, 206), and monitoring the types of visualization devices utilized. For example, one technique for performing a VATS lobectomy places the camera in the lower anterior corner of the patient's chest cavity above the diaphragm, whereas one technique for performing a VATS segmentectomy places the camera in an anterior intercostal position relative to the segmental fissure. Using pattern recognition or machine learning techniques, for example, the situational awareness system can be trained to recognize the positioning of the medical imaging device according to the visualization of the patient's anatomy. As another example, one technique for performing a VATS lobectomy utilizes a single medical imaging device, whereas another technique for performing a VATS segmentectomy utilizes multiple cameras. As yet another example, one technique for performing a VATS segmentectomy utilizes an infrared light source (which can be communicably coupled to the surgical hub as part of the visualization system) to visualize the segmental fissure, which is not utilized in a VATS lobectomy. By tracking any or all of this data from the medical imaging device, the surgical hub 106, 206 can thereby determine the specific type of surgical procedure being performed and/or the technique being used for a particular type of surgical procedure.


Ninth step 5218, the surgical team begins the dissection step of the procedure. The surgical hub 106, 206 can infer that the surgeon is in the process of dissecting to mobilize the patient's lung because it receives data from the RF or ultrasonic generator indicating that an energy instrument is being fired. The surgical hub 106, 206 can cross-reference the received data with the retrieved steps of the surgical procedure to determine that an energy instrument being fired at this point in the process (i.e., after the completion of the previously discussed steps of the procedure) corresponds to the dissection step. In certain instances, the energy instrument can be an energy tool mounted to a robotic arm of a robotic surgical system.


Tenth step 5220, the surgical team proceeds to the ligation step of the procedure. The surgical hub 106, 206 can infer that the surgeon is ligating arteries and veins because it receives data from the surgical stapling and cutting instrument indicating that the instrument is being fired. Similarly to the prior step, the surgical hub 106, 206 can derive this inference by cross-referencing the receipt of data from the surgical stapling and cutting instrument with the retrieved steps in the process. In certain instances, the surgical instrument can be a surgical tool mounted to a robotic arm of a robotic surgical system.


Eleventh step 5222, the segmentectomy portion of the procedure is performed. The surgical hub 106, 206 can infer that the surgeon is transecting the parenchyma based on data from the surgical stapling and cutting instrument, including data from its cartridge. The cartridge data can correspond to the size or type of staple being fired by the instrument, for example. As different types of staples are utilized for different types of tissues, the cartridge data can thus indicate the type of tissue being stapled and/or transected. In this case, the type of staple being fired is utilized for parenchyma (or other similar tissue types), which allows the surgical hub 106, 206 to infer that the segmentectomy portion of the procedure is being performed.


Twelfth step 5224, the node dissection step is then performed. The surgical hub 106, 206 can infer that the surgical team is dissecting the node and performing a leak test based on data received from the generator indicating that an RF or ultrasonic instrument is being fired. For this particular procedure, an RF or ultrasonic instrument being utilized after parenchyma was transected corresponds to the node dissection step, which allows the surgical hub 106, 206 to make this inference. It should be noted that surgeons regularly switch back and forth between surgical stapling/cutting instruments and surgical energy (i.e., RF or ultrasonic) instruments depending upon the particular step in the procedure because different instruments are better adapted for particular tasks. Therefore, the particular sequence in which the stapling/cutting instruments and surgical energy instruments are used can indicate what step of the procedure the surgeon is performing. Moreover, in certain instances, robotic tools can be utilized for one or more steps in a surgical procedure and/or handheld surgical instruments can be utilized for one or more steps in the surgical procedure. The surgeon(s) can alternate between robotic tools and handheld surgical instruments and/or can use the devices concurrently, for example. Upon completion of the twelfth step 5224, the incisions are closed up and the post-operative portion of the procedure begins.


Thirteenth step 5226, the patient's anesthesia is reversed. The surgical hub 106, 206 can infer that the patient is emerging from the anesthesia based on the ventilator data (i.e., the patient's breathing rate begins increasing), for example.


Lastly, the fourteenth step 5228 is that the medical personnel remove the various patient monitoring devices from the patient. The surgical hub 106, 206 can thus infer that the patient is being transferred to a recovery room when the hub loses EKG, BP, and other data from the patient monitoring devices. As can be seen from the description of this illustrative procedure, the surgical hub 106, 206 can determine or infer when each step of a given surgical procedure is taking place according to data received from the various data sources that are communicably coupled to the surgical hub 106, 206.


In various aspects, the surgical device 214002 (FIGS. 16-18), the display screen (FIG. 16), the wearable devices 214100, 214102, 214200, 214202 (FIGS. 19-21), and the display 214400 (FIGS. 22A and 22B) are configured to operate in a situational awareness in a hub environment, such as the surgical hub 106 or 206 (FIGS. 1-11), for example, as depicted by the timeline 5200. Situational awareness is further described in U.S. Provisional Patent Application Ser. No. 62/659,900, titled METHOD OF HUB COMMUNICATION, filed Apr. 19, 2018, which is herein incorporated by reference in its entirety. In certain instances, operation of a robotic surgical system, including the various robotic surgical systems disclosed herein, for example, can be controlled by the hub 106, 206 based on its situational awareness and/or feedback from the components thereof and/or based on information from the cloud 104.


Control Through Surgical Barriers

In various instances, digital surgical devices can be controlled through one or more surgical barriers. Surgical barriers include physical sterile barriers, intangible sterile barriers, and the walls of a patient's body. In one aspect, a first operating room system located within a sterile field can be indirectly commanded and controlled through the use of a second operating room system, which can have a primary operating mode and a secondary operating mode, which is designed for interacting with the first operating room system to provide commands and control. For example, in the primary operating mode, the second operating room system can effect a primary surgical function. In the secondary operating mode, the primary surgical function can be disabled. Moreover, in the secondary operating mode, the second operating room system can command and control the first operating room system. For example, a surgical instrument can be configured to effect tissue in the first operating mode and to command and control an imaging system and/or a display thereof in the second operating mode.


In certain instances, a clinician may want to provide input to a remote surgical system and/or to a surgical hub communicatively coupled to the remote surgical system from within the sterile field. For example, a clinician holding a first operating room system (e.g. a surgical device) may want to provide commands or inputs to another operating room system (e.g. an imaging or visualization system and/or display thereof) positioned outside the sterile field. In certain instances, it can be desirable to utilize the first operating room system to communicate or interact with the other operating room system through a surgical barrier, such as the boundary of the sterile field and/or a patient's body.


To facilitate such an interaction, the first operating room system can include a plurality of operating modes including a primary mode and a secondary mode. The first operating room system can switch between operating modes to selectively interact with the second operating room system. In such instances, the clinician can interact with the second operating room system without handing off or setting down the first operating room system and/or without removing the first operating room system from the surgical site.


For example, a surgical system can include a first device comprising a first control circuit and a second device configured to effect a surgical function, wherein the second device comprises a second control circuit in signal communication with the first control circuit, and wherein the second control circuit is configured to selectively toggle the second device between a secondary operating mode, in which the second device is configured to indirectly control the first device, and a primary operating mode, in which the second device is configured to control the surgical function. Each control circuit can include a processor and a memory communicatively coupled with the processor, wherein the memory stores instructions executable by the processor to receive an input signal. In response to the input signal, the second control circuit can switch between the primary operating mode and the secondary operating mode. In the primary operating mode, the second control circuit can actuate a surgical function, for example. When in the secondary operating mode, the second control circuit can control a display screen, for example. In various instances, a non-tangible computer readable medium can store computer readable instructions which, when executed, causes a surgical device to receive an input signal. In response to the input signal, the second control circuit can switch between the primary operating mode and the secondary operating mode. In the primary operating mode, the second control circuit can actuate a surgical function, for example. When in the secondary operating mode, the second control circuit can control a display screen, for example.


In one aspect, a surgical device can interface with a primary display to adjust and/or control the primary display. For example, the primary display can be controlled through a visual interface using a secondary control function of the surgical tool. Stated differently, the surgical device can be depicted on the primary display and can act as a cursor or indicator on the display to interact with a user interface of the display. Moreover, inputs to the surgical device can be inputs to the display and/or a surgical hub including the display. In such instances, the surgical device can be an input device for an interaction technique, user interface technique or input technique, which uses a combination of hardware and software to allow a computer (or control circuit thereof) to perform a task. The output can be displayed on the primary display and/or communicated to a surgical hub, as output resulting from a mouse click and/or selection provides an input command, which can be displayed or otherwise communicated via a computer monitor, for example.


Referring to FIGS. 16-18, a surgical system 214000 includes a surgical device 214002 and a display screen 214004. The display screen 214004 (FIG. 16) can be incorporated into an imaging system. For example, the imaging system can include an endoscope housing an endoscopic camera and the display screen 214004, which is configured to display the images obtained by the endoscopic camera. An endoscope is further described in U.S. patent application Ser. No. 11/277,290, titled DISPOSABLE ENDOSCOPE DEVICES, filed Mar. 23, 2006, now U.S. Patent Pub. No. 2007/0225556, which is incorporated by reference herein in its entirety. Reusable endoscopes are also contemplated.


In various instances, the display screen 214004 can be a video monitor, which is operably configured to display a live-feed of images from the surgical site. The display screen 214004 can depict a live, real-time video of the surgical site during a surgical procedure. Additionally or alternatively, the display screen 214004 can be configured to display an augmented reality view of the surgical site. For example, the display screen 214004 can depict hidden anatomical structures and/or hidden surgical devices. Such an augmented reality view can be toggled on and off, for example. The display screen 214004 also includes a graphical user interface. An operator can interact with the graphical user interface to provide input commands or controls to the display system and/or a surgical system communicatively coupled to the display system, as further described herein.


The surgical device 214002 is a handheld surgical instrument including a handle 214006, an elongate shaft 214008 extending distally from the handle 214006, and an end effector 214010 extending distally from the elongate shaft 214008. The end effector 214010 is configured to effect tissue. In one instance, the handheld surgical device can be an ultrasonic device. In such instances, the end effector 214010 can include an ultrasonic blade. A clamp arm can be positioned opposite the ultrasonic blade to facilitate clamping of tissue against the ultrasonic blade in various instances. Additionally or alternatively, the end effector 214010 can include tissue-contacting electrodes that are configured to deliver RF current to the tissue. In certain instances, the end effector 214010 can include a reciprocating knife, stapler, clip applier, and/or grasper, for example. In certain instances, the surgical device 214002 can include a housing that can be releasably coupled to a robotic arm. In such instances, the surgical device 214002 can be controlled by a clinician at a surgeon's console for the robotic surgical system.


The surgical device 214002 is configured to switch between a first mode and a second mode. The first mode can be an operational mode in which the surgical device 214002 is configured to perform a surgical function. For example, in the first mode, the surgical device 214002 can be configured to apply vibrational energy to tissue. The second mode can be a cursor mode or control mode, in which the surgical device 214002 can be configured to provide inputs to the display screen 214004. The inputs can be configured to adjust the information displayed on the display screen 214004 and/or provide inputs to a connected surgical system, such as a surgical hub like the surgical hubs 106 and 206 (FIGS. 1-11), the surgical hub 7006 (FIGS. 12 and 13), and the surgical hub 5104 (FIG. 14), for example.


Referring primarily to FIG. 17, the surgical device 214002 extends through a surgical barrier 214012 into the surgical site. The surgical barrier 214012 is an anatomical wall of a patient. The surgical device 214002 can also extend through a sterile field boundary that forms another surgical barrier. At the surgical site, an imaging system is configured to obtain views of the surgical site and the surgical device 214002. For example, a distal portion of the surgical device 214002, i.e. the grasping tips/jaws of the end effector 214010 and a portion of the shaft 214008, appear on the display screen 214004. In a first mode, the end effector 214010 can perform a surgical function, such as engaging, effecting, and/or treating tissue. The first mode can be considered a primary mode in that the end effector 214010 is performing its primary function as a surgical tool (e.g. a grasper can grasp tissue, an energy device can apply energy to tissue, a stapler can staple tissue, and so on). In the primary mode, the end effector 214010 is configured to directly effect tissue.


The end effector 214010 can toggle or switch between the primary mode and the second mode, which can be referred to as a secondary mode. The selected mode can be displayed on the display screen 214004. For example, the display screen 214004 in FIG. 16 shows a cursor mode icon 214052 to indicate that the end effector 214010 is acting as a cursor in a secondary mode. In the secondary mode, the end effector 214010 can be used to interact with the display screen 214004 and indirectly control the visualization system or a surgical system communicatively coupled thereto. The end effector 214010 shown on the display screen 214004 can function as a pointer or cursor for the graphical interface shown on the display screen 214004. In such instances, the clinician does not have to take his or her hands off of the surgical device 214002 or remove the surgical device 214002 from the patient's body to engage the display screen 214004 and provide input to the display screen 214004. In various instances, the camera of the imaging device (e.g. a laparoscopic camera) can be configured to track movement of the surgical devices at the surgical site. For example, the camera can scan or otherwise adjust its field of view to follow one or more surgical devices (or portions thereof) around the surgical site. In various instances, the clinician can select which surgical device(s) and/or portion(s) thereof are tracked by the camera. In such instances, the camera can track the end effector 214010, which can ensure the end effector 214010 is depicted on the display screen 214004 when the surgical device 214002 is in the secondary mode.


In the secondary mode, the surgical device 214002 can be used as an input device like a computer mouse or joystick, for example, to move a cursor around the interface on the display screen 214004 to manipulate the functions shown on the display screen 214004. When the displayed portion of the device is utilized as a cursor, the device tip (i.e., “cursor”) can press buttons on the display screen 214004, and drag and drop display items, circle and/or highlight a portion of the video (e.g., point out the tumor or unique anatomical features) to be referenced later. For example, the device tip can select a home screen icon 214040 to return to a home screen, a play icon 214042 to play a video of the surgical procedure obtained by an endoscope, a pause icon 214044 to pause the video, a rewind icon 214046 to re-watch or replay a portion of the video, a record icon 214048 to record a new portion of the surgical procedure, and/or a setting icon 214050 to adjust a setting and/or view on the display screen 214004. In one aspect, activating the secondary mode (or cursor control mode) can change the way the tool functions while in this mode. For example, button(s), trigger(s) and/or other actuator(s) can have different functions in different modes.


The operating mode of the surgical device 214002 can be selected by the clinician. For example, the clinician can selectively toggle or switch the surgical device 214002 between the primary mode and the secondary mode. In one instance, the secondary mode can be activated by a voice command. In another instance, a tactile action by a clinician can activate the secondary mode. Referring primarily to FIG. 18, the surgical device 214002 includes a manual switch 214018, which can enable the clinician to switch between operating modes. A first position of the manual switch 214018 can correspond to the primary mode, and a second position of the manual switch 214018 can correspond to the secondary mode. The clinician can move the manual switch 214018 to toggle the surgical device 214002 between operating modes.


In one example, the surgical device 214002 is an ultrasonic surgical instrument like the HARMONIC ACE® shears by Ethicon Endo-Surgery, LLC. Such an ultrasonic surgical instrument can include a plurality of input actuators, such as maximum and minimum power buttons 214020 and 214022. The maximum power button 214020 can generate ultrasonic energy at a first energy level, or within a first range, in the primary mode, and the minimum power button 214022 can generate ultrasonic energy at a second energy level, or in a second range, in the primary mode. The second energy level, or second range, can be less than the first energy level, or first range. In various instances, the buttons 214020, 214022 can define a range of positions corresponding to different levels and/or can detect the operator's force and adjust the energy level accordingly. For example, at least one of the buttons 214020, 214022 can define a rotary element to scroll between levels and/or selections, similar to a rotary wheel on a computer mouse, for example.


In the secondary mode, the ultrasonic sealing and cutting function can be disabled and the maximum and minimum power buttons 214020 and 2104022 on the surgical device 214002 can act like buttons of a computer mouse. The clinician can point the tip of the surgical device 214002 (as displayed on the display screen 214004) at something displayed on the display screen 214004 and the buttons can then interact with the display screen 214004, as described above. The dual-button input features of the surgical device 214002 can be intuitive to a clinician familiar with a two-button computer mouse, for example. In various instances, the two-buttons can be used to select icons, drag and drop icons, and/or adjust and interact with various features of the display screen 214004, as further described herein.


In various instances, the display screen 214004 is communicatively coupled to a surgical hub, such as the surgical hubs 106 or 206 (FIGS. 1-11), the surgical hub 7006 (FIGS. 12 and 13), or the surgical hub 5104 (FIG. 14), for example. In one instance, the primary display 119 for the visualization system 108 (see FIG. 2) can be the display screen 214004. In such instances, the visualization system 108 and the display screen 214004 thereof can be coupled to the imaging module 138 (see FIG. 3) of the surgical hub 106. Moreover, the situational awareness of the surgical hub 106 can be configured to implement intelligent adjustments and/or recommendations to a clinician during a surgical procedure. For example, a situational awareness module of the surgical hub 106 can suggest that a clinician confirm a complete tissue seal, staple line alignment, and/or tumor removal, and, in response to the recommendation, the clinician can switch the surgical device 214002 from the primary mode to the secondary mode to interact with the display screen 214004. The clinician may interact with the display screen 214004 by zooming in on the tissue seal, augmenting the view on the display screen 214004 to check the tissue seal, and/or selecting a measurement tool from the graphical interface to measure the length of the tissue seal and/or another anatomical structure/landmark distance, for example. The clinician can respond to the surgical hub's prompt/recommendation without removing the surgical device 214002 from the surgical site and without taking his or her hands off the surgical device 214002.


In various instances, the clinician can interact with the display screen 214004 to control and/or provide input to the surgical hub. In instances in which the display screen 214004 is positioned outside of the sterile field, like the primary display 119 in FIG. 2, for example, the clinician can exercise control of the display screen 214004 outside of the sterile field through various surgical barriers, including the sterile field boundary and the patient's body. In other words, the clinician can indirectly control and/or command a second operating room system outside the sterile field with a first operating room system from completely within the sterile field.


In various aspects, a wearable device, i.e. a “wearable”, can be configured to facilitate interaction with one or more communicatively coupled devices. For example, a clinician's wearable device within a sterile field can be used to interact with a surgical system outside of the sterile field. The wearable device can be an interactive device that is configured to interact with a remote system. In various instances, the wearable device can identify the wearer, i.e. the clinician wearing the device, and can identify surgical device(s) within a range of positions around the wearable device. Such a wearable device can determine if and how a clinician is holding a particular surgical device, for example. A wrist-worn wearable 214100 is shown in FIG. 20, and a finger-worn wearable 214102 is shown in FIG. 21. The wrist-worn wearable 214100 can be attached to the clinician's wrist “W” like a watch or a bracelet, for example. The finger-worn wearable 214102 can be attached to the clinician's thumb “T” or a finger “F” on the clinician's hand like a ring, for example.


The wearables 214100, 214102 each include a communication module 214104, which facilitates communication between the wearable 214100, 214102 and another surgical system. The communication modules 214104 can enable RFID tagging and/or near-field communication. The communications modules 214104 can emit a wireless signal 214106 indicative of a command or control from the clinician. For example, the wearables 214100 and 214102 can include a graphical user interface and/or a touchscreen. The clinician can engage the touchscreen or otherwise provide inputs to the graphical user interface to implement an adjustment to another surgical device. In one instance, the wearables 214100 and/or 214102 can be in signal communication with a visualization system and/or a display screen thereof. For example, the clinician can engage arrows 214110 on the user interface to pan the view on a primary display in a surgical theater, can adjust a zoom feature 214112 (FIG. 20) or 214113 (FIG. 21) to enlarge or reduce a view on the primary display, and/or select one or more icons 214114, 214116 on the user interface to adjust a view and/or data displayed on the display screen. Alternative graphics and input features are contemplated.


In one aspect, wearable devices can assist the clinician in interacting with displays, such as a primary display like the primary display 119 (FIG. 2) located outside the sterile field. For example, a wearable device can allow input from a surgeon to select, advance, re-size, and gesture with respect to a graphical user interface on the wearable device to adjust or control the primary display. For example, inputting a zoom-in operation on the wearable device can enlarge a portion of the video feed on the primary display. In one instance, the wearable device can include a simplified and/or minimized version of the information on the primary display, similar to a mobile device page displaying a simplified and/or minimized version of the information available on a desktop site. The wearable device can include a graphical user interface and a touchscreen, such that inputs to the graphical user interface on the touchscreen can be communicated to the primary display. In one such implementation, a clinician can interact with a sealed capacitive interactive surface on the wearable device, which can be worn like a watch of a wristband, with capacitive-infused latex gloves, for example.


In various instances, the wearable device can be tracked with image and/or object recognition techniques. For example, the image and/or video data can be processed utilizing a variety of machine vision, image processing, object recognition, and optical tracking techniques to track characteristics, properties, actions, and movements of the wearable device. For example, the wearable device can be recognized from images captured by the one or more cameras in the surgical theater utilizing a variety of image and/or object recognition techniques, including appearance and feature-based techniques. For example, the captured images can be processed utilizing an edge detection algorithm (e.g., a Canny edge detector algorithm) to generate outlines of the various objects within each image. An algorithm can then compare the templates of target objects (e.g. the target wearable device(s)) to the images containing the outlined objects to determine whether any of the target objects are located within the images. As another example, an algorithm can extract features from the captured images. The extracted features can be then be fed to a machine learning model (e.g., an artificial neural network or a support vector machine) trained via supervised or unsupervised learning techniques to correlate a feature vector to the targets. The features can include edges (extracted via a Canny edge detector algorithm, for example), curvature, corners (extracted via a Harris & Stephens corner detector algorithm, for example), and so on. Object recognition and tracking are further described in contemporaneously-filed U.S. patent application Ser. No. 16/182,255, titled USAGE AND TECHNIQUE ANALYSIS OF SURGEON/STAFF PERFORMANCE AGAINST A BASELINE TO OPTIMIZE DEVICE UTILIZATION AND PERFORMANCE FOR BOTH CURRENT AND FUTURE PROCEDURES, and in contemporaneously-filed U.S. patent application Ser. No. 16/182,269, titled IMAGE CAPTURING OF THE AREAS OUTSIDE THE ABDOMEN TO IMPROVE PLACEMENT AND CONTROL OF A SURGICAL DEVICE IN USE, which are incorporated by reference herein in their respective entireties.


In certain instances, the wearable device can include an array of magnetics, which can be detected by a sensor within the surgical theater. Based on the position(s) of the wearable device detected by the sensor, the surgical system can determine movement of the wearable device. The movement can correspond to gestures by the clinician, for example. In such instances, the surgical system can determine one or more gestures by the clinician and, in various instances, such gestures can be communicated to another surgical system, such as an imaging system. For example, the gestures can correspond to input commands to a display screen of the imaging system. Magnetic sensing arrays are further described in contemporaneously-filed U.S. patent application Ser. No. 16/182,269, titled IMAGE CAPTURING OF THE AREAS OUTSIDE THE ABDOMEN TO IMPROVE PLACEMENT AND CONTROL OF A SURGICAL DEVICE IN USE, which is incorporated by reference herein in its entirety.


As another example, a wearable device can enable tracking of surgical devices by pairing with the surgical devices during a hand-off, such as when a handheld surgical instrument is picked up and/or handed to a clinician, for example. Referring primarily to FIG. 19, a clinician C is wearing two wearable devices—a first wearable device 214200 is on the clinician's right wrist RW, and a second wearable device 214202 is on the clinician's left wrist LW. The wearable devices 214200, 214202 can be configured to determine which surgical device is positioned in which hand of the clinician. For example, the first wearable device 214200 can pair with a first surgical device 214210 in the clinician's right hand, and the second wearable device 214202 can pair with a second surgical device 214212 when the second surgical device 214212 is handed to the clinician. In one implementation, the wearable device 214200 and/or 214202 can include a built-in RFID tag or near-field communication device that allows a handle within the hand of the clinician to recognize which clinician is holding the surgical device such that the surgical device can be automatically paired or repaired to the clinician. For example, the clinician's desired settings and/or most-frequently used features/adjustments specific to that particular surgical device can appear on the screen of the wearable device and/or be highlighted on the screen. Although wrist-worn devices are shown in FIG. 19, the reader will appreciate that alternative devices, such as rings and/or gloves, for example, are contemplated. Moreover, in certain instances, the clinician may only wear a wearable device on one wrist, such as the wrist of his dominant side, for example. In various instances, the wearable device can determine the orientation of the surgical tool with respect to the wearable device and, thus, with respect to the clinician's hand. In such instances, the surgical system may adjust the displays and controls based on the detected orientation of the surgical device in the clinician's hand.


For example, a first arrangement of controls and/or selections can be used for a right-handed clinician and a second, different arrangement of controls and/or selections can be used for a left-handed clinician. Additionally or alternatively, the arrangement of controls and/or availability of certain controls can depend on whether the surgical device is positioned in the clinician's dominant hand or non-dominant hand and/or based on the situational awareness from a surgical hub. In various instances, a wearable device can identify the clinician for example, and thus, include clinician-customized settings, including identification of the clinician's dominant hand, for example.


In various instances, the control of a surgical device can be shared by different control devices and/or the control of the surgical device can switch between multiple control devices. For example, a surgical device can include an autonomous control mode, in which the control inputs are provided by the device itself. For example, a clinician can engage an actuator on the surgical device (e.g. a button, switch, toggle, trigger, etc.) to actuate a surgical function (e.g. energy activation, clamping, firing, etc.) of the surgical device. Handheld surgical tools and robotic surgical tools can operate in an autonomous control mode, for example. For robotic surgical tools, the input control(s) can be at the surgeon's control console. For a handheld surgical tool, the input control(s) can be on the handle, for example.


In certain instances, the surgical device can be controlled and/or subjugated by another surgical device, which can selectively issue control inputs to the surgical device. In such instances, the surgical device can be referred to as a “controlled surgical device” and the other surgical device can be referred to as a “controlling surgical device.” In one aspect, a mobile device having wireless communication features, such as a smart phone or tablet, for example, can be a controlling surgical device, which is selectively configured to provide control inputs to a controlled surgical device. Such a mobile device can be positioned in the sterile field. Additionally or alternatively, a wearable device can be a controlling surgical device, which is configured to provide control inputs to a controlled surgical device, as further described herein. In still other instances, a controlling surgical device, which can be paired and/or communicatively coupled to a controlled surgical device via a surgical hub, for example, can be configured to provide input controls to the controlled surgical device. In other instances, a display screen can be a controlling surgical device, which is configured to provide input controls to a controlled surgical device and/or a controlling surgical device can interact with a visualization system (e.g. as a cursor on a display screen) to provide input controls to the visualization system and/or another connected surgical device, i.e. the controlled surgical device(s). In the foregoing examples, the controlled surgical device can maintain at least some degree of autonomous control while the controlling surgical device(s) selectively exercise varying degrees of control over the controlled surgical instrument as well. In other instances, the input controls of the controlled surgical device can be disabled when control by a controlling surgical device is enabled and/or activated. In various instances, multiple surgical devices (including the controlled surgical device itself in certain instances) can simultaneously share control over the controlled surgical device. The reader will appreciate that various interactions of control interactions are contemplated in instances in which a surgical hub couples multiple surgical devices together into a cooperative surgical system.


In instances in which multiple surgical devices share control and/or alternative between control functionalities, a clinician may want to know which controlling surgical devices have control over a controlled surgical device at a particular time during the surgical procedure. For example, a surgical system can provide tactile, audible and/or visual indications to the clinician regarding the control mode.


In one aspect, the surgical system can provide various tactile, audio, and/or visual cues to the clinicians via a user interface, for example. The surgical system can highlight, emphasize, or otherwise bring attention to display cues on the user interface. Additionally or alternatively, the surgical system can nest or overlay data and key information. For example, certain information can be provided with an augmented reality view on the user interface and/or over a live feed of images, video, or other real-time data obtained by the surgical system. In another aspect, the surgical system can provide reinforced indication of a control function and/or a limitation thereof. For example, when control by a surgical device is disabled or otherwise not viable, the surgical device, control, or display can vibrate to communicate that the control input is not accepted.


As an example, if situational awareness indicates that a gross motor step is being performed during a surgical procedure, a clinician within the sterile field can be enabled to manipulate the position of a controlled surgical instrument via a controlling surgical device from within the sterile field, such as with a mobile tablet computer or wearable device located within the sterile field. However, if situational awareness indicates that a fine motor step is being performed during the surgical procedure, the control functionality of the controlling surgical device can be disabled such that only a clinician operating the surgical device, such as the clinician at the surgeon's console, for example, can provide input controls to the controlled surgical device. In such instances, the controlling surgical device can provide a tactile, auditory, and/or visual notice to the clinician within the sterile field to indicate that such control features are disabled. For example, the controlling surgical device can simply enter a “sleep” mode such that inputs cannot be provided by the sterile field clinician. In certain instances, the controlling surgical device can provide a verbal notice and/or beep, for example, to communicate that the desired control functionality is not viable. Additionally or alternatively, the controlling surgical device can vibrate or otherwise provide haptic feedback when the clinician in the sterile field attempts to provide a non-viable control input.


As another example, autonomous control of a surgical device can override control of the surgical device by another surgical device in various instances. For example, if a clinician at the surgeon's console is actively controlling a robotic tool, then a secondary control, such as the control input(s) provided by a clinician within the sterile field via a controlling surgical device, can be ignored. However, when the clinician at the surgeon's console stops actively controlling the robotic tool, the control inputs from the controlling surgical device within the sterile field can control the robotic tool. When the robotic tool is being controlled by the controlling surgical device within the sterile field, a user interface at the robotic console can utilize tactile, auditory, and/or visual cues to communicate that control is being shared with a controlling surgical device. Similarly, when the robotic tool is being controlled by the clinician at the surgeon's console, a user interface on the controlling surgical tool can indicate that its control functionality is disabled.


In various instances, controls for a surgical device can be configured to interface with one or more display(s) to communicate the interaction between the various controls. For example, a controlling surgical device can be configured to provide secondary control functionality over a controlled surgical device. Moreover, one or more indications can indicate or otherwise communicate the paired control function between the controlled surgical device and the controlling surgical devices. Such indications can be provided on one or more displays, such as the non-sterile displays 107 and 109, the primary display 119, the hub display 135, and/or a display in the surgeon's console 118 as depicted in FIGS. 2 and 3, for example. In one aspect, a touchscreen on a surgical device can be used by a clinician to interact with a primary surgical hub display. For example, a surgical device in the sterile field can include a capacitive touchscreen display. Surgical devices such as handheld surgical instruments, robotic tools, wearables, and display screens can include a touchscreen within the sterile field. In various instances, a clinician in the sterile field can slide, scroll, and/or otherwise configure a primary surgical hub display from the touchscreen within the surgical field. For example, by scanning, zooming, or otherwise adjusting the view on the touchscreen within the sterile field, the clinician can scan, zoom, or otherwise adjust the primary display for the surgical hub.


In various instances, a secondary device can contain controls that are only pairable with other devices to link command and control functions. For example, the secondary device can be a controller for controlling one or more other surgical devices. In various instances, the secondary device, or controlling surgical device, can be a screen located within the sterile field. For example, the secondary device can be a mobile device, such as a tablet or mobile phone, for example, that includes a screen. The screen can be a touchscreen, for example, which is configured to receive control input from a clinician in the sterile field. Additionally or alternatively, the display can include an embedded LED that highlights the control functionality in a way that pairs or connects the control functionality to a particular controlled surgical device.


Referring to FIGS. 22A and 22B, a display 214400 is shown. The display 214400 depicts a first plurality of information 214402 in FIG. 22A and a second plurality of information 214404 in FIG. 22B. A user can switch between the different views of FIGS. 22A and 22B by interacting with the display 214400. For example, the display 214400 includes a capacitive screen. In various instances, the clinician can interact with the display 214400 by touching an interface portion thereof. The display 214400 is configured to communicate different types of information to the clinician. For example, the display 214400 can communicate patient-specific information by selecting the PATIENT icon 214410, procedure-specific information by selecting the PROCEDURE icon 214412, and device-specific information by selecting the DEVICE icon 214414. The DEVICE icon 214414 has been selected in FIGS. 22A and 22B, as indicated with the highlighting around the DEVICE icon 214414. In other instances, the selected icon 214410, 214412, 214414 can be communicated in another manner, such as showing the selected icon 214410, 214412, 214414 in a different size, in a different position, in a different color, in a different style, and/or with another identifier such as an arrow or symbol relative thereto. Because the DEVICE icon 214414 is selected in FIGS. 22A and 22B, the device-specific information is portrayed. For example, the surgical devices being utilized during the surgical procedure are listed. In various instances, the display 214400 can selectively provide input commands to one or more of the surgical devices during the surgical procedure.


The listed surgical devices are a combination energy surgical device indicated with the first icon 214420, a surgical stapler indicated with the second icon 214422, and a suction/irrigation device indicated with the third icon 214424. The icons 214420, 214422, 214424 are textual words in FIGS. 22A and 22B; however, in other instances, the icons 214420, 214422, and 214424 can include graphics and/or can be symbols, such as a symbolic representation of the surgical device, for example. In certain instances, less than three or more than three surgical devices can be listed. In certain instances, only controllable surgical devices can be listed. In other instances, all of the surgical devices used during a surgical procedure can be listed; however, a clinician may only be able to select the controllable surgical devices from the list. Alternative surgical devices (e.g. an ultrasonic device, an electrosurgical device, a clip applier, a grasper, a knife, etc.) are contemplated.


The icon 214420 corresponding to the combination energy surgical device has been selected in FIG. 22A, and the icon 214422 corresponding to the stapler has been selected in FIG. 22B, as indicated with the highlighting around the respective icon 214420, 214422. In other instances, the selected icon 214420, 214422, 214424 can be communicated in another manner, such as showing the selected icon 214420, 214422, 214424 in a different size, in a different position, in a different color, in a different style, and/or with another identifier such as an arrow or symbol relative thereto. Because the icon 214420 corresponding to the combination energy surgical device has been selected in FIG. 22A, the available input commands correspond to commands for the combination energy surgical device. For example, a clinician can select a mode from the following list of modes 214430: ultrasonic, monopolar, bipolar, and blended. Alternative modes are also contemplated. The clinician can select the mode to control the energy modality applied by the combination energy surgical device. In other instances, additional adjustments can be controlled from the display 214400, such as the power level, duration, frequency, and so on. In various instances, the user input to the display 214400 can be communicated to the surgical hub by one or the various communication protocols described herein. Similarly, the surgical hub can relay the user input to the combination energy surgical device.


Referring now to FIG. 22B, because the icon 214422 corresponding to the surgical stapler has been selected, the available input commands correspond to commands for the surgical stapler. For example, a clinician can select a surgical function from a plurality of modes 214432 likes a clamping mode and a firing mode. In various instances, the modes 214432 can include additional sub-categories or adjustments, which can be selected from a menu. For example, clamping and/or firing can be performed manually or automatically, and/or the speed can be selected from a number of speeds, such as slow, medium, and fast. Alternative modes are also contemplated. The clinician can select the mode to control the surgical function of the surgical stapler. In various instances, the user input to the display 214400 can be communicated to the surgical hub by one or the various communication protocols described herein. Similarly, the surgical hub can relay the user input to the surgical stapler.


In various instances, pairing(s) between the display 214400 and the surgical device(s) can be communicated to the clinician. For example, the display 214400 can include at least one embedded LED that can be illuminated in a color that matches a color of a controlled surgical device and/or a color identifier on the controlled surgical device. In one aspect, the controlled surgical device can include a similar LED color identifier, for example. The embedded LED in the display 214400 can be illuminated in red, and a red LED on the paired surgical device can be illuminated to indicate pairing. In various instances, the display 214400 can be configured to control multiple surgical devices and, in such instances, multiple colors can be displayed on the display 214400 and the corresponding colors can be provided on respective identifiers throughout the surgical theater. For example, when the clinician has selected the first icon 214420 (FIG. 22A) corresponding to the combination energy surgical device, a portion of the display 214400 (e.g. the first icon 214420) can be illuminated in a particular color and an LED on the combination energy surgical device can be illuminated in the same color. Similarly, when the clinician has selected the second icon 214422 (FIG. 22B) corresponding to the surgical stapler, a portion of the display 214400 (e.g. the second icon 214422) can be illuminated in a particular color and an LED on the surgical stapler can be illuminated in the same color. Different colors can be assigned to different surgical devices. In various instances, the icons 214420, 214422, 214424 on the display can be illuminated in different colors and/or the icons can be highlighted with different colors. For example, the shapes around the selected icon 214420, 214422, 214424 can be a particular color that corresponds to an indicator on the paired surgical device.


The identifier on a controlled surgical device (e.g. the combination energy surgical device, surgical stapler, or suction/irrigation device) can be displayed in different ways or styles to communicate different control states. For example, when control by the display 214400 is disabled, the identifier can display a first pattern or style (e.g. the identifier can be a non-illuminated LED). When control by the secondary device is enabled, the identifier can display a second pattern or style (e.g. the identifier can be an illuminated LED), and/or when the secondary device is pairing with the controlled surgical device, the identifier can display a third pattern or style (e.g. the identifier can be a flashing LED). As described above, the color of the LED can corresponds to the color on the display 214400.


In various instances, one control interface can be used for activation of a surgical function and the other control interface can be used for sequencing through the instrument's modes. For example, a single trigger on the surgical device can effect different surgical functions, and the particular surgical function can be determined by the control interface on the display 214400. More specifically, with respect to FIG. 22A, a trigger or actuator on a combination energy surgical device can apply ultrasonic energy to tissue; however, if another mode was selected by the clinician, the same trigger or actuator would apply a different energy modularity to the tissue. In various instances, energy modalities and/or power levels can be controlled by the display 214400. Referring now to FIG. 22B, a trigger or actuator on the surgical stapler can be configured to clamp when the clamp function is selected on the display 214400, and the trigger can be configured to fire when the fire function is selected on the display 214400. Additionally adjustments (e.g. manual or automatic, clamping speed, firing speed, etc.) can be adjusted on the display 214400.


In other instances, multiple control devices can have activation capabilities for the same surgical device. For example, the display 214400 can include an activation control, which activates the surgical function on the surgical device. In certain instances, the power level for each activation can differ between the different control devices.


In one aspect, control of a secondary function can allow fine control from one controller and gross control from another controller. Secondary functions can include articulation of an end effector and distal head rotation, for example. In such instances, when gross control is desired, an integral or built-in device control, e.g. an autonomous control on the surgical device, can control the function. When fine control is desired, a secondary controller, such as the display 214400 can control the function. For example, when a clinician is utilizing another aspect of a surgical device, such as gripping a trigger for example, the fine control functionality can still be implemented with the secondary controller that is paired to the surgical device to create fine movements of the secondary function. In certain instances, certain functions like articulation and distal head rotation can be locked out when another function, like energy activation, clamping, or firing, is in operation. However, when the secondary controller for the secondary function is activated by another clinician or by another hand of the same clinician, the fine adjustment can be permitted though the clinician is using another control to operate the other function. In this mode of operation, the secondary function can have a limited operational envelope to ensure that the forces it applies and/or the rate at which it is operated are limited to produce the desired fine control or fine precision. Additionally or alternatively, it may override certain threshold limits to some extent because it is being directly controlled.


EXAMPLES

Various aspects of the subject matter described herein are set out in the following numbered examples:


Example 1

A surgical system comprising a first device comprising a first control circuit and a second device configured to effect a surgical function. The second device comprises a second control circuit in signal communication with the first control circuit. The second control circuit is configured to selectively toggle the second device between a secondary operating mode, in which the second device is configured to control the first device, and a primary operating mode, in which the second device is configured to control the surgical function.


Example 2

The surgical system of Example 1, wherein the first device comprises a display, wherein the second device comprises an end effector positioned within a sterile field, and wherein the end effector is viewable on the display.


Example 3

The surgical system of Examples 1 or 2, wherein the secondary operating mode comprises a cursor mode, and wherein the primary operating mode comprises a tissue treatment mode.


Example 4

The surgical system of any one of Examples 1-3, wherein the second device comprises a handle comprising an input switch movable between a first position and a second position, and wherein the first position corresponds to the primary operating mode and the second position corresponds to the secondary operating mode.


Example 5

The surgical system of any one of Examples 1-4, wherein the second control circuit is configured to toggle between the primary operating mode and the secondary operating mode in response to an audible command by a clinician.


Example 6

The surgical system of Example 3, wherein the end effector is configured to drag and drop an icon across the display in the cursor mode.


Example 7

The surgical system of Examples 3 or 6, wherein the end effector is configured to select an anatomical feature on the display in the cursor mode.


Example 8

The surgical system of any one of Examples 1-7, wherein the second device comprises an ultrasonic instrument configured to apply ultrasonic vibrations to tissue, wherein the ultrasonic instrument comprises a first actuation button and a second actuation button, wherein, in the primary operating mode, the first actuation button is configured to actuate a first energy level and the second actuation button is configured to actuate a second energy level, and wherein, in the secondary operating mode, the first actuation button comprises a first cursor button and the second actuation button comprises a second cursor button.


Example 9

A surgical system comprising an imaging system comprising a camera and a display screen. The surgical system further comprises a surgical device configured to effect a surgical function. The surgical device comprises a control circuit comprising a processor and a memory communicatively coupled to the processor, the memory storing instructions executable by the processor to receive an input signal, in response to the input signal, switch between a first operational mode and a second operational mode, in the first operational mode, actuate the surgical function, and in the second operational mode, control the display screen.


Example 10

The surgical system of Example 9, wherein the surgical device is configured to control the display screen through a surgical barrier.


Example 11

The surgical system of Examples 9 or 10, wherein the display screen comprises a video monitor in an operating room, and wherein the surgical device comprises a laparoscopic device comprising an end effector positioned in a patient in the operating room.


Example 12

The surgical system of Example 11, wherein the surgical device comprises an end effector, and wherein the camera is configured to track the end effector in the patient.


Example 13

The surgical system of Examples 11 or 12, wherein, in the second operational mode, the end effector is configured to interact with one or more icons on the video monitor as a cursor.


Example 14

The surgical system of any one of Examples 11-13, wherein, in the second operational mode, the end effector is configured to interact as a cursor with a video feed on the video monitor.


Example 15

The surgical system of any one of Examples 9-14, wherein the surgical device comprises a handle comprising an input switch movable between a first position and a second position, and wherein the first position corresponds to the first operational mode and the second position corresponds to the second operational mode.


Example 16

The surgical system of any one of Examples 9-14, wherein the control circuit is configured to toggle between the first operational mode and the second operational mode in response to an audible command by a clinician.


Example 17

A non-transitory computer readable medium storing computer readable instructions which, when executed, causes a surgical system to receive an input signal, in response to the input signal, switch between a first operational mode and a second operational mode, in the first operational mode, actuate a surgical function, and in the second operational mode, interact with a display screen through a surgical barrier.


Example 18

The non-transitory computer readable medium of Example 17, wherein the surgical system is configured to interact with the display screen through the surgical barrier by clicking on an icon on the display screen.


Example 19

The non-transitory computer readable medium of Examples 17 or 18, wherein the surgical system is configured to interact with the display screen through the surgical barrier by dragging and dropping an icon on the display screen.


Example 20

The non-transitory computer readable medium of any one of Examples 17-19, wherein the surgical system is configured to interact with the display screen through the surgical barrier by selecting a portion of a video.


While several forms have been illustrated and described, it is not the intention of Applicant to restrict or limit the scope of the appended claims to such detail. Numerous modifications, variations, changes, substitutions, combinations, and equivalents to those forms may be implemented and will occur to those skilled in the art without departing from the scope of the present disclosure. Moreover, the structure of each element associated with the described forms can be alternatively described as a means for providing the function performed by the element. Also, where materials are disclosed for certain components, other materials may be used. It is therefore to be understood that the foregoing description and the appended claims are intended to cover all such modifications, combinations, and variations as falling within the scope of the disclosed forms. The appended claims are intended to cover all such modifications, variations, changes, substitutions, modifications, and equivalents.


The foregoing detailed description has set forth various forms of the devices and/or processes via the use of block diagrams, flowcharts, and/or examples. Insofar as such block diagrams, flowcharts, and/or examples contain one or more functions and/or operations, it will be understood by those within the art that each function and/or operation within such block diagrams, flowcharts, and/or examples can be implemented, individually and/or collectively, by a wide range of hardware, software, firmware, or virtually any combination thereof. Those skilled in the art will recognize that some aspects of the forms disclosed herein, in whole or in part, can be equivalently implemented in integrated circuits, as one or more computer programs running on one or more computers (e.g., as one or more programs running on one or more computer systems), as one or more programs running on one or more processors (e.g., as one or more programs running on one or more microprocessors), as firmware, or as virtually any combination thereof, and that designing the circuitry and/or writing the code for the software and or firmware would be well within the skill of one of skill in the art in light of this disclosure. In addition, those skilled in the art will appreciate that the mechanisms of the subject matter described herein are capable of being distributed as one or more program products in a variety of forms, and that an illustrative form of the subject matter described herein applies regardless of the particular type of signal bearing medium used to actually carry out the distribution.


Instructions used to program logic to perform various disclosed aspects can be stored within a memory in the system, such as dynamic random access memory (DRAM), cache, flash memory, or other storage. Furthermore, the instructions can be distributed via a network or by way of other computer readable media. Thus a machine-readable medium may include any mechanism for storing or transmitting information in a form readable by a machine (e.g., a computer), but is not limited to, floppy diskettes, optical disks, compact disc, read-only memory (CD-ROMs), and magneto-optical disks, read-only memory (ROMs), random access memory (RAM), erasable programmable read-only memory (EPROM), electrically erasable programmable read-only memory (EEPROM), magnetic or optical cards, flash memory, or a tangible, machine-readable storage used in the transmission of information over the Internet via electrical, optical, acoustical or other forms of propagated signals (e.g., carrier waves, infrared signals, digital signals, etc.). Accordingly, the non-transitory computer-readable medium includes any type of tangible machine-readable medium suitable for storing or transmitting electronic instructions or information in a form readable by a machine (e.g., a computer).


As used in any aspect herein, the term “control circuit” may refer to, for example, hardwired circuitry, programmable circuitry (e.g., a computer processor including one or more individual instruction processing cores, processing unit, processor, microcontroller, microcontroller unit, controller, digital signal processor (DSP), programmable logic device (PLD), programmable logic array (PLA), or field programmable gate array (FPGA)), state machine circuitry, firmware that stores instructions executed by programmable circuitry, and any combination thereof. The control circuit may, collectively or individually, be embodied as circuitry that forms part of a larger system, for example, an integrated circuit (IC), an application-specific integrated circuit (ASIC), a system on-chip (SoC), desktop computers, laptop computers, tablet computers, servers, smart phones, etc. Accordingly, as used herein “control circuit” includes, but is not limited to, electrical circuitry having at least one discrete electrical circuit, electrical circuitry having at least one integrated circuit, electrical circuitry having at least one application specific integrated circuit, electrical circuitry forming a general purpose computing device configured by a computer program (e.g., a general purpose computer configured by a computer program which at least partially carries out processes and/or devices described herein, or a microprocessor configured by a computer program which at least partially carries out processes and/or devices described herein), electrical circuitry forming a memory device (e.g., forms of random access memory), and/or electrical circuitry forming a communications device (e.g., a modem, communications switch, or optical-electrical equipment). Those having skill in the art will recognize that the subject matter described herein may be implemented in an analog or digital fashion or some combination thereof.


As used in any aspect herein, the term “logic” may refer to an app, software, firmware and/or circuitry configured to perform any of the aforementioned operations. Software may be embodied as a software package, code, instructions, instruction sets and/or data recorded on non-transitory computer readable storage medium. Firmware may be embodied as code, instructions or instruction sets and/or data that are hard-coded (e.g., nonvolatile) in memory devices.


As used in any aspect herein, the terms “component,” “system,” “module” and the like can refer to a computer-related entity, either hardware, a combination of hardware and software, software, or software in execution.


As used in any aspect herein, an “algorithm” refers to a self-consistent sequence of steps leading to a desired result, where a “step” refers to a manipulation of physical quantities and/or logic states which may, though need not necessarily, take the form of electrical or magnetic signals capable of being stored, transferred, combined, compared, and otherwise manipulated. It is common usage to refer to these signals as bits, values, elements, symbols, characters, terms, numbers, or the like. These and similar terms may be associated with the appropriate physical quantities and are merely convenient labels applied to these quantities and/or states.


A network may include a packet switched network. The communication devices may be capable of communicating with each other using a selected packet switched network communications protocol. One example communications protocol may include an Ethernet communications protocol which may be capable permitting communication using a Transmission Control Protocol/Internet Protocol (TCP/IP). The Ethernet protocol may comply or be compatible with the Ethernet standard published by the Institute of Electrical and Electronics Engineers (IEEE) titled “IEEE 802.3 Standard”, published in December, 2008 and/or later versions of this standard. Alternatively or additionally, the communication devices may be capable of communicating with each other using an X.25 communications protocol. The X.25 communications protocol may comply or be compatible with a standard promulgated by the International Telecommunication Union-Telecommunication Standardization Sector (ITU-T). Alternatively or additionally, the communication devices may be capable of communicating with each other using a frame relay communications protocol. The frame relay communications protocol may comply or be compatible with a standard promulgated by Consultative Committee for International Telegraph and Telephone (CCITT) and/or the American National Standards Institute (ANSI). Alternatively or additionally, the transceivers may be capable of communicating with each other using an Asynchronous Transfer Mode (ATM) communications protocol. The ATM communications protocol may comply or be compatible with an ATM standard published by the ATM Forum titled “ATM-MPLS Network Interworking 2.0” published August 2001, and/or later versions of this standard. Of course, different and/or after-developed connection-oriented network communication protocols are equally contemplated herein.


Unless specifically stated otherwise as apparent from the foregoing disclosure, it is appreciated that, throughout the foregoing disclosure, discussions using terms such as “processing,” “computing,” “calculating,” “determining,” “displaying,” or the like, refer to the action and processes of a computer system, or similar electronic computing device, that manipulates and transforms data represented as physical (electronic) quantities within the computer system's registers and memories into other data similarly represented as physical quantities within the computer system memories or registers or other such information storage, transmission or display devices.


One or more components may be referred to herein as “configured to,” “configurable to,” “operable/operative to,” “adapted/adaptable,” “able to,” “conformable/conformed to,” etc. Those skilled in the art will recognize that “configured to” can generally encompass active-state components and/or inactive-state components and/or standby-state components, unless context requires otherwise.


The terms “proximal” and “distal” are used herein with reference to a clinician manipulating the handle portion of the surgical instrument. The term “proximal” refers to the portion closest to the clinician and the term “distal” refers to the portion located away from the clinician. It will be further appreciated that, for convenience and clarity, spatial terms such as “vertical”, “horizontal”, “up”, and “down” may be used herein with respect to the drawings. However, surgical instruments are used in many orientations and positions, and these terms are not intended to be limiting and/or absolute.


Those skilled in the art will recognize that, in general, terms used herein, and especially in the appended claims (e.g., bodies of the appended claims) are generally intended as “open” terms (e.g., the term “including” should be interpreted as “including but not limited to,” the term “having” should be interpreted as “having at least,” the term “includes” should be interpreted as “includes but is not limited to,” etc.). It will be further understood by those within the art that if a specific number of an introduced claim recitation is intended, such an intent will be explicitly recited in the claim, and in the absence of such recitation no such intent is present. For example, as an aid to understanding, the following appended claims may contain usage of the introductory phrases “at least one” and “one or more” to introduce claim recitations. However, the use of such phrases should not be construed to imply that the introduction of a claim recitation by the indefinite articles “a” or “an” limits any particular claim containing such introduced claim recitation to claims containing only one such recitation, even when the same claim includes the introductory phrases “one or more” or “at least one” and indefinite articles such as “a” or “an” (e.g., “a” and/or “an” should typically be interpreted to mean “at least one” or “one or more”); the same holds true for the use of definite articles used to introduce claim recitations.


In addition, even if a specific number of an introduced claim recitation is explicitly recited, those skilled in the art will recognize that such recitation should typically be interpreted to mean at least the recited number (e.g., the bare recitation of “two recitations,” without other modifiers, typically means at least two recitations, or two or more recitations). Furthermore, in those instances where a convention analogous to “at least one of A, B, and C, etc.” is used, in general such a construction is intended in the sense one having skill in the art would understand the convention (e.g., “a system having at least one of A, B, and C” would include but not be limited to systems that have A alone, B alone, C alone, A and B together, A and C together, B and C together, and/or A, B, and C together, etc.). In those instances where a convention analogous to “at least one of A, B, or C, etc.” is used, in general such a construction is intended in the sense one having skill in the art would understand the convention (e.g., “a system having at least one of A, B, or C” would include but not be limited to systems that have A alone, B alone, C alone, A and B together, A and C together, B and C together, and/or A, B, and C together, etc.). It will be further understood by those within the art that typically a disjunctive word and/or phrase presenting two or more alternative terms, whether in the description, claims, or drawings, should be understood to contemplate the possibilities of including one of the terms, either of the terms, or both terms unless context dictates otherwise. For example, the phrase “A or B” will be typically understood to include the possibilities of “A” or “B” or “A and B.”


With respect to the appended claims, those skilled in the art will appreciate that recited operations therein may generally be performed in any order. Also, although various operational flow diagrams are presented in a sequence(s), it should be understood that the various operations may be performed in other orders than those which are illustrated, or may be performed concurrently. Examples of such alternate orderings may include overlapping, interleaved, interrupted, reordered, incremental, preparatory, supplemental, simultaneous, reverse, or other variant orderings, unless context dictates otherwise. Furthermore, terms like “responsive to,” “related to,” or other past-tense adjectives are generally not intended to exclude such variants, unless context dictates otherwise.


It is worthy to note that any reference to “one aspect,” “an aspect,” “an exemplification,” “one exemplification,” and the like means that a particular feature, structure, or characteristic described in connection with the aspect is included in at least one aspect. Thus, appearances of the phrases “in one aspect,” “in an aspect,” “in an exemplification,” and “in one exemplification” in various places throughout the specification are not necessarily all referring to the same aspect. Furthermore, the particular features, structures or characteristics may be combined in any suitable manner in one or more aspects.


Any patent application, patent, non-patent publication, or other disclosure material referred to in this specification and/or listed in any Application Data Sheet is incorporated by reference herein, to the extent that the incorporated materials is not inconsistent herewith. As such, and to the extent necessary, the disclosure as explicitly set forth herein supersedes any conflicting material incorporated herein by reference. Any material, or portion thereof, that is said to be incorporated by reference herein, but which conflicts with existing definitions, statements, or other disclosure material set forth herein will only be incorporated to the extent that no conflict arises between that incorporated material and the existing disclosure material.


In summary, numerous benefits have been described which result from employing the concepts described herein. The foregoing description of the one or more forms has been presented for purposes of illustration and description. It is not intended to be exhaustive or limiting to the precise form disclosed. Modifications or variations are possible in light of the above teachings. The one or more forms were chosen and described in order to illustrate principles and practical application to thereby enable one of ordinary skill in the art to utilize the various forms and with various modifications as are suited to the particular use contemplated. It is intended that the claims submitted herewith define the overall scope.

Claims
  • 1. A surgical system, comprising: an image sensor;a first device comprising a first control circuit and a display to display intraoperative images of a surgical site obtained by the image sensor; anda second device configured to effect a surgical function, wherein the second device comprises a second control circuit in signal communication with the first control circuit, wherein the image sensor is positionable to image the second device such that an intraoperative image of the second device is viewable on the display of the first device, wherein the second control circuit is configured to selectively toggle the second device between a secondary operating mode, in which the intraoperative image of the second device on the display is configured to interact with at least one icon on the display overlying intraoperative images of the surgical site on the display to control the first device, and a primary operating mode, in which the second device is configured to control the surgical function, and wherein an ability of the second device to effect the surgical function is disabled when the second device is in the secondary operating mode.
  • 2. The surgical system of claim 1, wherein the second device comprises an end effector positioned within a sterile field, and wherein an image of the end effector is viewable on the display.
  • 3. The surgical system of claim 2, wherein the secondary operating mode comprises a cursor mode, and wherein the primary operating mode comprises a tissue treatment mode.
  • 4. The surgical system of claim 3, wherein the second control circuit is configured to toggle between the primary operating mode and the secondary operating mode in response to an audible command by a clinician.
  • 5. The surgical system of claim 1, wherein the second device is configured to provide an input to the first device to adjust information displayed on the display when the second device is in the secondary operating mode.
  • 6. The surgical system of claim 5, wherein the information displayed on the display is configured to be provided over an image of the surgical site.
  • 7. The surgical system of claim 1, wherein the display of the first device is configured to display an augmented reality view of the surgical site.
  • 8. The surgical system of claim 7, wherein the display is configured to depict a hidden anatomical structure when the display is displaying the augmented reality view of the surgical site.
  • 9. A surgical system, comprising: a first device comprising a first control circuit and a display to display intraoperative images of a surgical site obtained by an image sensor; anda second device configured to effect a surgical function, wherein the second device comprises a second control circuit in signal communication with the first control circuit, wherein an intraoperative image of a portion of the second device is viewable on the display, wherein the second control circuit is configured to selectively toggle the second device between a secondary operating mode, in which the intraoperative image of the portion of the second device is configured to interact with at least one icon on the display overlying intraoperative images of the surgical site to control the first device, and a primary operating mode, in which the second device is configured to control the surgical function, wherein an ability of the second device to effect the surgical function is disabled when the second device is in the secondary operating mode, wherein the second device comprises a handle comprising an input switch movable between a first position and a second position, and wherein the first position corresponds to the primary operating mode and the second position corresponds to the secondary operating mode.
  • 10. A surgical system, comprising: a first device comprising a first control circuit and a display to display intraoperative images of a surgical site obtained by an image sensor; anda second device comprising an end effector configured to effect a surgical function within a sterile field, wherein the second device comprises a second control circuit in signal communication with the first control circuit, wherein an intraoperative image of the end effector is viewable on the display, wherein the second control circuit is configured to selectively toggle the second device between a secondary operating mode, in which the intraoperative image of the end effector is configured to interact with at least one icon on the display overlying intraoperative images of the surgical site to control the first device, and a primary operating mode, in which the second device is configured to control the surgical function, wherein an ability of the second device to effect the surgical function is disabled when the second device is in the secondary operating mode, and wherein movement of the intraoperative image of the end effector within the sterile field is configured to drag and drop an icon across the display in the secondary operating mode.
  • 11. A surgical system, comprising: a first device comprising a first control circuit and a display to display intraoperative images of a surgical site obtained by an image sensor; anda second device comprising an end effector configured to effect a surgical function within a sterile field, wherein the second device comprises a second control circuit in signal communication with the first control circuit, wherein an intraoperative image of the end effector is viewable on the display, wherein the second control circuit is configured to selectively toggle the second device between a secondary operating mode, in which the intraoperative image of the end effector is configured to interact with intraoperative images of the surgical site to control the first device, and a primary operating mode, in which the second device is configured to control the surgical function, wherein an ability of the second device to effect the surgical function is disabled when the second device is in the secondary operating mode, and wherein a movement of the intraoperative image of the end effector on the display is configured to select an anatomical feature on the display in the secondary operating mode from within the sterile field.
  • 12. A surgical system, comprising: a first device comprising a first control circuit and a display; anda second device configured to effect a surgical function, wherein the second device comprises a second control circuit in signal communication with the first control circuit, wherein the second control circuit is configured to selectively toggle the second device between a secondary operating mode, in which the second device is configured to control the first device, and a primary operating mode, in which the second device is configured to control the surgical function, wherein an ability of the second device to effect the surgical function is disabled when the second device is in the secondary operating mode, wherein the second device comprises an ultrasonic instrument configured to apply ultrasonic vibrations to tissue, wherein the ultrasonic instrument comprises a first actuation button and a second actuation button, wherein, in the primary operating mode, the first actuation button is configured to actuate a first energy level and the second actuation button is configured to actuate a second energy level, and wherein, in the secondary operating mode, the first actuation button comprises a first cursor button and the second actuation button comprises a second cursor button.
  • 13. A surgical system, comprising: an image sensor;a first device comprising a first control circuit and a display; anda second device configured to effect a surgical function, wherein the second device comprises an end effector and a second control circuit in signal communication with the first control circuit, wherein the image sensor is positionable to image the end effector at a surgical site such that an intraoperative image of the end effector and the surgical site are viewable on the display, wherein the second control circuit is configured to selectively toggle the second device between a secondary operating mode, in which the intraoperative image of the end effector is configured to interact with information on the display such that the second device is configured to control the first device, and a primary operating mode, in which the second device is configured to control the surgical function, wherein an ability of the second device to effect the surgical function is disabled when the second device is in the secondary operating mode, and wherein the second device comprises a handheld surgical instrument.
  • 14. The surgical system of claim 13, wherein the handheld surgical instrument comprises an ultrasonic instrument.
  • 15. The surgical system of claim 13, wherein the image sensor comprises a video camera, wherein the display is configured to present a play icon, and wherein the intraoperative image of the end effector is configured to interact with the play icon in the secondary operating mode to play video data obtained by the video camera.
  • 16. The surgical system of claim 13, wherein the image sensor comprises a video camera, wherein the display is configured to present a pause icon, and wherein the intraoperative image of the end effector is configured to interact with the pause icon in the secondary operating mode to pause a replaying of video data obtained by the video camera.
  • 17. The surgical system of claim 13, wherein the image sensor comprises a video camera, wherein the display is configured to present a rewind icon, and wherein the intraoperative image of the end effector is configured to interact with the rewind icon in the secondary operating mode to rewind video data obtained by the video camera.
  • 18. The surgical system of claim 13, wherein the image sensor comprises a video camera, wherein the display is configured to present a record icon, and wherein the intraoperative image of the end effector is configured to interact with the record icon in the secondary operating mode to record video data obtained by the video camera.
CROSS-REFERENCE TO RELATED APPLICATIONS

The present application claims priority under 35 U.S.C. § 119(e) to U.S. Provisional Patent Application No. 62/729,176, titled INDIRECT COMMAND AND CONTROL OF A FIRST OPERATING ROOM SYSTEM THROUGH THE USE OF A SECOND OPERATING ROOM SYSTEM WITHIN A STERILE FIELD WHERE THE SECOND OPERATING ROOM SYSTEM HAS PRIMARY AND SECONDARY OPERATING MODES, filed on Sep. 10, 2018, the disclosure of which is herein incorporated by reference in its entirety. The present application also claims priority under 35 U.S.C. § 119(e) to U.S. Provisional Patent Application No. 62/692,747, titled SMART ACTIVATION OF AN ENERGY DEVICE BY ANOTHER DEVICE, filed on Jun. 30, 2018, to U.S. Provisional Patent Application No. 62/692,748, titled SMART ENERGY ARCHITECTURE, filed on Jun. 30, 2018, and to U.S. Provisional Patent Application No. 62/692,768, titled SMART ENERGY DEVICES, filed on Jun. 30, 2018, the disclosure of each of which is herein incorporated by reference in its entirety. The present application also claims priority under 35 U.S.C. § 119(e) to U.S. Provisional Patent Application No. 62/659,900, titled METHOD OF HUB COMMUNICATION, filed on Apr. 19, 2018, the disclosure of which is herein incorporated by reference in its entirety. The present application also claims priority under 35 U.S.C. § 119(e) to U.S. Provisional Patent Application No. 62/650,898, filed on Mar. 30, 2018, titled CAPACITIVE COUPLED RETURN PATH PAD WITH SEPARABLE ARRAY ELEMENTS, to U.S. Provisional Patent Application Ser. No. 62/650,887, titled SURGICAL SYSTEMS WITH OPTIMIZED SENSING CAPABILITIES, filed Mar. 30, 2018, to U.S. Provisional Patent Application Ser. No. 62/650,882, titled SMOKE EVACUATION MODULE FOR INTERACTIVE SURGICAL PLATFORM, filed Mar. 30, 2018, and to U.S. Provisional Patent Application Ser. No. 62/650,877, titled SURGICAL SMOKE EVACUATION SENSING AND CONTROLS, filed Mar. 30, 2018, the disclosure of each of which is herein incorporated by reference in its entirety. The present application also claims priority under 35 U.S.C. § 119(e) to U.S. Provisional Patent Application Ser. No. 62/640,417, titled TEMPERATURE CONTROL IN ULTRASONIC DEVICE AND CONTROL SYSTEM THEREFOR, filed Mar. 8, 2018, and to U.S. Provisional Patent Application Ser. No. 62/640,415, titled ESTIMATING STATE OF ULTRASONIC END EFFECTOR AND CONTROL SYSTEM THEREFOR, filed Mar. 8, 2018, the disclosure of each of which is herein incorporated by reference in its entirety. The present application also claims priority under 35 U.S.C. § 119(e) to U.S. Provisional Patent Application Ser. No. 62/611,341, titled INTERACTIVE SURGICAL PLATFORM, filed Dec. 28, 2017, to U.S. Provisional Patent Application Ser. No. 62/611,340, titled CLOUD-BASED MEDICAL ANALYTICS, filed Dec. 28, 2017, and to U.S. Provisional Patent Application Ser. No. 62/611,339, titled ROBOT ASSISTED SURGICAL PLATFORM, filed Dec. 28, 2017, the disclosure of each of which is herein incorporated by reference in its entirety.

US Referenced Citations (2815)
Number Name Date Kind
1853416 Hall Apr 1932 A
2222125 Stehlik Nov 1940 A
3082426 Miles Mar 1963 A
3503396 Pierie et al. Mar 1970 A
3584628 Green Jun 1971 A
3626457 Duerr et al. Dec 1971 A
3633584 Farrell Jan 1972 A
3759017 Young Sep 1973 A
3863118 Lander et al. Jan 1975 A
3898545 Coppa et al. Aug 1975 A
3912121 Steffen Oct 1975 A
3915271 Harper Oct 1975 A
3932812 Milligan Jan 1976 A
4041362 Ichiyanagi Aug 1977 A
4052649 Greenwell et al. Oct 1977 A
4087730 Goles May 1978 A
4157859 Terry Jun 1979 A
4171700 Farin Oct 1979 A
4202722 Paquin May 1980 A
4412539 Jarvik Nov 1983 A
4448193 Ivanov May 1984 A
4523695 Braun et al. Jun 1985 A
4608160 Zoch Aug 1986 A
4614366 North et al. Sep 1986 A
4633874 Chow et al. Jan 1987 A
4701193 Robertson et al. Oct 1987 A
4735603 Goodson et al. Apr 1988 A
4788977 Farin et al. Dec 1988 A
4827911 Broadwin et al. May 1989 A
4849752 Bryant Jul 1989 A
D303787 Messenger et al. Oct 1989 S
4892244 Fox et al. Jan 1990 A
4962681 Yang Oct 1990 A
4976173 Yang Dec 1990 A
5010341 Huntley et al. Apr 1991 A
5026387 Thomas Jun 1991 A
5035692 Lyon et al. Jul 1991 A
5042460 Sakurai et al. Aug 1991 A
5047043 Kubota et al. Sep 1991 A
5084057 Green et al. Jan 1992 A
5100402 Fan Mar 1992 A
D327061 Soren et al. Jun 1992 S
5129570 Schulze et al. Jul 1992 A
5151102 Kamiyama et al. Sep 1992 A
5156315 Green et al. Oct 1992 A
5158585 Saho et al. Oct 1992 A
5160334 Billings et al. Nov 1992 A
5171247 Hughett et al. Dec 1992 A
5189277 Boisvert et al. Feb 1993 A
5197962 Sansom et al. Mar 1993 A
5204669 Dorfe et al. Apr 1993 A
5217003 Wilk Jun 1993 A
5242474 Herbst et al. Sep 1993 A
5253793 Green et al. Oct 1993 A
5271543 Grant et al. Dec 1993 A
RE34519 Fox et al. Jan 1994 E
5275323 Schulze et al. Jan 1994 A
5318516 Cosmescu Jun 1994 A
5318563 Malis et al. Jun 1994 A
5322055 Davison et al. Jun 1994 A
5342349 Kaufman Aug 1994 A
5364003 Williamson, IV Nov 1994 A
5383880 Hooven Jan 1995 A
5385544 Edwards et al. Jan 1995 A
5391144 Sakurai et al. Feb 1995 A
5396900 Slater et al. Mar 1995 A
5397046 Savage et al. Mar 1995 A
5403312 Yates et al. Apr 1995 A
5403327 Thornton et al. Apr 1995 A
5413267 Solyntjes et al. May 1995 A
5415335 Knodell, Jr. May 1995 A
5417699 Klein et al. May 1995 A
5439468 Schulze et al. Aug 1995 A
5445304 Plyley et al. Aug 1995 A
5462545 Wang et al. Oct 1995 A
5465895 Knodel et al. Nov 1995 A
5467911 Tsuruta et al. Nov 1995 A
5474566 Alesi et al. Dec 1995 A
5485947 Olson et al. Jan 1996 A
5496315 Weaver et al. Mar 1996 A
5496317 Goble et al. Mar 1996 A
5503320 Webster et al. Apr 1996 A
5507773 Huitema et al. Apr 1996 A
5529235 Boiarski et al. Jun 1996 A
5531743 Nettekoven et al. Jul 1996 A
5545148 Wurster Aug 1996 A
5552685 Young et al. Sep 1996 A
5560372 Cory Oct 1996 A
5584425 Savage et al. Dec 1996 A
5610379 Muz et al. Mar 1997 A
5610811 Honda Mar 1997 A
5613966 Makower et al. Mar 1997 A
5624452 Yates Apr 1997 A
D379346 Mieki May 1997 S
5626587 Bishop et al. May 1997 A
5643291 Pier et al. Jul 1997 A
5654750 Weil et al. Aug 1997 A
5673841 Schulze et al. Oct 1997 A
5673842 Bittner et al. Oct 1997 A
5675227 Roos et al. Oct 1997 A
5693042 Boiarski et al. Dec 1997 A
5693052 Weaver Dec 1997 A
5695502 Pier et al. Dec 1997 A
5697926 Weaver Dec 1997 A
5706998 Plyley et al. Jan 1998 A
5718359 Palmer et al. Feb 1998 A
5720287 Chapelon et al. Feb 1998 A
5724468 Leone et al. Mar 1998 A
5725536 Oberlin et al. Mar 1998 A
5725542 Yoon Mar 1998 A
5735445 Vidal et al. Apr 1998 A
5735848 Yates et al. Apr 1998 A
5746209 Yost et al. May 1998 A
5749362 Funda et al. May 1998 A
5749893 Vidal et al. May 1998 A
5752644 Bolanos et al. May 1998 A
5762255 Chrisman et al. Jun 1998 A
5762458 Wang et al. Jun 1998 A
5766186 Faraz et al. Jun 1998 A
5769791 Benaron et al. Jun 1998 A
5775331 Raymond et al. Jul 1998 A
5796188 Bays Aug 1998 A
5797537 Oberlin et al. Aug 1998 A
5800350 Coppleson et al. Sep 1998 A
5807393 Williamson, IV et al. Sep 1998 A
D399561 Ellingson Oct 1998 S
5817093 Williamson, IV et al. Oct 1998 A
5820009 Melling et al. Oct 1998 A
5833690 Yates et al. Nov 1998 A
5836849 Mathiak et al. Nov 1998 A
5836869 Kudo et al. Nov 1998 A
5836909 Cosmescu Nov 1998 A
5843080 Fleenor et al. Dec 1998 A
5846237 Nettekoven Dec 1998 A
5849022 Sakashita et al. Dec 1998 A
5873873 Smith et al. Feb 1999 A
5878938 Bittner et al. Mar 1999 A
5893849 Weaver Apr 1999 A
5906625 Bito et al. May 1999 A
5942333 Arnett et al. Aug 1999 A
5947996 Logeman Sep 1999 A
5968032 Sleister Oct 1999 A
5980510 Tsonton et al. Nov 1999 A
5987346 Benaron et al. Nov 1999 A
5997528 Bisch et al. Dec 1999 A
6004269 Crowley et al. Dec 1999 A
6010054 Johnson et al. Jan 2000 A
6030437 Gourrier et al. Feb 2000 A
6036637 Kudo Mar 2000 A
6039734 Goble Mar 2000 A
6039735 Greep Mar 2000 A
6059799 Aranyi et al. May 2000 A
6066137 Greep May 2000 A
6079606 Milliman et al. Jun 2000 A
6090107 Borgmeier et al. Jul 2000 A
6099537 Sugai et al. Aug 2000 A
6102907 Smethers et al. Aug 2000 A
6109500 Alli et al. Aug 2000 A
6113598 Baker Sep 2000 A
6126592 Proch et al. Oct 2000 A
6126658 Baker Oct 2000 A
6131789 Schulze et al. Oct 2000 A
6139561 Shibata et al. Oct 2000 A
6155473 Tompkins et al. Dec 2000 A
6214000 Fleenor et al. Apr 2001 B1
6258105 Hart et al. Jul 2001 B1
6269411 Reasoner Jul 2001 B1
6273887 Yamauchi et al. Aug 2001 B1
6283960 Ashley Sep 2001 B1
6301495 Gueziec et al. Oct 2001 B1
6302881 Farin Oct 2001 B1
6308089 von der Ruhr et al. Oct 2001 B1
6325808 Bernard et al. Dec 2001 B1
6325811 Messerly Dec 2001 B1
6331181 Tierney et al. Dec 2001 B1
6341164 Dilkie et al. Jan 2002 B1
6391102 Bodden et al. May 2002 B1
6423057 He et al. Jul 2002 B1
6434416 Mizoguchi et al. Aug 2002 B1
6443973 Whitman Sep 2002 B1
6451015 Rittman, III et al. Sep 2002 B1
6454781 Witt et al. Sep 2002 B1
6457625 Tormala et al. Oct 2002 B1
6461352 Morgan et al. Oct 2002 B2
6466817 Kaula et al. Oct 2002 B1
6480796 Wiener Nov 2002 B2
6482217 Pintor et al. Nov 2002 B1
6524307 Palmerton et al. Feb 2003 B1
6530933 Yeung et al. Mar 2003 B1
6551243 Bocionek et al. Apr 2003 B2
6569109 Sakurai et al. May 2003 B2
6582424 Fleenor et al. Jun 2003 B2
6584358 Carter et al. Jun 2003 B2
6585791 Garito et al. Jul 2003 B1
6611793 Burnside et al. Aug 2003 B1
6618626 West, Jr. et al. Sep 2003 B2
6628989 Penner et al. Sep 2003 B1
6633234 Wiener et al. Oct 2003 B2
6648223 Boukhny et al. Nov 2003 B2
6678552 Pearlman Jan 2004 B2
6679899 Wiener et al. Jan 2004 B2
6685704 Greep Feb 2004 B2
6695199 Whitman Feb 2004 B2
6699187 Webb et al. Mar 2004 B2
6731514 Evans May 2004 B2
6742895 Robin Jun 2004 B2
6752816 Culp et al. Jun 2004 B2
6760616 Hoey et al. Jul 2004 B2
6770072 Truckai et al. Aug 2004 B1
6773444 Messerly Aug 2004 B2
6775575 Bommannan et al. Aug 2004 B2
6778846 Martinez et al. Aug 2004 B1
6781683 Kacyra et al. Aug 2004 B2
6783524 Anderson et al. Aug 2004 B2
6783525 Greep et al. Aug 2004 B2
6793652 Whitman et al. Sep 2004 B1
6793663 Kneifel et al. Sep 2004 B2
6824539 Novak Nov 2004 B2
6846308 Whitman et al. Jan 2005 B2
6849071 Whitman et al. Feb 2005 B2
6849074 Chen et al. Feb 2005 B2
6852219 Hammond Feb 2005 B2
6863650 Irion Mar 2005 B1
6869430 Balbierz et al. Mar 2005 B2
6869435 Blake, III Mar 2005 B2
6911033 de Guillebon et al. Jun 2005 B2
6913471 Smith Jul 2005 B2
6937892 Leyde et al. Aug 2005 B2
6945981 Donofrio et al. Sep 2005 B2
6951559 Greep Oct 2005 B1
6962587 Johnson et al. Nov 2005 B2
6978921 Shelton, IV et al. Dec 2005 B2
6988649 Shelton, IV et al. Jan 2006 B2
7000818 Shelton, IV et al. Feb 2006 B2
7009511 Mazar et al. Mar 2006 B2
7030146 Baynes et al. Apr 2006 B2
7032798 Whitman et al. Apr 2006 B2
7041941 Faries, Jr. et al. May 2006 B2
7044352 Shelton, IV et al. May 2006 B2
7044911 Drinan et al. May 2006 B2
7044949 Orszulak et al. May 2006 B2
7048775 Jornitz et al. May 2006 B2
7053752 Wang et al. May 2006 B2
7055730 Ehrenfels et al. Jun 2006 B2
7073765 Newkirk Jul 2006 B2
7077853 Kramer et al. Jul 2006 B2
7077856 Whitman Jul 2006 B2
7081096 Brister et al. Jul 2006 B2
7094231 Ellman et al. Aug 2006 B1
7097640 Wang et al. Aug 2006 B2
7103688 Strong Sep 2006 B2
7104949 Anderson et al. Sep 2006 B2
7118564 Ritchie et al. Oct 2006 B2
7121460 Parsons et al. Oct 2006 B1
7137980 Buysse et al. Nov 2006 B2
7140528 Shelton, IV Nov 2006 B2
7143923 Shelton, IV et al. Dec 2006 B2
7143925 Shelton, IV et al. Dec 2006 B2
7147139 Schwemberger et al. Dec 2006 B2
7155316 Sutherland et al. Dec 2006 B2
7164940 Hareyama et al. Jan 2007 B2
7169145 Isaacson et al. Jan 2007 B2
7177533 McFarlin et al. Feb 2007 B2
7182775 de Guillebon et al. Feb 2007 B2
7207472 Wukusick et al. Apr 2007 B2
7208005 Frecker et al. Apr 2007 B2
7217269 El-Galley et al. May 2007 B2
7230529 Ketcherside, Jr. et al. Jun 2007 B2
7232447 Gellman et al. Jun 2007 B2
7236817 Papas et al. Jun 2007 B2
7246734 Shelton, IV Jul 2007 B2
7252664 Nasab et al. Aug 2007 B2
7278563 Green Oct 2007 B1
7294106 Birkenbach et al. Nov 2007 B2
7294116 Ellman et al. Nov 2007 B1
7296724 Green et al. Nov 2007 B2
7317955 McGreevy Jan 2008 B2
7328828 Ortiz et al. Feb 2008 B2
7334717 Rethy et al. Feb 2008 B2
7343565 Ying et al. Mar 2008 B2
7344532 Goble et al. Mar 2008 B2
7353068 Tanaka et al. Apr 2008 B2
7362228 Nycz et al. Apr 2008 B2
7371227 Zeiner May 2008 B2
7380695 Doll et al. Jun 2008 B2
7383088 Spinelli et al. Jun 2008 B2
7391173 Schena Jun 2008 B2
7407074 Ortiz et al. Aug 2008 B2
7408439 Wang et al. Aug 2008 B2
7413541 Konishi Aug 2008 B2
7422136 Marczyk Sep 2008 B1
7422139 Shelton, IV et al. Sep 2008 B2
7422586 Morris et al. Sep 2008 B2
7423972 Shaham et al. Sep 2008 B2
D579876 Novotney et al. Nov 2008 S
7445620 Kefer Nov 2008 B2
7457804 Uber, III et al. Nov 2008 B2
D583328 Chiang Dec 2008 S
7464847 Viola et al. Dec 2008 B2
7464849 Shelton, IV et al. Dec 2008 B2
7496418 Kim et al. Feb 2009 B2
D589447 Sasada et al. Mar 2009 S
7515961 Germanson et al. Apr 2009 B2
7518502 Austin et al. Apr 2009 B2
7554343 Bromfield Jun 2009 B2
7563259 Takahashi Jul 2009 B2
7568604 Ehrenfels et al. Aug 2009 B2
7575144 Ortiz et al. Aug 2009 B2
7597731 Palmerton et al. Oct 2009 B2
7617137 Kreiner et al. Nov 2009 B2
7621192 Conti et al. Nov 2009 B2
7621898 Lalomia et al. Nov 2009 B2
7631793 Rethy et al. Dec 2009 B2
7637410 Marczyk Dec 2009 B2
7637907 Blaha Dec 2009 B2
7641092 Kruszynski et al. Jan 2010 B2
7644848 Swayze et al. Jan 2010 B2
7667592 Ohyama et al. Feb 2010 B2
7667839 Bates Feb 2010 B2
7670334 Hueil et al. Mar 2010 B2
7694865 Scirica Apr 2010 B2
7699772 Pauker et al. Apr 2010 B2
7699860 Huitema et al. Apr 2010 B2
7717312 Beetel May 2010 B2
7720306 Gardiner et al. May 2010 B2
7721934 Shelton, IV et al. May 2010 B2
7721936 Shalton, IV et al. May 2010 B2
7722603 McPherson May 2010 B2
7736357 Lee, Jr. et al. Jun 2010 B2
7742176 Braunecker et al. Jun 2010 B2
7743960 Whitman et al. Jun 2010 B2
7753245 Boudreaux et al. Jul 2010 B2
7757028 Druke et al. Jul 2010 B2
7766207 Mather et al. Aug 2010 B2
7766905 Paterson et al. Aug 2010 B2
7770773 Whitman et al. Aug 2010 B2
7771429 Ballard et al. Aug 2010 B2
7776037 Odom Aug 2010 B2
7782789 Stultz et al. Aug 2010 B2
7784663 Shelton, IV Aug 2010 B2
7803151 Whitman Sep 2010 B2
7810692 Hall et al. Oct 2010 B2
7818041 Kim et al. Oct 2010 B2
7819298 Hall et al. Oct 2010 B2
7832612 Baxter, III et al. Nov 2010 B2
7833219 Tashiro et al. Nov 2010 B2
7836085 Petakov et al. Nov 2010 B2
7837079 Holsten et al. Nov 2010 B2
7837680 Isaacson et al. Nov 2010 B2
7841980 Minosawa et al. Nov 2010 B2
7845537 Shelton, IV et al. Dec 2010 B2
7857185 Swayze et al. Dec 2010 B2
D631252 Leslie Jan 2011 S
7862560 Marion Jan 2011 B2
7862579 Ortiz et al. Jan 2011 B2
7865236 Cory et al. Jan 2011 B2
7884735 Newkirk Feb 2011 B2
7887530 Zemlok et al. Feb 2011 B2
7892337 Palmerton et al. Feb 2011 B2
7907166 Lamprecht et al. Mar 2011 B2
7913891 Doll et al. Mar 2011 B2
7918230 Whitman et al. Apr 2011 B2
7918377 Measamer et al. Apr 2011 B2
7920706 Asokan et al. Apr 2011 B2
7922063 Zemlok et al. Apr 2011 B2
7927014 Dehler Apr 2011 B2
7932826 Fritchie et al. Apr 2011 B2
7942300 Rethy et al. May 2011 B2
7945065 Menzl et al. May 2011 B2
7945342 Tsai et al. May 2011 B2
7950560 Zemlok et al. May 2011 B2
7951148 McClurken May 2011 B2
7954682 Giordano et al. Jun 2011 B2
7954687 Zemlok et al. Jun 2011 B2
7955322 Devengenzo et al. Jun 2011 B2
7956620 Gilbert Jun 2011 B2
7963433 Whitman et al. Jun 2011 B2
7966269 Bauer et al. Jun 2011 B2
7967180 Scirica Jun 2011 B2
7976553 Shelton, IV et al. Jul 2011 B2
7979157 Anvari Jul 2011 B2
7980443 Scheib et al. Jul 2011 B2
7982776 Dunki-Jacobs et al. Jul 2011 B2
7988028 Farascioni et al. Aug 2011 B2
7993140 Sakezles Aug 2011 B2
7993354 Brecher et al. Aug 2011 B1
7993954 Wieting Aug 2011 B2
7995045 Dunki-Jacobs Aug 2011 B2
8005947 Morris et al. Aug 2011 B2
8007494 Taylor et al. Aug 2011 B1
8007513 Nalagatla et al. Aug 2011 B2
8010180 Quaid et al. Aug 2011 B2
8012170 Whitman et al. Sep 2011 B2
8015976 Shah Sep 2011 B2
8016855 Whitman et al. Sep 2011 B2
8019094 Hsieh et al. Sep 2011 B2
8025199 Whitman et al. Sep 2011 B2
8027710 Dannan Sep 2011 B1
8035685 Jensen Oct 2011 B2
8038686 Huitema et al. Oct 2011 B2
8038693 Allen Oct 2011 B2
8043560 Okumoto et al. Oct 2011 B2
8054184 Cline et al. Nov 2011 B2
8054752 Druke et al. Nov 2011 B2
8062306 Nobis et al. Nov 2011 B2
8062330 Prommersberger et al. Nov 2011 B2
8066721 Kortenbach et al. Nov 2011 B2
8074861 Ehrenfels et al. Dec 2011 B2
8075571 Vitali et al. Dec 2011 B2
8095327 Tahara et al. Jan 2012 B2
8096459 Ortiz et al. Jan 2012 B2
8116848 Shahidi Feb 2012 B2
8118206 Zand et al. Feb 2012 B2
8120301 Goldberg et al. Feb 2012 B2
8123764 Meade et al. Feb 2012 B2
D655678 Kobayashi et al. Mar 2012 S
8128625 Odom Mar 2012 B2
8131565 Dicks et al. Mar 2012 B2
8136712 Zingman Mar 2012 B2
8146149 Steinkogler et al. Mar 2012 B2
D657368 Magee et al. Apr 2012 S
8147486 Honour et al. Apr 2012 B2
8155479 Hoffman et al. Apr 2012 B2
8157145 Shelton, IV et al. Apr 2012 B2
8157150 Viola et al. Apr 2012 B2
8157151 Ingmanson et al. Apr 2012 B2
8160098 Yan et al. Apr 2012 B1
8160690 Wilfley et al. Apr 2012 B2
8161977 Shelton, IV et al. Apr 2012 B2
8170396 Kuspa et al. May 2012 B2
8172836 Ward May 2012 B2
8181839 Beetel May 2012 B2
8185409 Putnam et al. May 2012 B2
8206345 Abboud et al. Jun 2012 B2
8208707 Mendonca et al. Jun 2012 B2
8210411 Yates et al. Jul 2012 B2
8211100 Podhajsky et al. Jul 2012 B2
8214007 Baker et al. Jul 2012 B2
8216849 Petty Jul 2012 B2
8220688 Laurent et al. Jul 2012 B2
8225643 Abboud et al. Jul 2012 B2
8225979 Farascioni et al. Jul 2012 B2
8229549 Whitman et al. Jul 2012 B2
8231042 Hessler et al. Jul 2012 B2
8239066 Jennings et al. Aug 2012 B2
8241322 Whitman et al. Aug 2012 B2
8255045 Gharib et al. Aug 2012 B2
D667838 Magee et al. Sep 2012 S
8257387 Cunningham Sep 2012 B2
8260016 Maeda et al. Sep 2012 B2
8262560 Whitman Sep 2012 B2
8292639 Achammer et al. Oct 2012 B2
8292888 Whitman Oct 2012 B2
8295902 Salahieh et al. Oct 2012 B2
8308040 Huang et al. Nov 2012 B2
8321581 Katis et al. Nov 2012 B2
8322590 Patel et al. Dec 2012 B2
8328065 Shah Dec 2012 B2
8335590 Costa et al. Dec 2012 B2
D675164 Kobayashi et al. Jan 2013 S
8343065 Bartol et al. Jan 2013 B2
8346392 Walser et al. Jan 2013 B2
8360299 Zemlok et al. Jan 2013 B2
8364222 Cook et al. Jan 2013 B2
D676392 Gassauer Feb 2013 S
8365975 Manoux et al. Feb 2013 B1
D678196 Miyauchi et al. Mar 2013 S
D678304 Yakoub et al. Mar 2013 S
8388652 Viola Mar 2013 B2
8393514 Shelton, IV et al. Mar 2013 B2
8397972 Kostrzewski Mar 2013 B2
8398541 DiMaio et al. Mar 2013 B2
8403944 Pain et al. Mar 2013 B2
8403945 Whitfield et al. Mar 2013 B2
8403946 Whitfield et al. Mar 2013 B2
8406859 Zuzak et al. Mar 2013 B2
8411034 Boillot et al. Apr 2013 B2
8413871 Racenet et al. Apr 2013 B2
8422035 Hinderling et al. Apr 2013 B2
8423182 Robinson et al. Apr 2013 B2
8428722 Verhoef et al. Apr 2013 B2
8429153 Birdwell et al. Apr 2013 B2
8439910 Greep et al. May 2013 B2
8444663 Houser et al. May 2013 B2
8452615 Abri May 2013 B2
8453906 Huang et al. Jun 2013 B2
8454506 Rothman et al. Jun 2013 B2
8461744 Wiener et al. Jun 2013 B2
8468030 Stroup et al. Jun 2013 B2
8469973 Meade et al. Jun 2013 B2
8472630 Konrad et al. Jun 2013 B2
8473066 Aghassian et al. Jun 2013 B2
D687146 Juzkiw et al. Jul 2013 S
8476227 Kaplan et al. Jul 2013 B2
8478418 Fahey Jul 2013 B2
8489235 Moll et al. Jul 2013 B2
8499992 Whitman et al. Aug 2013 B2
8500728 Newton et al. Aug 2013 B2
8500756 Papa et al. Aug 2013 B2
8503759 Greer et al. Aug 2013 B2
8505801 Ehrenfels et al. Aug 2013 B2
8506478 Mizuyoshi Aug 2013 B2
8512325 Mathonnet Aug 2013 B2
8512365 Wiener et al. Aug 2013 B2
8515520 Brunnett et al. Aug 2013 B2
8517239 Scheib et al. Aug 2013 B2
8521331 Itkowitz Aug 2013 B2
8523043 Ullrich et al. Sep 2013 B2
8533475 Frikart et al. Sep 2013 B2
8540709 Allen Sep 2013 B2
8546996 Messerly et al. Oct 2013 B2
8554697 Claus et al. Oct 2013 B2
8560047 Haider et al. Oct 2013 B2
8561870 Baxter, III et al. Oct 2013 B2
8562598 Falkenstein et al. Oct 2013 B2
8566115 Moore Oct 2013 B2
8567393 Hickle et al. Oct 2013 B2
8568411 Falkenstein et al. Oct 2013 B2
8571598 Valavi Oct 2013 B2
8573459 Smith et al. Nov 2013 B2
8573465 Shelton, IV Nov 2013 B2
8574229 Eder et al. Nov 2013 B2
8585631 Dacquay Nov 2013 B2
8585694 Amoah et al. Nov 2013 B2
8590762 Hess et al. Nov 2013 B2
8591536 Robertson Nov 2013 B2
8595607 Nekoomaram et al. Nov 2013 B2
8596513 Olson et al. Dec 2013 B2
8596515 Okoniewski Dec 2013 B2
8604709 Jalbout et al. Dec 2013 B2
8608044 Hueil et al. Dec 2013 B2
8608045 Smith et al. Dec 2013 B2
8616431 Timm et al. Dec 2013 B2
8617155 Johnson et al. Dec 2013 B2
8620055 Barratt et al. Dec 2013 B2
8620473 Diolaiti et al. Dec 2013 B2
8622275 Baxter, III et al. Jan 2014 B2
8623027 Price et al. Jan 2014 B2
8627483 Rachlin et al. Jan 2014 B2
8627993 Smith et al. Jan 2014 B2
8627995 Smith et al. Jan 2014 B2
8628518 Blumenkranz et al. Jan 2014 B2
8628545 Cabrera et al. Jan 2014 B2
8631987 Shelton, IV et al. Jan 2014 B2
8632525 Kerr et al. Jan 2014 B2
8636190 Zemlok et al. Jan 2014 B2
8636736 Yates et al. Jan 2014 B2
8641621 Razzaque et al. Feb 2014 B2
8652086 Gerg et al. Feb 2014 B2
8652121 Quick et al. Feb 2014 B2
8652128 Ward Feb 2014 B2
8657176 Shelton, IV et al. Feb 2014 B2
8657177 Scirica et al. Feb 2014 B2
8663220 Wiener et al. Mar 2014 B2
8663222 Anderson et al. Mar 2014 B2
8666544 Moll et al. Mar 2014 B2
8679114 Chapman et al. Mar 2014 B2
8682049 Zhao et al. Mar 2014 B2
8682489 Itkowitz et al. Mar 2014 B2
8685056 Evans et al. Apr 2014 B2
8688188 Heller et al. Apr 2014 B2
8690864 Hoarau Apr 2014 B2
8701962 Kostrzewski Apr 2014 B2
8708213 Shelton, IV et al. Apr 2014 B2
D704839 Juzkiw et al. May 2014 S
8719061 Birchall May 2014 B2
8720766 Hess et al. May 2014 B2
8733613 Huitema et al. May 2014 B2
8740840 Foley et al. Jun 2014 B2
8740866 Reasoner et al. Jun 2014 B2
8747238 Shelton, IV et al. Jun 2014 B2
8752749 Moore et al. Jun 2014 B2
8757465 Woodard, Jr. et al. Jun 2014 B2
8761717 Buchheit Jun 2014 B1
8763879 Shelton, IV et al. Jul 2014 B2
8768251 Claus et al. Jul 2014 B2
8771270 Burbank Jul 2014 B2
8775196 Simpson et al. Jul 2014 B2
8779648 Giordano et al. Jul 2014 B2
8790253 Sunagawa et al. Jul 2014 B2
8794497 Zingman Aug 2014 B2
8795001 Lam et al. Aug 2014 B1
8799008 Johnson et al. Aug 2014 B2
8799009 Mellin et al. Aug 2014 B2
8800838 Shelton, IV Aug 2014 B2
8801703 Gregg et al. Aug 2014 B2
8814996 Giurgiutiu et al. Aug 2014 B2
8818556 Sanchez et al. Aug 2014 B2
8819581 Nakamura et al. Aug 2014 B2
8820603 Shelton, IV et al. Sep 2014 B2
8820607 Marczyk Sep 2014 B2
8820608 Miyamoto Sep 2014 B2
8827134 Viola et al. Sep 2014 B2
8827136 Hessler Sep 2014 B2
8840003 Morgan et al. Sep 2014 B2
D716333 Chotin et al. Oct 2014 S
8851354 Swensgard et al. Oct 2014 B2
8852174 Burbank Oct 2014 B2
8864747 Merchant et al. Oct 2014 B2
8875973 Whitman Nov 2014 B2
8876857 Burbank Nov 2014 B2
8882662 Charles Nov 2014 B2
8885032 Igarashi et al. Nov 2014 B2
8886790 Harrang et al. Nov 2014 B2
8893946 Boudreaux et al. Nov 2014 B2
8893949 Shelton, IV et al. Nov 2014 B2
8899479 Cappuzzo et al. Dec 2014 B2
8905977 Shelton et al. Dec 2014 B2
8912746 Reid et al. Dec 2014 B2
8914098 Brennan et al. Dec 2014 B2
8917513 Hazzard Dec 2014 B1
8918207 Prisco Dec 2014 B2
8920186 Shishikura Dec 2014 B2
8920414 Stone et al. Dec 2014 B2
8920433 Barrier et al. Dec 2014 B2
8930203 Kiaie et al. Jan 2015 B2
8930214 Woolford Jan 2015 B2
8931679 Kostrzewski Jan 2015 B2
8934684 Mohamed Jan 2015 B2
8936614 Allen, IV Jan 2015 B2
8945095 Blumenkranz et al. Feb 2015 B2
8945163 Voegele et al. Feb 2015 B2
8955732 Zemlok et al. Feb 2015 B2
8956581 Rosenbaum et al. Feb 2015 B2
8960519 Whitman et al. Feb 2015 B2
8960520 McCuen Feb 2015 B2
8962062 Podhajsky et al. Feb 2015 B2
8967443 McCuen Mar 2015 B2
8967455 Zhou Mar 2015 B2
8968276 Zemlok et al. Mar 2015 B2
8968296 McPherson Mar 2015 B2
8968309 Roy et al. Mar 2015 B2
8968312 Marczyk et al. Mar 2015 B2
8968337 Whitfield et al. Mar 2015 B2
8968358 Reschke Mar 2015 B2
8974429 Gordon et al. Mar 2015 B2
8979890 Boudreaux Mar 2015 B2
8986288 Konishi Mar 2015 B2
8986302 Aldridge et al. Mar 2015 B2
8989903 Weir et al. Mar 2015 B2
8991678 Wellman et al. Mar 2015 B2
8992565 Brisson et al. Mar 2015 B2
8998797 Omori Apr 2015 B2
9002518 Manzo et al. Apr 2015 B2
9005230 Yates et al. Apr 2015 B2
9010608 Casasanta, Jr. et al. Apr 2015 B2
9010611 Ross et al. Apr 2015 B2
9011366 Dean et al. Apr 2015 B2
9011427 Price et al. Apr 2015 B2
9016539 Kostrzewski et al. Apr 2015 B2
9017326 DiNardo et al. Apr 2015 B2
9020240 Pettersson et al. Apr 2015 B2
D729267 Yoo et al. May 2015 S
9023032 Robinson May 2015 B2
9023071 Miller et al. May 2015 B2
9023079 Boulnois et al. May 2015 B2
9027431 Tang et al. May 2015 B2
9028494 Shelton, IV et al. May 2015 B2
9035568 Ganton et al. May 2015 B2
9038882 Racenet et al. May 2015 B2
9043027 Durant et al. May 2015 B2
9044227 Shelton, IV et al. Jun 2015 B2
9044244 Ludwin et al. Jun 2015 B2
9044261 Houser Jun 2015 B2
9050063 Roe et al. Jun 2015 B2
9050083 Yates et al. Jun 2015 B2
9050120 Swarup et al. Jun 2015 B2
9052809 Vesto Jun 2015 B2
9055035 Porsch et al. Jun 2015 B2
9055870 Meador et al. Jun 2015 B2
9060770 Shelton, IV et al. Jun 2015 B2
9060775 Wiener et al. Jun 2015 B2
9066650 Sekiguchi Jun 2015 B2
9072523 Houser et al. Jul 2015 B2
9072535 Shelton, IV et al. Jul 2015 B2
9072536 Shelton, IV et al. Jul 2015 B2
9078653 Leimbach et al. Jul 2015 B2
9078727 Miller Jul 2015 B2
9084606 Greep Jul 2015 B2
9089360 Messerly et al. Jul 2015 B2
9095362 Dachs, II et al. Aug 2015 B2
9095367 Olson et al. Aug 2015 B2
9099863 Smith et al. Aug 2015 B2
9101358 Kerr et al. Aug 2015 B2
9101359 Smith et al. Aug 2015 B2
9101374 Hoch et al. Aug 2015 B1
9106270 Puterbaugh et al. Aug 2015 B2
9107573 Birnkrant Aug 2015 B2
9107662 Kostrzewski Aug 2015 B2
9107684 Ma Aug 2015 B2
9107688 Kimball et al. Aug 2015 B2
9107689 Robertson et al. Aug 2015 B2
9107694 Hendriks et al. Aug 2015 B2
9111548 Nandy et al. Aug 2015 B2
9113880 Zemlok et al. Aug 2015 B2
9114494 Mah Aug 2015 B1
9116597 Gulasky Aug 2015 B1
9119617 Souls et al. Sep 2015 B2
9119655 Bowling et al. Sep 2015 B2
9119657 Shelton, IV et al. Sep 2015 B2
9123155 Cunningham et al. Sep 2015 B2
9125644 Lane et al. Sep 2015 B2
9129054 Nawana et al. Sep 2015 B2
9131957 Skarbnik et al. Sep 2015 B2
9137254 Bilbrey et al. Sep 2015 B2
9138129 Diolaiti Sep 2015 B2
9138225 Huang et al. Sep 2015 B2
9141758 Kress et al. Sep 2015 B2
9149322 Knowlton Oct 2015 B2
9155503 Cadwell Oct 2015 B2
9160853 Daddi et al. Oct 2015 B1
9161803 Yates et al. Oct 2015 B2
9168054 Turner et al. Oct 2015 B2
9168091 Janssen et al. Oct 2015 B2
9168104 Dein Oct 2015 B2
9179912 Yates et al. Nov 2015 B2
9183723 Sherman et al. Nov 2015 B2
9186143 Timm et al. Nov 2015 B2
9192375 Skinlo et al. Nov 2015 B2
9192447 Choi et al. Nov 2015 B2
9192707 Gerber et al. Nov 2015 B2
9198711 Joseph Dec 2015 B2
9198835 Swisher et al. Dec 2015 B2
9202078 Abuelsaad et al. Dec 2015 B2
9204830 Zand et al. Dec 2015 B2
9204879 Shelton, IV Dec 2015 B2
9204995 Scheller et al. Dec 2015 B2
9211120 Scheib et al. Dec 2015 B2
9216062 Duque et al. Dec 2015 B2
9218053 Komuro et al. Dec 2015 B2
9220502 Zemlok et al. Dec 2015 B2
9220505 Vasudevan et al. Dec 2015 B2
9226689 Jacobsen et al. Jan 2016 B2
9226751 Shelton, IV et al. Jan 2016 B2
9226766 Aldridge et al. Jan 2016 B2
9226767 Stulen et al. Jan 2016 B2
9226791 McCarthy et al. Jan 2016 B2
9232883 Ozawa et al. Jan 2016 B2
9237891 Shelton, IV Jan 2016 B2
9237921 Messerly et al. Jan 2016 B2
9241728 Price et al. Jan 2016 B2
9241730 Babaev Jan 2016 B2
9241731 Boudreaux et al. Jan 2016 B2
9247996 Merana et al. Feb 2016 B1
9250172 Harris et al. Feb 2016 B2
9255907 Heanue et al. Feb 2016 B2
9259282 Azizian et al. Feb 2016 B2
9265429 St. Pierre et al. Feb 2016 B2
9265585 Wingardner et al. Feb 2016 B2
9265959 Drew et al. Feb 2016 B2
9272406 Aronhalt et al. Mar 2016 B2
9277956 Zhang Mar 2016 B2
9277961 Panescu et al. Mar 2016 B2
9277969 Brannan et al. Mar 2016 B2
9280884 Schultz et al. Mar 2016 B1
9282962 Schmid et al. Mar 2016 B2
9282974 Shelton, IV Mar 2016 B2
9283045 Rhee et al. Mar 2016 B2
9283054 Morgan et al. Mar 2016 B2
9289211 Williams et al. Mar 2016 B2
9289212 Shelton, IV et al. Mar 2016 B2
9295514 Shelton, IV et al. Mar 2016 B2
9299138 Zellner et al. Mar 2016 B2
9301691 Hufnagel et al. Apr 2016 B2
9301753 Aldridge et al. Apr 2016 B2
9301755 Shelton, IV et al. Apr 2016 B2
9301759 Spivey et al. Apr 2016 B2
9301810 Amiri et al. Apr 2016 B2
9302213 Manahan et al. Apr 2016 B2
9307894 Grunberg et al. Apr 2016 B2
9307914 Fahey Apr 2016 B2
9307986 Hall et al. Apr 2016 B2
9314246 Shelton, IV et al. Apr 2016 B2
9314308 Parihar et al. Apr 2016 B2
9320563 Brustad et al. Apr 2016 B2
9325732 Stickle et al. Apr 2016 B1
9326767 Koch et al. May 2016 B2
9326770 Shelton, IV et al. May 2016 B2
9331422 Nazzaro et al. May 2016 B2
9332987 Leimbach et al. May 2016 B2
9333042 Diolaiti et al. May 2016 B2
9336385 Spencer et al. May 2016 B1
9341704 Picard et al. May 2016 B2
9345481 Hall et al. May 2016 B2
9345490 Ippisch May 2016 B2
9345544 Hourtash et al. May 2016 B2
9345546 Toth et al. May 2016 B2
9345900 Wu et al. May 2016 B2
9351726 Leimbach et al. May 2016 B2
9351727 Leimbach et al. May 2016 B2
9358003 Hall et al. Jun 2016 B2
9358685 Meier et al. Jun 2016 B2
9360449 Duric Jun 2016 B2
9364200 Whitman et al. Jun 2016 B2
9364230 Shelton, IV et al. Jun 2016 B2
9364231 Wenchell Jun 2016 B2
9364249 Kimball et al. Jun 2016 B2
9364294 Razzaque et al. Jun 2016 B2
9370400 Parihar Jun 2016 B2
9375282 Nau, Jr. et al. Jun 2016 B2
9375539 Stearns et al. Jun 2016 B2
9381003 Todor et al. Jul 2016 B2
9381058 Houser et al. Jul 2016 B2
9386984 Aronhalt et al. Jul 2016 B2
9386988 Baxter, III et al. Jul 2016 B2
9387295 Mastri et al. Jul 2016 B1
9393017 Flanagan et al. Jul 2016 B2
9393037 Olson et al. Jul 2016 B2
9398905 Martin Jul 2016 B2
9398911 Auld Jul 2016 B2
9402629 Ehrenfels et al. Aug 2016 B2
9404868 Yamanaka et al. Aug 2016 B2
9414776 Sillay et al. Aug 2016 B2
9414940 Stein et al. Aug 2016 B2
9419018 Sasagawa et al. Aug 2016 B2
9421014 Ingmanson et al. Aug 2016 B2
9433470 Choi Sep 2016 B2
9439622 Case et al. Sep 2016 B2
9439668 Timm et al. Sep 2016 B2
9439736 Olson Sep 2016 B2
9445764 Gross et al. Sep 2016 B2
9445813 Shelton, IV et al. Sep 2016 B2
9450701 Do et al. Sep 2016 B2
9451949 Gorek et al. Sep 2016 B2
9451958 Shelton, IV et al. Sep 2016 B2
9463022 Swayze et al. Oct 2016 B2
9463646 Payne et al. Oct 2016 B2
9468438 Baber et al. Oct 2016 B2
9474565 Shikhman et al. Oct 2016 B2
D772252 Myers et al. Nov 2016 S
9480492 Aranyi et al. Nov 2016 B2
9485475 Speier et al. Nov 2016 B2
9486271 Dunning Nov 2016 B2
9492146 Kostrzewski et al. Nov 2016 B2
9492237 Kang et al. Nov 2016 B2
9493807 Little et al. Nov 2016 B2
9498182 Case et al. Nov 2016 B2
9498215 Duque et al. Nov 2016 B2
9498219 Moore et al. Nov 2016 B2
9498231 Haider et al. Nov 2016 B2
9498291 Balaji et al. Nov 2016 B2
9509566 Chu et al. Nov 2016 B2
9516239 Blanquart et al. Dec 2016 B2
9519753 Gerdeman et al. Dec 2016 B1
9522003 Weir et al. Dec 2016 B2
9526407 Hoeg et al. Dec 2016 B2
9526499 Kostrzewski et al. Dec 2016 B2
9526580 Humayun et al. Dec 2016 B2
9526587 Zhao et al. Dec 2016 B2
9532827 Morgan et al. Jan 2017 B2
9532845 Dossett et al. Jan 2017 B1
9539007 Dhakad et al. Jan 2017 B2
9539020 Conlon et al. Jan 2017 B2
9542481 Halter et al. Jan 2017 B2
9545216 D'Angelo et al. Jan 2017 B2
9546662 Shener-Irmakoglu et al. Jan 2017 B2
9549781 He et al. Jan 2017 B2
9554692 Levy Jan 2017 B2
9554794 Baber et al. Jan 2017 B2
9554854 Yates et al. Jan 2017 B2
9561038 Shelton, IV et al. Feb 2017 B2
9561045 Hinman et al. Feb 2017 B2
9561082 Yen et al. Feb 2017 B2
9561982 Enicks et al. Feb 2017 B2
9566708 Kurnianto Feb 2017 B2
9572592 Price et al. Feb 2017 B2
9579099 Penna et al. Feb 2017 B2
9579503 Mckinney et al. Feb 2017 B2
9585657 Shelton, IV et al. Mar 2017 B2
9585658 Shelton, IV Mar 2017 B2
9592095 Panescu et al. Mar 2017 B2
9597081 Swayze et al. Mar 2017 B2
9600031 Kaneko et al. Mar 2017 B2
9600138 Thomas et al. Mar 2017 B2
9603024 Wang et al. Mar 2017 B2
9603277 Morgan et al. Mar 2017 B2
9603609 Kawashima et al. Mar 2017 B2
D783675 Yagisawa et al. Apr 2017 S
D784270 Bhattacharya Apr 2017 S
9610114 Baxter, III et al. Apr 2017 B2
9610412 Zemlok et al. Apr 2017 B2
9615877 Tyrrell et al. Apr 2017 B2
9622684 Wybo Apr 2017 B2
9622808 Beller et al. Apr 2017 B2
9628501 Datta Ray et al. Apr 2017 B2
9629560 Joseph Apr 2017 B2
9629623 Lytle, IV et al. Apr 2017 B2
9629628 Aranyi Apr 2017 B2
9629629 Leimbach et al. Apr 2017 B2
9630318 Ibarz Gabardos et al. Apr 2017 B2
9636096 Heaton, II et al. May 2017 B1
9636112 Penna et al. May 2017 B2
9636188 Gattani et al. May 2017 B2
9636239 Durand et al. May 2017 B2
9636825 Penn et al. May 2017 B2
9641596 Unagami et al. May 2017 B2
9641815 Richardson et al. May 2017 B2
9642620 Baxter, III et al. May 2017 B2
9643022 Mashiach et al. May 2017 B2
9649089 Smith et al. May 2017 B2
9649110 Parihar et al. May 2017 B2
9649111 Shelton, IV et al. May 2017 B2
9649126 Robertson et al. May 2017 B2
9649169 Cinquin et al. May 2017 B2
9652655 Satish et al. May 2017 B2
9655614 Swensgard et al. May 2017 B2
9655616 Aranyi May 2017 B2
9656092 Golden May 2017 B2
9662116 Smith et al. May 2017 B2
9662177 Weir et al. May 2017 B2
9668729 Williams et al. Jun 2017 B2
9668732 Patel et al. Jun 2017 B2
9668765 Grace et al. Jun 2017 B2
9671860 Ogawa et al. Jun 2017 B2
9675264 Acquista et al. Jun 2017 B2
9675354 Weir et al. Jun 2017 B2
9681870 Baxter, III et al. Jun 2017 B2
9686306 Chizeck et al. Jun 2017 B2
9687230 Leimbach et al. Jun 2017 B2
9690362 Leimbach et al. Jun 2017 B2
9700292 Nawana et al. Jul 2017 B2
9700309 Jaworek et al. Jul 2017 B2
9700312 Kostrzewski et al. Jul 2017 B2
9700320 Dinardo et al. Jul 2017 B2
9706993 Hessler et al. Jul 2017 B2
9710214 Lin et al. Jul 2017 B2
9710644 Reybok et al. Jul 2017 B2
9713424 Spaide Jul 2017 B2
9713503 Goldschmidt Jul 2017 B2
9717141 Tegg Jul 2017 B1
9717498 Aranyi et al. Aug 2017 B2
9717525 Ahluwalia et al. Aug 2017 B2
9717548 Couture Aug 2017 B2
9724094 Baber et al. Aug 2017 B2
9724100 Scheib et al. Aug 2017 B2
9724118 Schulte et al. Aug 2017 B2
9733663 Leimbach et al. Aug 2017 B2
9737301 Baber et al. Aug 2017 B2
9737310 Whitfield et al. Aug 2017 B2
9737335 Butler et al. Aug 2017 B2
9737355 Yates et al. Aug 2017 B2
9737371 Romo et al. Aug 2017 B2
9740826 Raghavan et al. Aug 2017 B2
9743016 Nestares et al. Aug 2017 B2
9743929 Leimbach et al. Aug 2017 B2
9743946 Faller et al. Aug 2017 B2
9743947 Price et al. Aug 2017 B2
9750499 Leimbach et al. Sep 2017 B2
9750500 Malkowski Sep 2017 B2
9750522 Scheib et al. Sep 2017 B2
9750523 Tsubuku Sep 2017 B2
9750560 Ballakur et al. Sep 2017 B2
9750563 Shikhman et al. Sep 2017 B2
9753135 Bosch Sep 2017 B2
9753568 McMillen Sep 2017 B2
9757126 Cappola Sep 2017 B2
9757128 Baber et al. Sep 2017 B2
9757142 Shimizu Sep 2017 B2
9757152 Ogilvie et al. Sep 2017 B2
9763741 Alvarez et al. Sep 2017 B2
9764164 Wiener et al. Sep 2017 B2
9770541 Carr et al. Sep 2017 B2
9775611 Kostrzewski Oct 2017 B2
9777913 Talbert et al. Oct 2017 B2
9782164 Mumaw et al. Oct 2017 B2
9782169 Kimsey et al. Oct 2017 B2
9782212 Wham et al. Oct 2017 B2
9782214 Houser et al. Oct 2017 B2
9788835 Morgan et al. Oct 2017 B2
9788836 Overmyer et al. Oct 2017 B2
9788851 Dannaher et al. Oct 2017 B2
9788902 Inoue et al. Oct 2017 B2
9788907 Alvi et al. Oct 2017 B1
9795436 Yates et al. Oct 2017 B2
9797486 Zergiebel et al. Oct 2017 B2
9801531 Morita et al. Oct 2017 B2
9801626 Parihar et al. Oct 2017 B2
9801627 Harris et al. Oct 2017 B2
9801679 Trees et al. Oct 2017 B2
9802033 Hibner et al. Oct 2017 B2
9804618 Leimbach et al. Oct 2017 B2
9805472 Chou et al. Oct 2017 B2
9808244 Leimbach et al. Nov 2017 B2
9808245 Richard et al. Nov 2017 B2
9808246 Shelton, IV et al. Nov 2017 B2
9808248 Hoffman Nov 2017 B2
9808249 Shelton, IV Nov 2017 B2
9808305 Hareyama et al. Nov 2017 B2
9814457 Martin et al. Nov 2017 B2
9814460 Kimsey et al. Nov 2017 B2
9814462 Woodard, Jr. et al. Nov 2017 B2
9814463 Williams et al. Nov 2017 B2
9820699 Bingley et al. Nov 2017 B2
9820738 Lytle, IV et al. Nov 2017 B2
9820741 Kostrzewski Nov 2017 B2
9820768 Gee et al. Nov 2017 B2
9826976 Parihar et al. Nov 2017 B2
9826977 Leimbach et al. Nov 2017 B2
9827054 Richmond et al. Nov 2017 B2
9827059 Robinson et al. Nov 2017 B2
9830424 Dixon et al. Nov 2017 B2
9833241 Huitema et al. Dec 2017 B2
9833254 Barral et al. Dec 2017 B1
9839419 Deck et al. Dec 2017 B2
9839424 Zergiebel et al. Dec 2017 B2
9839428 Baxter, III et al. Dec 2017 B2
9839467 Harper et al. Dec 2017 B2
9839470 Gilbert et al. Dec 2017 B2
9839487 Dachs, II Dec 2017 B2
9844321 Ekvall et al. Dec 2017 B1
9844368 Boudreaux et al. Dec 2017 B2
9844369 Huitema et al. Dec 2017 B2
9844374 Lytle, IV et al. Dec 2017 B2
9844375 Overmyer et al. Dec 2017 B2
9844376 Baxter, III et al. Dec 2017 B2
9844379 Shelton, IV et al. Dec 2017 B2
9848058 Johnson et al. Dec 2017 B2
9848877 Shelton, IV et al. Dec 2017 B2
9861354 Saliman et al. Jan 2018 B2
9861363 Chen et al. Jan 2018 B2
9861428 Trees et al. Jan 2018 B2
9864839 Baym et al. Jan 2018 B2
9867612 Parihar et al. Jan 2018 B2
9867651 Wham Jan 2018 B2
9867670 Brannan et al. Jan 2018 B2
9867914 Bonano et al. Jan 2018 B2
9872609 Levy Jan 2018 B2
9872683 Hopkins et al. Jan 2018 B2
9877718 Weir et al. Jan 2018 B2
9877721 Schellin et al. Jan 2018 B2
9883860 Leimbach Feb 2018 B2
9888864 Rondoni et al. Feb 2018 B2
9888914 Martin et al. Feb 2018 B2
9888919 Leimbach et al. Feb 2018 B2
9888921 Williams et al. Feb 2018 B2
9888975 Auld Feb 2018 B2
9895148 Shelton, IV et al. Feb 2018 B2
9900787 Ou Feb 2018 B2
9901342 Shelton, IV et al. Feb 2018 B2
9901406 State et al. Feb 2018 B2
9901411 Gombert et al. Feb 2018 B2
9905000 Chou et al. Feb 2018 B2
9907196 Susini et al. Feb 2018 B2
9907550 Sniffin et al. Mar 2018 B2
9913642 Leimbach et al. Mar 2018 B2
9913645 Zerkle et al. Mar 2018 B2
9918326 Gilson et al. Mar 2018 B2
9918730 Trees et al. Mar 2018 B2
9918778 Walberg et al. Mar 2018 B2
9918788 Paul et al. Mar 2018 B2
9922304 DeBusk et al. Mar 2018 B2
9924941 Burbank Mar 2018 B2
9924944 Shelton, IV et al. Mar 2018 B2
9924961 Shelton, IV et al. Mar 2018 B2
9931040 Homyk et al. Apr 2018 B2
9931118 Shelton, IV et al. Apr 2018 B2
9931124 Gokharu Apr 2018 B2
9936863 Tesar Apr 2018 B2
9936942 Chin et al. Apr 2018 B2
9936955 Miller et al. Apr 2018 B2
9936961 Chien et al. Apr 2018 B2
9937012 Hares et al. Apr 2018 B2
9937014 Bowling et al. Apr 2018 B2
9937626 Rockrohr Apr 2018 B2
9938972 Walley Apr 2018 B2
9943230 Kaku et al. Apr 2018 B2
9943309 Shelton, IV et al. Apr 2018 B2
9943312 Posada et al. Apr 2018 B2
9943377 Yates et al. Apr 2018 B2
9943379 Gregg, II et al. Apr 2018 B2
9943918 Grogan et al. Apr 2018 B2
9943964 Hares Apr 2018 B2
9949785 Price et al. Apr 2018 B2
9962157 Sapre May 2018 B2
9968355 Shelton, IV et al. May 2018 B2
9974595 Anderson et al. May 2018 B2
9976259 Tan et al. May 2018 B2
9980140 Spencer et al. May 2018 B1
9980769 Trees et al. May 2018 B2
9980778 Ohline et al. May 2018 B2
9987000 Shelton, IV et al. Jun 2018 B2
9987068 Anderson et al. Jun 2018 B2
9987072 McPherson Jun 2018 B2
9990856 Kuchenbecker et al. Jun 2018 B2
9993248 Shelton, IV et al. Jun 2018 B2
9993258 Shelton, IV et al. Jun 2018 B2
9993305 Andersson Jun 2018 B2
10004491 Martin et al. Jun 2018 B2
10004497 Overmyer et al. Jun 2018 B2
10004500 Shelton, IV et al. Jun 2018 B2
10004501 Shelton, IV et al. Jun 2018 B2
10004527 Gee et al. Jun 2018 B2
10004557 Gross Jun 2018 B2
D822206 Shelton, IV et al. Jul 2018 S
10010322 Shelton, IV et al. Jul 2018 B2
10010324 Huitema et al. Jul 2018 B2
10013049 Leimbach et al. Jul 2018 B2
10016199 Baber et al. Jul 2018 B2
10016538 Locke et al. Jul 2018 B2
10021318 Hugosson et al. Jul 2018 B2
10022090 Whitman Jul 2018 B2
10022120 Martin et al. Jul 2018 B2
10022391 Ruderman Chen et al. Jul 2018 B2
10022568 Messerly et al. Jul 2018 B2
10028402 Walker Jul 2018 B1
10028744 Shelton, IV et al. Jul 2018 B2
10028761 Leimbach et al. Jul 2018 B2
10028788 Kang Jul 2018 B2
10034704 Asher et al. Jul 2018 B2
10037641 Hyde et al. Jul 2018 B2
10037715 Toly et al. Jul 2018 B2
D826405 Shelton, IV et al. Aug 2018 S
10039546 Williams et al. Aug 2018 B2
10039564 Hibner et al. Aug 2018 B2
10039565 Vezzu Aug 2018 B2
10039589 Virshek et al. Aug 2018 B2
10041822 Zemlok Aug 2018 B2
10044791 Kamen et al. Aug 2018 B2
10045704 Fagin et al. Aug 2018 B2
10045776 Shelton, IV et al. Aug 2018 B2
10045779 Savage et al. Aug 2018 B2
10045781 Cropper et al. Aug 2018 B2
10045782 Murthy Aravalli Aug 2018 B2
10045813 Mueller Aug 2018 B2
10048379 Markendorf et al. Aug 2018 B2
10052044 Shelton, IV et al. Aug 2018 B2
10052102 Baxter, III et al. Aug 2018 B2
10052104 Shelton, IV et al. Aug 2018 B2
10054441 Schorr et al. Aug 2018 B2
10058393 Bonutti et al. Aug 2018 B2
10069633 Gulati et al. Sep 2018 B2
10076326 Yates et al. Sep 2018 B2
10080618 Marshall et al. Sep 2018 B2
10084833 McDonnell et al. Sep 2018 B2
D831209 Huitema et al. Oct 2018 S
10085748 Morgan et al. Oct 2018 B2
10085749 Cappola et al. Oct 2018 B2
10092355 Hannaford et al. Oct 2018 B1
10095942 Mentese et al. Oct 2018 B2
10097578 Baldonado et al. Oct 2018 B2
10098527 Weisenburgh, II et al. Oct 2018 B2
10098635 Burbank Oct 2018 B2
10098642 Baxter, III et al. Oct 2018 B2
10098705 Brisson et al. Oct 2018 B2
10102926 Leonardi Oct 2018 B1
10105140 Malinouskas et al. Oct 2018 B2
10105142 Baxter, III et al. Oct 2018 B2
10105470 Reasoner et al. Oct 2018 B2
10111658 Chowaniec et al. Oct 2018 B2
10111665 Aranyi et al. Oct 2018 B2
10111679 Baber et al. Oct 2018 B2
10111703 Cosman, Jr. et al. Oct 2018 B2
D834541 You et al. Nov 2018 S
10117649 Baxter et al. Nov 2018 B2
10117651 Whitman et al. Nov 2018 B2
10117702 Danziger et al. Nov 2018 B2
10118119 Sappok et al. Nov 2018 B2
10130359 Hess et al. Nov 2018 B2
10130360 Olson et al. Nov 2018 B2
10130361 Yates et al. Nov 2018 B2
10130367 Cappola et al. Nov 2018 B2
10130373 Castro et al. Nov 2018 B2
10130432 Auld et al. Nov 2018 B2
10133248 Fitzsimmons et al. Nov 2018 B2
10135242 Baber et al. Nov 2018 B2
10136246 Yamada Nov 2018 B2
10136887 Shelton, IV et al. Nov 2018 B2
10136891 Shelton, IV et al. Nov 2018 B2
10136949 Felder et al. Nov 2018 B2
10136954 Johnson et al. Nov 2018 B2
10137245 Melker et al. Nov 2018 B2
10143526 Walker et al. Dec 2018 B2
10143948 Bonifas et al. Dec 2018 B2
10147148 Wu et al. Dec 2018 B2
10149680 Parihar et al. Dec 2018 B2
10152789 Carnes et al. Dec 2018 B2
10154841 Weaner et al. Dec 2018 B2
10159044 Hrabak Dec 2018 B2
10159481 Whitman et al. Dec 2018 B2
10159483 Beckman et al. Dec 2018 B2
10164466 Calderoni Dec 2018 B2
10166025 Leimbach et al. Jan 2019 B2
10166061 Berry et al. Jan 2019 B2
10169862 Andre et al. Jan 2019 B2
10172618 Shelton, IV et al. Jan 2019 B2
10172687 Garbus et al. Jan 2019 B2
10175096 Dickerson Jan 2019 B2
10175127 Collins et al. Jan 2019 B2
10178992 Wise et al. Jan 2019 B2
10179413 Rockrohr Jan 2019 B2
10180463 Beckman et al. Jan 2019 B2
10182814 Okoniewski Jan 2019 B2
10182816 Shelton, IV et al. Jan 2019 B2
10182818 Hensel et al. Jan 2019 B2
10187742 Dor et al. Jan 2019 B2
10188385 Kerr et al. Jan 2019 B2
10189157 Schlegel et al. Jan 2019 B2
10190888 Hryb et al. Jan 2019 B2
10194891 Jeong et al. Feb 2019 B2
10194907 Marczyk et al. Feb 2019 B2
10194913 Nalagatla et al. Feb 2019 B2
10194972 Yates et al. Feb 2019 B2
10197803 Badiali et al. Feb 2019 B2
10198965 Hart Feb 2019 B2
10201311 Chou et al. Feb 2019 B2
10201349 Leimbach et al. Feb 2019 B2
10201364 Leimbach et al. Feb 2019 B2
10201365 Boudreaux et al. Feb 2019 B2
10205708 Fletcher et al. Feb 2019 B1
10206605 Shelton, IV et al. Feb 2019 B2
10206752 Hares et al. Feb 2019 B2
10213201 Shelton, IV et al. Feb 2019 B2
10213203 Swayze et al. Feb 2019 B2
10213266 Zemlok et al. Feb 2019 B2
10213268 Dachs, II Feb 2019 B2
10219491 Stiles, Jr. et al. Mar 2019 B2
10220522 Rockrohr Mar 2019 B2
10222750 Bang et al. Mar 2019 B2
10226249 Jaworek et al. Mar 2019 B2
10226250 Beckman et al. Mar 2019 B2
10226254 Cabrera et al. Mar 2019 B2
10226302 Lacal et al. Mar 2019 B2
10231634 Zand et al. Mar 2019 B2
10231733 Ehrenfels et al. Mar 2019 B2
10231775 Shelton, IV et al. Mar 2019 B2
10238413 Hibner et al. Mar 2019 B2
10245027 Shelton, IV et al. Apr 2019 B2
10245028 Shelton, IV et al. Apr 2019 B2
10245029 Hunter et al. Apr 2019 B2
10245030 Hunter et al. Apr 2019 B2
10245033 Overmyer et al. Apr 2019 B2
10245037 Conklin et al. Apr 2019 B2
10245038 Hopkins et al. Apr 2019 B2
10245040 Milliman Apr 2019 B2
10251661 Collings et al. Apr 2019 B2
10251725 Valentine et al. Apr 2019 B2
10255995 Ingmanson Apr 2019 B2
10258331 Shelton, IV et al. Apr 2019 B2
10258359 Kapadia Apr 2019 B2
10258362 Conlon Apr 2019 B2
10258363 Worrell et al. Apr 2019 B2
10258415 Harrah et al. Apr 2019 B2
10258418 Shelton, IV et al. Apr 2019 B2
10258425 Mustufa et al. Apr 2019 B2
10263171 Wiener et al. Apr 2019 B2
10265004 Yamaguchi et al. Apr 2019 B2
10265035 Fehre et al. Apr 2019 B2
10265066 Measamer et al. Apr 2019 B2
10265068 Harris et al. Apr 2019 B2
10265072 Shelton, IV et al. Apr 2019 B2
10265090 Ingmanson et al. Apr 2019 B2
10265130 Hess et al. Apr 2019 B2
10271840 Sapre Apr 2019 B2
10271844 Valentine et al. Apr 2019 B2
10271846 Shelton, IV et al. Apr 2019 B2
10271850 Williams Apr 2019 B2
10271851 Shelton, IV et al. Apr 2019 B2
D847989 Shelton, IV et al. May 2019 S
10278698 Racenet May 2019 B2
10278778 State et al. May 2019 B2
10282963 Fahey May 2019 B2
10283220 Azizian et al. May 2019 B2
10285694 Viola et al. May 2019 B2
10285698 Cappola et al. May 2019 B2
10285700 Scheib May 2019 B2
10285705 Shelton, IV et al. May 2019 B2
10292610 Srivastava May 2019 B2
10292704 Harris et al. May 2019 B2
10292707 Shelton, IV et al. May 2019 B2
10292758 Boudreaux et al. May 2019 B2
10292769 Yu May 2019 B1
10292771 Wood et al. May 2019 B2
10293129 Fox et al. May 2019 B2
10299792 Huitema et al. May 2019 B2
10299868 Tsuboi et al. May 2019 B2
10299870 Connolly et al. May 2019 B2
10305926 Mihan et al. May 2019 B2
D850617 Shelton, IV et al. Jun 2019 S
10307159 Harris et al. Jun 2019 B2
10307170 Parfett et al. Jun 2019 B2
10307199 Farritor et al. Jun 2019 B2
10311036 Hussam et al. Jun 2019 B1
10313137 Aarnio et al. Jun 2019 B2
10314577 Laurent et al. Jun 2019 B2
10314582 Shelton, IV et al. Jun 2019 B2
10321907 Shelton, IV et al. Jun 2019 B2
10321964 Grover et al. Jun 2019 B2
10327764 Harris et al. Jun 2019 B2
10327779 Richard et al. Jun 2019 B2
10335042 Schoenle et al. Jul 2019 B2
10335147 Rector et al. Jul 2019 B2
10335149 Baxter, III et al. Jul 2019 B2
10335180 Johnson et al. Jul 2019 B2
10335227 Heard Jul 2019 B2
10339496 Matson et al. Jul 2019 B2
10342543 Shelton, IV et al. Jul 2019 B2
10342602 Strobl et al. Jul 2019 B2
10342623 Huelman et al. Jul 2019 B2
10343102 Reasoner et al. Jul 2019 B2
10349824 Claude et al. Jul 2019 B2
10349939 Shelton, IV et al. Jul 2019 B2
10349941 Marczyk et al. Jul 2019 B2
10350016 Burbank et al. Jul 2019 B2
10357184 Crawford et al. Jul 2019 B2
10357246 Shelton, IV et al. Jul 2019 B2
10357247 Shelton, IV et al. Jul 2019 B2
10362179 Harris Jul 2019 B2
10363032 Scheib et al. Jul 2019 B2
10363037 Aronhalt et al. Jul 2019 B2
10368861 Baxter, III et al. Aug 2019 B2
10368865 Harris et al. Aug 2019 B2
10368867 Harris et al. Aug 2019 B2
10368876 Bhatnagar et al. Aug 2019 B2
10368894 Madan et al. Aug 2019 B2
10368903 Morales et al. Aug 2019 B2
10376263 Morgan et al. Aug 2019 B2
10376305 Yates et al. Aug 2019 B2
10376337 Kilroy et al. Aug 2019 B2
10376338 Taylor et al. Aug 2019 B2
10378893 Mankovskii Aug 2019 B2
10383518 Abu-Tarif et al. Aug 2019 B2
10383699 Kilroy et al. Aug 2019 B2
10384021 Koeth et al. Aug 2019 B2
10386990 Shikhman et al. Aug 2019 B2
10390718 Chen et al. Aug 2019 B2
10390794 Kuroiwa et al. Aug 2019 B2
10390825 Shelton, IV et al. Aug 2019 B2
10390831 Holsten et al. Aug 2019 B2
10390895 Henderson et al. Aug 2019 B2
10398348 Osadchy et al. Sep 2019 B2
10398434 Shelton, IV et al. Sep 2019 B2
10398517 Eckert et al. Sep 2019 B2
10398521 Itkowitz et al. Sep 2019 B2
10404521 McChord et al. Sep 2019 B2
10404801 Martch Sep 2019 B2
10405857 Shelton, IV et al. Sep 2019 B2
10405859 Harris et al. Sep 2019 B2
10405863 Wise et al. Sep 2019 B2
10413291 Worthington et al. Sep 2019 B2
10413293 Shelton, IV et al. Sep 2019 B2
10413297 Harris et al. Sep 2019 B2
10417446 Takeyama Sep 2019 B2
10420552 Shelton, IV et al. Sep 2019 B2
10420558 Nalagatla et al. Sep 2019 B2
10420559 Marczyk et al. Sep 2019 B2
10420620 Rockrohr Sep 2019 B2
10420865 Reasoner et al. Sep 2019 B2
10422727 Pliskin Sep 2019 B2
10426466 Contini et al. Oct 2019 B2
10426467 Miller et al. Oct 2019 B2
10426468 Contini et al. Oct 2019 B2
10426471 Shelton, IV et al. Oct 2019 B2
10426481 Aronhalt et al. Oct 2019 B2
10433837 Worthington et al. Oct 2019 B2
10433844 Shelton, IV et al. Oct 2019 B2
10433849 Shelton, IV et al. Oct 2019 B2
10433918 Shelton, IV et al. Oct 2019 B2
10441279 Shelton, IV et al. Oct 2019 B2
10441281 Shelton, IV et al. Oct 2019 B2
10441344 Notz et al. Oct 2019 B2
10441345 Aldridge et al. Oct 2019 B2
10448948 Shelton, IV et al. Oct 2019 B2
10448950 Shelton, IV et al. Oct 2019 B2
10456137 Vendely et al. Oct 2019 B2
10456140 Shelton, IV et al. Oct 2019 B2
10456193 Yates et al. Oct 2019 B2
10463365 Williams Nov 2019 B2
10463367 Kostrzewski et al. Nov 2019 B2
10463371 Kostrzewski Nov 2019 B2
10463436 Jackson et al. Nov 2019 B2
10470684 Toth et al. Nov 2019 B2
10470762 Leimbach et al. Nov 2019 B2
10470764 Baxter, III et al. Nov 2019 B2
10470768 Harris et al. Nov 2019 B2
10470791 Houser Nov 2019 B2
10471254 Sano et al. Nov 2019 B2
10478181 Shelton, IV et al. Nov 2019 B2
10478182 Taylor Nov 2019 B2
10478185 Nicholas Nov 2019 B2
10478189 Bear et al. Nov 2019 B2
10478190 Miller et al. Nov 2019 B2
10478544 Friederichs et al. Nov 2019 B2
10485450 Gupta et al. Nov 2019 B2
10485542 Shelton, IV et al. Nov 2019 B2
10485543 Shelton, IV et al. Nov 2019 B2
10492783 Shelton, IV et al. Dec 2019 B2
10492784 Beardsley et al. Dec 2019 B2
10492785 Overmyer et al. Dec 2019 B2
10496788 Amarasingham et al. Dec 2019 B2
10498269 Zemlok et al. Dec 2019 B2
10499847 Latimer et al. Dec 2019 B2
10499891 Chaplin et al. Dec 2019 B2
10499914 Huang et al. Dec 2019 B2
10499915 Aranyi Dec 2019 B2
10499994 Luks et al. Dec 2019 B2
10507068 Kopp et al. Dec 2019 B2
10507278 Gao et al. Dec 2019 B2
10510267 Jarc et al. Dec 2019 B2
10512413 Schepis et al. Dec 2019 B2
10512461 Gupta et al. Dec 2019 B2
10512499 McHenry et al. Dec 2019 B2
10512509 Bowling et al. Dec 2019 B2
10512514 Nowlin et al. Dec 2019 B2
10517588 Gupta et al. Dec 2019 B2
10517595 Hunter et al. Dec 2019 B2
10517596 Hunter et al. Dec 2019 B2
10517686 Vokrot et al. Dec 2019 B2
10524789 Swayze et al. Jan 2020 B2
10531579 Hsiao et al. Jan 2020 B2
10531874 Morgan et al. Jan 2020 B2
10531929 Widenhouse et al. Jan 2020 B2
10532330 Diallo et al. Jan 2020 B2
10536617 Liang et al. Jan 2020 B2
10537324 Shelton, IV et al. Jan 2020 B2
10537325 Bakos et al. Jan 2020 B2
10537351 Shelton, IV et al. Jan 2020 B2
10537396 Zingaretti et al. Jan 2020 B2
10542978 Chowaniec et al. Jan 2020 B2
10542979 Shelton, IV et al. Jan 2020 B2
10542982 Beckman et al. Jan 2020 B2
10542991 Shelton, IV et al. Jan 2020 B2
D876466 Kobayashi et al. Feb 2020 S
10548504 Shelton, IV et al. Feb 2020 B2
10548612 Martinez et al. Feb 2020 B2
10548673 Harris et al. Feb 2020 B2
10552574 Sweeney Feb 2020 B2
10555675 Satish et al. Feb 2020 B2
10555748 Yates et al. Feb 2020 B2
10555750 Conlon et al. Feb 2020 B2
10555769 Worrell et al. Feb 2020 B2
10561349 Wedekind et al. Feb 2020 B2
10561422 Schellin et al. Feb 2020 B2
10561470 Hourtash et al. Feb 2020 B2
10561471 Nichogi Feb 2020 B2
10561560 Boutoussov et al. Feb 2020 B2
10561753 Thompson et al. Feb 2020 B2
10565170 Walling et al. Feb 2020 B2
10568625 Harris et al. Feb 2020 B2
10568626 Shelton, IV et al. Feb 2020 B2
10568632 Miller et al. Feb 2020 B2
10568704 Savall et al. Feb 2020 B2
10575868 Hall et al. Mar 2020 B2
10582928 Hunter et al. Mar 2020 B2
10582931 Mujawar Mar 2020 B2
10582962 Friedrichs et al. Mar 2020 B2
10582964 Weinberg et al. Mar 2020 B2
10586074 Rose et al. Mar 2020 B2
10588623 Schmid et al. Mar 2020 B2
10588625 Weaner et al. Mar 2020 B2
10588629 Malinouskas et al. Mar 2020 B2
10588630 Shelton, IV et al. Mar 2020 B2
10588631 Shelton, IV et al. Mar 2020 B2
10588632 Shelton, IV et al. Mar 2020 B2
10588711 DiCarlo et al. Mar 2020 B2
10592067 Merdan et al. Mar 2020 B2
10595844 Nawana et al. Mar 2020 B2
10595882 Parfett et al. Mar 2020 B2
10595887 Shelton, IV et al. Mar 2020 B2
10595930 Scheib et al. Mar 2020 B2
10595952 Forrest et al. Mar 2020 B2
10602007 Takano Mar 2020 B2
10602848 Magana Mar 2020 B2
10603036 Hunter et al. Mar 2020 B2
10603128 Zergiebel et al. Mar 2020 B2
10610223 Wellman et al. Apr 2020 B2
10610224 Shelton, IV et al. Apr 2020 B2
10610286 Wiener et al. Apr 2020 B2
10610313 Bailey et al. Apr 2020 B2
10617412 Shelton, IV et al. Apr 2020 B2
10617413 Shelton, IV et al. Apr 2020 B2
10617414 Shelton, IV et al. Apr 2020 B2
10617482 Houser et al. Apr 2020 B2
10617484 Kilroy et al. Apr 2020 B2
10624635 Harris et al. Apr 2020 B2
10624667 Faller et al. Apr 2020 B2
10624691 Wiener et al. Apr 2020 B2
10631423 Collins et al. Apr 2020 B2
10631858 Burbank Apr 2020 B2
10631912 McFarlin et al. Apr 2020 B2
10631916 Horner et al. Apr 2020 B2
10631917 Ineson Apr 2020 B2
10631939 Dachs, II et al. Apr 2020 B2
10639027 Shelton, IV et al. May 2020 B2
10639034 Harris et al. May 2020 B2
10639035 Shelton, IV et al. May 2020 B2
10639036 Yates et al. May 2020 B2
10639037 Shelton, IV et al. May 2020 B2
10639039 Vendely et al. May 2020 B2
10639098 Cosman et al. May 2020 B2
10639111 Kopp May 2020 B2
10639185 Agrawal et al. May 2020 B2
10653413 Worthington et al. May 2020 B2
10653476 Ross May 2020 B2
10653489 Kopp May 2020 B2
10656720 Holz May 2020 B1
10660705 Piron et al. May 2020 B2
10667809 Bakos et al. Jun 2020 B2
10667810 Shelton, IV et al. Jun 2020 B2
10667811 Harris et al. Jun 2020 B2
10667877 Kapadia Jun 2020 B2
10674897 Levy Jun 2020 B2
10675021 Harris et al. Jun 2020 B2
10675023 Cappola Jun 2020 B2
10675024 Shelton, IV et al. Jun 2020 B2
10675025 Swayze et al. Jun 2020 B2
10675026 Harris et al. Jun 2020 B2
10675035 Zingman Jun 2020 B2
10675100 Frushour Jun 2020 B2
10675104 Kapadia Jun 2020 B2
10677764 Ross et al. Jun 2020 B2
10679758 Fox et al. Jun 2020 B2
10682136 Harris et al. Jun 2020 B2
10682138 Shelton, IV et al. Jun 2020 B2
10686805 Reybok, Jr. et al. Jun 2020 B2
10687806 Shelton, IV et al. Jun 2020 B2
10687809 Shelton, IV et al. Jun 2020 B2
10687810 Shelton, IV et al. Jun 2020 B2
10687884 Wiener et al. Jun 2020 B2
10687905 Kostrzewski Jun 2020 B2
10695055 Shelton, IV et al. Jun 2020 B2
10695081 Shelton, IV et al. Jun 2020 B2
10695134 Barral et al. Jun 2020 B2
10702270 Shelton, IV et al. Jul 2020 B2
10702271 Aranyi et al. Jul 2020 B2
10709446 Harris et al. Jul 2020 B2
10716473 Greiner Jul 2020 B2
10716489 Kalvoy et al. Jul 2020 B2
10716583 Look et al. Jul 2020 B2
10716615 Shelton, IV et al. Jul 2020 B2
10716639 Kapadia et al. Jul 2020 B2
10717194 Griffiths et al. Jul 2020 B2
10722222 Aranyi Jul 2020 B2
10722233 Wellman Jul 2020 B2
10722292 Arya et al. Jul 2020 B2
D893717 Messerly et al. Aug 2020 S
10729458 Stoddard et al. Aug 2020 B2
10729509 Shelton, IV et al. Aug 2020 B2
10733267 Pedersen Aug 2020 B2
10736219 Seow et al. Aug 2020 B2
10736498 Watanabe et al. Aug 2020 B2
10736616 Scheib et al. Aug 2020 B2
10736628 Yates et al. Aug 2020 B2
10736629 Shelton, IV et al. Aug 2020 B2
10736636 Baxter, III et al. Aug 2020 B2
10736705 Scheib et al. Aug 2020 B2
10743872 Leimbach et al. Aug 2020 B2
10748115 Laster et al. Aug 2020 B2
10751052 Stokes et al. Aug 2020 B2
10751136 Farritor et al. Aug 2020 B2
10751239 Volek et al. Aug 2020 B2
10751768 Hersey et al. Aug 2020 B2
10755813 Shelton, IV et al. Aug 2020 B2
D896379 Shelton, IV et al. Sep 2020 S
10758229 Shelton, IV et al. Sep 2020 B2
10758230 Shelton, IV et al. Sep 2020 B2
10758294 Jones Sep 2020 B2
10758310 Shelton, IV et al. Sep 2020 B2
10765376 Brown, III et al. Sep 2020 B2
10765424 Baxter, III et al. Sep 2020 B2
10765427 Shelton, IV et al. Sep 2020 B2
10765470 Yates et al. Sep 2020 B2
10772630 Wixey Sep 2020 B2
10772651 Shelton, IV et al. Sep 2020 B2
10772673 Allen, IV et al. Sep 2020 B2
10772688 Peine et al. Sep 2020 B2
10779818 Zemlok et al. Sep 2020 B2
10779821 Harris et al. Sep 2020 B2
10779823 Shelton, IV et al. Sep 2020 B2
10779897 Rockrohr Sep 2020 B2
10779900 Pedros et al. Sep 2020 B2
10783634 Nye et al. Sep 2020 B2
10786298 Johnson Sep 2020 B2
10786317 Zhou et al. Sep 2020 B2
10786327 Anderson et al. Sep 2020 B2
10792038 Becerra et al. Oct 2020 B2
10792118 Prpa et al. Oct 2020 B2
10792422 Douglas et al. Oct 2020 B2
10799304 Kapadia et al. Oct 2020 B2
10803977 Sanmugalingham Oct 2020 B2
10806445 Penna et al. Oct 2020 B2
10806453 Chen et al. Oct 2020 B2
10806454 Kopp Oct 2020 B2
10806499 Castaneda et al. Oct 2020 B2
10806506 Gaspredes et al. Oct 2020 B2
10806532 Grubbs et al. Oct 2020 B2
10811131 Schneider et al. Oct 2020 B2
10813638 Shelton, IV et al. Oct 2020 B2
10813703 Swayze et al. Oct 2020 B2
10818383 Sharifi Sedeh et al. Oct 2020 B2
10828028 Harris et al. Nov 2020 B2
10828030 Weir et al. Nov 2020 B2
10835206 Bell et al. Nov 2020 B2
10835245 Swayze et al. Nov 2020 B2
10835246 Shelton, IV et al. Nov 2020 B2
10835247 Shelton, IV et al. Nov 2020 B2
10842473 Scheib et al. Nov 2020 B2
10842490 DiNardo et al. Nov 2020 B2
10842492 Shelton, IV et al. Nov 2020 B2
10842522 Messerly et al. Nov 2020 B2
10842523 Shelton, IV et al. Nov 2020 B2
10842575 Panescu et al. Nov 2020 B2
10842897 Schwartz et al. Nov 2020 B2
D904612 Wynn et al. Dec 2020 S
10849697 Yates et al. Dec 2020 B2
10849700 Kopp et al. Dec 2020 B2
10856768 Osadchy et al. Dec 2020 B2
10856867 Shelton, IV et al. Dec 2020 B2
10856868 Shelton, IV et al. Dec 2020 B2
10856870 Harris et al. Dec 2020 B2
10863984 Shelton, IV et al. Dec 2020 B2
10864037 Mun et al. Dec 2020 B2
10864050 Tabandeh et al. Dec 2020 B2
10872684 McNutt et al. Dec 2020 B2
10874396 Moore et al. Dec 2020 B2
10881399 Shelton, IV et al. Jan 2021 B2
10881401 Baber et al. Jan 2021 B2
10881446 Strobl Jan 2021 B2
10881464 Odermatt et al. Jan 2021 B2
10888321 Shelton, IV et al. Jan 2021 B2
10888322 Morgan et al. Jan 2021 B2
10892899 Shelton, IV et al. Jan 2021 B2
10892995 Shelton, IV et al. Jan 2021 B2
10893863 Shelton, IV et al. Jan 2021 B2
10893864 Harris et al. Jan 2021 B2
10893884 Stoddard et al. Jan 2021 B2
10898105 Weprin et al. Jan 2021 B2
10898183 Shelton, IV et al. Jan 2021 B2
10898186 Bakos et al. Jan 2021 B2
10898189 Mcdonald, II Jan 2021 B2
10898256 Yates et al. Jan 2021 B2
10898280 Kopp Jan 2021 B2
10898622 Shelton, IV et al. Jan 2021 B2
10902944 Casey et al. Jan 2021 B1
10903685 Yates et al. Jan 2021 B2
10905415 DiNardo et al. Feb 2021 B2
10905418 Shelton, IV et al. Feb 2021 B2
10905420 Jasemian et al. Feb 2021 B2
10912567 Shelton, IV et al. Feb 2021 B2
10916415 Karancsi et al. Feb 2021 B2
D914878 Shelton, IV et al. Mar 2021 S
10932784 Mozdzierz et al. Mar 2021 B2
10950982 Regnier et al. Mar 2021 B2
10952732 Binmoeller et al. Mar 2021 B2
10962449 Unuma et al. Mar 2021 B2
10966590 Takahashi et al. Apr 2021 B2
10980595 Wham Apr 2021 B2
11000276 Shelton, IV et al. May 2021 B2
11020115 Scheib et al. Jun 2021 B2
11051817 Shelton, IV et al. Jul 2021 B2
11051902 Kruecker et al. Jul 2021 B2
11058501 Tokarchuk et al. Jul 2021 B2
11071595 Johnson et al. Jul 2021 B2
11103246 Marczyk et al. Aug 2021 B2
11141213 Yates et al. Oct 2021 B2
11183293 Lu et al. Nov 2021 B2
11185325 Shelton, IV et al. Nov 2021 B2
11197731 Hoffman et al. Dec 2021 B2
11273290 Kowshik Mar 2022 B2
11289188 Mabotuwana et al. Mar 2022 B2
D950728 Bakos et al. May 2022 S
D952144 Boudreaux May 2022 S
11322248 Grantcharov et al. May 2022 B2
11350932 Shelton, IV et al. Jun 2022 B2
11373755 Shelton, IV et al. Jun 2022 B2
11376098 Shelton, IV et al. Jul 2022 B2
11382715 Arai et al. Jul 2022 B2
D964564 Boudreaux Sep 2022 S
11464514 Yates et al. Oct 2022 B2
11504191 Mccloud et al. Nov 2022 B2
11571212 Yates et al. Feb 2023 B2
11701139 Nott et al. Jul 2023 B2
20010056237 Cane et al. Dec 2001 A1
20020049551 Friedman et al. Apr 2002 A1
20020052616 Wiener et al. May 2002 A1
20020072746 Lingenfelder et al. Jun 2002 A1
20020138642 Miyazawa et al. Sep 2002 A1
20020144147 Basson et al. Oct 2002 A1
20020169584 Fu et al. Nov 2002 A1
20020194023 Turley et al. Dec 2002 A1
20030009111 Cory et al. Jan 2003 A1
20030009154 Whitman Jan 2003 A1
20030018329 Hooven Jan 2003 A1
20030028183 Sanchez et al. Feb 2003 A1
20030046109 Uchikubo Mar 2003 A1
20030050654 Whitman et al. Mar 2003 A1
20030069573 Kadhiresan et al. Apr 2003 A1
20030093503 Yamaki et al. May 2003 A1
20030114851 Truckai et al. Jun 2003 A1
20030120284 Palacios et al. Jun 2003 A1
20030130711 Pearson et al. Jul 2003 A1
20030210812 Khamene et al. Nov 2003 A1
20030223877 Anstine et al. Dec 2003 A1
20040015053 Bieger et al. Jan 2004 A1
20040078236 Stoodley et al. Apr 2004 A1
20040082850 Bonner et al. Apr 2004 A1
20040092992 Adams et al. May 2004 A1
20040108825 Lee et al. Jun 2004 A1
20040124964 Wang Jul 2004 A1
20040199180 Knodel et al. Oct 2004 A1
20040199659 Ishikawa et al. Oct 2004 A1
20040206365 Knowlton Oct 2004 A1
20040215131 Sakurai Oct 2004 A1
20040229496 Robinson et al. Nov 2004 A1
20040243147 Lipow Dec 2004 A1
20040243148 Wasielewski Dec 2004 A1
20040243435 Williams Dec 2004 A1
20050020909 Moctezuma de la Barrera et al. Jan 2005 A1
20050020918 Wilk et al. Jan 2005 A1
20050021027 Shields et al. Jan 2005 A1
20050023324 Doll et al. Feb 2005 A1
20050033108 Sawyer Feb 2005 A1
20050063575 Ma et al. Mar 2005 A1
20050065438 Miller Mar 2005 A1
20050070800 Takahashi Mar 2005 A1
20050100867 Hilscher et al. May 2005 A1
20050131390 Heinrich et al. Jun 2005 A1
20050139629 Schwemberger et al. Jun 2005 A1
20050143759 Kelly Jun 2005 A1
20050148854 Ito et al. Jul 2005 A1
20050149001 Uchikubo et al. Jul 2005 A1
20050149356 Cyr et al. Jul 2005 A1
20050165390 Mauti et al. Jul 2005 A1
20050182655 Merzlak et al. Aug 2005 A1
20050192633 Montpetit Sep 2005 A1
20050203380 Sauer et al. Sep 2005 A1
20050203384 Sati et al. Sep 2005 A1
20050203504 Wham et al. Sep 2005 A1
20050213832 Schofield et al. Sep 2005 A1
20050222631 Dalal et al. Oct 2005 A1
20050228246 Lee et al. Oct 2005 A1
20050228425 Boukhny et al. Oct 2005 A1
20050236474 Onuma et al. Oct 2005 A1
20050251233 Kanzius Nov 2005 A1
20050277913 McCary Dec 2005 A1
20050288425 Lee et al. Dec 2005 A1
20060020272 Gildenberg Jan 2006 A1
20060025816 Shelton Feb 2006 A1
20060039105 Smith et al. Feb 2006 A1
20060059018 Shiobara et al. Mar 2006 A1
20060069388 Truckai et al. Mar 2006 A1
20060079872 Eggleston Apr 2006 A1
20060079874 Faller et al. Apr 2006 A1
20060116908 Dew et al. Jun 2006 A1
20060122558 Sherman et al. Jun 2006 A1
20060136622 Rouvelin et al. Jun 2006 A1
20060142739 DiSilestro et al. Jun 2006 A1
20060184160 Ozaki et al. Aug 2006 A1
20060241399 Fabian Oct 2006 A1
20060282009 Oberg et al. Dec 2006 A1
20060287645 Tashiro et al. Dec 2006 A1
20070005002 Millman et al. Jan 2007 A1
20070010838 Shelton et al. Jan 2007 A1
20070016235 Tanaka et al. Jan 2007 A1
20070016979 Damaj et al. Jan 2007 A1
20070027459 Horvath et al. Feb 2007 A1
20070038080 Salisbury et al. Feb 2007 A1
20070049947 Menn et al. Mar 2007 A1
20070066970 Ineson Mar 2007 A1
20070078678 DiSilvestro et al. Apr 2007 A1
20070084896 Doll et al. Apr 2007 A1
20070085528 Govari et al. Apr 2007 A1
20070156019 Larkin et al. Jul 2007 A1
20070167702 Hasser et al. Jul 2007 A1
20070168461 Moore Jul 2007 A1
20070173803 Wham et al. Jul 2007 A1
20070175951 Shelton et al. Aug 2007 A1
20070175955 Shelton et al. Aug 2007 A1
20070179482 Anderson Aug 2007 A1
20070179508 Arndt Aug 2007 A1
20070191713 Eichmann et al. Aug 2007 A1
20070192139 Cookson et al. Aug 2007 A1
20070203744 Scholl Aug 2007 A1
20070225556 Ortiz et al. Sep 2007 A1
20070225690 Sekiguchi et al. Sep 2007 A1
20070239028 Houser et al. Oct 2007 A1
20070244478 Bahney Oct 2007 A1
20070249990 Cosmescu Oct 2007 A1
20070270660 Caylor et al. Nov 2007 A1
20070282195 Masini et al. Dec 2007 A1
20070282321 Shah et al. Dec 2007 A1
20070282333 Fortson et al. Dec 2007 A1
20070293218 Meylan et al. Dec 2007 A1
20080013460 Allen et al. Jan 2008 A1
20080015664 Podhajsky Jan 2008 A1
20080015912 Rosenthal et al. Jan 2008 A1
20080019393 Yamaki Jan 2008 A1
20080033404 Romoda et al. Feb 2008 A1
20080039742 Hashimshony et al. Feb 2008 A1
20080040151 Moore Feb 2008 A1
20080058593 Gu et al. Mar 2008 A1
20080059658 Williams Mar 2008 A1
20080077158 Haider et al. Mar 2008 A1
20080083414 Messerges Apr 2008 A1
20080091071 Kumar et al. Apr 2008 A1
20080114212 Messerges May 2008 A1
20080114350 Park et al. May 2008 A1
20080129465 Rao Jun 2008 A1
20080140090 Aranyi et al. Jun 2008 A1
20080164296 Shelton et al. Jul 2008 A1
20080167644 Shelton et al. Jul 2008 A1
20080177258 Govari et al. Jul 2008 A1
20080177362 Phillips et al. Jul 2008 A1
20080200940 Eichmann et al. Aug 2008 A1
20080223904 Marczyk Sep 2008 A1
20080234708 Houser et al. Sep 2008 A1
20080235052 Node-Langlois et al. Sep 2008 A1
20080245841 Smith et al. Oct 2008 A1
20080255413 Zemlok et al. Oct 2008 A1
20080262654 Omori et al. Oct 2008 A1
20080272172 Zemlok et al. Nov 2008 A1
20080281301 DeBoer et al. Nov 2008 A1
20080281678 Keuls et al. Nov 2008 A1
20080296346 Shelton, IV et al. Dec 2008 A1
20080306759 Ilkin et al. Dec 2008 A1
20080312953 Claus Dec 2008 A1
20090017910 Rofougaran et al. Jan 2009 A1
20090030437 Houser et al. Jan 2009 A1
20090036750 Weinstein et al. Feb 2009 A1
20090036794 Stubhaug et al. Feb 2009 A1
20090043253 Podaima Feb 2009 A1
20090046146 Hoyt Feb 2009 A1
20090048589 Takashino et al. Feb 2009 A1
20090048595 Mihori et al. Feb 2009 A1
20090048611 Funda et al. Feb 2009 A1
20090076409 Wu et al. Mar 2009 A1
20090090763 Zemlok et al. Apr 2009 A1
20090099866 Newman Apr 2009 A1
20090114699 Viola May 2009 A1
20090128084 Johnson et al. May 2009 A1
20090157072 Wham et al. Jun 2009 A1
20090182577 Squilla et al. Jul 2009 A1
20090188094 Cunningham et al. Jul 2009 A1
20090192591 Ryan et al. Jul 2009 A1
20090206131 Weisenburgh, II et al. Aug 2009 A1
20090217932 Voegele Sep 2009 A1
20090234352 Behnke et al. Sep 2009 A1
20090259149 Tahara et al. Oct 2009 A1
20090259221 Tahara et al. Oct 2009 A1
20090259489 Kimura et al. Oct 2009 A1
20090270678 Scott et al. Oct 2009 A1
20090281541 Ibrahim et al. Nov 2009 A1
20090299214 Wu et al. Dec 2009 A1
20090306581 Claus Dec 2009 A1
20090307681 Armado et al. Dec 2009 A1
20090326321 Jacobsen et al. Dec 2009 A1
20090326336 Lemke Dec 2009 A1
20100036374 Ward Feb 2010 A1
20100036405 Giordano et al. Feb 2010 A1
20100038403 D'Arcangelo Feb 2010 A1
20100057106 Sorrentino et al. Mar 2010 A1
20100065604 Weng Mar 2010 A1
20100069939 Konishi Mar 2010 A1
20100069942 Shelton, IV Mar 2010 A1
20100070417 Flynn et al. Mar 2010 A1
20100120266 Rimborg May 2010 A1
20100132334 Duclos et al. Jun 2010 A1
20100137845 Ramstein et al. Jun 2010 A1
20100137886 Zergiebel et al. Jun 2010 A1
20100161129 Costa Jun 2010 A1
20100168561 Anderson Jul 2010 A1
20100179831 Brown et al. Jul 2010 A1
20100191100 Anderson et al. Jul 2010 A1
20100194574 Monk et al. Aug 2010 A1
20100198200 Horvath Aug 2010 A1
20100198248 Vakharia Aug 2010 A1
20100204717 Knodel Aug 2010 A1
20100217991 Choi Aug 2010 A1
20100234996 Schreiber et al. Sep 2010 A1
20100235689 Tian et al. Sep 2010 A1
20100250571 Pierce et al. Sep 2010 A1
20100258327 Esenwein et al. Oct 2010 A1
20100280247 Mutti et al. Nov 2010 A1
20100292535 Paskar Nov 2010 A1
20100292684 Cybulski et al. Nov 2010 A1
20100301095 Shelton, IV et al. Dec 2010 A1
20110006876 Moberg et al. Jan 2011 A1
20110015649 Anvari et al. Jan 2011 A1
20110022032 Zemlok et al. Jan 2011 A1
20110036890 Ma Feb 2011 A1
20110043612 Keller et al. Feb 2011 A1
20110046618 Minar et al. Feb 2011 A1
20110071530 Carson Mar 2011 A1
20110077512 Boswell Mar 2011 A1
20110087238 Wang et al. Apr 2011 A1
20110087502 Yelton et al. Apr 2011 A1
20110105277 Shauli May 2011 A1
20110105895 Kornblau et al. May 2011 A1
20110112569 Friedman et al. May 2011 A1
20110118708 Burbank et al. May 2011 A1
20110119075 Dhoble May 2011 A1
20110125149 El-Galley et al. May 2011 A1
20110152712 Cao et al. Jun 2011 A1
20110163147 Laurent et al. Jul 2011 A1
20110166883 Palmer et al. Jul 2011 A1
20110196398 Robertson et al. Aug 2011 A1
20110209128 Nikara et al. Aug 2011 A1
20110218526 Mathur Sep 2011 A1
20110237883 Chun Sep 2011 A1
20110238079 Hannaford et al. Sep 2011 A1
20110251612 Faller et al. Oct 2011 A1
20110264000 Paul et al. Oct 2011 A1
20110264078 Lipow et al. Oct 2011 A1
20110264086 Ingle Oct 2011 A1
20110265311 Kondo et al. Nov 2011 A1
20110273465 Konishi et al. Nov 2011 A1
20110278343 Knodel et al. Nov 2011 A1
20110290024 Lefler Dec 2011 A1
20110295270 Giordano et al. Dec 2011 A1
20110306840 Allen et al. Dec 2011 A1
20110307284 Thompson et al. Dec 2011 A1
20120012638 Huang et al. Jan 2012 A1
20120021684 Schultz et al. Jan 2012 A1
20120022519 Huang et al. Jan 2012 A1
20120029354 Mark et al. Feb 2012 A1
20120046662 Gilbert Feb 2012 A1
20120059684 Hampapur et al. Mar 2012 A1
20120078247 Worrell et al. Mar 2012 A1
20120080336 Shelton, IV et al. Apr 2012 A1
20120083786 Artale et al. Apr 2012 A1
20120100517 Bowditch et al. Apr 2012 A1
20120101488 Aldridge et al. Apr 2012 A1
20120116265 Houser et al. May 2012 A1
20120116381 Houser et al. May 2012 A1
20120116394 Timm et al. May 2012 A1
20120130217 Kauphusman et al. May 2012 A1
20120145714 Farascioni et al. Jun 2012 A1
20120172696 Kallback et al. Jul 2012 A1
20120190981 Harris et al. Jul 2012 A1
20120191091 Allen Jul 2012 A1
20120191162 Villa Jul 2012 A1
20120197619 Namer Yelin et al. Aug 2012 A1
20120203067 Higgins et al. Aug 2012 A1
20120203785 Awada Aug 2012 A1
20120211542 Racenet Aug 2012 A1
20120226150 Balicki et al. Sep 2012 A1
20120232549 Willyard et al. Sep 2012 A1
20120245958 Lawrence et al. Sep 2012 A1
20120253329 Zemlok et al. Oct 2012 A1
20120253847 Dell'Anno et al. Oct 2012 A1
20120265555 Cappuzzo et al. Oct 2012 A1
20120292367 Morgan et al. Nov 2012 A1
20120319859 Taub et al. Dec 2012 A1
20130001121 Metzger Jan 2013 A1
20130006241 Takashino Jan 2013 A1
20130008677 Huifu Jan 2013 A1
20130024213 Poon Jan 2013 A1
20130046182 Hegg et al. Feb 2013 A1
20130046279 Niklewski et al. Feb 2013 A1
20130046295 Kerr et al. Feb 2013 A1
20130066647 Andrie et al. Mar 2013 A1
20130090526 Suzuki et al. Apr 2013 A1
20130090755 Kiryu et al. Apr 2013 A1
20130093829 Rosenblatt et al. Apr 2013 A1
20130096597 Anand et al. Apr 2013 A1
20130116218 Kaplan et al. May 2013 A1
20130131845 Guilleminot May 2013 A1
20130144284 Behnke, II et al. Jun 2013 A1
20130153635 Hodgkinson Jun 2013 A1
20130165776 Blomqvist Jun 2013 A1
20130168435 Huang et al. Jul 2013 A1
20130178853 Hyink et al. Jul 2013 A1
20130190755 Deborski et al. Jul 2013 A1
20130191647 Ferrara, Jr. et al. Jul 2013 A1
20130193188 Shelton, IV et al. Aug 2013 A1
20130196703 Masoud et al. Aug 2013 A1
20130197531 Boukhny et al. Aug 2013 A1
20130201356 Kennedy et al. Aug 2013 A1
20130206813 Nalagatla Aug 2013 A1
20130214025 Zemlok et al. Aug 2013 A1
20130253480 Kimball et al. Sep 2013 A1
20130256373 Schmid et al. Oct 2013 A1
20130267874 Marcotte et al. Oct 2013 A1
20130267975 Timm Oct 2013 A1
20130268283 Vann et al. Oct 2013 A1
20130277410 Fernandez et al. Oct 2013 A1
20130317837 Ballantyne et al. Nov 2013 A1
20130321425 Greene et al. Dec 2013 A1
20130325809 Kim et al. Dec 2013 A1
20130331873 Ross et al. Dec 2013 A1
20130331875 Ross et al. Dec 2013 A1
20140001231 Shelton, IV et al. Jan 2014 A1
20140001234 Shelton, IV et al. Jan 2014 A1
20140005640 Shelton, IV et al. Jan 2014 A1
20140006132 Barker Jan 2014 A1
20140006943 Robbins et al. Jan 2014 A1
20140009894 Yu Jan 2014 A1
20140013565 MacDonald et al. Jan 2014 A1
20140018788 Engelman et al. Jan 2014 A1
20140029411 Nayak et al. Jan 2014 A1
20140033926 Fassel et al. Feb 2014 A1
20140035762 Shelton, IV et al. Feb 2014 A1
20140039491 Bakos et al. Feb 2014 A1
20140058407 Tsekos et al. Feb 2014 A1
20140066700 Wilson et al. Mar 2014 A1
20140073893 Bencini Mar 2014 A1
20140074076 Gertner Mar 2014 A1
20140081255 Johnson et al. Mar 2014 A1
20140081659 Nawana et al. Mar 2014 A1
20140084949 Smith et al. Mar 2014 A1
20140087999 Kaplan et al. Mar 2014 A1
20140092089 Kasuya et al. Apr 2014 A1
20140107697 Patani et al. Apr 2014 A1
20140108035 Akbay et al. Apr 2014 A1
20140108983 William R. et al. Apr 2014 A1
20140117256 Mueller et al. May 2014 A1
20140121669 Claus May 2014 A1
20140142963 Hill et al. May 2014 A1
20140148729 Schmitz et al. May 2014 A1
20140148803 Taylor May 2014 A1
20140163359 Sholev et al. Jun 2014 A1
20140166724 Schellin et al. Jun 2014 A1
20140171778 Tsusaka et al. Jun 2014 A1
20140171787 Garbey et al. Jun 2014 A1
20140176576 Spencer Jun 2014 A1
20140187856 Holoien et al. Jul 2014 A1
20140188440 Donhowe et al. Jul 2014 A1
20140194864 Martin et al. Jul 2014 A1
20140195052 Tsusaka et al. Jul 2014 A1
20140204190 Rosenblatt, III et al. Jul 2014 A1
20140226572 Thota et al. Aug 2014 A1
20140243597 Weisenburgh, II Aug 2014 A1
20140243799 Parihar Aug 2014 A1
20140243809 Gelfand et al. Aug 2014 A1
20140243811 Reschke et al. Aug 2014 A1
20140246475 Hall et al. Sep 2014 A1
20140249557 Koch et al. Sep 2014 A1
20140252064 Mozdzierz et al. Sep 2014 A1
20140263541 Leimbach et al. Sep 2014 A1
20140263552 Hall et al. Sep 2014 A1
20140275760 Lee Sep 2014 A1
20140276748 Ku et al. Sep 2014 A1
20140276749 Johnson Sep 2014 A1
20140278219 Canavan et al. Sep 2014 A1
20140287393 Kumar et al. Sep 2014 A1
20140296694 Jaworski Oct 2014 A1
20140303660 Boyden et al. Oct 2014 A1
20140303990 Schoenefeld et al. Oct 2014 A1
20140336943 Pellini et al. Nov 2014 A1
20140337052 Pellini et al. Nov 2014 A1
20140364691 Krivopisk et al. Dec 2014 A1
20150006201 Pait et al. Jan 2015 A1
20150012010 Adler Jan 2015 A1
20150025549 Kilroy et al. Jan 2015 A1
20150032150 Ishida et al. Jan 2015 A1
20150051452 Ciaccio Feb 2015 A1
20150051598 Orszulak et al. Feb 2015 A1
20150051617 Takemura Feb 2015 A1
20150053737 Leimbach et al. Feb 2015 A1
20150053743 Yates et al. Feb 2015 A1
20150053746 Shelton, IV et al. Feb 2015 A1
20150053749 Shelton, IV et al. Feb 2015 A1
20150057675 Akeel et al. Feb 2015 A1
20150062316 Haraguchi et al. Mar 2015 A1
20150066000 An et al. Mar 2015 A1
20150070187 Wiesner et al. Mar 2015 A1
20150073400 Sverdlik et al. Mar 2015 A1
20150077528 Awdeh Mar 2015 A1
20150083774 Measamer et al. Mar 2015 A1
20150099458 Weisner et al. Apr 2015 A1
20150108198 Estrella Apr 2015 A1
20150133945 Dushyant et al. May 2015 A1
20150136833 Shelton, IV et al. May 2015 A1
20150140982 Postrel May 2015 A1
20150141980 Jadhav et al. May 2015 A1
20150145682 Harris May 2015 A1
20150148830 Stulen et al. May 2015 A1
20150157354 Bales, Jr. et al. Jun 2015 A1
20150168126 Nevet et al. Jun 2015 A1
20150173673 Toth et al. Jun 2015 A1
20150173756 Baxter, III et al. Jun 2015 A1
20150182220 Yates et al. Jul 2015 A1
20150196295 Shelton, IV et al. Jul 2015 A1
20150199109 Lee Jul 2015 A1
20150201918 Kumar et al. Jul 2015 A1
20150202014 Kim et al. Jul 2015 A1
20150208934 Sztrubel et al. Jul 2015 A1
20150223725 Engel et al. Aug 2015 A1
20150223868 Brandt et al. Aug 2015 A1
20150237502 Schmidt et al. Aug 2015 A1
20150238355 Vezzu et al. Aug 2015 A1
20150257783 Levine et al. Sep 2015 A1
20150272557 Overmyer et al. Oct 2015 A1
20150272571 Leimbach et al. Oct 2015 A1
20150272580 Leimbach et al. Oct 2015 A1
20150272582 Leimbach et al. Oct 2015 A1
20150272694 Charles Oct 2015 A1
20150282733 Fielden et al. Oct 2015 A1
20150282821 Look et al. Oct 2015 A1
20150289925 Voegele et al. Oct 2015 A1
20150296042 Aoyama Oct 2015 A1
20150297200 Fitzsimmons et al. Oct 2015 A1
20150297222 Huitema et al. Oct 2015 A1
20150297228 Huitema et al. Oct 2015 A1
20150297233 Huitema et al. Oct 2015 A1
20150297311 Tesar Oct 2015 A1
20150302157 Collar et al. Oct 2015 A1
20150305828 Park et al. Oct 2015 A1
20150310174 Coudert et al. Oct 2015 A1
20150313538 Bechtel et al. Nov 2015 A1
20150317899 Dumbauld et al. Nov 2015 A1
20150320423 Aranyi Nov 2015 A1
20150324114 Hurley et al. Nov 2015 A1
20150328474 Flyash et al. Nov 2015 A1
20150331995 Zhao et al. Nov 2015 A1
20150332003 Stamm et al. Nov 2015 A1
20150332196 Stiller et al. Nov 2015 A1
20150335344 Aljuri et al. Nov 2015 A1
20150374259 Garbey et al. Dec 2015 A1
20160000437 Giordano et al. Jan 2016 A1
20160001411 Alberti Jan 2016 A1
20160005169 Sela et al. Jan 2016 A1
20160015471 Piron et al. Jan 2016 A1
20160019346 Boston et al. Jan 2016 A1
20160022374 Haider et al. Jan 2016 A1
20160034648 Mohlenbrock et al. Feb 2016 A1
20160038224 Couture et al. Feb 2016 A1
20160038253 Piron et al. Feb 2016 A1
20160048780 Sethumadhavan et al. Feb 2016 A1
20160051315 Boudreaux Feb 2016 A1
20160058439 Shelton, IV et al. Mar 2016 A1
20160066913 Swayze et al. Mar 2016 A1
20160078190 Greene et al. Mar 2016 A1
20160100837 Huang et al. Apr 2016 A1
20160106516 Mesallum Apr 2016 A1
20160106934 Hiraga et al. Apr 2016 A1
20160121143 Mumaw et al. May 2016 A1
20160143659 Glutz et al. May 2016 A1
20160157717 Gaster Jun 2016 A1
20160158468 Tang et al. Jun 2016 A1
20160166336 Razzaque et al. Jun 2016 A1
20160174998 Lal et al. Jun 2016 A1
20160175025 Strobl Jun 2016 A1
20160180045 Syed Jun 2016 A1
20160182637 Adriaens et al. Jun 2016 A1
20160184054 Lowe Jun 2016 A1
20160192960 Bueno et al. Jul 2016 A1
20160203599 Gillies et al. Jul 2016 A1
20160206202 Frangioni Jul 2016 A1
20160206362 Mehta et al. Jul 2016 A1
20160224760 Petak et al. Aug 2016 A1
20160225551 Shedletsky Aug 2016 A1
20160228061 Kallback et al. Aug 2016 A1
20160228204 Quaid et al. Aug 2016 A1
20160235303 Fleming et al. Aug 2016 A1
20160242836 Eggers et al. Aug 2016 A1
20160249910 Shelton, IV et al. Sep 2016 A1
20160249920 Gupta et al. Sep 2016 A1
20160270732 Källbäck et al. Sep 2016 A1
20160270842 Strobl et al. Sep 2016 A1
20160270861 Guru et al. Sep 2016 A1
20160275259 Nolan et al. Sep 2016 A1
20160278841 Panescu et al. Sep 2016 A1
20160287312 Tegg et al. Oct 2016 A1
20160287316 Worrell et al. Oct 2016 A1
20160287337 Aram et al. Oct 2016 A1
20160287912 Warnking Oct 2016 A1
20160292456 Dubey et al. Oct 2016 A1
20160296246 Schaller Oct 2016 A1
20160302210 Thornton et al. Oct 2016 A1
20160310055 Zand et al. Oct 2016 A1
20160310204 McHenry et al. Oct 2016 A1
20160314716 Grubbs Oct 2016 A1
20160314717 Grubbs Oct 2016 A1
20160317172 Kumada et al. Nov 2016 A1
20160321400 Durrant et al. Nov 2016 A1
20160323283 Kang et al. Nov 2016 A1
20160324537 Green et al. Nov 2016 A1
20160331460 Cheatham, III et al. Nov 2016 A1
20160331473 Yamamura Nov 2016 A1
20160338685 Nawana et al. Nov 2016 A1
20160342753 Feazell Nov 2016 A1
20160342916 Arceneaux et al. Nov 2016 A1
20160345857 Jensrud et al. Dec 2016 A1
20160345976 Gonzalez et al. Dec 2016 A1
20160350490 Martinez et al. Dec 2016 A1
20160354160 Crowley et al. Dec 2016 A1
20160354162 Yen et al. Dec 2016 A1
20160361070 Ardel et al. Dec 2016 A1
20160367305 Hareland Dec 2016 A1
20160367401 Claus Dec 2016 A1
20160374710 Sinelnikov et al. Dec 2016 A1
20160374723 Frankhouser et al. Dec 2016 A1
20160374762 Case et al. Dec 2016 A1
20160379504 Bailey et al. Dec 2016 A1
20170000516 Stulen et al. Jan 2017 A1
20170000553 Wiener et al. Jan 2017 A1
20170005911 Kasargod et al. Jan 2017 A1
20170007247 Shelton, IV et al. Jan 2017 A1
20170027603 Pandey Feb 2017 A1
20170042604 McFarland et al. Feb 2017 A1
20170049522 Kapadia Feb 2017 A1
20170068792 Reiner Mar 2017 A1
20170079530 DiMaio et al. Mar 2017 A1
20170079730 Azizian et al. Mar 2017 A1
20170086829 Vendely et al. Mar 2017 A1
20170086906 Tsuruta Mar 2017 A1
20170086930 Thompson et al. Mar 2017 A1
20170105754 Boudreaux et al. Apr 2017 A1
20170105787 Witt et al. Apr 2017 A1
20170116873 Lendvay et al. Apr 2017 A1
20170119477 Amiot et al. May 2017 A1
20170127499 Unoson et al. May 2017 A1
20170132374 Lee et al. May 2017 A1
20170132385 Hunter et al. May 2017 A1
20170132785 Wshah et al. May 2017 A1
20170143284 Sehnert et al. May 2017 A1
20170143366 Groene et al. May 2017 A1
20170143442 Tesar et al. May 2017 A1
20170147759 Iyer et al. May 2017 A1
20170154156 Sevenster Jun 2017 A1
20170156076 Eom et al. Jun 2017 A1
20170164996 Honda et al. Jun 2017 A1
20170164997 Johnson et al. Jun 2017 A1
20170165008 Finley Jun 2017 A1
20170165012 Chaplin et al. Jun 2017 A1
20170172550 Mukherjee et al. Jun 2017 A1
20170172565 Heneveld Jun 2017 A1
20170172614 Scheib et al. Jun 2017 A1
20170172674 Hanuschik et al. Jun 2017 A1
20170172676 Itkowitz et al. Jun 2017 A1
20170173262 Veltz Jun 2017 A1
20170177807 Fabian Jun 2017 A1
20170178069 Paterra et al. Jun 2017 A1
20170185732 Niklewski et al. Jun 2017 A1
20170196583 Sugiyama Jul 2017 A1
20170196637 Shelton, IV et al. Jul 2017 A1
20170202591 Shelton, IV et al. Jul 2017 A1
20170202595 Shelton, IV Jul 2017 A1
20170202607 Shelton, IV et al. Jul 2017 A1
20170202608 Shelton, IV et al. Jul 2017 A1
20170209145 Swayze et al. Jul 2017 A1
20170215944 Keffeler Aug 2017 A1
20170224332 Hunter et al. Aug 2017 A1
20170224334 Worthington et al. Aug 2017 A1
20170224428 Kopp Aug 2017 A1
20170231553 Igarashi et al. Aug 2017 A1
20170231627 Shelton, IV et al. Aug 2017 A1
20170231628 Shelton, IV et al. Aug 2017 A1
20170245809 Ma et al. Aug 2017 A1
20170249431 Shelton, IV et al. Aug 2017 A1
20170249432 Grantcharov Aug 2017 A1
20170262604 Francois Sep 2017 A1
20170265864 Hessler et al. Sep 2017 A1
20170265943 Sela et al. Sep 2017 A1
20170273715 Piron et al. Sep 2017 A1
20170281171 Shelton, IV et al. Oct 2017 A1
20170281173 Shelton, IV et al. Oct 2017 A1
20170281186 Shelton, IV et al. Oct 2017 A1
20170281189 Nalagatla et al. Oct 2017 A1
20170289617 Song et al. Oct 2017 A1
20170290585 Shelton, IV et al. Oct 2017 A1
20170296169 Yates et al. Oct 2017 A1
20170296173 Shelton, IV et al. Oct 2017 A1
20170296185 Swensgard et al. Oct 2017 A1
20170296213 Swensgard et al. Oct 2017 A1
20170303984 Malackowski Oct 2017 A1
20170304007 Piron et al. Oct 2017 A1
20170304020 Ng et al. Oct 2017 A1
20170311777 Hirayama et al. Nov 2017 A1
20170312456 Phillips Nov 2017 A1
20170319268 Akagane Nov 2017 A1
20170325876 Nakadate et al. Nov 2017 A1
20170325878 Messerly et al. Nov 2017 A1
20170333147 Bernstein Nov 2017 A1
20170333152 Wade Nov 2017 A1
20170337043 Brincat et al. Nov 2017 A1
20170348047 Reiter et al. Dec 2017 A1
20170360358 Amiot et al. Dec 2017 A1
20170360499 Greep et al. Dec 2017 A1
20170367583 Black et al. Dec 2017 A1
20170367695 Shelton, IV et al. Dec 2017 A1
20170367754 Narisawa Dec 2017 A1
20170367771 Tako et al. Dec 2017 A1
20170367772 Gunn et al. Dec 2017 A1
20170370710 Chen et al. Dec 2017 A1
20180008359 Randle Jan 2018 A1
20180011983 Zuhars et al. Jan 2018 A1
20180014764 Bechtel et al. Jan 2018 A1
20180021058 Meglan Jan 2018 A1
20180042659 Rupp et al. Feb 2018 A1
20180050196 Pawsey et al. Feb 2018 A1
20180052971 Hanina et al. Feb 2018 A1
20180055529 Messerly et al. Mar 2018 A1
20180056496 Rubens et al. Mar 2018 A1
20180065248 Barral et al. Mar 2018 A1
20180078170 Panescu et al. Mar 2018 A1
20180082480 White et al. Mar 2018 A1
20180085102 Kikuchi Mar 2018 A1
20180098049 Sugano et al. Apr 2018 A1
20180098816 Govari et al. Apr 2018 A1
20180108438 Ryan et al. Apr 2018 A1
20180110523 Shelton, IV Apr 2018 A1
20180116662 Shelton, IV et al. May 2018 A1
20180116735 Tierney et al. May 2018 A1
20180122506 Grantcharov et al. May 2018 A1
20180125590 Giordano et al. May 2018 A1
20180132895 Silver May 2018 A1
20180144243 Hsieh et al. May 2018 A1
20180144314 Miller May 2018 A1
20180153436 Olson Jun 2018 A1
20180153574 Faller et al. Jun 2018 A1
20180153628 Grover et al. Jun 2018 A1
20180153632 Tokarchuk et al. Jun 2018 A1
20180154297 Maletich et al. Jun 2018 A1
20180161062 Kaga et al. Jun 2018 A1
20180161716 Li et al. Jun 2018 A1
20180165780 Romeo Jun 2018 A1
20180168574 Robinson et al. Jun 2018 A1
20180168575 Simms et al. Jun 2018 A1
20180168577 Aronhalt et al. Jun 2018 A1
20180168578 Aronhalt et al. Jun 2018 A1
20180168579 Aronhalt et al. Jun 2018 A1
20180168584 Harris et al. Jun 2018 A1
20180168586 Shelton, IV et al. Jun 2018 A1
20180168590 Overmyer et al. Jun 2018 A1
20180168592 Overmyer et al. Jun 2018 A1
20180168593 Overmyer et al. Jun 2018 A1
20180168597 Fanelli et al. Jun 2018 A1
20180168598 Shelton, IV et al. Jun 2018 A1
20180168608 Shelton, IV et al. Jun 2018 A1
20180168609 Fanelli et al. Jun 2018 A1
20180168610 Shelton, IV et al. Jun 2018 A1
20180168614 Shelton, IV et al. Jun 2018 A1
20180168615 Shelton, IV et al. Jun 2018 A1
20180168617 Shelton, IV et al. Jun 2018 A1
20180168618 Scott et al. Jun 2018 A1
20180168619 Scott et al. Jun 2018 A1
20180168623 Simms et al. Jun 2018 A1
20180168625 Posada et al. Jun 2018 A1
20180168627 Weaner et al. Jun 2018 A1
20180168628 Hunter et al. Jun 2018 A1
20180168633 Shelton, IV et al. Jun 2018 A1
20180168647 Shelton, IV et al. Jun 2018 A1
20180168648 Shelton, IV et al. Jun 2018 A1
20180168649 Shelton, IV et al. Jun 2018 A1
20180168650 Shelton, IV et al. Jun 2018 A1
20180168651 Shelton, IV et al. Jun 2018 A1
20180172420 Hein et al. Jun 2018 A1
20180177383 Noonan et al. Jun 2018 A1
20180182475 Cossler et al. Jun 2018 A1
20180183684 Jacobson et al. Jun 2018 A1
20180193579 Hanrahan et al. Jul 2018 A1
20180206884 Beaupre Jul 2018 A1
20180206905 Batchelor et al. Jul 2018 A1
20180211726 Courtemanche et al. Jul 2018 A1
20180214025 Homyk et al. Aug 2018 A1
20180221005 Hamel et al. Aug 2018 A1
20180221598 Silver Aug 2018 A1
20180228557 Darisse et al. Aug 2018 A1
20180233222 Daley et al. Aug 2018 A1
20180233235 Angelides Aug 2018 A1
20180235719 Jarc Aug 2018 A1
20180235722 Baghdadi et al. Aug 2018 A1
20180242967 Meade Aug 2018 A1
20180247128 Alvi et al. Aug 2018 A1
20180247711 Terry Aug 2018 A1
20180250086 Grubbs Sep 2018 A1
20180250825 Hashimoto et al. Sep 2018 A1
20180263699 Murphy Sep 2018 A1
20180263710 Sakaguchi et al. Sep 2018 A1
20180268320 Shekhar Sep 2018 A1
20180271520 Shelton, IV et al. Sep 2018 A1
20180271603 Nir et al. Sep 2018 A1
20180289427 Griffiths et al. Oct 2018 A1
20180294060 Kassab Oct 2018 A1
20180296286 Peine et al. Oct 2018 A1
20180296289 Rodriguez-Navarro et al. Oct 2018 A1
20180300506 Kawakami et al. Oct 2018 A1
20180303552 Ryan et al. Oct 2018 A1
20180304471 Tokuchi Oct 2018 A1
20180310935 Wixey Nov 2018 A1
20180310986 Batchelor et al. Nov 2018 A1
20180315492 Bishop et al. Nov 2018 A1
20180317826 Muhsin et al. Nov 2018 A1
20180317916 Wixey Nov 2018 A1
20180325619 Rauniyar et al. Nov 2018 A1
20180333188 Nott et al. Nov 2018 A1
20180333207 Moctezuma De la Barrera Nov 2018 A1
20180333209 Frushour et al. Nov 2018 A1
20180351987 Patel et al. Dec 2018 A1
20180353186 Mozdzierz et al. Dec 2018 A1
20180357383 Allen et al. Dec 2018 A1
20180360454 Shelton, IV et al. Dec 2018 A1
20180360456 Shelton, IV et al. Dec 2018 A1
20180366213 Fidone et al. Dec 2018 A1
20180368930 Esterberg et al. Dec 2018 A1
20180369511 Zergiebel et al. Dec 2018 A1
20190000446 Shelton, IV et al. Jan 2019 A1
20190000478 Messerly et al. Jan 2019 A1
20190000565 Shelton, IV et al. Jan 2019 A1
20190000569 Crawford et al. Jan 2019 A1
20190001079 Zergiebel et al. Jan 2019 A1
20190005641 Yamamoto Jan 2019 A1
20190006047 Gorek et al. Jan 2019 A1
20190025040 Andreason et al. Jan 2019 A1
20190036688 Wasily et al. Jan 2019 A1
20190038335 Mohr et al. Feb 2019 A1
20190038364 Enoki Feb 2019 A1
20190045515 Kwasnick et al. Feb 2019 A1
20190046198 Stokes et al. Feb 2019 A1
20190053801 Wixey et al. Feb 2019 A1
20190053866 Seow et al. Feb 2019 A1
20190059986 Shelton, IV et al. Feb 2019 A1
20190059997 Frushour Feb 2019 A1
20190069949 Vrba et al. Mar 2019 A1
20190069964 Hagn Mar 2019 A1
20190069966 Petersen et al. Mar 2019 A1
20190070550 Lalomia et al. Mar 2019 A1
20190070731 Bowling et al. Mar 2019 A1
20190083190 Graves et al. Mar 2019 A1
20190087544 Peterson Mar 2019 A1
20190090969 Jarc et al. Mar 2019 A1
20190099221 Schmidt et al. Apr 2019 A1
20190099226 Hallen Apr 2019 A1
20190104919 Shelton, IV et al. Apr 2019 A1
20190105468 Kato et al. Apr 2019 A1
20190110828 Despatie Apr 2019 A1
20190110855 Barral et al. Apr 2019 A1
20190110856 Barral et al. Apr 2019 A1
20190115108 Hegedus et al. Apr 2019 A1
20190122330 Saget et al. Apr 2019 A1
20190125320 Shelton, IV et al. May 2019 A1
20190125321 Shelton, IV et al. May 2019 A1
20190125324 Scheib et al. May 2019 A1
20190125335 Shelton, IV et al. May 2019 A1
20190125336 Deck et al. May 2019 A1
20190125337 Shelton, IV et al. May 2019 A1
20190125338 Shelton, IV et al. May 2019 A1
20190125339 Shelton, IV et al. May 2019 A1
20190125347 Stokes et al. May 2019 A1
20190125348 Shelton, IV et al. May 2019 A1
20190125352 Shelton, IV et al. May 2019 A1
20190125353 Shelton, IV et al. May 2019 A1
20190125354 Deck et al. May 2019 A1
20190125355 Shelton, IV et al. May 2019 A1
20190125356 Shelton, IV et al. May 2019 A1
20190125357 Shelton, IV et al. May 2019 A1
20190125358 Shelton, IV et al. May 2019 A1
20190125359 Shelton, IV et al. May 2019 A1
20190125360 Shelton, IV et al. May 2019 A1
20190125361 Shelton, IV et al. May 2019 A1
20190125377 Shelton, IV May 2019 A1
20190125378 Shelton, IV et al. May 2019 A1
20190125379 Shelton, IV et al. May 2019 A1
20190125380 Hunter et al. May 2019 A1
20190125383 Scheib et al. May 2019 A1
20190125384 Scheib et al. May 2019 A1
20190125385 Scheib et al. May 2019 A1
20190125386 Shelton, IV et al. May 2019 A1
20190125387 Parihar et al. May 2019 A1
20190125388 Shelton, IV et al. May 2019 A1
20190125389 Shelton, IV et al. May 2019 A1
20190125430 Shelton, IV et al. May 2019 A1
20190125431 Shelton, IV et al. May 2019 A1
20190125432 Shelton, IV et al. May 2019 A1
20190125454 Stokes et al. May 2019 A1
20190125455 Shelton, IV et al. May 2019 A1
20190125456 Shelton, IV et al. May 2019 A1
20190125457 Parihar et al. May 2019 A1
20190125458 Shelton, IV et al. May 2019 A1
20190125459 Shelton, IV et al. May 2019 A1
20190125476 Shelton, IV et al. May 2019 A1
20190133703 Seow et al. May 2019 A1
20190142449 Shelton, IV et al. May 2019 A1
20190142535 Seow et al. May 2019 A1
20190145942 Dutriez et al. May 2019 A1
20190150975 Kawasaki et al. May 2019 A1
20190159777 Ehrenfels et al. May 2019 A1
20190159778 Shelton, IV et al. May 2019 A1
20190162179 O'Shea et al. May 2019 A1
20190163875 Allen et al. May 2019 A1
20190167296 Tsubuku et al. Jun 2019 A1
20190192044 Ravi et al. Jun 2019 A1
20190192157 Scott et al. Jun 2019 A1
20190192236 Shelton, IV et al. Jun 2019 A1
20190200844 Shelton, IV et al. Jul 2019 A1
20190200863 Shelton, IV et al. Jul 2019 A1
20190200905 Shelton, IV et al. Jul 2019 A1
20190200906 Shelton, IV et al. Jul 2019 A1
20190200977 Shelton, IV et al. Jul 2019 A1
20190200980 Shelton, IV et al. Jul 2019 A1
20190200981 Harris et al. Jul 2019 A1
20190200984 Shelton, IV et al. Jul 2019 A1
20190200985 Shelton, IV et al. Jul 2019 A1
20190200986 Shelton, IV et al. Jul 2019 A1
20190200987 Shelton, IV et al. Jul 2019 A1
20190200988 Shelton, IV Jul 2019 A1
20190200996 Shelton, IV et al. Jul 2019 A1
20190200997 Shelton, IV et al. Jul 2019 A1
20190200998 Shelton, IV et al. Jul 2019 A1
20190201020 Shelton, IV et al. Jul 2019 A1
20190201021 Shelton, IV et al. Jul 2019 A1
20190201023 Shelton, IV et al. Jul 2019 A1
20190201024 Shelton, IV et al. Jul 2019 A1
20190201025 Shelton, IV et al. Jul 2019 A1
20190201026 Shelton, IV et al. Jul 2019 A1
20190201027 Shelton, IV et al. Jul 2019 A1
20190201028 Shelton, IV et al. Jul 2019 A1
20190201029 Shelton, IV et al. Jul 2019 A1
20190201030 Shelton, IV et al. Jul 2019 A1
20190201033 Yates et al. Jul 2019 A1
20190201034 Shelton, IV et al. Jul 2019 A1
20190201036 Nott et al. Jul 2019 A1
20190201037 Houser et al. Jul 2019 A1
20190201038 Yates et al. Jul 2019 A1
20190201039 Widenhouse et al. Jul 2019 A1
20190201040 Messerly et al. Jul 2019 A1
20190201041 Kimball et al. Jul 2019 A1
20190201042 Nott et al. Jul 2019 A1
20190201043 Shelton, IV et al. Jul 2019 A1
20190201044 Shelton, IV et al. Jul 2019 A1
20190201045 Yates et al. Jul 2019 A1
20190201046 Shelton, IV et al. Jul 2019 A1
20190201047 Yates et al. Jul 2019 A1
20190201073 Nott et al. Jul 2019 A1
20190201074 Yates et al. Jul 2019 A1
20190201075 Shelton, IV et al. Jul 2019 A1
20190201076 Honda et al. Jul 2019 A1
20190201077 Yates et al. Jul 2019 A1
20190201079 Shelton, IV et al. Jul 2019 A1
20190201080 Messerly et al. Jul 2019 A1
20190201081 Shelton, IV et al. Jul 2019 A1
20190201082 Shelton, IV et al. Jul 2019 A1
20190201083 Shelton, IV et al. Jul 2019 A1
20190201084 Shelton, IV et al. Jul 2019 A1
20190201085 Shelton, IV et al. Jul 2019 A1
20190201086 Shelton, IV et al. Jul 2019 A1
20190201087 Shelton, IV et al. Jul 2019 A1
20190201090 Shelton, IV et al. Jul 2019 A1
20190201091 Yates et al. Jul 2019 A1
20190201092 Yates et al. Jul 2019 A1
20190201102 Shelton, IV et al. Jul 2019 A1
20190201104 Shelton, IV et al. Jul 2019 A1
20190201105 Shelton, IV et al. Jul 2019 A1
20190201111 Shelton, IV et al. Jul 2019 A1
20190201112 Wiener et al. Jul 2019 A1
20190201113 Shelton, IV et al. Jul 2019 A1
20190201114 Shelton, IV et al. Jul 2019 A1
20190201115 Shelton, IV et al. Jul 2019 A1
20190201116 Shelton, IV et al. Jul 2019 A1
20190201118 Shelton, IV et al. Jul 2019 A1
20190201119 Harris et al. Jul 2019 A1
20190201120 Shelton, IV et al. Jul 2019 A1
20190201123 Shelton, IV et al. Jul 2019 A1
20190201124 Shelton, IV et al. Jul 2019 A1
20190201125 Shelton, IV et al. Jul 2019 A1
20190201126 Shelton, IV et al. Jul 2019 A1
20190201127 Shelton, IV et al. Jul 2019 A1
20190201128 Yates et al. Jul 2019 A1
20190201129 Shelton, IV et al. Jul 2019 A1
20190201130 Shelton, IV et al. Jul 2019 A1
20190201135 Shelton, IV et al. Jul 2019 A1
20190201136 Shelton, IV et al. Jul 2019 A1
20190201137 Shelton, IV et al. Jul 2019 A1
20190201138 Yates et al. Jul 2019 A1
20190201139 Shelton, IV et al. Jul 2019 A1
20190201140 Yates et al. Jul 2019 A1
20190201141 Shelton, IV et al. Jul 2019 A1
20190201142 Shelton, IV et al. Jul 2019 A1
20190201143 Shelton, IV et al. Jul 2019 A1
20190201144 Shelton, IV et al. Jul 2019 A1
20190201145 Shelton, IV et al. Jul 2019 A1
20190201146 Shelton, IV et al. Jul 2019 A1
20190201159 Shelton, IV et al. Jul 2019 A1
20190201594 Shelton, IV et al. Jul 2019 A1
20190201597 Shelton, IV et al. Jul 2019 A1
20190204201 Shelton, IV et al. Jul 2019 A1
20190205001 Messerly et al. Jul 2019 A1
20190205441 Shelton, IV et al. Jul 2019 A1
20190205566 Shelton, IV et al. Jul 2019 A1
20190205567 Shelton, IV et al. Jul 2019 A1
20190206003 Harris et al. Jul 2019 A1
20190206004 Shelton, IV et al. Jul 2019 A1
20190206050 Yates et al. Jul 2019 A1
20190206216 Shelton, IV et al. Jul 2019 A1
20190206542 Shelton, IV et al. Jul 2019 A1
20190206551 Yates et al. Jul 2019 A1
20190206555 Morgan et al. Jul 2019 A1
20190206556 Shelton, IV et al. Jul 2019 A1
20190206561 Shelton, IV et al. Jul 2019 A1
20190206562 Shelton, IV et al. Jul 2019 A1
20190206563 Shelton, IV et al. Jul 2019 A1
20190206564 Shelton, IV et al. Jul 2019 A1
20190206565 Shelton, IV Jul 2019 A1
20190206569 Shelton, IV et al. Jul 2019 A1
20190206576 Shelton, IV et al. Jul 2019 A1
20190207911 Wiener et al. Jul 2019 A1
20190208641 Yates et al. Jul 2019 A1
20190224434 Silver et al. Jul 2019 A1
20190254759 Azizian Aug 2019 A1
20190261984 Nelson et al. Aug 2019 A1
20190269476 Bowling et al. Sep 2019 A1
20190272917 Couture et al. Sep 2019 A1
20190274662 Rockman et al. Sep 2019 A1
20190274705 Sawhney et al. Sep 2019 A1
20190274706 Nott et al. Sep 2019 A1
20190274707 Sawhney et al. Sep 2019 A1
20190274708 Boudreaux Sep 2019 A1
20190274709 Scoggins Sep 2019 A1
20190274710 Black Sep 2019 A1
20190274711 Scoggins et al. Sep 2019 A1
20190274712 Faller et al. Sep 2019 A1
20190274713 Scoggins et al. Sep 2019 A1
20190274714 Cuti et al. Sep 2019 A1
20190274716 Nott et al. Sep 2019 A1
20190274717 Nott et al. Sep 2019 A1
20190274718 Denzinger et al. Sep 2019 A1
20190274719 Stulen Sep 2019 A1
20190274720 Gee et al. Sep 2019 A1
20190274749 Brady et al. Sep 2019 A1
20190274750 Jayme et al. Sep 2019 A1
20190274752 Denzinger et al. Sep 2019 A1
20190278262 Taylor et al. Sep 2019 A1
20190282311 Nowlin et al. Sep 2019 A1
20190290389 Kopp Sep 2019 A1
20190298340 Shelton, IV et al. Oct 2019 A1
20190298341 Shelton, IV et al. Oct 2019 A1
20190298342 Shelton, IV et al. Oct 2019 A1
20190298343 Shelton, IV et al. Oct 2019 A1
20190298346 Shelton, IV et al. Oct 2019 A1
20190298347 Shelton, IV et al. Oct 2019 A1
20190298350 Shelton, IV et al. Oct 2019 A1
20190298351 Shelton, IV et al. Oct 2019 A1
20190298352 Shelton, IV et al. Oct 2019 A1
20190298353 Shelton, IV et al. Oct 2019 A1
20190298354 Shelton, IV et al. Oct 2019 A1
20190298355 Shelton, IV et al. Oct 2019 A1
20190298356 Shelton, IV et al. Oct 2019 A1
20190298357 Shelton, IV et al. Oct 2019 A1
20190298464 Abbott Oct 2019 A1
20190298481 Rosenberg et al. Oct 2019 A1
20190307520 Peine et al. Oct 2019 A1
20190311802 Kokubo et al. Oct 2019 A1
20190314015 Shelton, IV et al. Oct 2019 A1
20190314016 Huitema et al. Oct 2019 A1
20190314081 Brogna Oct 2019 A1
20190320929 Spencer et al. Oct 2019 A1
20190321117 Itkowitz et al. Oct 2019 A1
20190333626 Mansi et al. Oct 2019 A1
20190343594 Garcia Kilroy et al. Nov 2019 A1
20190365569 Skovgaard et al. Dec 2019 A1
20190374140 Tucker et al. Dec 2019 A1
20190374292 Barral et al. Dec 2019 A1
20190378610 Barral et al. Dec 2019 A1
20200000470 Du et al. Jan 2020 A1
20200000509 Hayashida et al. Jan 2020 A1
20200038120 Ziraknejad et al. Feb 2020 A1
20200046353 Deck et al. Feb 2020 A1
20200054317 Pisarnwongs et al. Feb 2020 A1
20200054320 Harris et al. Feb 2020 A1
20200054321 Harris et al. Feb 2020 A1
20200054322 Harris et al. Feb 2020 A1
20200054323 Harris et al. Feb 2020 A1
20200054326 Harris et al. Feb 2020 A1
20200054327 Harris et al. Feb 2020 A1
20200054328 Harris et al. Feb 2020 A1
20200054330 Harris et al. Feb 2020 A1
20200078070 Henderson et al. Mar 2020 A1
20200078071 Asher Mar 2020 A1
20200078076 Henderson et al. Mar 2020 A1
20200078077 Henderson et al. Mar 2020 A1
20200078078 Henderson et al. Mar 2020 A1
20200078079 Morgan et al. Mar 2020 A1
20200078080 Henderson et al. Mar 2020 A1
20200078081 Jayme et al. Mar 2020 A1
20200078082 Henderson et al. Mar 2020 A1
20200078089 Henderson et al. Mar 2020 A1
20200078096 Barbagli et al. Mar 2020 A1
20200078106 Henderson et al. Mar 2020 A1
20200078110 Henderson et al. Mar 2020 A1
20200078111 Oberkircher et al. Mar 2020 A1
20200078112 Henderson et al. Mar 2020 A1
20200078113 Sawhney et al. Mar 2020 A1
20200078114 Asher et al. Mar 2020 A1
20200078115 Asher et al. Mar 2020 A1
20200078116 Oberkircher et al. Mar 2020 A1
20200078117 Henderson et al. Mar 2020 A1
20200078118 Henderson et al. Mar 2020 A1
20200078119 Henderson et al. Mar 2020 A1
20200078120 Aldridge et al. Mar 2020 A1
20200081585 Petre et al. Mar 2020 A1
20200090808 Carroll et al. Mar 2020 A1
20200100825 Henderson et al. Apr 2020 A1
20200100830 Henderson et al. Apr 2020 A1
20200106220 Henderson et al. Apr 2020 A1
20200162896 Su et al. May 2020 A1
20200168323 Bullington et al. May 2020 A1
20200178760 Kashima et al. Jun 2020 A1
20200178971 Harris et al. Jun 2020 A1
20200193600 Shameli et al. Jun 2020 A1
20200197027 Hershberger et al. Jun 2020 A1
20200203004 Shanbhag et al. Jun 2020 A1
20200214699 Shelton, IV et al. Jul 2020 A1
20200222079 Swaney et al. Jul 2020 A1
20200222149 Valentine et al. Jul 2020 A1
20200226751 Jin et al. Jul 2020 A1
20200230803 Yamashita et al. Jul 2020 A1
20200237372 Park Jul 2020 A1
20200261075 Boudreaux et al. Aug 2020 A1
20200261076 Boudreaux et al. Aug 2020 A1
20200261077 Shelton, IV et al. Aug 2020 A1
20200261078 Bakos et al. Aug 2020 A1
20200261080 Bakos et al. Aug 2020 A1
20200261081 Boudreaux et al. Aug 2020 A1
20200261082 Boudreaux et al. Aug 2020 A1
20200261083 Bakos et al. Aug 2020 A1
20200261084 Bakos et al. Aug 2020 A1
20200261085 Boudreaux et al. Aug 2020 A1
20200261086 Zeiner et al. Aug 2020 A1
20200261087 Timm et al. Aug 2020 A1
20200261088 Harris et al. Aug 2020 A1
20200261089 Shelton, IV et al. Aug 2020 A1
20200275928 Shelton, IV et al. Sep 2020 A1
20200275930 Harris et al. Sep 2020 A1
20200281665 Kopp Sep 2020 A1
20200305924 Carroll Oct 2020 A1
20200305945 Morgan et al. Oct 2020 A1
20200314569 Morgan et al. Oct 2020 A1
20200348662 Cella et al. Nov 2020 A1
20200352664 King et al. Nov 2020 A1
20200388385 De Los Reyes et al. Dec 2020 A1
20200405304 Mozdzierz et al. Dec 2020 A1
20200405375 Shelton, IV et al. Dec 2020 A1
20210000555 Shelton, IV et al. Jan 2021 A1
20210007760 Reisin Jan 2021 A1
20210015568 Liao et al. Jan 2021 A1
20210022731 Eisinger Jan 2021 A1
20210022738 Weir et al. Jan 2021 A1
20210022809 Crawford et al. Jan 2021 A1
20210059674 Shelton, IV et al. Mar 2021 A1
20210068834 Shelton, IV et al. Mar 2021 A1
20210076966 Grantcharov et al. Mar 2021 A1
20210128149 Whitfield et al. May 2021 A1
20210153889 Nott et al. May 2021 A1
20210169516 Houser et al. Jun 2021 A1
20210176179 Shelton, IV Jun 2021 A1
20210177452 Nott et al. Jun 2021 A1
20210177489 Yates et al. Jun 2021 A1
20210186454 Behzadi et al. Jun 2021 A1
20210192914 Shelton, IV et al. Jun 2021 A1
20210201646 Shelton, IV et al. Jul 2021 A1
20210205020 Shelton, IV et al. Jul 2021 A1
20210205021 Shelton, IV et al. Jul 2021 A1
20210205028 Shelton, IV et al. Jul 2021 A1
20210205029 Wiener et al. Jul 2021 A1
20210205030 Shelton, IV et al. Jul 2021 A1
20210205031 Shelton, IV et al. Jul 2021 A1
20210212602 Shelton, IV et al. Jul 2021 A1
20210212694 Shelton, IV et al. Jul 2021 A1
20210212717 Yates et al. Jul 2021 A1
20210212719 Houser et al. Jul 2021 A1
20210212770 Messerly et al. Jul 2021 A1
20210212771 Shelton, IV et al. Jul 2021 A1
20210212774 Shelton, IV et al. Jul 2021 A1
20210212775 Shelton, IV et al. Jul 2021 A1
20210212782 Shelton, IV et al. Jul 2021 A1
20210219976 DiNardo et al. Jul 2021 A1
20210220058 Messerly et al. Jul 2021 A1
20210240852 Shelton, IV et al. Aug 2021 A1
20210241898 Shelton, IV et al. Aug 2021 A1
20210249125 Morgan et al. Aug 2021 A1
20210251487 Shelton, IV et al. Aug 2021 A1
20210259687 Gonzalez et al. Aug 2021 A1
20210259697 Shelton, IV et al. Aug 2021 A1
20210259698 Shelton, IV et al. Aug 2021 A1
20210282780 Shelton, IV et al. Sep 2021 A1
20210282781 Shelton, IV et al. Sep 2021 A1
20210306176 Park et al. Sep 2021 A1
20210315579 Shelton, IV et al. Oct 2021 A1
20210315580 Shelton, IV et al. Oct 2021 A1
20210315581 Shelton, IV et al. Oct 2021 A1
20210315582 Shelton, IV et al. Oct 2021 A1
20210322014 Shelton, IV et al. Oct 2021 A1
20210322015 Shelton, IV et al. Oct 2021 A1
20210322017 Shelton, IV et al. Oct 2021 A1
20210322018 Shelton, IV et al. Oct 2021 A1
20210322019 Shelton, IV et al. Oct 2021 A1
20210322020 Shelton, IV et al. Oct 2021 A1
20210336939 Wiener et al. Oct 2021 A1
20210353287 Shelton, IV et al. Nov 2021 A1
20210353288 Shelton, IV et al. Nov 2021 A1
20210358599 Alvi et al. Nov 2021 A1
20210361284 Shelton, IV et al. Nov 2021 A1
20220000484 Shelton, IV et al. Jan 2022 A1
20220054158 Shelton, IV et al. Feb 2022 A1
20220079591 Bakos et al. Mar 2022 A1
20220157306 Albrecht et al. May 2022 A1
20220160438 Shelton, IV et al. May 2022 A1
20220175374 Shelton, IV et al. Jun 2022 A1
20220230738 Shelton, IV et al. Jul 2022 A1
20220241027 Shelton, IV et al. Aug 2022 A1
20220249097 Shelton, IV et al. Aug 2022 A1
20220323092 Shelton, IV et al. Oct 2022 A1
20220323150 Yates et al. Oct 2022 A1
20220331011 Shelton, IV et al. Oct 2022 A1
20220331018 Parihar et al. Oct 2022 A1
20220346792 Shelton, IV et al. Nov 2022 A1
20220370117 Messerly et al. Nov 2022 A1
20220370126 Shelton, IV et al. Nov 2022 A1
20220374414 Shelton, IV et al. Nov 2022 A1
20220395276 Yates et al. Dec 2022 A1
20220401099 Shelton, IV et al. Dec 2022 A1
20220406452 Shelton, IV Dec 2022 A1
20220409302 Shelton, IV et al. Dec 2022 A1
20230000518 Nott et al. Jan 2023 A1
20230037577 Kimball et al. Feb 2023 A1
20230064821 Shelton, IV Mar 2023 A1
20230092371 Yates et al. Mar 2023 A1
20230098870 Harris et al. Mar 2023 A1
20230116571 Shelton, IV et al. Apr 2023 A1
20230146947 Shelton, IV et al. May 2023 A1
20230165642 Shelton, IV et al. Jun 2023 A1
20230171266 Brunner et al. Jun 2023 A1
20230171304 Shelton, IV et al. Jun 2023 A1
20230187060 Morgan et al. Jun 2023 A1
Foreign Referenced Citations (135)
Number Date Country
2015201140 Mar 2015 AU
2709634 Jul 2009 CA
2795323 May 2014 CA
101617950 Jan 2010 CN
106027664 Oct 2016 CN
106413578 Feb 2017 CN
104490448 Mar 2017 CN
206097107 Apr 2017 CN
106777916 May 2017 CN
107811710 Mar 2018 CN
108652695 Oct 2018 CN
2037167 Jul 1980 DE
3016131 Oct 1981 DE
3824913 Feb 1990 DE
4002843 Apr 1991 DE
102005051367 Apr 2007 DE
102016207666 Nov 2017 DE
0000756 Oct 1981 EP
0408160 Jan 1991 EP
0473987 Mar 1992 EP
0929263 Jul 1999 EP
1214913 Jun 2002 EP
2730209 May 2014 EP
2732772 May 2014 EP
2942023 Nov 2015 EP
3047806 Jul 2016 EP
3056923 Aug 2016 EP
3095399 Nov 2016 EP
3120781 Jan 2017 EP
3135225 Mar 2017 EP
3141181 Mar 2017 EP
2838234 Oct 2003 FR
2509523 Jul 2014 GB
S5191993 Jul 1976 JP
S5373315 Jun 1978 JP
S57185848 Nov 1982 JP
S58207752 Dec 1983 JP
S63315049 Dec 1988 JP
H06142113 May 1994 JP
H06178780 Jun 1994 JP
H06209902 Aug 1994 JP
H07132122 May 1995 JP
H08071072 Mar 1996 JP
H08332169 Dec 1996 JP
H0928663 Feb 1997 JP
H09154850 Jun 1997 JP
H11151247 Jun 1999 JP
H11197159 Jul 1999 JP
H11309156 Nov 1999 JP
2000058355 Feb 2000 JP
2001029353 Feb 2001 JP
2001195686 Jul 2001 JP
2001314411 Nov 2001 JP
2001340350 Dec 2001 JP
2002272758 Sep 2002 JP
2003061975 Mar 2003 JP
2003070921 Mar 2003 JP
2003153918 May 2003 JP
2004118664 Apr 2004 JP
2005111080 Apr 2005 JP
2005309702 Nov 2005 JP
2005348797 Dec 2005 JP
2006077626 Mar 2006 JP
2006117143 May 2006 JP
2006164251 Jun 2006 JP
2006280804 Oct 2006 JP
2006288431 Oct 2006 JP
2007123394 May 2007 JP
2007139822 Jun 2007 JP
2007300312 Nov 2007 JP
2009039515 Feb 2009 JP
2010057642 Mar 2010 JP
2010131265 Jun 2010 JP
2010269067 Dec 2010 JP
2012065698 Apr 2012 JP
2012239669 Dec 2012 JP
2012240158 Dec 2012 JP
2012533346 Dec 2012 JP
2013044303 Mar 2013 JP
2013081282 May 2013 JP
2013135738 Jul 2013 JP
2013144057 Jul 2013 JP
2014155207 Aug 2014 JP
2015085454 May 2015 JP
2016514017 May 2016 JP
2016528010 Sep 2016 JP
2016174836 Oct 2016 JP
2016214553 Dec 2016 JP
2017047022 Mar 2017 JP
2017096359 Jun 2017 JP
2017513561 Jun 2017 JP
2017526510 Sep 2017 JP
2017532168 Nov 2017 JP
20140104587 Aug 2014 KR
101587721 Jan 2016 KR
WO-9734533 Sep 1997 WO
WO-9808449 Mar 1998 WO
WO-0024322 May 2000 WO
WO-0108578 Feb 2001 WO
WO-0112089 Feb 2001 WO
WO-0120892 Mar 2001 WO
WO-03079909 Oct 2003 WO
WO-2006001264 Jan 2006 WO
WO-2007137304 Nov 2007 WO
WO-2008053485 May 2008 WO
WO-2008056618 May 2008 WO
WO-2008069816 Jun 2008 WO
WO-2008076079 Jun 2008 WO
WO-2008147555 Dec 2008 WO
WO-2011112931 Sep 2011 WO
WO-2013143573 Oct 2013 WO
WO-2014031800 Feb 2014 WO
WO-2014071184 May 2014 WO
WO-2014116961 Jul 2014 WO
WO-2014134196 Sep 2014 WO
WO-2015030157 Mar 2015 WO
WO-2015054665 Apr 2015 WO
WO-2015129395 Sep 2015 WO
WO-2016093049 Jun 2016 WO
WO-2016100719 Jun 2016 WO
WO-2016118752 Jul 2016 WO
WO-2016206015 Dec 2016 WO
WO-2017011382 Jan 2017 WO
WO-2017011646 Jan 2017 WO
WO-2017058617 Apr 2017 WO
WO-2017058695 Apr 2017 WO
WO-2017151996 Sep 2017 WO
WO-2017183353 Oct 2017 WO
WO-2017189317 Nov 2017 WO
WO-2017205308 Nov 2017 WO
WO-2017210499 Dec 2017 WO
WO-2017210501 Dec 2017 WO
WO-2018116247 Jun 2018 WO
WO-2018152141 Aug 2018 WO
WO-2018176414 Oct 2018 WO
Non-Patent Literature Citations (61)
Entry
US 10,504,709 B2, 12/2019, Karancsi et al. (withdrawn)
Flores et al., “Large-scale Offloading in the Internet of Things,” 2017 IEEE International Conference on Pervasive Computing and Communications Workshops (PERCOM Workshops), IEEE, pp. 479-484, Mar. 13, 2017.
Kalantarian et al., “Computation Offloading for Real-Time Health-Monitoring Devices,” 2016 38th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EBMC), IEEE, pp. 4971-4974, Aug. 16, 2016.
Yuyi Mao et al., “A Survey on Mobile Edge Computing: The Communication Perspective,” IEEE Communications Surveys & Tutorials, pp. 2322-2358, Jun. 13, 2017.
Khazaei et al., “Health Informatics for Neonatal Intensive Care Units: An Analytical Modeling Perspective,” IEEE Journal of Translational Engineering in Health and Medicine, vol. 3, pp. 1-9, Oct. 21, 2015.
Benkmann et al., “Concept of iterative optimization of minimally invasive surgery,” 2017 22nd International Conference on Methods and Models in Automation and Robotics (MMAR), IEEE pp. 443-446, Aug. 28, 2017.
Trautman, Peter, “Breaking the Human-Robot Deadlock: Surpassing Shared Control Performance Limits with Sparse Human-Robot Interaction,” Robotics: Science and Systems XIIII, pp. 1-10, Jul. 12, 2017.
Yang et al., “A dynamic stategy for packet scheduling and bandwidth allocation based on channel quality in IEEE 802.16e OFDMA system,” Journal of Network and Computer Applications, vol. 39, pp. 52-60, May 2, 2013.
Takahashi et al., “Automatic smoke evacuation in laparoscopic surgery: a simplified method for objective evaluation,” Surgical Endoscopy, vol. 27, No. 8, pp. 2980-2987, Feb. 23, 2013.
Miksch et al., “Utilizing temporal data abstraction for data validation and therapy planning for artificially ventilated newborn infants,” Artificial Intelligence in Medicine, vol. 8, No. 6, pp. 543-576 (1996).
Horn et al., “Effective data validation of high-frequency data: Time-point-time-interval-, and trend-based methods,” Computers in Biology and Medic, New York, NY, vol. 27, No. 5, pp. 389-409 (1997).
Stacey et al., “Temporal abstraction in intelligent clinical data analysis: A survey, ” Artificial Intelligence in Medicine, vol. 39, No. 1, pp. 1-24 (2006).
Zoccali, Bruno, “A Method for Approximating Component Temperatures at Altitude Conditions Based on CFD Analysis at Sea Level Conditions,” (white paper), www.tdmginc.com, Dec. 6, 2018 (9 pages).
Slocinski et al., “Distance measure for impedance spectra for quantified evaluations,” Lecture Notes on Impedance Spectroscopy, vol. 3, Taylor and Francis Group (Jul. 2012)—Book not Attached.
Engel et al. “A safe robot system for craniofacial surgery”, 2013 IEEE International Conference on Robotics and Automation (ICRA); May 6-10, 2013; Karlsruhe, Germany, vol. 2, Jan. 1, 2001, pp. 2020-2024.
Bonaci et al., “To Make a Robot Secure: An Experimental Analysis of Cyber Security Threats Against Teleoperated Surgical Robots,” May 13, 2015. Retrieved from the Internet: URL:https://arxiv.org/pdf/1504.04339v2.pdf [retrieved on Aug. 24, 2019].
Homa Alemzadeh et al., “Targeted Attacks on Teleoperated Surgical Robots: Dynamic Model-Based Detection and Mitigation,” 2016 46th Annual IEEE/IFIP International Conference on Dependable Systems and Networks (DSN), IEEE, Jun. 28, 2016, pp. 395-406.
Phumzile Malindi, “5. QoS in Telemedicine,” “Telemedicine,” Jun. 20, 2011, IntechOpen, pp. 119-138.
Staub et al., “Contour-based Surgical Instrument Tracking Supported by Kinematic Prediction,” Proceedings of the 2010 3rd IEEE RAS & EMBS International Conference on Biomedical Robotics and Biomechatronics, Sep. 1, 2010, pp. 746-752.
Allan et al., “3-D Pose Estimation of Articulated Instruments in Robotic Minimally Invasive Surgery,” IEEE Transactions on Medical Imaging, vol. 37, No. 5, May 1, 2018, pp. 1204-1213.
Kassahun et al., “Surgical Robotics Beyond Enhanced Dexterity Instrumentation: A Survey of the Machine Learning Techniques and their Role in Intelligent and Autonomous Surgical Actions.” International Journal of Computer Assisted Radiology and Surgery, vol. 11, No. 4, Oct. 8, 2015, pp. 553-568.
Weede et al. “An Intelligent and Autonomous Endoscopic Guidance System for Minimally Invasive Surgery,” 2013 IEEE International Conference on Robotics ad Automation (ICRA), May 6-10, 2013. Karlsruhe, Germany, May 1, 2011, pp. 5762-5768.
Altenberg et al., “Genes of Glycolysis are Ubiquitously Overexpressed in 24 Cancer Classes,” Genomics, vol. 84, pp. 1014-1020 (2004).
Harold I. Brandon and V. Leroy Young, Mar. 1997, Surgical Services Management vol. 3 No. 3. retrieved from the internet <https://www.surgimedics.com/Research%20Articles/Electrosurgical%20Plume/Characterization%20And%20Removal%20Of%20Electrosurgical%20Smoke.pdf> (Year: 1997).
Marshall Brain, How Microcontrollers Work, 2006, retrieved from the internet <https://web.archive.org/web/20060221235221/http://electronics.howstuffworks.com/microcontroller.htm/printable> (Year: 2006).
CRC Press, “The Measurement, Instrumentation and Sensors Handbook,” 1999, Section VII, Chapter 41, Peter O'Shea, “Phase Measurement,” pp. 1303-1321, ISBN 0-8493-2145-X.
Jiang, “‘Sound of Silence’ : a secure indoor wireless ultrasonic communication system,” Article, 2014, pp. 46-50, Snapshots of Doctoral Research at University College Cork, School of Engineering—Electrical & Electronic Engineering, UCC, Cork, Ireland.
Li, et al., “Short-range ultrasonic communications in air using quadrature modulation,” Journal, Oct. 30, 2009, pp. 2060-2072, IEEE Transactions on Ultrasonics, Ferroelectrics, and Frequency Control, vol. 56, No. 10, IEEE.
Salamon, “AI Detects Polyps Better Than Colonoscopists” Online Article, Jun. 3, 2018, Medscape Medical News, Digestive Disease Week (DDW) 2018: Presentation 133.
Misawa, et al. “Artificial Intelligence-Assisted Polyp Detection for Colonoscopy: Initial Experience,” Article, Jun. 2018, pp. 2027-2029, vol. 154, Issue 8, American Gastroenterolgy Association.
Dottorato, “Analysis and Design of the Rectangular Microstrip Patch Antennas for TM0n0 operating mode,”Article, Oct. 8, 2010, pp. 1-9, Microwave Journal.
Miller, et al., “Impact of Powered and Tissue-Specific Endoscopic Stapling Technology on Clinical and Economic Outcomes of Video-Assisted Thoracic Surgery Lobectomy Procedures: A Retrospective, Observational Study,” Article, Apr. 2018, pp. 707-723, vol. 35 (Issue 5), Advances in Therapy.
Hsiao-Wei Tang, “ARCM”, Video, Sep. 2012, YouTube, 5 screenshots, Retrieved from internet: <https://www.youtube.com/watch?v=UIdQaxb3fRw&feature=youtu.be>.
Giannios, et al., “Visible to near-infrared refractive properties of freshly-excised human-liver tissues: marking hepatic malignancies,” Article, Jun. 14, 2016, pp. 1-10, Scientific Reports 6, Article No. 27910, Nature.
Vander Heiden, et al., “Understanding the Warburg effect: the metabolic requirements of cell proliferation,” Article, May 22, 2009, pp. 1-12, vol. 324, Issue 5930, Science.
Hirayama et al., “Quantitative Metabolome Profiling of Colon and Stomach Cancer Microenvironment by Capillary Electrophoresis Time-of-Flight Mass Spectrometry,” Article, Jun. 2009, pp. 4918-4925, vol. 69, Issue 11, Cancer Research.
Cengiz, et al., “A Tale of Two Compartments: Interstitial Versus Blood Glucose Monitoring,” Article, Jun. 2009, pp. S11-S16, vol. 11, Supplement 1, Diabetes Technology & Therapeutics.
Shen, et al., “An iridium nanoparticles dispersed carbon based thick film electrochemical biosensor and its application for a single use, disposable glucose biosensor,” Article, Feb. 3, 2007, pp. 106-113, vol. 125, Issue 1, Sensors and Actuators B: Chemical, Science Direct.
“ATM-MPLS Network Interworking Version 2.0, af-aic-0178.001” ATM Standard, The ATM Forum Technical Committee, published Aug. 2003.
IEEE Std 802.3-2012 (Revision of IEEE Std 802.3-2008, published Dec. 28, 2012.
IEEE Std No. 177, “Standard Definitions and Methods of Measurement for Piezoelectric Vibrators,” published May 1966, The Institute of Electrical and Electronics Engineers, Inc., New York, N.Y.
Shi et al., An intuitive control console for robotic syrgery system, 2014, IEEE, p. 404-407 (Year: 2014).
Choi et al., A haptic augmented reality surgeon console for a laparoscopic surgery robot system, 2013, IEEE, p. 355-357 (Year: 2013).
Xie et al., Development of stereo vision and master-slave controller for a compact surgical robot system, 2015, IEEE, p. 403-407 (Year: 2015).
Sun et al., Innovative effector design for simulation training in robotic surgery, 2010, IEEE, p. 1735-1759 (Year: 2010).
Anonymous, “Internet of Things Powers Connected Surgical Device Infrastructure Case Study”, Dec. 31, 2016 (Dec. 31, 2016), Retrieved from the Internet: URL:https://www.cognizant.com/services-resources/150110_IoT_connected_surgical_devices.pdf.
Draijer, Matthijs et al., “Review of laser pseckle contrast techniques for visualizing tissue perfusion,” Lasers in Medical Science, Springer-Verlag, LO, vol. 24, No. 4, Dec. 3, 2008, pp. 639-651.
Roy D Cullum, “Handbook of Engineering Design”, ISBN: 9780408005586, Jan. 1, 1988 (Jan. 1, 1988), XP055578597, ISBN: 9780408005586, 10-20, Chapter 6, p. 138, right-hand column, paragraph 3.
“Surgical instrumentation: the true cost of instrument trays and a potential strategy for optimization”; Mhlaba et al.; Sep. 23, 2015 (Year: 2015).
Nabil Simaan et al., “Intelligent Surgical Robots with Situational Awareness: From Good to Great Surgeons”, DOI: 10.1115/1.2015-Sep-6 external link, Sep. 2015 (Sep. 2015), p. 3-6, Retrieved from the Internet: URL:http://memagazineselect.asmedigitalcollection.asme.org/data/journals/meena/936888/me-2015-sep6.pdf XP055530863.
Anonymous: “Titanium Key Chain Tool 1.1, Ultralight Multipurpose Key Chain Tool, Forward Cutting Can Opener—Vargo Titanium,” vargooutdoors.com, Jul. 5, 2014 (Jul. 5, 2014), retrieved from the internet: https://vargooutdoors.com/titanium-key-chain-tool-1-1.html.
Anonymous: “Screwdriver—Wikipedia”, en.wikipedia.org, Jun. 23, 2019, XP055725151, Retrieved from the Internet: URL:https://en.wikipedia.org/w/index.php?title=Screwdriver&oldid=903111203 [retrieved on Mar. 20, 2021].
Nordlinger, Christopher, “The Internet of Things and the Operating Room of the Future,” May 4, 2015, https://medium.com/@chrisnordlinger/the-internet-of-things-and-the-operating-room-of-the-future-8999a143d7b1, retrieved from the internet on Apr. 27, 2021, 9 pages.
Screen captures from YouTube video clip entitled “Four ways to use the Lego Brick Separator Tool,” 2 pages, uploaded on May 29, 2014 by user “Sarah Lewis”. Retrieved from internet: https://www.youtube.com/watch?v=ucKiRD6U1LU (Year: 2014).
Sorrells, P., “Application Note AN680. Passive RFID Basics,” retrieved from http://ww1.microchip.com/downloads/en/AppNotes/00680b.pdf on Feb. 26/2020, Dec. 31, 1998, pp. 1-7.
Lalys, et al., “Automatic knowledge-based recognition of low-level tasks in ophthalmological procedures”, Int J Cars, vol. 8, No. 1, pp. 1-49, Apr. 19, 2012.
Hu, Jinwen, Stimulations of adaptive temperature control with self-focused hyperthermia system for tumor treatment, Jan. 9, 2012, Ultrasonics 53, pp. 171-177, (Year: 2012).
Hussain et al., “A survey on resource allocation in high performance distributed computing systems”, Parallel Computing, vol. 39, No. 11, pp. 709-736 (2013).
Anonymous: “Quality of service—Wikipedia”, Dec. 7, 2017, Retrieved from the Internet: URL:https://en.wikipedia.org/w/index.php?title=Quality_of_service&oldid=814298744#Applications [retrieved on Feb. 14, 2023], pp. 1-12.
Anonymous: “Differentiated services—Wikipedia”, Dec. 14, 2017, Retrieved from the Internet: URL:https://en.wikipedia.org/w/index.php?title=Differentiated_services&oldid=815415620 [retrieved on Feb. 14, 2023], pp. 1-7.
Anonymous: “Cloud computing—Wikipedia”, Dec. 19, 2017, Retrieved from the Internet: URL:https://en.wikipedia.org/w/index.php?title=Cloud_computing&oldid=816206558 [retrieved Feb. 14, 2023], pp. 1-21.
Related Publications (1)
Number Date Country
20190201158 A1 Jul 2019 US
Provisional Applications (14)
Number Date Country
62729176 Sep 2018 US
62692748 Jun 2018 US
62692747 Jun 2018 US
62692768 Jun 2018 US
62659900 Apr 2018 US
62650882 Mar 2018 US
62650898 Mar 2018 US
62650877 Mar 2018 US
62650887 Mar 2018 US
62640415 Mar 2018 US
62640417 Mar 2018 US
62611341 Dec 2017 US
62611339 Dec 2017 US
62611340 Dec 2017 US