Characterization of tissue irregularities through the use of mono-chromatic light refractivity

Information

  • Patent Grant
  • 11864728
  • Patent Number
    11,864,728
  • Date Filed
    Thursday, March 29, 2018
    6 years ago
  • Date Issued
    Tuesday, January 9, 2024
    3 months ago
Abstract
A surgical image acquisition system includes multiple illumination sources, each source emitting light at a specified wavelength, a light sensor to receive light reflected from a tissue sample illuminated by each of the illumination sources, and a computing system. The computing system may receive data from the light sensor when the tissue sample is illuminated by the illumination sources, and calculate structural data related to one or more characteristics of a structure within the tissue. The structural data may be a surface characteristic such as a surface roughness or a structure composition such as a collagen and elastin composition. The computer system may further transmit the structural data to a smart surgical device. The smart devices may include a smart stapler, a smart RF sealing device, or a smart ultrasonic cutting device. The system may include a controller and computer enabled instructions to accomplish the above.
Description
BACKGROUND

The present disclosure relates to various surgical systems. Surgical procedures are typically performed in surgical operating theaters or rooms in a healthcare facility such as, for example, a hospital. A sterile field is typically created around the patient. The sterile field may include the scrubbed team members, who are properly attired, and all furniture and fixtures in the area. Various surgical devices and systems are utilized in performance of a surgical procedure.


SUMMARY

In some aspects, a surgical image acquisition system may include a plurality of illumination sources in which each illumination source is configured to emit light having a specified central wavelength, a light sensor configured to receive a portion of the light reflected from a tissue sample when illuminated by the one or more of the plurality of illumination sources, and a computing system. The computing system is further configured to receive data from the light sensor when the tissue sample is illuminated by each of the plurality of illumination sources, calculate structural data related to a characteristic of a structure within the tissue sample based on the data received by the light sensor when the tissue sample is illuminated by each of the illumination sources, and transmit the structural data related to the characteristic of the structure to be received by a smart surgical device. The characteristic of the structure may be a surface characteristic or a structure composition.


In one aspect of the surgical image acquisition system, the plurality of illumination sources may include at least one of a red light illumination source, a green light illumination source, and a blue light illumination source.


In one aspect of the surgical image acquisition, the plurality of illumination sources may include at least one of an infrared light illumination source and an ultraviolet light illumination source.


In one aspect of the surgical image acquisition system, the computing system, configured to calculate structural data related to a characteristic of a structure within the tissue, may include a computing system configured to calculate structural data related to a composition of a structure within the tissue.


In one aspect of the surgical image acquisition, the computing system, configured to calculate structural data related to a characteristic of a structure within the tissue, comprises a computing system configured to calculate structural data related to a surface roughness of a structure within the tissue.


In some aspects, a surgical image acquisition system may include a processor and a memory coupled to the processor. The memory may store instructions executable by the processor to control the operation of a plurality of illumination sources of a tissue sample in which each illumination source is configured to emit light having a specified central wavelength, to receive data from the light sensor when the tissue sample is illuminated by each of the plurality of illumination sources, to calculate structural data related to a characteristic of a structure within the tissue sample based on the data received by the light sensor when the tissue sample is illuminated by each of the illumination sources, and to transmit the structural data related to the characteristic of the structure to be received by a smart surgical device. In some aspects, the characteristic of the structure may be a surface characteristic or a structure composition.


In one aspect of the surgical image acquisition system, the instructions executable by the processor to control the operation of a plurality of illumination sources comprise one or more instructions to illuminate the tissue sample sequentially by each of the plurality of illumination sources.


In one aspect of the surgical image acquisition system, the instructions executable by the processor to calculate structural data related to a characteristic of a structure within the tissue sample based on the data received by the light sensor may include one or more instructions to calculate structural data related to a characteristic of a structure within the tissue sample based on a phase shift in the illumination reflected by the tissue sample.


In one aspect of the surgical image acquisition system, the structure composition may include a relative composition of collagen and elastin in a tissue.


In one aspect of the surgical image acquisition system, the structure composition may include an amount of hydration of a tissue.


In some aspects, a surgical image acquisition system may include a control circuit configured to control the operation of a plurality of illumination sources of a tissue sample in which each illumination source is configured to emit light having a specified central wavelength, to receive data from the light sensor when the tissue sample is illuminated by each of the plurality of illumination sources, to calculate structural data related to a characteristic of a structure within the tissue sample based on the data received by the light sensor when the tissue sample is illuminated by each of the illumination sources, and to transmit the structural data related to the characteristic of the structure to be received by a smart surgical device. In some aspects, the characteristic of the structure may be a surface characteristic or a structure composition.


In one aspect of the surgical image acquisition system, the control circuit is configured to transmit the structural data related to the characteristic of the structure to be received by a smart surgical device wherein the smart surgical device is a smart surgical stapler.


In one aspect of the surgical image acquisition system, the control circuit is further configured to transmit data related to an anvil pressure based on the characteristic of the structure to be received by the smart surgical stapler.


In one aspect of the surgical image acquisition system, the control circuit is configured to transmit the structural data related to the characteristic of the structure to be received by a smart surgical device wherein the smart surgical device is a smart surgical RF sealing device.


In one aspect of the surgical image acquisition system, the control circuit is further configured to transmit data related to an amount of RF power based on the characteristic of the structure to be received by the smart RF sealing device.


In one aspect of the surgical image acquisition system, the control circuit is configured to transmit the structural data related to the characteristic of the structure to be received by a smart surgical device wherein the smart surgical device is a smart ultrasound cutting device.


In one aspect of the surgical image acquisition system, the control circuit is further configured to transmit data related to an amount of power provided to an ultrasonic transducer or a driving frequency of the ultrasonic transducer based on the characteristic of the structure to be received by the ultrasound cutting device.


In some aspects, a non-transitory computer readable medium storing computer readable instructions which, when executed, causes a machine to control the operation of a plurality of illumination sources of a tissue sample wherein each illumination source is configured to emit light having a specified central wavelength, to receive data from the light sensor when the tissue sample is illuminated by each of the plurality of illumination sources, to calculate structural data related to a characteristic of a structure within the tissue sample based on the data received by the light sensor when the tissue sample is illuminated by each of the illumination sources, and to transmit the structural data related to the characteristic of the structure to be received by a smart surgical device. In some aspects, the characteristic of the structure is a surface characteristic or a structure composition.





FIGURES

The features of various aspects are set forth with particularity in the appended claims. The various aspects, however, both as to organization and methods of operation, together with further objects and advantages thereof, may best be understood by reference to the following description, taken in conjunction with the accompanying drawings as follows.



FIG. 1 is a block diagram of a computer-implemented interactive surgical system, in accordance with at least one aspect of the present disclosure.



FIG. 2 is a surgical system being used to perform a surgical procedure in an operating room, in accordance with at least one aspect of the present disclosure.



FIG. 3 is a surgical hub paired with a visualization system, a robotic system, and an intelligent instrument, in accordance with at least one aspect of the present disclosure.



FIG. 4 is a partial perspective view of a surgical hub enclosure, and of a combo generator module slidably receivable in a drawer of the surgical hub enclosure, in accordance with at least one aspect of the present disclosure.



FIG. 5 is a perspective view of a combo generator module with bipolar, ultrasonic, and monopolar contacts and a smoke evacuation component, in accordance with at least one aspect of the present disclosure.



FIG. 6 illustrates individual power bus attachments for a plurality of lateral docking ports of a lateral modular housing configured to receive a plurality of modules, in accordance with at least one aspect of the present disclosure.



FIG. 7 illustrates a vertical modular housing configured to receive a plurality of modules, in accordance with at least one aspect of the present disclosure.



FIG. 8 illustrates a surgical data network comprising a modular communication hub configured to connect modular devices located in one or more operating theaters of a healthcare facility, or any room in a healthcare facility specially equipped for surgical operations, to the cloud, in accordance with at least one aspect of the present disclosure.



FIG. 9 illustrates a computer-implemented interactive surgical system, in accordance with at least one aspect of the present disclosure.



FIG. 10 illustrates a surgical hub comprising a plurality of modules coupled to the modular control tower, in accordance with at least one aspect of the present disclosure.



FIG. 11 illustrates one aspect of a Universal Serial Bus (USB) network hub device, in accordance with at least one aspect of the present disclosure.



FIG. 12 illustrates a logic diagram of a control system of a surgical instrument or tool, in accordance with at least one aspect of the present disclosure.



FIG. 13 illustrates a control circuit configured to control aspects of the surgical instrument or tool, in accordance with at least one aspect of the present disclosure.



FIG. 14 illustrates a combinational logic circuit configured to control aspects of the surgical instrument or tool, in accordance with at least one aspect of the present disclosure.



FIG. 15 illustrates a sequential logic circuit configured to control aspects of the surgical instrument or tool, in accordance with at least one aspect of the present disclosure.



FIG. 16 illustrates a surgical instrument or tool comprising a plurality of motors which can be activated to perform various functions, in accordance with at least one aspect of the present disclosure.



FIG. 17 is a schematic diagram of a robotic surgical instrument configured to operate a surgical tool described herein, in accordance with at least one aspect of the present disclosure.



FIG. 18 illustrates a block diagram of a surgical instrument programmed to control the distal translation of a displacement member, in accordance with at least one aspect of the present disclosure.



FIG. 19 is a schematic diagram of a surgical instrument configured to control various functions, in accordance with at least one aspect of the present disclosure.



FIG. 20 is a simplified block diagram of a generator configured to provide inductorless tuning, among other benefits, in accordance with at least one aspect of the present disclosure.



FIG. 21 illustrates an example of a generator, which is one form of the generator of FIG. 20, in accordance with at least one aspect of the present disclosure.



FIG. 22A illustrates a visualization system that may be incorporated into a surgical system, in accordance with at least one aspect of the present disclosure.



FIG. 22B illustrates a top plan view of a hand unit of the visualization system of FIG. 22A, in accordance with at least one aspect of the present disclosure.



FIG. 22C illustrates a side plan view of the hand unit depicted in FIG. 22A along with an imaging sensor disposed therein, in accordance with at least one aspect of the present disclosure.



FIG. 22D illustrates a plurality of an imaging sensors a depicted in FIG. 22C, in accordance with at least one aspect of the present disclosure.



FIG. 23A illustrates a plurality of laser emitters that may be incorporated in the visualization system of FIG. 22A, in accordance with at least one aspect of the present disclosure.



FIG. 23B illustrates illumination of an image sensor having a Bayer pattern of color filters, in accordance with at least one aspect of the present disclosure.



FIG. 23C illustrates a graphical representation of the operation of a pixel array for a plurality of frames, in accordance with at least one aspect of the present disclosure.



FIG. 23D illustrates a schematic of an example of an operation sequence of chrominance and luminance frames, in accordance with at least one aspect of the present disclosure.



FIG. 23E illustrates an example of sensor and emitter patterns, in accordance with at least one aspect of the present disclosure.



FIG. 23F illustrates a graphical representation of the operation of a pixel array, in accordance with at least one aspect of the present disclosure.



FIG. 24 illustrates a schematic of one example of instrumentation for NIR spectroscopy, according to one aspect of the present disclosure.



FIG. 25 illustrates schematically one example of instrumentation for determining NIRS based on Fourier transform infrared imaging, in accordance with at least one aspect of the present disclosure.



FIGS. 26A-C illustrate a change in wavelength of light scattered from moving blood cells, in accordance with at least one aspect of the present disclosure.



FIG. 27 illustrates an aspect of instrumentation that may be used to detect a Doppler shift in laser light scattered from portions of a tissue, in accordance with at least one aspect of the present disclosure.



FIG. 28 illustrates schematically some optical effects on light impinging on a tissue having subsurface structures, in accordance with at least one aspect of the present disclosure.



FIG. 29 illustrates an example of the effects on a Doppler analysis of light impinging on a tissue sample having subsurface structures, in accordance with at least one aspect of the present disclosure.



FIGS. 30A-C illustrate schematically the detection of moving blood cells at a tissue depth based on a laser Doppler analysis at a variety of laser wavelengths, in accordance with at least one aspect of the present disclosure.



FIG. 30D illustrates the effect of illuminating a CMOS imaging sensor with a plurality of light wavelengths over time, in accordance with at least one aspect of the present disclosure.



FIG. 31 illustrates an example of a use of Doppler imaging to detect the present of subsurface blood vessels, in accordance with at least one aspect of the present disclosure.



FIG. 32 illustrates a method to identify a subsurface blood vessel based on a Doppler shift of blue light due to blood cells flowing therethrough, in accordance with at least one aspect of the present disclosure.



FIG. 33 illustrates schematically localization of a deep subsurface blood vessel, in accordance with at least one aspect of the present disclosure.



FIG. 34 illustrates schematically localization of a shallow subsurface blood vessel, in accordance with at least one aspect of the present disclosure.



FIG. 35 illustrates a composite image comprising a surface image and an image of a subsurface blood vessel, in accordance with at least one aspect of the present disclosure.



FIG. 36 is a flow chart of a method for determining a depth of a surface feature in a piece of tissue, in accordance with at least one aspect of the present disclosure.



FIG. 37 illustrates the effect of the location and characteristics of non-vascular structures on light impinging on a tissue sample, in accordance with at least one aspect of the present disclosure.



FIG. 38 schematically depicts one example of components used in a full field OCT device, in accordance with at least one aspect of the present disclosure.



FIG. 39 illustrates schematically the effect of tissue anomalies on light reflected from a tissue sample, in accordance with at least one aspect of the present disclosure.



FIG. 40 illustrates an image display derived from a combination of tissue visualization modalities, in accordance with at least one aspect of the present disclosure.



FIGS. 41A-C illustrate several aspects of displays that may be provided to a surgeon for a visual identification of a combination of surface and sub-surface structures of a tissue in a surgical site, in accordance with at least one aspect of the present disclosure.



FIG. 42 is a flow chart of a method for providing information related to a characteristic of a tissue to a smart surgical instrument, in accordance with at least one aspect of the present disclosure.



FIGS. 43A and 43B illustrate a multi-pixel light sensor receiving by light reflected by a tissue illuminated by sequential exposure to red, green, blue, and infra red light, and red, green, blue, and ultraviolet laser light sources, respectively, in accordance with at least one aspect of the present disclosure.



FIGS. 44A and 44B illustrate the distal end of an elongated camera probe having a single light sensor and two light sensors, respectively, in accordance with at least one aspect of the present disclosure.



FIG. 44C illustrates a perspective view of an example of a monolithic sensor having a plurality of pixel arrays, in accordance with at least one aspect of the present disclosure.



FIG. 45 illustrates one example of a pair of fields of view available to two image sensors of an elongated camera probe, in accordance with at least one aspect of the present disclosure.



FIGS. 46A-D illustrate additional examples of a pair of fields of view available to two image sensors of an elongated camera probe, in accordance with at least one aspect of the present disclosure.



FIGS. 47A-C illustrate an example of the use of an imaging system incorporating the features disclosed in FIG. 46D, in accordance with at least one aspect of the present disclosure.



FIGS. 48A and 48B depict another example of the use of a dual imaging system, in accordance with at least one aspect of the present disclosure.



FIGS. 49A-C illustrate examples of a sequence of surgical steps which may benefit from the use of multi-image analysis at the surgical site, in accordance with at least one aspect of the present disclosure.



FIG. 50 is a timeline depicting situational awareness of a surgical hub, in accordance with at least one aspect of the present disclosure.





DESCRIPTION

Applicant of the present application owns the following U.S. Provisional Patent Applications, filed on Mar. 28, 2018, each of which is herein incorporated by reference in its entirety:

    • U.S. Provisional Patent Application Ser. No. 62/649,302, titled INTERACTIVE SURGICAL SYSTEMS WITH ENCRYPTED COMMUNICATION CAPABILITIES;
    • U.S. Provisional Patent Application Ser. No. 62/649,294, titled DATA STRIPPING METHOD TO INTERROGATE PATIENT RECORDS AND CREATE ANONYMIZED RECORD;
    • U.S. Provisional Patent Application Ser. No. 62/649,300, titled SURGICAL HUB SITUATIONAL AWARENESS;
    • U.S. Provisional Patent Application Ser. No. 62/649,309, titled SURGICAL HUB SPATIAL AWARENESS TO DETERMINE DEVICES IN OPERATING THEATER;
    • U.S. Provisional Patent Application Ser. No. 62/649,310, titled COMPUTER IMPLEMENTED INTERACTIVE SURGICAL SYSTEMS;
    • U.S. Provisional Patent Application Ser. No. 62/649,291, titled USE OF LASER LIGHT AND RED-GREEN-BLUE COLORATION TO DETERMINE PROPERTIES OF BACK SCATTERED LIGHT;
    • U.S. Provisional Patent Application Ser. No. 62/649,296, titled ADAPTIVE CONTROL PROGRAM UPDATES FOR SURGICAL DEVICES;
    • U.S. Provisional Patent Application Ser. No. 62/649,333, titled CLOUD-BASED MEDICAL ANALYTICS FOR CUSTOMIZATION AND RECOMMENDATIONS TO A USER;
    • U.S. Provisional Patent Application Ser. No. 62/649,327, titled CLOUD-BASED MEDICAL ANALYTICS FOR SECURITY AND AUTHENTICATION TRENDS AND REACTIVE MEASURES;
    • U.S. Provisional Patent Application Ser. No. 62/649,315, titled DATA HANDLING AND PRIORITIZATION IN A CLOUD ANALYTICS NETWORK;
    • U.S. Provisional Patent Application Ser. No. 62/649,313, titled CLOUD INTERFACE FOR COUPLED SURGICAL DEVICES;
    • U.S. Provisional Patent Application Ser. No. 62/649,320, titled DRIVE ARRANGEMENTS FOR ROBOT-ASSISTED SURGICAL PLATFORMS;
    • U.S. Provisional Patent Application Ser. No. 62/649,307, titled AUTOMATIC TOOL ADJUSTMENTS FOR ROBOT-ASSISTED SURGICAL PLATFORMS; and
    • U.S. Provisional Patent Application Ser. No. 62/649,323, titled SENSING ARRANGEMENTS FOR ROBOT-ASSISTED SURGICAL PLATFORMS.


Applicant of the present application owns the following U.S. Patent Applications, filed on Mar. 29, 2018, each of which is herein incorporated by reference in its entirety:

    • U.S. patent application Ser. No. 15/940,641, titled INTERACTIVE SURGICAL SYSTEMS WITH ENCRYPTED COMMUNICATION CAPABILITIES; now U.S. Pat. No. 10,944,728;
    • U.S. patent application Ser. No. 15/940,648, titled INTERACTIVE SURGICAL SYSTEMS WITH CONDITION HANDLING OF DEVICES AND DATA CAPABILITIES; now U.S. Pat. No. 11,069,012;
    • U.S. patent application Ser. No. 15/940,656, titled SURGICAL HUB COORDINATION OF CONTROL AND COMMUNICATION OF OPERATING ROOM DEVICES; now U.S. Pat. No. 11,166,772;
    • U.S. patent application Ser. No. 15/940,666, titled SPATIAL AWARENESS OF SURGICAL HUBS IN OPERATING ROOMS; now U.S. Pat. No. 11,678,881;
    • U.S. patent application Ser. No. 15/940,670, titled COOPERATIVE UTILIZATION OF DATA DERIVED FROM SECONDARY SOURCES BY INTELLIGENT SURGICAL HUBS; now U.S. Pat. No. 11,266,468;
    • U.S. patent application Ser. No. 15/940,677, titled SURGICAL HUB CONTROL ARRANGEMENTS; now U.S. Pat. No. 10,987,178;
    • U.S. patent application Ser. No. 15/940,632, titled DATA STRIPPING METHOD TO INTERROGATE PATIENT RECORDS AND CREATE ANONYMIZED RECORD; now U.S. Pat. No. 11,132,462.
    • U.S. patent application Ser. No. 15/940,640, titled COMMUNICATION HUB AND STORAGE DEVICE FOR STORING PARAMETERS AND STATUS OF A SURGICAL DEVICE TO BE SHARED WITH CLOUD BASED ANALYTICS SYSTEMS; now U.S. Pat. No. 11,202,570;
    • U.S. patent application Ser. No. 15/940,645, titled SELF DESCRIBING DATA PACKETS GENERATED AT AN ISSUING INSTRUMENT; now U.S. Pat. No. 10,892,899;
    • U.S. patent application Ser. No. 15/940,649, titled DATA PAIRING TO INTERCONNECT A DEVICE MEASURED PARAMETER WITH AN OUTCOME; now U.S. Patent Application Publication No. 2019/0205567;
    • U.S. patent application Ser. No. 15/940,654, titled SURGICAL HUB SITUATIONAL AWARENESS; now U.S. Patent Application Publication No. 2019/0201140;
    • U.S. patent application Ser. No. 15/940,663, titled SURGICAL SYSTEM DISTRIBUTED PROCESSING; now U.S. Pat. No. 11,419,630;
    • U.S. patent application Ser. No. 15/940,668, titled AGGREGATION AND REPORTING OF SURGICAL HUB DATA; now U.S. Patent No. 2019/0201115;
    • U.S. patent application Ser. No. 15/940,671, titled SURGICAL HUB SPATIAL AWARENESS TO DETERMINE DEVICES IN OPERATING THEATER; now U.S. Patent Application Publication No. 2019/0201104;
    • U.S. patent application Ser. No. 15/940,686, titled DISPLAY OF ALIGNMENT OF STAPLE CARTRIDGE TO PRIOR LINEAR STAPLE LINE; now U.S. Pat. No. 11,026,751;
    • U.S. patent application Ser. No. 15/940,700, titled STERILE FIELD INTERACTIVE CONTROL DISPLAYS; now U.S. Pat. No. 11,672,605;
    • U.S. patent application Ser. No. 15/940,629, titled COMPUTER IMPLEMENTED INTERACTIVE SURGICAL SYSTEMS; now U.S. Patent Application Publication No. 2019/0201112;
    • U.S. patent application Ser. No. 15/940,704; titled USE OF LASER LIGHT AND RED-GREEN-BLUE COLORATION TO DETERMINE PROPERTIES OF BACK SCATTERED LIGHT; now U.S. Pat. No. 11,100,631; and
    • U.S. patent application Ser. No. 15/940,742, titled DUAL CMOS ARRAY IMAGING; Now U.S. Patent Application Publication No. 2019/0200906:


Applicant of the present application owns the following U.S. Patent Applications, filed on Mar. 29, 2018, each of which is herein incorporated by reference in its entirety:

    • U.S. patent application Ser. No. 15/940,636, titled ADAPTIVE CONTROL PROGRAM UPDATES FOR SURGICAL DEVICES; now U.S. Pat. No. 11,410,259;
    • U.S. patent application Ser. No. 15/940,653, titled ADAPTIVE CONTROL PROGRAM UPDATES FOR SURGICAL HUBS; now U.S. Pat. No. 11,076,921;
    • U.S. patent application Ser. No. 15/940,660, titled CLOUD-BASED MEDICAL ANALYTICS FOR CUSTOMIZATION AND RECOMMENDATIONS TO A USER; now U.S. Patent Application Publication No. 2019/0206555;
    • U.S. patent application Ser. No. 15/940,679, titled CLOUD-BASED MEDICAL ANALYTICS FOR LINKING OF LOCAL USAGE TRENDS WITH THE RESOURCE ACQUISITION BEHAVIORS OF LARGER DATA SET; now U.S. Pat. No. 10,932,872;
    • U.S. patent application Ser. No. 15/940,694, titled CLOUD-BASED MEDICAL ANALYTICS FOR MEDICAL FACILITY SEGMENTED INDIVIDUALIZATION OF INSTRUMENT FUNCTION; now U.S. Pat. No. 10,966,791;
    • U.S. patent application Ser. No. 15/940,634, titled CLOUD-BASED MEDICAL ANALYTICS FOR SECURITY AND AUTHENTICATION TRENDS AND REACTIVE MEASURES; now U.S. Pat. No. 11,179,208;
    • U.S. patent application Ser. No. 15/940,706, titled DATA HANDLING AND PRIORITIZATION IN A CLOUD ANALYTICS NETWORK; now U.S. Patent Application Publication No. 2019/0206561; and
    • U.S. patent application Ser. No. 15/940,675, titled CLOUD INTERFACE FOR COUPLED SURGICAL DEVICES; Now U.S. Pat. No. 10,849,697,


Applicant of the present application owns the following U.S. Patent Applications, filed on Mar. 29, 2018, each of which is herein incorporated by reference in its entirety:

    • U.S. patent application Ser. No. 15/940,627, titled DRIVE ARRANGEMENTS FOR ROBOT-ASSISTED SURGICAL PLATFORMS; now U.S. Pat. No. 11,013,563;
    • U.S. patent application Ser. No. 15/940,637, titled COMMUNICATION ARRANGEMENTS FOR ROBOT-ASSISTED SURGICAL PLATFORMS; U.S. Patent Application Publication No. 2019/0201139;
    • U.S. patent application Ser. No. 15/940,642, titled CONTROLS FOR ROBOT-ASSISTED SURGICAL PLATFORMS; now U.S. Patent Application Publication No. 2019/0201113;
    • U.S. patent application Ser. No. 15/940,676, titled AUTOMATIC TOOL ADJUSTMENTS FOR ROBOT-ASSISTED SURGICAL PLATFORMS; now U.S. Patent Application Publication No. 2019/0201142;
    • U.S. patent application Ser. No. 15/940,680, titled CONTROLLERS FOR ROBOT-ASSISTED SURGICAL PLATFORMS; now U.S. Pat. No. 11,213,359;
    • U.S. patent application Ser. No. 15/940,683, titled COOPERATIVE SURGICAL ACTIONS FOR ROBOT-ASSISTED SURGICAL PLATFORMS; now U.S. Pat. No. 11,058,498;
    • U.S. patent application Ser. No. 15/940,690, titled DISPLAY ARRANGEMENTS FOR ROBOT-ASSISTED SURGICAL PLATFORMS; now U.S. Patent Application Publication No. 2019/0201118; and
    • U.S. patent application Ser. No. 15/940,711, titled SENSING ARRANGEMENTS FOR ROBOT-ASSISTED SURGICAL PLATFORMS; now U.S. Pat. No. 11,432,885.


Before explaining various aspects of surgical devices and generators in detail, it should be noted that the illustrative examples are not limited in application or use to the details of construction and arrangement of parts illustrated in the accompanying drawings and description. The illustrative examples may be implemented or incorporated in other aspects, variations and modifications, and may be practiced or carried out in various ways. Further, unless otherwise indicated, the terms and expressions employed herein have been chosen for the purpose of describing the illustrative examples for the convenience of the reader and are not for the purpose of limitation thereof. Also, it will be appreciated that one or more of the following-described aspects, expressions of aspects, and/or examples, can be combined with any one or more of the other following-described aspects, expressions of aspects and/or examples.


Referring to FIG. 1, a computer-implemented interactive surgical system 100 includes one or more surgical systems 102 and a cloud-based system (e.g., the cloud 104 that may include a remote server 113 coupled to a storage device 105). Each surgical system 102 includes at least one surgical hub 106 in communication with the cloud 104 that may include a remote server 113. In one example, as illustrated in FIG. 1, the surgical system 102 includes a visualization system 108, a robotic system 110, and a handheld intelligent surgical instrument 112, which are configured to communicate with one another and/or the hub 106. In some aspects, a surgical system 102 may include an M number of hubs 106, an N number of visualization systems 108, an O number of robotic systems 110, and a P number of handheld intelligent surgical instruments 112, where M, N, O, and P are integers greater than or equal to one.



FIG. 3 depicts an example of a surgical system 102 being used to perform a surgical procedure on a patient who is lying down on an operating table 114 in a surgical operating room 116. A robotic system 110 is used in the surgical procedure as a part of the surgical system 102. The robotic system 110 includes a surgeon's console 118, a patient side cart 120 (surgical robot), and a surgical robotic hub 122. The patient side cart 120 can manipulate at least one removably coupled surgical tool 117 through a minimally invasive incision in the body of the patient while the surgeon views the surgical site through the surgeon's console 118. An image of the surgical site can be obtained by a medical imaging device 124, which can be manipulated by the patient side cart 120 to orient the imaging device 124. The robotic hub 122 can be used to process the images of the surgical site for subsequent display to the surgeon through the surgeon's console 118.


Other types of robotic systems can be readily adapted for use with the surgical system 102. Various examples of robotic systems and surgical tools that are suitable for use with the present disclosure are described in U.S. Provisional Patent Application Ser. No. 62/611,339, titled ROBOT ASSISTED SURGICAL PLATFORM, filed Dec. 28, 2017, the disclosure of which is herein incorporated by reference in its entirety.


Various examples of cloud-based analytics that are performed by the cloud 104, and are suitable for use with the present disclosure, are described in U.S. Provisional Patent Application Ser. No. 62/611,340, titled CLOUD-BASED MEDICAL ANALYTICS, filed Dec. 28, 2017, the disclosure of which is herein incorporated by reference in its entirety.


In various aspects, the imaging device 124 includes at least one image sensor and one or more optical components. Suitable image sensors include, but are not limited to, Charge-Coupled Device (CCD) sensors and Complementary Metal-Oxide Semiconductor (CMOS) sensors.


The optical components of the imaging device 124 may include one or more illumination sources and/or one or more lenses. The one or more illumination sources may be directed to illuminate portions of the surgical field. The one or more image sensors may receive light reflected or refracted from the surgical field, including light reflected or refracted from tissue and/or surgical instruments.


The one or more illumination sources may be configured to radiate electromagnetic energy in the visible spectrum as well as the invisible spectrum. The visible spectrum, sometimes referred to as the optical spectrum or luminous spectrum, is that portion of the electromagnetic spectrum that is visible to (i.e., can be detected by) the human eye and may be referred to as visible light or simply light. A typical human eye will respond to wavelengths in air that are from about 380 nm to about 750 nm.


The invisible spectrum (i.e., the non-luminous spectrum) is that portion of the electromagnetic spectrum that lies below and above the visible spectrum (i.e., wavelengths below about 380 nm and above about 750 nm). The invisible spectrum is not detectable by the human eye. Wavelengths greater than about 750 nm are longer than the red visible spectrum, and they become invisible infrared (IR), microwave, and radio electromagnetic radiation. Wavelengths less than about 380 nm are shorter than the violet spectrum, and they become invisible ultraviolet, x-ray, and gamma ray electromagnetic radiation.


In various aspects, the imaging device 124 is configured for use in a minimally invasive procedure. Examples of imaging devices suitable for use with the present disclosure include, but not limited to, an arthroscope, angioscope, bronchoscope, choledochoscope, colonoscope, cytoscope, duodenoscope, enteroscope, esophagogastro-duodenoscope (gastroscope), endoscope, laryngoscope, nasopharyngo-neproscope, sigmoidoscope, thoracoscope, and ureteroscope.


In one aspect, the imaging device employs multi-spectrum monitoring to discriminate topography and underlying structures. A multi-spectral image is one that captures image data within specific wavelength ranges across the electromagnetic spectrum. The wavelengths may be separated by filters or by the use of instruments that are sensitive to particular wavelengths, including light from frequencies beyond the visible light range, e.g., IR and ultraviolet. Spectral imaging can allow extraction of additional information the human eye fails to capture with its receptors for red, green, and blue. The use of multi-spectral imaging is described in greater detail under the heading “Advanced Imaging Acquisition Module” in U.S. Provisional Patent Application Ser. No. 62/611,341, titled INTERACTIVE SURGICAL PLATFORM, filed Dec. 28, 2017, the disclosure of which is herein incorporated by reference in its entirety. Multi-spectrum monitoring can be a useful tool in relocating a surgical field after a surgical task is completed to perform one or more of the previously described tests on the treated tissue.


It is axiomatic that strict sterilization of the operating room and surgical equipment is required during any surgery. The strict hygiene and sterilization conditions required in a “surgical theater,” i.e., an operating or treatment room, necessitate the highest possible sterility of all medical devices and equipment. Part of that sterilization process is the need to sterilize anything that comes in contact with the patient or penetrates the sterile field, including the imaging device 124 and its attachments and components. It will be appreciated that the sterile field may be considered a specified area, such as within a tray or on a sterile towel, that is considered free of microorganisms, or the sterile field may be considered an area, immediately around a patient, who has been prepared for a surgical procedure. The sterile field may include the scrubbed team members, who are properly attired, and all furniture and fixtures in the area.


In various aspects, the visualization system 108 includes one or more imaging sensors, one or more image processing units, one or more storage arrays, and one or more displays that are strategically arranged with respect to the sterile field, as illustrated in FIG. 2. In one aspect, the visualization system 108 includes an interface for HL7, PACS, and EMR. Various components of the visualization system 108 are described under the heading “Advanced Imaging Acquisition Module” in U.S. Provisional Patent Application Ser. No. 62/611,341, titled INTERACTIVE SURGICAL PLATFORM, filed Dec. 28, 2017, the disclosure of which is herein incorporated by reference in its entirety.


As illustrated in FIG. 2, a primary display 119 is positioned in the sterile field to be visible to an operator at the operating table 114. In addition, a visualization tower 111 is positioned outside the sterile field. The visualization tower 111 includes a first non-sterile display 107 and a second non-sterile display 109, which face away from each other. The visualization system 108, guided by the hub 106, is configured to utilize the displays 107, 109, and 119 to coordinate information flow to operators inside and outside the sterile field. For example, the hub 106 may cause the visualization system 108 to display a snap-shot of a surgical site, as recorded by an imaging device 124, on a non-sterile display 107 or 109, while maintaining a live feed of the surgical site on the primary display 119. The snap-shot on the non-sterile display 107 or 109 can permit a non-sterile operator to perform a diagnostic step relevant to the surgical procedure, for example.


In one aspect, the hub 106 is also configured to route a diagnostic input or feedback entered by a non-sterile operator at the visualization tower 111 to the primary display 119 within the sterile field, where it can be viewed by a sterile operator at the operating table. In one example, the input can be in the form of a modification to the snap-shot displayed on the non-sterile display 107 or 109, which can be routed to the primary display 119 by the hub 106.


Referring to FIG. 2, a surgical instrument 112 is being used in the surgical procedure as part of the surgical system 102. The hub 106 is also configured to coordinate information flow to a display of the surgical instrument 112. For example, in U.S. Provisional Patent Application Ser. No. 62/611,341, titled INTERACTIVE SURGICAL PLATFORM, filed Dec. 28, 2017, the disclosure of which is herein incorporated by reference in its entirety. A diagnostic input or feedback entered by a non-sterile operator at the visualization tower 111 can be routed by the hub 106 to the surgical instrument display 115 within the sterile field, where it can be viewed by the operator of the surgical instrument 112. Example surgical instruments that are suitable for use with the surgical system 102 are described under the heading “Surgical Instrument Hardware” and in U.S. Provisional Patent Application Ser. No. 62/611,341, titled INTERACTIVE SURGICAL PLATFORM, filed Dec. 28, 2017, the disclosure of which is herein incorporated by reference in its entirety, for example.


Referring now to FIG. 3, a hub 106 is depicted in communication with a visualization system 108, a robotic system 110, and a handheld intelligent surgical instrument 112. The hub 106 includes a hub display 135, an imaging module 138, a generator module 140, a communication module 130, a processor module 132, and a storage array 134. In certain aspects, as illustrated in FIG. 3, the hub 106 further includes a smoke evacuation module 126 and/or a suction/irrigation module 128.


During a surgical procedure, energy application to tissue, for sealing and/or cutting, is generally associated with smoke evacuation, suction of excess fluid, and/or irrigation of the tissue. Fluid, power, and/or data lines from different sources are often entangled during the surgical procedure. Valuable time can be lost addressing this issue during a surgical procedure. Detangling the lines may necessitate disconnecting the lines from their respective modules, which may require resetting the modules. The hub modular enclosure 136 offers a unified environment for managing the power, data, and fluid lines, which reduces the frequency of entanglement between such lines.


Aspects of the present disclosure present a surgical hub for use in a surgical procedure that involves energy application to tissue at a surgical site. The surgical hub includes a hub enclosure and a combo generator module 140 slidably receivable in a docking station of the hub enclosure. The docking station includes data and power contacts. The combo generator module 140 includes two or more of an ultrasonic energy generator component 141, a bipolar RF energy generator component 144, and a monopolar RF energy generator component 142 that are housed in a single unit. In one aspect, the combo generator module 140 also includes a smoke evacuation component, at least one energy delivery cable for connecting the combo generator module 140 to a surgical instrument, at least one smoke evacuation component configured to evacuate smoke, fluid, and/or particulates generated by the application of therapeutic energy to the tissue, and a fluid line extending from the remote surgical site to the smoke evacuation component.


In one aspect, the fluid line is a first fluid line and a second fluid line extends from the remote surgical site to a suction and irrigation module slidably received in the hub enclosure. In one aspect, the hub enclosure comprises a fluid interface.


Certain surgical procedures may require the application of more than one energy type to the tissue. One energy type may be more beneficial for cutting the tissue, while another different energy type may be more beneficial for sealing the tissue. For example, a bipolar generator 144 can be used to seal the tissue while an ultrasonic generator 141 can be used to cut the sealed tissue. Aspects of the present disclosure present a solution where a hub modular enclosure 136 is configured to accommodate different generators, and facilitate an interactive communication therebetween. One of the advantages of the hub modular enclosure 136 is enabling the quick removal and/or replacement of various modules.


Aspects of the present disclosure present a modular surgical enclosure for use in a surgical procedure that involves energy application to tissue. The modular surgical enclosure includes a first energy-generator module, configured to generate a first energy for application to the tissue, and a first docking station comprising a first docking port that includes first data and power contacts, wherein the first energy-generator module is slidably movable into an electrical engagement with the power and data contacts and wherein the first energy-generator module is slidably movable out of the electrical engagement with the first power and data contacts.


Further to the above, the modular surgical enclosure also includes a second energy-generator module configured to generate a second energy, different than the first energy, for application to the tissue, and a second docking station comprising a second docking port that includes second data and power contacts, wherein the second energy-generator module is slidably movable into an electrical engagement with the power and data contacts, and wherein the second energy-generator module is slidably movable out of the electrical engagement with the second power and data contacts.


In addition, the modular surgical enclosure also includes a communication bus between the first docking port and the second docking port, configured to facilitate communication between the first energy-generator module and the second energy-generator module.


Referring to FIGS. 3-7, aspects of the present disclosure are presented for a hub modular enclosure 136 that allows the modular integration of a generator module 140, a smoke evacuation module 126, and a suction/irrigation module 128. The hub modular enclosure 136 further facilitates interactive communication between the modules 140, 126, 128. As illustrated in FIG. 5, the generator module 140 can be a generator module with integrated monopolar 142, bipolar 144, and ultrasonic 141 components supported in a single housing unit 139 slidably insertable into the hub modular enclosure 136. As illustrated in FIG. 5, the generator module 140 can be configured to connect to a monopolar device 146, a bipolar device 147, and an ultrasonic device 148. Alternatively, the generator module 140 may comprise a series of monopolar 142, bipolar 144, and/or ultrasonic 141 generator modules that interact through the hub modular enclosure 136. The hub modular enclosure 136 can be configured to facilitate the insertion of multiple generators and interactive communication between the generators docked into the hub modular enclosure 136 so that the generators would act as a single generator


In one aspect, the hub modular enclosure 136 comprises a modular power and communication backplane 149 with external and wireless communication headers to enable the removable attachment of the modules 140, 126, 128 and interactive communication therebetween.


In one aspect, the hub modular enclosure 136 includes docking stations, or drawers, 151, herein also referred to as drawers, which are configured to slidably receive the modules 140, 126, 128. FIG. 4 illustrates a partial perspective view of a surgical hub enclosure 136, and a combo generator module 145 slidably receivable in a docking station 151 of the surgical hub enclosure 136. A docking port 152 with power and data contacts on a rear side of the combo generator module 145 is configured to engage a corresponding docking port 150 with power and data contacts of a corresponding docking station 151 of the hub modular enclosure 136 as the combo generator module 145 is slid into position within the corresponding docking station 151 of the hub module enclosure 136. In one aspect, the combo generator module 145 includes a bipolar, ultrasonic, and monopolar module and a smoke evacuation module integrated together into a single housing unit 139, as illustrated in FIG. 5.


In various aspects, the smoke evacuation module 126 includes a fluid line 154 that conveys captured/collected smoke and/or fluid away from a surgical site and to, for example, the smoke evacuation module 126. Vacuum suction originating from the smoke evacuation module 126 can draw the smoke into an opening of a utility conduit at the surgical site. The utility conduit, coupled to the fluid line, can be in the form of a flexible tube terminating at the smoke evacuation module 126. The utility conduit and the fluid line define a fluid path extending toward the smoke evacuation module 126 that is received in the hub enclosure 136.


In various aspects, the suction/irrigation module 128 is coupled to a surgical tool comprising an aspiration fluid line and a suction fluid line. In one example, the aspiration and suction fluid lines are in the form of flexible tubes extending from the surgical site toward the suction/irrigation module 128. One or more drive systems can be configured to cause irrigation and aspiration of fluids to and from the surgical site.


In one aspect, the surgical tool includes a shaft having an end effector at a distal end thereof and at least one energy treatment associated with the end effector, an aspiration tube, and an irrigation tube. The aspiration tube can have an inlet port at a distal end thereof and the aspiration tube extends through the shaft. Similarly, an irrigation tube can extend through the shaft and can have an inlet port in proximity to the energy deliver implement. The energy deliver implement is configured to deliver ultrasonic and/or RF energy to the surgical site and is coupled to the generator module 140 by a cable extending initially through the shaft.


The irrigation tube can be in fluid communication with a fluid source, and the aspiration tube can be in fluid communication with a vacuum source. The fluid source and/or the vacuum source can be housed in the suction/irrigation module 128. In one example, the fluid source and/or the vacuum source can be housed in the hub enclosure 136 separately from the suction/irrigation module 128. In such example, a fluid interface can be configured to connect the suction/irrigation module 128 to the fluid source and/or the vacuum source.


In one aspect, the modules 140, 126, 128 and/or their corresponding docking stations on the hub modular enclosure 136 may include alignment features that are configured to align the docking ports of the modules into engagement with their counterparts in the docking stations of the hub modular enclosure 136. For example, as illustrated in FIG. 4, the combo generator module 145 includes side brackets 155 that are configured to slidably engage with corresponding brackets 156 of the corresponding docking station 151 of the hub modular enclosure 136. The brackets cooperate to guide the docking port contacts of the combo generator module 145 into an electrical engagement with the docking port contacts of the hub modular enclosure 136.


In some aspects, the drawers 151 of the hub modular enclosure 136 are the same, or substantially the same size, and the modules are adjusted in size to be received in the drawers 151. For example, the side brackets 155 and/or 156 can be larger or smaller depending on the size of the module. In other aspects, the drawers 151 are different in size and are each designed to accommodate a particular module.


Furthermore, the contacts of a particular module can be keyed for engagement with the contacts of a particular drawer to avoid inserting a module into a drawer with mismatching contacts.


As illustrated in FIG. 4, the docking port 150 of one drawer 151 can be coupled to the docking port 150 of another drawer 151 through a communications link 157 to facilitate an interactive communication between the modules housed in the hub modular enclosure 136. The docking ports 150 of the hub modular enclosure 136 may alternatively, or additionally, facilitate a wireless interactive communication between the modules housed in the hub modular enclosure 136. Any suitable wireless communication can be employed, such as for example Air Titan-Bluetooth.



FIG. 6 illustrates individual power bus attachments for a plurality of lateral docking ports of a lateral modular housing 160 configured to receive a plurality of modules of a surgical hub 206. The lateral modular housing 160 is configured to laterally receive and interconnect the modules 161. The modules 161 are slidably inserted into docking stations 162 of lateral modular housing 160, which includes a backplane for interconnecting the modules 161. As illustrated in FIG. 6, the modules 161 are arranged laterally in the lateral modular housing 160. Alternatively, the modules 161 may be arranged vertically in a lateral modular housing.



FIG. 7 illustrates a vertical modular housing 164 configured to receive a plurality of modules 165 of the surgical hub 106. The modules 165 are slidably inserted into docking stations, or drawers, 167 of vertical modular housing 164, which includes a backplane for interconnecting the modules 165. Although the drawers 167 of the vertical modular housing 164 are arranged vertically, in certain instances, a vertical modular housing 164 may include drawers that are arranged laterally. Furthermore, the modules 165 may interact with one another through the docking ports of the vertical modular housing 164. In the example of FIG. 7, a display 177 is provided for displaying data relevant to the operation of the modules 165. In addition, the vertical modular housing 164 includes a master module 178 housing a plurality of sub-modules that are slidably received in the master module 178.


In various aspects, the imaging module 138 comprises an integrated video processor and a modular light source and is adapted for use with various imaging devices. In one aspect, the imaging device is comprised of a modular housing that can be assembled with a light source module and a camera module. The housing can be a disposable housing. In at least one example, the disposable housing is removably coupled to a reusable controller, a light source module, and a camera module. The light source module and/or the camera module can be selectively chosen depending on the type of surgical procedure. In one aspect, the camera module comprises a CCD sensor. In another aspect, the camera module comprises a CMOS sensor. In another aspect, the camera module is configured for scanned beam imaging. Likewise, the light source module can be configured to deliver a white light or a different light, depending on the surgical procedure.


During a surgical procedure, removing a surgical device from the surgical field and replacing it with another surgical device that includes a different camera or a different light source can be inefficient. Temporarily losing sight of the surgical field may lead to undesirable consequences. The module imaging device of the present disclosure is configured to permit the replacement of a light source module or a camera module midstream during a surgical procedure, without having to remove the imaging device from the surgical field.


In one aspect, the imaging device comprises a tubular housing that includes a plurality of channels. A first channel is configured to slidably receive the camera module, which can be configured for a snap-fit engagement with the first channel. A second channel is configured to slidably receive the light source module, which can be configured for a snap-fit engagement with the second channel. In another example, the camera module and/or the light source module can be rotated into a final position within their respective channels. A threaded engagement can be employed in lieu of the snap-fit engagement.


In various examples, multiple imaging devices are placed at different positions in the surgical field to provide multiple views. The imaging module 138 can be configured to switch between the imaging devices to provide an optimal view. In various aspects, the imaging module 138 can be configured to integrate the images from the different imaging device.


Various image processors and imaging devices suitable for use with the present disclosure are described in U.S. Pat. No. 7,995,045, titled COMBINED SBI AND CONVENTIONAL IMAGE PROCESSOR, which issued on Aug. 9, 2011, which is herein incorporated by reference in its entirety. In addition, U.S. Pat. No. 7,982,776, titled SBI MOTION ARTIFACT REMOVAL APPARATUS AND METHOD, which issued on Jul. 19, 2011, which is herein incorporated by reference in its entirety, describes various systems for removing motion artifacts from image data. Such systems can be integrated with the imaging module 138. Furthermore, U.S. Patent Application Publication No. 2011/0306840, titled CONTROLLABLE MAGNETIC SOURCE TO FIXTURE INTRACORPOREAL APPARATUS, which published on Dec. 15, 2011, and U.S. Patent Application Publication No. 2014/0243597, titled SYSTEM FOR PERFORMING A MINIMALLY INVASIVE SURGICAL PROCEDURE, which published on Aug. 28, 2014, each of which is herein incorporated by reference in its entirety.



FIG. 8 illustrates a surgical data network 201 comprising a modular communication hub 203 configured to connect modular devices located in one or more operating theaters of a healthcare facility, or any room in a healthcare facility specially equipped for surgical operations, to a cloud-based system (e.g., the cloud 204 that may include a remote server 213 coupled to a storage device 205). In one aspect, the modular communication hub 203 comprises a network hub 207 and/or a network switch 209 in communication with a network router. The modular communication hub 203 also can be coupled to a local computer system 210 to provide local computer processing and data manipulation. The surgical data network 201 may be configured as passive, intelligent, or switching. A passive surgical data network serves as a conduit for the data, enabling it to go from one device (or segment) to another and to the cloud computing resources. An intelligent surgical data network includes additional features to enable the traffic passing through the surgical data network to be monitored and to configure each port in the network hub 207 or network switch 209. An intelligent surgical data network may be referred to as a manageable hub or switch. A switching hub reads the destination address of each packet and then forwards the packet to the correct port.


Modular devices 1a-1n located in the operating theater may be coupled to the modular communication hub 203. The network hub 207 and/or the network switch 209 may be coupled to a network router 211 to connect the devices 1a-1n to the cloud 204 or the local computer system 210. Data associated with the devices 1a-1n may be transferred to cloud-based computers via the router for remote data processing and manipulation. Data associated with the devices 1a-1n may also be transferred to the local computer system 210 for local data processing and manipulation. Modular devices 2a-2m located in the same operating theater also may be coupled to a network switch 209. The network switch 209 may be coupled to the network hub 207 and/or the network router 211 to connect to the devices 2a-2m to the cloud 204. Data associated with the devices 2a-2n may be transferred to the cloud 204 via the network router 211 for data processing and manipulation. Data associated with the devices 2a-2m may also be transferred to the local computer system 210 for local data processing and manipulation.


It will be appreciated that the surgical data network 201 may be expanded by interconnecting multiple network hubs 207 and/or multiple network switches 209 with multiple network routers 211. The modular communication hub 203 may be contained in a modular control tower configured to receive multiple devices 1a-1n/2a-2m. The local computer system 210 also may be contained in a modular control tower. The modular communication hub 203 is connected to a display 212 to display images obtained by some of the devices 1a-1n/2a-2m, for example during surgical procedures. In various aspects, the devices 1a-1n/2a-2m may include, for example, various modules such as an imaging module 138 coupled to an endoscope, a generator module 140 coupled to an energy-based surgical device, a smoke evacuation module 126, a suction/irrigation module 128, a communication module 130, a processor module 132, a storage array 134, a surgical device coupled to a display, and/or a non-contact sensor module, among other modular devices that may be connected to the modular communication hub 203 of the surgical data network 201.


In one aspect, the surgical data network 201 may comprise a combination of network hub(s), network switch(es), and network router(s) connecting the devices 1a-1n/2a-2m to the cloud. Any one of or all of the devices 1a-1n/2a-2m coupled to the network hub or network switch may collect data in real time and transfer the data to cloud computers for data processing and manipulation. It will be appreciated that cloud computing relies on sharing computing resources rather than having local servers or personal devices to handle software applications. The word “cloud” may be used as a metaphor for “the Internet,” although the term is not limited as such. Accordingly, the term “cloud computing” may be used herein to refer to “a type of Internet-based computing,” where different services—such as servers, storage, and applications—are delivered to the modular communication hub 203 and/or computer system 210 located in the surgical theater (e.g., a fixed, mobile, temporary, or field operating room or space) and to devices connected to the modular communication hub 203 and/or computer system 210 through the Internet. The cloud infrastructure may be maintained by a cloud service provider. In this context, the cloud service provider may be the entity that coordinates the usage and control of the devices 1a-1n/2a-2m located in one or more operating theaters. The cloud computing services can perform a large number of calculations based on the data gathered by smart surgical instruments, robots, and other computerized devices located in the operating theater. The hub hardware enables multiple devices or connections to be connected to a computer that communicates with the cloud computing resources and storage.


Applying cloud computer data processing techniques on the data collected by the devices 1a-1n/2a-2m, the surgical data network provides improved surgical outcomes, reduced costs, and improved patient satisfaction. At least some of the devices 1a-1n/2a-2m may be employed to view tissue states to assess leaks or perfusion of sealed tissue after a tissue sealing and cutting procedure. At least some of the devices 1a-1n/2a-2m may be employed to identify pathology, such as the effects of diseases, using the cloud-based computing to examine data including images of samples of body tissue for diagnostic purposes. This includes localization and margin confirmation of tissue and phenotypes. At least some of the devices 1a-1n/2a-2m may be employed to identify anatomical structures of the body using a variety of sensors integrated with imaging devices and techniques such as overlaying images captured by multiple imaging devices. The data gathered by the devices 1a-1n/2a-2m, including image data, may be transferred to the cloud 204 or the local computer system 210 or both for data processing and manipulation including image processing and manipulation. The data may be analyzed to improve surgical procedure outcomes by determining if further treatment, such as the application of endoscopic intervention, emerging technologies, a targeted radiation, targeted intervention, and precise robotics to tissue-specific sites and conditions, may be pursued. Such data analysis may further employ outcome analytics processing, and using standardized approaches may provide beneficial feedback to either confirm surgical treatments and the behavior of the surgeon or suggest modifications to surgical treatments and the behavior of the surgeon.


In one implementation, the operating theater devices 1a-1n may be connected to the modular communication hub 203 over a wired channel or a wireless channel depending on the configuration of the devices 1a-1n to a network hub. The network hub 207 may be implemented, in one aspect, as a local network broadcast device that works on the physical layer of the Open System Interconnection (OSI) model. The network hub provides connectivity to the devices 1a-1n located in the same operating theater network. The network hub 207 collects data in the form of packets and sends them to the router in half duplex mode. The network hub 207 does not store any media access control/internet protocol (MAC/IP) to transfer the device data. Only one of the devices 1a-1n can send data at a time through the network hub 207. The network hub 207 has no routing tables or intelligence regarding where to send information and broadcasts all network data across each connection and to a remote server 213 (FIG. 9) over the cloud 204. The network hub 207 can detect basic network errors such as collisions, but having all information broadcast to multiple ports can be a security risk and cause bottlenecks.


In another implementation, the operating theater devices 2a-2m may be connected to a network switch 209 over a wired channel or a wireless channel. The network switch 209 works in the data link layer of the OSI model. The network switch 209 is a multicast device for connecting the devices 2a-2m located in the same operating theater to the network. The network switch 209 sends data in the form of frames to the network router 211 and works in full duplex mode. Multiple devices 2a-2m can send data at the same time through the network switch 209. The network switch 209 stores and uses MAC addresses of the devices 2a-2m to transfer data.


The network hub 207 and/or the network switch 209 are coupled to the network router 211 for connection to the cloud 204. The network router 211 works in the network layer of the OSI model. The network router 211 creates a route for transmitting data packets received from the network hub 207 and/or network switch 211 to cloud-based computer resources for further processing and manipulation of the data collected by any one of or all the devices 1a-1n/2a-2m. The network router 211 may be employed to connect two or more different networks located in different locations, such as, for example, different operating theaters of the same healthcare facility or different networks located in different operating theaters of different healthcare facilities. The network router 211 sends data in the form of packets to the cloud 204 and works in full duplex mode. Multiple devices can send data at the same time. The network router 211 uses IP addresses to transfer data.


In one example, the network hub 207 may be implemented as a USB hub, which allows multiple USB devices to be connected to a host computer. The USB hub may expand a single USB port into several tiers so that there are more ports available to connect devices to the host system computer. The network hub 207 may include wired or wireless capabilities to receive information over a wired channel or a wireless channel. In one aspect, a wireless USB short-range, high-bandwidth wireless radio communication protocol may be employed for communication between the devices 1a-1n and devices 2a-2m located in the operating theater.


In other examples, the operating theater devices 1a-1n/2a-2m may communicate to the modular communication hub 203 via Bluetooth wireless technology standard for exchanging data over short distances (using short-wavelength UHF radio waves in the ISM band from 2.4 to 2.485 GHz) from fixed and mobile devices and building personal area networks (PANs). In other aspects, the operating theater devices 1a-1n/2a-2m may communicate to the modular communication hub 203 via a number of wireless or wired communication standards or protocols, including but not limited to W-Fi (IEEE 802.11 family), WiMAX (IEEE 802.16 family), IEEE 802.20, long-term evolution (LTE), and Ev-DO, HSPA+, HSDPA+, HSUPA+, EDGE, GSM, GPRS, CDMA, TDMA, DECT, and Ethernet derivatives thereof, as well as any other wireless and wired protocols that are designated as 3G, 4G, 5G, and beyond. The computing module may include a plurality of communication modules. For instance, a first communication module may be dedicated to shorter-range wireless communications such as Wi-Fi and Bluetooth, and a second communication module may be dedicated to longer-range wireless communications such as GPS, EDGE, GPRS, CDMA, WiMAX, LTE, Ev-DO, and others.


The modular communication hub 203 may serve as a central connection for one or all of the operating theater devices 1a-1n/2a-2m and handles a data type known as frames. Frames carry the data generated by the devices 1a-1n/2a-2m. When a frame is received by the modular communication hub 203, it is amplified and transmitted to the network router 211, which transfers the data to the cloud computing resources by using a number of wireless or wired communication standards or protocols, as described herein.


The modular communication hub 203 can be used as a standalone device or be connected to compatible network hubs and network switches to form a larger network. The modular communication hub 203 is generally easy to install, configure, and maintain, making it a good option for networking the operating theater devices 1a-1n/2a-2m.



FIG. 9 illustrates a computer-implemented interactive surgical system 200. The computer-implemented interactive surgical system 200 is similar in many respects to the computer-implemented interactive surgical system 100. For example, the computer-implemented interactive surgical system 200 includes one or more surgical systems 202, which are similar in many respects to the surgical systems 102. Each surgical system 202 includes at least one surgical hub 206 in communication with a cloud 204 that may include a remote server 213. In one aspect, the computer-implemented interactive surgical system 200 comprises a modular control tower 236 connected to multiple operating theater devices such as, for example, intelligent surgical instruments, robots, and other computerized devices located in the operating theater. As shown in FIG. 10, the modular control tower 236 comprises a modular communication hub 203 coupled to a computer system 210. As illustrated in the example of FIG. 9, the modular control tower 236 is coupled to an imaging module 238 that is coupled to an endoscope 239, a generator module 240 that is coupled to an energy device 241, a smoke evacuator module 226, a suction/irrigation module 228, a communication module 230, a processor module 232, a storage array 234, a smart device/instrument 235 optionally coupled to a display 237, and a non-contact sensor module 242. The operating theater devices are coupled to cloud computing resources and data storage via the modular control tower 236. A robot hub 222 also may be connected to the modular control tower 236 and to the cloud computing resources. The devices/instruments 235, visualization systems 208, among others, may be coupled to the modular control tower 236 via wired or wireless communication standards or protocols, as described herein. The modular control tower 236 may be coupled to a hub display 215 (e.g., monitor, screen) to display and overlay images received from the imaging module, device/instrument display, and/or other visualization systems 208. The hub display also may display data received from devices connected to the modular control tower in conjunction with images and overlaid images.



FIG. 10 illustrates a surgical hub 206 comprising a plurality of modules coupled to the modular control tower 236. The modular control tower 236 comprises a modular communication hub 203, e.g., a network connectivity device, and a computer system 210 to provide local processing, visualization, and imaging, for example. As shown in FIG. 10, the modular communication hub 203 may be connected in a tiered configuration to expand the number of modules (e.g., devices) that may be connected to the modular communication hub 203 and transfer data associated with the modules to the computer system 210, cloud computing resources, or both. As shown in FIG. 10, each of the network hubs/switches in the modular communication hub 203 includes three downstream ports and one upstream port. The upstream network hub/switch is connected to a processor to provide a communication connection to the cloud computing resources and a local display 217. Communication to the cloud 204 may be made either through a wired or a wireless communication channel.


The surgical hub 206 employs a non-contact sensor module 242 to measure the dimensions of the operating theater and generate a map of the surgical theater using either ultrasonic or laser-type non-contact measurement devices. An ultrasound-based non-contact sensor module scans the operating theater by transmitting a burst of ultrasound and receiving the echo when it bounces off the perimeter walls of an operating theater as described under the heading “Surgical Hub Spatial Awareness Within an Operating Room” in U.S. Provisional Patent Application Ser. No. 62/611,341, titled INTERACTIVE SURGICAL PLATFORM, filed Dec. 28, 2017, the disclosure of which is herein incorporated by reference in its entirety, in which the sensor module is configured to determine the size of the operating theater and to adjust Bluetooth-pairing distance limits. A laser-based non-contact sensor module scans the operating theater by transmitting laser light pulses, receiving laser light pulses that bounce off the perimeter walls of the operating theater, and comparing the phase of the transmitted pulse to the received pulse to determine the size of the operating theater and to adjust Bluetooth pairing distance limits, for example.


The computer system 210 comprises a processor 244 and a network interface 245. The processor 244 is coupled to a communication module 247, storage 248, memory 249, non-volatile memory 250, and input/output interface 251 via a system bus. The system bus can be any of several types of bus structure(s) including the memory bus or memory controller, a peripheral bus or external bus, and/or a local bus using any variety of available bus architectures including, but not limited to, 9-bit bus, Industrial Standard Architecture (ISA), Micro-Charmel Architecture (MSA), Extended ISA (EISA), Intelligent Drive Electronics (IDE), VESA Local Bus (VLB), Peripheral Component Interconnect (PCI), USB, Advanced Graphics Port (AGP), Personal Computer Memory Card International Association bus (PCMCIA), Small Computer Systems Interface (SCSI), or any other proprietary bus.


The processor 244 may be any single-core or multicore processor such as those known under the trade name ARM Cortex by Texas Instruments. In one aspect, the processor may be an LM4F230H5QR ARM Cortex-M4F Processor Core, available from Texas Instruments, for example, comprising an on-chip memory of 256 KB single-cycle flash memory, or other non-volatile memory, up to 40 MHz, a prefetch buffer to improve performance above 40 MHz, a 32 KB single-cycle serial random access memory (SRAM), an internal read-only memory (ROM) loaded with StellarisWare® software, a 2 KB electrically erasable programmable read-only memory (EEPROM), and/or one or more pulse width modulation (PWM) modules, one or more quadrature encoder inputs (QEI) analogs, one or more 12-bit analog-to-digital converters (ADCs) with 12 analog input channels, details of which are available for the product datasheet.


In one aspect, the processor 244 may comprise a safety controller comprising two controller-based families such as TMS570 and RM4×, known under the trade name Hercules ARM Cortex R4, also by Texas Instruments. The safety controller may be configured specifically for IEC 61508 and ISO 26262 safety critical applications, among others, to provide advanced integrated safety features while delivering scalable performance, connectivity, and memory options.


The system memory includes volatile memory and non-volatile memory. The basic input/output system (BIOS), containing the basic routines to transfer information between elements within the computer system, such as during start-up, is stored in non-volatile memory. For example, the non-volatile memory can include ROM, programmable ROM (PROM), electrically programmable ROM (EPROM), EEPROM, or flash memory. Volatile memory includes random-access memory (RAM), which acts as external cache memory. Moreover, RAM is available in many forms such as SRAM, dynamic RAM (DRAM), synchronous DRAM (SDRAM), double data rate SDRAM (DDR SDRAM), enhanced SDRAM (ESDRAM), Synchlink DRAM (SLDRAM), and direct Rambus RAM (DRRAM).


The computer system 210 also includes removable/non-removable, volatile/non-volatile computer storage media, such as for example disk storage. The disk storage includes, but is not limited to, devices like a magnetic disk drive, floppy disk drive, tape drive, Jaz drive, Zip drive, LS-60 drive, flash memory card, or memory stick. In addition, the disk storage can include storage media separately or in combination with other storage media including, but not limited to, an optical disc drive such as a compact disc ROM device (CD-ROM), compact disc recordable drive (CD-R Drive), compact disc rewritable drive (CD-RW Drive), or a digital versatile disc ROM drive (DVD-ROM). To facilitate the connection of the disk storage devices to the system bus, a removable or non-removable interface may be employed.


It is to be appreciated that the computer system 210 includes software that acts as an intermediary between users and the basic computer resources described in a suitable operating environment. Such software includes an operating system. The operating system, which can be stored on the disk storage, acts to control and allocate resources of the computer system. System applications take advantage of the management of resources by the operating system through program modules and program data stored either in the system memory or on the disk storage. It is to be appreciated that various components described herein can be implemented with various operating systems or combinations of operating systems.


A user enters commands or information into the computer system 210 through input device(s) coupled to the I/O interface 251. The input devices include, but are not limited to, a pointing device such as a mouse, trackball, stylus, touch pad, keyboard, microphone, joystick, game pad, satellite dish, scanner, TV tuner card, digital camera, digital video camera, web camera, and the like. These and other input devices connect to the processor through the system bus via interface port(s). The interface port(s) include, for example, a serial port, a parallel port, a game port, and a USB. The output device(s) use some of the same types of ports as input device(s). Thus, for example, a USB port may be used to provide input to the computer system and to output information from the computer system to an output device. An output adapter is provided to illustrate that there are some output devices like monitors, displays, speakers, and printers, among other output devices that require special adapters. The output adapters include, by way of illustration and not limitation, video and sound cards that provide a means of connection between the output device and the system bus. It should be noted that other devices and/or systems of devices, such as remote computer(s), provide both input and output capabilities.


The computer system 210 can operate in a networked environment using logical connections to one or more remote computers, such as cloud computer(s), or local computers. The remote cloud computer(s) can be a personal computer, server, router, network PC, workstation, microprocessor-based appliance, peer device, or other common network node, and the like, and typically includes many or all of the elements described relative to the computer system. For purposes of brevity, only a memory storage device is illustrated with the remote computer(s). The remote computer(s) is logically connected to the computer system through a network interface and then physically connected via a communication connection. The network interface encompasses communication networks such as local area networks (LANs) and wide area networks (WANs). LAN technologies include Fiber Distributed Data Interface (FDDI), Copper Distributed Data Interface (CDDI), Ethernet/IEEE 802.3, Token Ring/IEEE 802.5 and the like. WAN technologies include, but are not limited to, point-to-point links, circuit-switching networks like Integrated Services Digital Networks (ISDN) and variations thereon, packet-switching networks, and Digital Subscriber Lines (DSL).


In various aspects, the computer system 210 of FIG. 10, the imaging module 238 and/or visualization system 208, and/or the processor module 232 of FIGS. 9-10, may comprise an image processor, image processing engine, media processor, or any specialized digital signal processor (DSP) used for the processing of digital images. The image processor may employ parallel computing with single instruction, multiple data (SIMD) or multiple instruction, multiple data (MIMD) technologies to increase speed and efficiency. The digital image processing engine can perform a range of tasks. The image processor may be a system on a chip with multicore processor architecture.


The communication connection(s) refers to the hardware/software employed to connect the network interface to the bus. While the communication connection is shown for illustrative clarity inside the computer system, it can also be external to the computer system 210. The hardware/software necessary for connection to the network interface includes, for illustrative purposes only, internal and external technologies such as modems, including regular telephone-grade modems, cable modems, and DSL modems, ISDN adapters, and Ethernet cards.



FIG. 11 illustrates a functional block diagram of one aspect of a USB network hub 300 device, according to one aspect of the present disclosure. In the illustrated aspect, the USB network hub device 300 employs a TUSB2036 integrated circuit hub by Texas Instruments. The USB network hub 300 is a CMOS device that provides an upstream USB transceiver port 302 and up to three downstream USB transceiver ports 304, 306, 308 in compliance with the USB 2.0 specification. The upstream USB transceiver port 302 is a differential root data port comprising a differential data minus (DM0) input paired with a differential data plus (DP0) input. The three downstream USB transceiver ports 304, 306, 308 are differential data ports where each port includes differential data plus (DP1-DP3) outputs paired with differential data minus (DM1-DM3) outputs.


The USB network hub 300 device is implemented with a digital state machine instead of a microcontroller, and no firmware programming is required. Fully compliant USB transceivers are integrated into the circuit for the upstream USB transceiver port 302 and all downstream USB transceiver ports 304, 306, 308. The downstream USB transceiver ports 304, 306, 308 support both full-speed and low-speed devices by automatically setting the slew rate according to the speed of the device attached to the ports. The USB network hub 300 device may be configured either in bus-powered or self-powered mode and includes a hub power logic 312 to manage power.


The USB network hub 300 device includes a serial interface engine 310 (SIE). The SIE 310 is the front end of the USB network hub 300 hardware and handles most of the protocol described in chapter 8 of the USB specification. The SIE 310 typically comprehends signaling up to the transaction level. The functions that it handles could include: packet recognition, transaction sequencing, SOP, EOP, RESET, and RESUME signal detection/generation, clock/data separation, non-return-to-zero invert (NRZI) data encoding/decoding and bit-stuffing, CRC generation and checking (token and data), packet ID (PID) generation and checking/decoding, and/or serial-parallel/parallel-serial conversion. The 310 receives a clock input 314 and is coupled to a suspend/resume logic and frame timer 316 circuit and a hub repeater circuit 318 to control communication between the upstream USB transceiver port 302 and the downstream USB transceiver ports 304, 306, 308 through port logic circuits 320, 322, 324. The SIE 310 is coupled to a command decoder 326 via interface logic to control commands from a serial EEPROM via a serial EEPROM interface 330.


In various aspects, the USB network hub 300 can connect 127 functions configured in up to six logical layers (tiers) to a single computer. Further, the USB network hub 300 can connect to all peripherals using a standardized four-wire cable that provides both communication and power distribution. The power configurations are bus-powered and self-powered modes. The USB network hub 300 may be configured to support four modes of power management: a bus-powered hub, with either individual-port power management or ganged-port power management, and the self-powered hub, with either individual-port power management or ganged-port power management. In one aspect, using a USB cable, the USB network hub 300, the upstream USB transceiver port 302 is plugged into a USB host controller, and the downstream USB transceiver ports 304, 306, 308 are exposed for connecting USB compatible devices, and so forth.


Surgical Instrument Hardware


FIG. 12 illustrates a logic diagram of a control system 470 of a surgical instrument or tool in accordance with one or more aspects of the present disclosure. The system 470 comprises a control circuit. The control circuit includes a microcontroller 461 comprising a processor 462 and a memory 468. One or more of sensors 472, 474, 476, for example, provide real-time feedback to the processor 462. A motor 482, driven by a motor driver 492, operably couples a longitudinally movable displacement member to drive the I-beam knife element. A tracking system 480 is configured to determine the position of the longitudinally movable displacement member. The position information is provided to the processor 462, which can be programmed or configured to determine the position of the longitudinally movable drive member as well as the position of a firing member, firing bar, and I-beam knife element. Additional motors may be provided at the tool driver interface to control I-beam firing, closure tube travel, shaft rotation, and articulation. A display 473 displays a variety of operating conditions of the instruments and may include touch screen functionality for data input. Information displayed on the display 473 may be overlaid with images acquired via endoscopic imaging modules.


In one aspect, the microcontroller 461 may be any single-core or multicore processor such as those known under the trade name ARM Cortex by Texas Instruments. In one aspect, the main microcontroller 461 may be an LM4F230H5QR ARM Cortex-M4F Processor Core, available from Texas Instruments, for example, comprising an on-chip memory of 256 KB single-cycle flash memory, or other non-volatile memory, up to 40 MHz, a prefetch buffer to improve performance above 40 MHz, a 32 KB single-cycle SRAM, and internal ROM loaded with StellarisWare® software, a 2 KB EEPROM, one or more PWM modules, one or more QEI analogs, and/or one or more 12-bit ADCs with 12 analog input channels, details of which are available for the product datasheet.


In one aspect, the microcontroller 461 may comprise a safety controller comprising two controller-based families such as TMS570 and RM4×, known under the trade name Hercules ARM Cortex R4, also by Texas Instruments. The safety controller may be configured specifically for IEC 61508 and ISO 26262 safety critical applications, among others, to provide advanced integrated safety features while delivering scalable performance, connectivity, and memory options.


The microcontroller 461 may be programmed to perform various functions such as precise control over the speed and position of the knife and articulation systems. In one aspect, the microcontroller 461 includes a processor 462 and a memory 468. The electric motor 482 may be a brushed direct current (DC) motor with a gearbox and mechanical links to an articulation or knife system. In one aspect, a motor driver 492 may be an A3941 available from Allegro Microsystems, Inc. Other motor drivers may be readily substituted for use in the tracking system 480 comprising an absolute positioning system. A detailed description of an absolute positioning system is described in U.S. Patent Application Publication No. 2017/0296213, titled SYSTEMS AND METHODS FOR CONTROLLING A SURGICAL STAPLING AND CUTTING INSTRUMENT, which published on Oct. 19, 2017, which is herein incorporated by reference in its entirety.


The microcontroller 461 may be programmed to provide precise control over the speed and position of displacement members and articulation systems. The microcontroller 461 may be configured to compute a response in the software of the microcontroller 461. The computed response is compared to a measured response of the actual system to obtain an “observed” response, which is used for actual feedback decisions. The observed response is a favorable, tuned value that balances the smooth, continuous nature of the simulated response with the measured response, which can detect outside influences on the system.


In one aspect, the motor 482 may be controlled by the motor driver 492 and can be employed by the firing system of the surgical instrument or tool. In various forms, the motor 482 may be a brushed DC driving motor having a maximum rotational speed of approximately 25,000 RPM. In other arrangements, the motor 482 may include a brushless motor, a cordless motor, a synchronous motor, a stepper motor, or any other suitable electric motor. The motor driver 492 may comprise an H-bridge driver comprising field-effect transistors (FETs), for example. The motor 482 can be powered by a power assembly releasably mounted to the handle assembly or tool housing for supplying control power to the surgical instrument or tool. The power assembly may comprise a battery which may include a number of battery cells connected in series that can be used as the power source to power the surgical instrument or tool. In certain circumstances, the battery cells of the power assembly may be replaceable and/or rechargeable. In at least one example, the battery cells can be lithium-ion batteries which can be couplable to and separable from the power assembly.


The motor driver 492 may be an A3941 available from Allegro Microsystems, Inc. The A3941 492 is a full-bridge controller for use with external N-channel power metal-oxide semiconductor field-effect transistors (MOSFETs) specifically designed for inductive loads, such as brush DC motors. The driver 492 comprises a unique charge pump regulator that provides full (>10 V) gate drive for battery voltages down to 7 V and allows the A3941 to operate with a reduced gate drive, down to 5.5 V. A bootstrap capacitor may be employed to provide the above battery supply voltage required for N-channel MOSFETs. An internal charge pump for the high-side drive allows DC (100% duty cycle) operation. The full bridge can be driven in fast or slow decay modes using diode or synchronous rectification. In the slow decay mode, current recirculation can be through the high-side or the lowside FETs. The power FETs are protected from shoot-through by resistor-adjustable dead time. Integrated diagnostics provide indications of undervoltage, overtemperature, and power bridge faults and can be configured to protect the power MOSFETs under most short circuit conditions. Other motor drivers may be readily substituted for use in the tracking system 480 comprising an absolute positioning system.


The tracking system 480 comprises a controlled motor drive circuit arrangement comprising a position sensor 472 according to one aspect of this disclosure. The position sensor 472 for an absolute positioning system provides a unique position signal corresponding to the location of a displacement member. In one aspect, the displacement member represents a longitudinally movable drive member comprising a rack of drive teeth for meshing engagement with a corresponding drive gear of a gear reducer assembly. In other aspects, the displacement member represents the firing member, which could be adapted and configured to include a rack of drive teeth. In yet another aspect, the displacement member represents a firing bar or the I-beam, each of which can be adapted and configured to include a rack of drive teeth. Accordingly, as used herein, the term displacement member is used generically to refer to any movable member of the surgical instrument or tool such as the drive member, the firing member, the firing bar, the I-beam, or any element that can be displaced. In one aspect, the longitudinally movable drive member is coupled to the firing member, the firing bar, and the I-beam. Accordingly, the absolute positioning system can, in effect, track the linear displacement of the I-beam by tracking the linear displacement of the longitudinally movable drive member. In various other aspects, the displacement member may be coupled to any position sensor 472 suitable for measuring linear displacement. Thus, the longitudinally movable drive member, the firing member, the firing bar, or the I-beam, or combinations thereof, may be coupled to any suitable linear displacement sensor. Linear displacement sensors may include contact or non-contact displacement sensors. Linear displacement sensors may comprise linear variable differential transformers (LVDT), differential variable reluctance transducers (DVRT), a slide potentiometer, a magnetic sensing system comprising a movable magnet and a series of linearly arranged Hall effect sensors, a magnetic sensing system comprising a fixed magnet and a series of movable, linearly arranged Hall effect sensors, an optical sensing system comprising a movable light source and a series of linearly arranged photo diodes or photo detectors, an optical sensing system comprising a fixed light source and a series of movable linearly, arranged photo diodes or photo detectors, or any combination thereof.


The electric motor 482 can include a rotatable shaft that operably interfaces with a gear assembly that is mounted in meshing engagement with a set, or rack, of drive teeth on the displacement member. A sensor element may be operably coupled to a gear assembly such that a single revolution of the position sensor 472 element corresponds to some linear longitudinal translation of the displacement member. An arrangement of gearing and sensors can be connected to the linear actuator, via a rack and pinion arrangement, or a rotary actuator, via a spur gear or other connection. A power source supplies power to the absolute positioning system and an output indicator may display the output of the absolute positioning system. The displacement member represents the longitudinally movable drive member comprising a rack of drive teeth formed thereon for meshing engagement with a corresponding drive gear of the gear reducer assembly. The displacement member represents the longitudinally movable firing member, firing bar, I-beam, or combinations thereof.


A single revolution of the sensor element associated with the position sensor 472 is equivalent to a longitudinal linear displacement d1 of the of the displacement member, where d1 is the longitudinal linear distance that the displacement member moves from point “a” to point “b” after a single revolution of the sensor element coupled to the displacement member. The sensor arrangement may be connected via a gear reduction that results in the position sensor 472 completing one or more revolutions for the full stroke of the displacement member. The position sensor 472 may complete multiple revolutions for the full stroke of the displacement member.


A series of switches, where n is an integer greater than one, may be employed alone or in combination with a gear reduction to provide a unique position signal for more than one revolution of the position sensor 472. The state of the switches are fed back to the microcontroller 461 that applies logic to determine a unique position signal corresponding to the longitudinal linear displacement d1+d2+ . . . dn of the displacement member. The output of the position sensor 472 is provided to the microcontroller 461. The position sensor 472 of the sensor arrangement may comprise a magnetic sensor, an analog rotary sensor like a potentiometer, or an array of analog Hall-effect elements, which output a unique combination of position signals or values.


The position sensor 472 may comprise any number of magnetic sensing elements, such as, for example, magnetic sensors classified according to whether they measure the total magnetic field or the vector components of the magnetic field. The techniques used to produce both types of magnetic sensors encompass many aspects of physics and electronics. The technologies used for magnetic field sensing include search coil, fluxgate, optically pumped, nuclear precession, SQUID, Hall-effect, anisotropic magnetoresistance, giant magnetoresistance, magnetic tunnel junctions, giant magnetoimpedance, magnetostrictive/piezoelectric composites, magnetodiode, magnetotransistor, fiber-optic, magneto-optic, and microelectromechanical systems-based magnetic sensors, among others.


In one aspect, the position sensor 472 for the tracking system 480 comprising an absolute positioning system comprises a magnetic rotary absolute positioning system. The position sensor 472 may be implemented as an AS5055EQFT single-chip magnetic rotary position sensor available from Austria Microsystems, AG. The position sensor 472 is interfaced with the microcontroller 461 to provide an absolute positioning system. The position sensor 472 is a low-voltage and low-power component and includes four Hall-effect elements in an area of the position sensor 472 that is located above a magnet. A high-resolution ADC and a smart power management controller are also provided on the chip. A coordinate rotation digital computer (CORDIC) processor, also known as the digit-by-digit method and Volder's algorithm, is provided to implement a simple and efficient algorithm to calculate hyperbolic and trigonometric functions that require only addition, subtraction, bitshift, and table lookup operations. The angle position, alarm bits, and magnetic field information are transmitted over a standard serial communication interface, such as a serial peripheral interface (SPI) interface, to the microcontroller 461. The position sensor 472 provides 12 or 14 bits of resolution. The position sensor 472 may be an AS5055 chip provided in a small QFN 16-pin 4×4×0.85 mm package.


The tracking system 480 comprising an absolute positioning system may comprise and/or be programmed to implement a feedback controller, such as a PID, state feedback, and adaptive controller. A power source converts the signal from the feedback controller into a physical input to the system: in this case the voltage. Other examples include a PWM of the voltage, current, and force. Other sensor(s) may be provided to measure physical parameters of the physical system in addition to the position measured by the position sensor 472. In some aspects, the other sensor(s) can include sensor arrangements such as those described in U.S. Pat. No. 9,345,481, titled STAPLE CARTRIDGE TISSUE THICKNESS SENSOR SYSTEM, which issued on May 24, 2016, which is herein incorporated by reference in its entirety; U.S. Patent Application Publication No. 2014/0263552, titled STAPLE CARTRIDGE TISSUE THICKNESS SENSOR SYSTEM, which published on Sep. 18, 2014, which is herein incorporated by reference in its entirety; and U.S. patent application Ser. No. 15/628,175, titled TECHNIQUES FOR ADAPTIVE CONTROL OF MOTOR VELOCITY OF A SURGICAL STAPLING AND CUTTING INSTRUMENT, filed Jun. 20, 2017, which is herein incorporated by reference in its entirety. In a digital signal processing system, an absolute positioning system is coupled to a digital data acquisition system where the output of the absolute positioning system will have a finite resolution and sampling frequency. The absolute positioning system may comprise a compare-and-combine circuit to combine a computed response with a measured response using algorithms, such as a weighted average and a theoretical control loop, that drive the computed response towards the measured response. The computed response of the physical system takes into account properties like mass, inertial, viscous friction, inductance resistance, etc., to predict what the states and outputs of the physical system will be by knowing the input.


The absolute positioning system provides an absolute position of the displacement member upon power-up of the instrument, without retracting or advancing the displacement member to a reset (zero or home) position as may be required with conventional rotary encoders that merely count the number of steps forwards or backwards that the motor 482 has taken to infer the position of a device actuator, drive bar, knife, or the like.


A sensor 474, such as, for example, a strain gauge or a micro-strain gauge, is configured to measure one or more parameters of the end effector, such as, for example, the amplitude of the strain exerted on the anvil during a clamping operation, which can be indicative of the closure forces applied to the anvil. The measured strain is converted to a digital signal and provided to the processor 462. Alternatively, or in addition to the sensor 474, a sensor 476, such as, for example, a load sensor, can measure the closure force applied by the closure drive system to the anvil. The sensor 476, such as, for example, a load sensor, can measure the firing force applied to an I-beam in a firing stroke of the surgical instrument or tool. The I-beam is configured to engage a wedge sled, which is configured to upwardly cam staple drivers to force out staples into deforming contact with an anvil. The I-beam also includes a sharpened cutting edge that can be used to sever tissue as the I-beam is advanced distally by the firing bar. Alternatively, a current sensor 478 can be employed to measure the current drawn by the motor 482. The force required to advance the firing member can correspond to the current drawn by the motor 482, for example. The measured force is converted to a digital signal and provided to the processor 462.


In one form, the strain gauge sensor 474 can be used to measure the force applied to the tissue by the end effector. A strain gauge can be coupled to the end effector to measure the force on the tissue being treated by the end effector. A system for measuring forces applied to the tissue grasped by the end effector comprises a strain gauge sensor 474, such as, for example, a micro-strain gauge, that is configured to measure one or more parameters of the end effector, for example. In one aspect, the strain gauge sensor 474 can measure the amplitude or magnitude of the strain exerted on a jaw member of an end effector during a clamping operation, which can be indicative of the tissue compression. The measured strain is converted to a digital signal and provided to a processor 462 of the microcontroller 461. A load sensor 476 can measure the force used to operate the knife element, for example, to cut the tissue captured between the anvil and the staple cartridge. A magnetic field sensor can be employed to measure the thickness of the captured tissue. The measurement of the magnetic field sensor also may be converted to a digital signal and provided to the processor 462.


The measurements of the tissue compression, the tissue thickness, and/or the force required to close the end effector on the tissue, as respectively measured by the sensors 474, 476, can be used by the microcontroller 461 to characterize the selected position of the firing member and/or the corresponding value of the speed of the firing member. In one instance, a memory 468 may store a technique, an equation, and/or a lookup table which can be employed by the microcontroller 461 in the assessment.


The control system 470 of the surgical instrument or tool also may comprise wired or wireless communication circuits to communicate with the modular communication hub as shown in FIGS. 8-11.



FIG. 13 illustrates a control circuit 500 configured to control aspects of the surgical instrument or tool according to one aspect of this disclosure. The control circuit 500 can be configured to implement various processes described herein. The control circuit 500 may comprise a microcontroller comprising one or more processors 502 (e.g., microprocessor, microcontroller) coupled to at least one memory circuit 504. The memory circuit 504 stores machine-executable instructions that, when executed by the processor 502, cause the processor 502 to execute machine instructions to implement various processes described herein. The processor 502 may be any one of a number of single-core or multicore processors known in the art. The memory circuit 504 may comprise volatile and non-volatile storage media. The processor 502 may include an instruction processing unit 506 and an arithmetic unit 508. The instruction processing unit may be configured to receive instructions from the memory circuit 504 of this disclosure.



FIG. 14 illustrates a combinational logic circuit 510 configured to control aspects of the surgical instrument or tool according to one aspect of this disclosure. The combinational logic circuit 510 can be configured to implement various processes described herein. The combinational logic circuit 510 may comprise a finite state machine comprising a combinational logic 512 configured to receive data associated with the surgical instrument or tool at an input 514, process the data by the combinational logic 512, and provide an output 516.



FIG. 15 illustrates a sequential logic circuit 520 configured to control aspects of the surgical instrument or tool according to one aspect of this disclosure. The sequential logic circuit 520 or the combinational logic 522 can be configured to implement various processes described herein. The sequential logic circuit 520 may comprise a finite state machine. The sequential logic circuit 520 may comprise a combinational logic 522, at least one memory circuit 524, and a clock 529, for example. The at least one memory circuit 524 can store a current state of the finite state machine. In certain instances, the sequential logic circuit 520 may be synchronous or asynchronous. The combinational logic 522 is configured to receive data associated with the surgical instrument or tool from an input 526, process the data by the combinational logic 522, and provide an output 528. In other aspects, the circuit may comprise a combination of a processor (e.g., processor 502, FIG. 13) and a finite state machine to implement various processes herein. In other aspects, the finite state machine may comprise a combination of a combinational logic circuit (e.g., combinational logic circuit 510, FIG. 14) and the sequential logic circuit 520.



FIG. 16 illustrates a surgical instrument or tool comprising a plurality of motors which can be activated to perform various functions. In certain instances, a first motor can be activated to perform a first function, a second motor can be activated to perform a second function, a third motor can be activated to perform a third function, a fourth motor can be activated to perform a fourth function, and so on. In certain instances, the plurality of motors of robotic surgical instrument 600 can be individually activated to cause firing, closure, and/or articulation motions in the end effector. The firing, closure, and/or articulation motions can be transmitted to the end effector through a shaft assembly, for example.


In certain instances, the surgical instrument system or tool may include a firing motor 602. The firing motor 602 may be operably coupled to a firing motor drive assembly 604 which can be configured to transmit firing motions, generated by the motor 602 to the end effector, in particular to displace the I-beam element. In certain instances, the firing motions generated by the motor 602 may cause the staples to be deployed from the staple cartridge into tissue captured by the end effector and/or the cutting edge of the I-beam element to be advanced to cut the captured tissue, for example. The I-beam element may be retracted by reversing the direction of the motor 602.


In certain instances, the surgical instrument or tool may include a closure motor 603. The closure motor 603 may be operably coupled to a closure motor drive assembly 605 which can be configured to transmit closure motions, generated by the motor 603 to the end effector, in particular to displace a closure tube to close the anvil and compress tissue between the anvil and the staple cartridge. The closure motions may cause the end effector to transition from an open configuration to an approximated configuration to capture tissue, for example. The end effector may be transitioned to an open position by reversing the direction of the motor 603.


In certain instances, the surgical instrument or tool may include one or more articulation motors 606a, 606b, for example. The motors 606a, 606b may be operably coupled to respective articulation motor drive assemblies 608a, 608b, which can be configured to transmit articulation motions generated by the motors 606a, 606b to the end effector. In certain instances, the articulation motions may cause the end effector to articulate relative to the shaft, for example.


As described above, the surgical instrument or tool may include a plurality of motors which may be configured to perform various independent functions. In certain instances, the plurality of motors of the surgical instrument or tool can be individually or separately activated to perform one or more functions while the other motors remain inactive. For example, the articulation motors 606a, 606b can be activated to cause the end effector to be articulated while the firing motor 602 remains inactive. Alternatively, the firing motor 602 can be activated to fire the plurality of staples, and/or to advance the cutting edge, while the articulation motor 606 remains inactive. Furthermore the closure motor 603 may be activated simultaneously with the firing motor 602 to cause the closure tube and the I-beam element to advance distally as described in more detail hereinbelow.


In certain instances, the surgical instrument or tool may include a common control module 610 which can be employed with a plurality of motors of the surgical instrument or tool. In certain instances, the common control module 610 may accommodate one of the plurality of motors at a time. For example, the common control module 610 can be couplable to and separable from the plurality of motors of the robotic surgical instrument individually. In certain instances, a plurality of the motors of the surgical instrument or tool may share one or more common control modules such as the common control module 610. In certain instances, a plurality of motors of the surgical instrument or tool can be individually and selectively engaged with the common control module 610. In certain instances, the common control module 610 can be selectively switched from interfacing with one of a plurality of motors of the surgical instrument or tool to interfacing with another one of the plurality of motors of the surgical instrument or tool.


In at least one example, the common control module 610 can be selectively switched between operable engagement with the articulation motors 606a, 606b and operable engagement with either the firing motor 602 or the closure motor 603. In at least one example, as illustrated in FIG. 16, a switch 614 can be moved or transitioned between a plurality of positions and/or states. In a first position 616, the switch 614 may electrically couple the common control module 610 to the firing motor 602; in a second position 617, the switch 614 may electrically couple the common control module 610 to the closure motor 603; in a third position 618a, the switch 614 may electrically couple the common control module 610 to the first articulation motor 606a; and in a fourth position 618b, the switch 614 may electrically couple the common control module 610 to the second articulation motor 606b, for example. In certain instances, separate common control modules 610 can be electrically coupled to the firing motor 602, the closure motor 603, and the articulations motor 606a, 606b at the same time. In certain instances, the switch 614 may be a mechanical switch, an electromechanical switch, a solid-state switch, or any suitable switching mechanism.


Each of the motors 602, 603, 606a, 606b may comprise a torque sensor to measure the output torque on the shaft of the motor. The force on an end effector may be sensed in any conventional manner, such as by force sensors on the outer sides of the jaws or by a torque sensor for the motor actuating the jaws.


In various instances, as illustrated in FIG. 16, the common control module 610 may comprise a motor driver 626 which may comprise one or more H-Bridge FETs. The motor driver 626 may modulate the power transmitted from a power source 628 to a motor coupled to the common control module 610 based on input from a microcontroller 620 (the “controller”), for example. In certain instances, the microcontroller 620 can be employed to determine the current drawn by the motor, for example, while the motor is coupled to the common control module 610, as described above.


In certain instances, the microcontroller 620 may include a microprocessor 622 (the “processor”) and one or more non-transitory computer-readable mediums or memory units 624 (the “memory”). In certain instances, the memory 624 may store various program instructions, which when executed may cause the processor 622 to perform a plurality of functions and/or calculations described herein. In certain instances, one or more of the memory units 624 may be coupled to the processor 622, for example.


In certain instances, the power source 628 can be employed to supply power to the microcontroller 620, for example. In certain instances, the power source 628 may comprise a battery (or “battery pack” or “power pack”), such as a lithium-ion battery, for example. In certain instances, the battery pack may be configured to be releasably mounted to a handle for supplying power to the surgical instrument 600. A number of battery cells connected in series may be used as the power source 628. In certain instances, the power source 628 may be replaceable and/or rechargeable, for example.


In various instances, the processor 622 may control the motor driver 626 to control the position, direction of rotation, and/or velocity of a motor that is coupled to the common control module 610. In certain instances, the processor 622 can signal the motor driver 626 to stop and/or disable a motor that is coupled to the common control module 610. It should be understood that the term “processor” as used herein includes any suitable microprocessor, microcontroller, or other basic computing device that incorporates the functions of a computer's central processing unit (CPU) on an integrated circuit or, at most, a few integrated circuits. The processor is a multipurpose, programmable device that accepts digital data as input, processes it according to instructions stored in its memory, and provides results as output. It is an example of sequential digital logic, as it has internal memory. Processors operate on numbers and symbols represented in the binary numeral system.


In one instance, the processor 622 may be any single-core or multicore processor such as those known under the trade name ARM Cortex by Texas Instruments. In certain instances, the microcontroller 620 may be an LM 4F230H5QR, available from Texas Instruments, for example. In at least one example, the Texas Instruments LM4F230H5QR is an ARM Cortex-M4F Processor Core comprising an on-chip memory of 256 KB single-cycle flash memory, or other non-volatile memory, up to 40 MHz, a prefetch buffer to improve performance above 40 MHz, a 32 KB single-cycle SRAM, an internal ROM loaded with StellarisWare® software, a 2 KB EEPROM, one or more PWM modules, one or more QEI analogs, one or more 12-bit ADCs with 12 analog input channels, among other features that are readily available for the product datasheet. Other microcontrollers may be readily substituted for use with the module 4410. Accordingly, the present disclosure should not be limited in this context.


In certain instances, the memory 624 may include program instructions for controlling each of the motors of the surgical instrument 600 that are couplable to the common control module 610. For example, the memory 624 may include program instructions for controlling the firing motor 602, the closure motor 603, and the articulation motors 606a, 606b. Such program instructions may cause the processor 622 to control the firing, closure, and articulation functions in accordance with inputs from algorithms or control programs of the surgical instrument or tool.


In certain instances, one or more mechanisms and/or sensors such as, for example, sensors 630 can be employed to alert the processor 622 to the program instructions that should be used in a particular setting. For example, the sensors 630 may alert the processor 622 to use the program instructions associated with firing, closing, and articulating the end effector. In certain instances, the sensors 630 may comprise position sensors which can be employed to sense the position of the switch 614, for example. Accordingly, the processor 622 may use the program instructions associated with firing the I-beam of the end effector upon detecting, through the sensors 630 for example, that the switch 614 is in the first position 616; the processor 622 may use the program instructions associated with closing the anvil upon detecting, through the sensors 630 for example, that the switch 614 is in the second position 617; and the processor 622 may use the program instructions associated with articulating the end effector upon detecting, through the sensors 630 for example, that the switch 614 is in the third or fourth position 618a, 618b.



FIG. 17 is a schematic diagram of a robotic surgical instrument 700 configured to operate a surgical tool described herein according to one aspect of this disclosure. The robotic surgical instrument 700 may be programmed or configured to control distal/proximal translation of a displacement member, distal/proximal displacement of a closure tube, shaft rotation, and articulation, either with single or multiple articulation drive links. In one aspect, the surgical instrument 700 may be programmed or configured to individually control a firing member, a closure member, a shaft member, and/or one or more articulation members. The surgical instrument 700 comprises a control circuit 710 configured to control motor-driven firing members, closure members, shaft members, and/or one or more articulation members.


In one aspect, the robotic surgical instrument 700 comprises a control circuit 710 configured to control an anvil 716 and an I-beam 714 (including a sharp cutting edge) portion of an end effector 702, a removable staple cartridge 718, a shaft 740, and one or more articulation members 742a, 742b via a plurality of motors 704a-704e. A position sensor 734 may be configured to provide position feedback of the I-beam 714 to the control circuit 710. Other sensors 738 may be configured to provide feedback to the control circuit 710. A timer/counter 731 provides timing and counting information to the control circuit 710. An energy source 712 may be provided to operate the motors 704a-704e, and a current sensor 736 provides motor current feedback to the control circuit 710. The motors 704a-704e can be operated individually by the control circuit 710 in a open-loop or closed-loop feedback control.


In one aspect, the control circuit 710 may comprise one or more microcontrollers, microprocessors, or other suitable processors for executing instructions that cause the processor or processors to perform one or more tasks. In one aspect, a timer/counter 731 provides an output signal, such as the elapsed time or a digital count, to the control circuit 710 to correlate the position of the I-beam 714 as determined by the position sensor 734 with the output of the timer/counter 731 such that the control circuit 710 can determine the position of the I-beam 714 at a specific time (t) relative to a starting position or the time (t) when the I-beam 714 is at a specific position relative to a starting position. The timer/counter 731 may be configured to measure elapsed time, count external events, or time external events.


In one aspect, the control circuit 710 may be programmed to control functions of the end effector 702 based on one or more tissue conditions. The control circuit 710 may be programmed to sense tissue conditions, such as thickness, either directly or indirectly, as described herein. The control circuit 710 may be programmed to select a firing control program or closure control program based on tissue conditions. A firing control program may describe the distal motion of the displacement member. Different firing control programs may be selected to better treat different tissue conditions. For example, when thicker tissue is present, the control circuit 710 may be programmed to translate the displacement member at a lower velocity and/or with lower power. When thinner tissue is present, the control circuit 710 may be programmed to translate the displacement member at a higher velocity and/or with higher power. A closure control program may control the closure force applied to the tissue by the anvil 716. Other control programs control the rotation of the shaft 740 and the articulation members 742a, 742b.


In one aspect, the control circuit 710 may generate motor set point signals. The motor set point signals may be provided to various motor controllers 708a-708e. The motor controllers 708a-708e may comprise one or more circuits configured to provide motor drive signals to the motors 704a-704e to drive the motors 704a-704e as described herein. In some examples, the motors 704a-704e may be brushed DC electric motors. For example, the velocity of the motors 704a-704e may be proportional to the respective motor drive signals. In some examples, the motors 704a-704e may be brushless DC electric motors, and the respective motor drive signals may comprise a PWM signal provided to one or more stator windings of the motors 704a-704e. Also, in some examples, the motor controllers 708a-708e may be omitted and the control circuit 710 may generate the motor drive signals directly.


In one aspect, the control circuit 710 may initially operate each of the motors 704a-704e in an open-loop configuration for a first open-loop portion of a stroke of the displacement member. Based on the response of the robotic surgical instrument 700 during the open-loop portion of the stroke, the control circuit 710 may select a firing control program in a closed-loop configuration. The response of the instrument may include a translation distance of the displacement member during the open-loop portion, a time elapsed during the open-loop portion, the energy provided to one of the motors 704a-704e during the open-loop portion, a sum of pulse widths of a motor drive signal, etc. After the open-loop portion, the control circuit 710 may implement the selected firing control program for a second portion of the displacement member stroke. For example, during a closed-loop portion of the stroke, the control circuit 710 may modulate one of the motors 704a-704e based on translation data describing a position of the displacement member in a closed-loop manner to translate the displacement member at a constant velocity.


In one aspect, the motors 704a-704e may receive power from an energy source 712. The energy source 712 may be a DC power supply driven by a main alternating current power source, a battery, a super capacitor, or any other suitable energy source. The motors 704a-704e may be mechanically coupled to individual movable mechanical elements such as the I-beam 714, anvil 716, shaft 740, articulation 742a, and articulation 742b via respective transmissions 706a-706e. The transmissions 706a-706e may include one or more gears or other linkage components to couple the motors 704a-704e to movable mechanical elements. A position sensor 734 may sense a position of the I-beam 714. The position sensor 734 may be or include any type of sensor that is capable of generating position data that indicate a position of the I-beam 714. In some examples, the position sensor 734 may include an encoder configured to provide a series of pulses to the control circuit 710 as the I-beam 714 translates distally and proximally. The control circuit 710 may track the pulses to determine the position of the I-beam 714. Other suitable position sensors may be used, including, for example, a proximity sensor. Other types of position sensors may provide other signals indicating motion of the I-beam 714. Also, in some examples, the position sensor 734 may be omitted. Where any of the motors 704a-704e is a stepper motor, the control circuit 710 may track the position of the I-beam 714 by aggregating the number and direction of steps that the motor 704 has been instructed to execute. The position sensor 734 may be located in the end effector 702 or at any other portion of the instrument. The outputs of each of the motors 704a-704e include a torque sensor 744a-744e to sense force and have an encoder to sense rotation of the drive shaft.


In one aspect, the control circuit 710 is configured to drive a firing member such as the I-beam 714 portion of the end effector 702. The control circuit 710 provides a motor set point to a motor control 708a, which provides a drive signal to the motor 704a. The output shaft of the motor 704a is coupled to a torque sensor 744a. The torque sensor 744a is coupled to a transmission 706a which is coupled to the I-beam 714. The transmission 706a comprises movable mechanical elements such as rotating elements and a firing member to control the movement of the I-beam 714 distally and proximally along a longitudinal axis of the end effector 702. In one aspect, the motor 704a may be coupled to the knife gear assembly, which includes a knife gear reduction set that includes a first knife drive gear and a second knife drive gear. A torque sensor 744a provides a firing force feedback signal to the control circuit 710. The firing force signal represents the force required to fire or displace the I-beam 714. A position sensor 734 may be configured to provide the position of the I-beam 714 along the firing stroke or the position of the firing member as a feedback signal to the control circuit 710. The end effector 702 may include additional sensors 738 configured to provide feedback signals to the control circuit 710. When ready to use, the control circuit 710 may provide a firing signal to the motor control 708a. In response to the firing signal, the motor 704a may drive the firing member distally along the longitudinal axis of the end effector 702 from a proximal stroke start position to a stroke end position distal to the stroke start position. As the firing member translates distally, an I-beam 714, with a cutting element positioned at a distal end, advances distally to cut tissue located between the staple cartridge 718 and the anvil 716.


In one aspect, the control circuit 710 is configured to drive a closure member such as the anvil 716 portion of the end effector 702. The control circuit 710 provides a motor set point to a motor control 708b, which provides a drive signal to the motor 704b. The output shaft of the motor 704b is coupled to a torque sensor 744b. The torque sensor 744b is coupled to a transmission 706b which is coupled to the anvil 716. The transmission 706b comprises movable mechanical elements such as rotating elements and a closure member to control the movement of the anvil 716 from the open and closed positions. In one aspect, the motor 704b is coupled to a closure gear assembly, which includes a closure reduction gear set that is supported in meshing engagement with the closure spur gear. The torque sensor 744b provides a closure force feedback signal to the control circuit 710. The closure force feedback signal represents the closure force applied to the anvil 716. The position sensor 734 may be configured to provide the position of the closure member as a feedback signal to the control circuit 710. Additional sensors 738 in the end effector 702 may provide the closure force feedback signal to the control circuit 710. The pivotable anvil 716 is positioned opposite the staple cartridge 718. When ready to use, the control circuit 710 may provide a closure signal to the motor control 708b. In response to the closure signal, the motor 704b advances a closure member to grasp tissue between the anvil 716 and the staple cartridge 718.


In one aspect, the control circuit 710 is configured to rotate a shaft member such as the shaft 740 to rotate the end effector 702. The control circuit 710 provides a motor set point to a motor control 708c, which provides a drive signal to the motor 704c. The output shaft of the motor 704c is coupled to a torque sensor 744c. The torque sensor 744c is coupled to a transmission 706c which is coupled to the shaft 740. The transmission 706c comprises movable mechanical elements such as rotating elements to control the rotation of the shaft 740 clockwise or counterclockwise up to and over 360°. In one aspect, the motor 704c is coupled to the rotational transmission assembly, which includes a tube gear segment that is formed on (or attached to) the proximal end of the proximal closure tube for operable engagement by a rotational gear assembly that is operably supported on the tool mounting plate. The torque sensor 744c provides a rotation force feedback signal to the control circuit 710. The rotation force feedback signal represents the rotation force applied to the shaft 740. The position sensor 734 may be configured to provide the position of the closure member as a feedback signal to the control circuit 710. Additional sensors 738 such as a shaft encoder may provide the rotational position of the shaft 740 to the control circuit 710.


In one aspect, the control circuit 710 is configured to articulate the end effector 702. The control circuit 710 provides a motor set point to a motor control 708d, which provides a drive signal to the motor 704d. The output shaft of the motor 704d is coupled to a torque sensor 744d. The torque sensor 744d is coupled to a transmission 706d which is coupled to an articulation member 742a. The transmission 706d comprises movable mechanical elements such as articulation elements to control the articulation of the end effector 702 ±65°. In one aspect, the motor 704d is coupled to an articulation nut, which is rotatably journaled on the proximal end portion of the distal spine portion and is rotatably driven thereon by an articulation gear assembly. The torque sensor 744d provides an articulation force feedback signal to the control circuit 710. The articulation force feedback signal represents the articulation force applied to the end effector 702. Sensors 738, such as an articulation encoder, may provide the articulation position of the end effector 702 to the control circuit 710.


In another aspect, the articulation function of the robotic surgical system 700 may comprise two articulation members, or links, 742a, 742b. These articulation members 742a, 742b are driven by separate disks on the robot interface (the rack) which are driven by the two motors 708d, 708e. When the separate firing motor 704a is provided, each of articulation links 742a, 742b can be antagonistically driven with respect to the other link in order to provide a resistive holding motion and a load to the head when it is not moving and to provide an articulation motion as the head is articulated. The articulation members 742a, 742b attach to the head at a fixed radius as the head is rotated. Accordingly, the mechanical advantage of the push-and-pull link changes as the head is rotated. This change in the mechanical advantage may be more pronounced with other articulation link drive systems.


In one aspect, the one or more motors 704a-704e may comprise a brushed DC motor with a gearbox and mechanical links to a firing member, closure member, or articulation member. Another example includes electric motors 704a-704e that operate the movable mechanical elements such as the displacement member, articulation links, closure tube, and shaft. An outside influence is an unmeasured, unpredictable influence of things like tissue, surrounding bodies, and friction on the physical system. Such outside influence can be referred to as drag, which acts in opposition to one of electric motors 704a-704e. The outside influence, such as drag, may cause the operation of the physical system to deviate from a desired operation of the physical system.


In one aspect, the position sensor 734 may be implemented as an absolute positioning system. In one aspect, the position sensor 734 may comprise a magnetic rotary absolute positioning system implemented as an AS5055EQFT single-chip magnetic rotary position sensor available from Austria Microsystems, AG. The position sensor 734 may interface with the control circuit 710 to provide an absolute positioning system. The position may include multiple Hall-effect elements located above a magnet and coupled to a CORDIC processor, also known as the digit-by-digit method and Volder's algorithm, that is provided to implement a simple and efficient algorithm to calculate hyperbolic and trigonometric functions that require only addition, subtraction, bitshift, and table lookup operations.


In one aspect, the control circuit 710 may be in communication with one or more sensors 738. The sensors 738 may be positioned on the end effector 702 and adapted to operate with the robotic surgical instrument 700 to measure the various derived parameters such as the gap distance versus time, tissue compression versus time, and anvil strain versus time. The sensors 738 may comprise a magnetic sensor, a magnetic field sensor, a strain gauge, a load cell, a pressure sensor, a force sensor, a torque sensor, an inductive sensor such as an eddy current sensor, a resistive sensor, a capacitive sensor, an optical sensor, and/or any other suitable sensor for measuring one or more parameters of the end effector 702. The sensors 738 may include one or more sensors. The sensors 738 may be located on the staple cartridge 718 deck to determine tissue location using segmented electrodes. The torque sensors 744a-744e may be configured to sense force such as firing force, closure force, and/or articulation force, among others. Accordingly, the control circuit 710 can sense (1) the closure load experienced by the distal closure tube and its position, (2) the firing member at the rack and its position, (3) what portion of the staple cartridge 718 has tissue on it, and (4) the load and position on both articulation rods.


In one aspect, the one or more sensors 738 may comprise a strain gauge, such as a micro-strain gauge, configured to measure the magnitude of the strain in the anvil 716 during a clamped condition. The strain gauge provides an electrical signal whose amplitude varies with the magnitude of the strain. The sensors 738 may comprise a pressure sensor configured to detect a pressure generated by the presence of compressed tissue between the anvil 716 and the staple cartridge 718. The sensors 738 may be configured to detect impedance of a tissue section located between the anvil 716 and the staple cartridge 718 that is indicative of the thickness and/or fullness of tissue located therebetween.


In one aspect, the sensors 738 may be implemented as one or more limit switches, electromechanical devices, solid-state switches, Hall-effect devices, magneto-resistive (MR) devices, giant magneto-resistive (GMR) devices, magnetometers, among others. In other implementations, the sensors 738 may be implemented as solid-state switches that operate under the influence of light, such as optical sensors, IR sensors, ultraviolet sensors, among others. Still, the switches may be solid-state devices such as transistors (e.g., FET, junction FET, MOSFET, bipolar, and the like). In other implementations, the sensors 738 may include electrical conductorless switches, ultrasonic switches, accelerometers, and inertial sensors, among others.


In one aspect, the sensors 738 may be configured to measure forces exerted on the anvil 716 by the closure drive system. For example, one or more sensors 738 can be at an interaction point between the closure tube and the anvil 716 to detect the closure forces applied by the closure tube to the anvil 716. The forces exerted on the anvil 716 can be representative of the tissue compression experienced by the tissue section captured between the anvil 716 and the staple cartridge 718. The one or more sensors 738 can be positioned at various interaction points along the closure drive system to detect the closure forces applied to the anvil 716 by the closure drive system. The one or more sensors 738 may be sampled in real time during a clamping operation by the processor of the control circuit 710. The control circuit 710 receives real-time sample measurements to provide and analyze time-based information and assess, in real time, closure forces applied to the anvil 716.


In one aspect, a current sensor 736 can be employed to measure the current drawn by each of the motors 704a-704e. The force required to advance any of the movable mechanical elements such as the I-beam 714 corresponds to the current drawn by one of the motors 704a-704e. The force is converted to a digital signal and provided to the control circuit 710. The control circuit 710 can be configured to simulate the response of the actual system of the instrument in the software of the controller. A displacement member can be actuated to move an I-beam 714 in the end effector 702 at or near a target velocity. The robotic surgical instrument 700 can include a feedback controller, which can be one of any feedback controllers, including, but not limited to a PID, a state feedback, a linear-quadratic (LQR), and/or an adaptive controller, for example. The robotic surgical instrument 700 can include a power source to convert the signal from the feedback controller into a physical input such as case voltage, PWM voltage, frequency modulated voltage, current, torque, and/or force, for example. Additional details are disclosed in U.S. patent application Ser. No. 15/636,829, titled CLOSED LOOP VELOCITY CONTROL TECHNIQUES FOR ROBOTIC SURGICAL INSTRUMENT, filed Jun. 29, 2017, which is herein incorporated by reference in its entirety.



FIG. 18 illustrates a block diagram of a surgical instrument 750 programmed to control the distal translation of a displacement member according to one aspect of this disclosure. In one aspect, the surgical instrument 750 is programmed to control the distal translation of a displacement member such as the I-beam 764. The surgical instrument 750 comprises an end effector 752 that may comprise an anvil 766, an I-beam 764 (including a sharp cutting edge), and a removable staple cartridge 768.


The position, movement, displacement, and/or translation of a linear displacement member, such as the I-beam 764, can be measured by an absolute positioning system, sensor arrangement, and position sensor 784. Because the I-beam 764 is coupled to a longitudinally movable drive member, the position of the I-beam 764 can be determined by measuring the position of the longitudinally movable drive member employing the position sensor 784. Accordingly, in the following description, the position, displacement, and/or translation of the I-beam 764 can be achieved by the position sensor 784 as described herein. A control circuit 760 may be programmed to control the translation of the displacement member, such as the I-beam 764. The control circuit 760, in some examples, may comprise one or more microcontrollers, microprocessors, or other suitable processors for executing instructions that cause the processor or processors to control the displacement member, e.g., the I-beam 764, in the manner described. In one aspect, a timer/counter 781 provides an output signal, such as the elapsed time or a digital count, to the control circuit 760 to correlate the position of the I-beam 764 as determined by the position sensor 784 with the output of the timer/counter 781 such that the control circuit 760 can determine the position of the I-beam 764 at a specific time (t) relative to a starting position. The timer/counter 781 may be configured to measure elapsed time, count external events, or time external events.


The control circuit 760 may generate a motor set point signal 772. The motor set point signal 772 may be provided to a motor controller 758. The motor controller 758 may comprise one or more circuits configured to provide a motor drive signal 774 to the motor 754 to drive the motor 754 as described herein. In some examples, the motor 754 may be a brushed DC electric motor. For example, the velocity of the motor 754 may be proportional to the motor drive signal 774. In some examples, the motor 754 may be a brushless DC electric motor and the motor drive signal 774 may comprise a PWM signal provided to one or more stator windings of the motor 754. Also, in some examples, the motor controller 758 may be omitted, and the control circuit 760 may generate the motor drive signal 774 directly.


The motor 754 may receive power from an energy source 762. The energy source 762 may be or include a battery, a super capacitor, or any other suitable energy source. The motor 754 may be mechanically coupled to the I-beam 764 via a transmission 756. The transmission 756 may include one or more gears or other linkage components to couple the motor 754 to the I-beam 764. A position sensor 784 may sense a position of the I-beam 764. The position sensor 784 may be or include any type of sensor that is capable of generating position data that indicate a position of the I-beam 764. In some examples, the position sensor 784 may include an encoder configured to provide a series of pulses to the control circuit 760 as the I-beam 764 translates distally and proximally. The control circuit 760 may track the pulses to determine the position of the I-beam 764. Other suitable position sensors may be used, including, for example, a proximity sensor. Other types of position sensors may provide other signals indicating motion of the I-beam 764. Also, in some examples, the position sensor 784 may be omitted. Where the motor 754 is a stepper motor, the control circuit 760 may track the position of the I-beam 764 by aggregating the number and direction of steps that the motor 754 has been instructed to execute. The position sensor 784 may be located in the end effector 752 or at any other portion of the instrument.


The control circuit 760 may be in communication with one or more sensors 788. The sensors 788 may be positioned on the end effector 752 and adapted to operate with the surgical instrument 750 to measure the various derived parameters such as gap distance versus time, tissue compression versus time, and anvil strain versus time. The sensors 788 may comprise a magnetic sensor, a magnetic field sensor, a strain gauge, a pressure sensor, a force sensor, an inductive sensor such as an eddy current sensor, a resistive sensor, a capacitive sensor, an optical sensor, and/or any other suitable sensor for measuring one or more parameters of the end effector 752. The sensors 788 may include one or more sensors.


The one or more sensors 788 may comprise a strain gauge, such as a micro-strain gauge, configured to measure the magnitude of the strain in the anvil 766 during a clamped condition. The strain gauge provides an electrical signal whose amplitude varies with the magnitude of the strain. The sensors 788 may comprise a pressure sensor configured to detect a pressure generated by the presence of compressed tissue between the anvil 766 and the staple cartridge 768. The sensors 788 may be configured to detect impedance of a tissue section located between the anvil 766 and the staple cartridge 768 that is indicative of the thickness and/or fullness of tissue located therebetween.


The sensors 788 may be is configured to measure forces exerted on the anvil 766 by a closure drive system. For example, one or more sensors 788 can be at an interaction point between a closure tube and the anvil 766 to detect the closure forces applied by a closure tube to the anvil 766. The forces exerted on the anvil 766 can be representative of the tissue compression experienced by the tissue section captured between the anvil 766 and the staple cartridge 768. The one or more sensors 788 can be positioned at various interaction points along the closure drive system to detect the closure forces applied to the anvil 766 by the closure drive system. The one or more sensors 788 may be sampled in real time during a clamping operation by a processor of the control circuit 760. The control circuit 760 receives real-time sample measurements to provide and analyze time-based information and assess, in real time, closure forces applied to the anvil 766.


A current sensor 786 can be employed to measure the current drawn by the motor 754. The force required to advance the I-beam 764 corresponds to the current drawn by the motor 754. The force is converted to a digital signal and provided to the control circuit 760.


The control circuit 760 can be configured to simulate the response of the actual system of the instrument in the software of the controller. A displacement member can be actuated to move an I-beam 764 in the end effector 752 at or near a target velocity. The surgical instrument 750 can include a feedback controller, which can be one of any feedback controllers, including, but not limited to a PID, a state feedback, LQR, and/or an adaptive controller, for example. The surgical instrument 750 can include a power source to convert the signal from the feedback controller into a physical input such as case voltage, PWM voltage, frequency modulated voltage, current, torque, and/or force, for example.


The actual drive system of the surgical instrument 750 is configured to drive the displacement member, cutting member, or I-beam 764, by a brushed DC motor with gearbox and mechanical links to an articulation and/or knife system. Another example is the electric motor 754 that operates the displacement member and the articulation driver, for example, of an interchangeable shaft assembly. An outside influence is an unmeasured, unpredictable influence of things like tissue, surrounding bodies and friction on the physical system. Such outside influence can be referred to as drag which acts in opposition to the electric motor 754. The outside influence, such as drag, may cause the operation of the physical system to deviate from a desired operation of the physical system.


Various example aspects are directed to a surgical instrument 750 comprising an end effector 752 with motor-driven surgical stapling and cutting implements. For example, a motor 754 may drive a displacement member distally and proximally along a longitudinal axis of the end effector 752. The end effector 752 may comprise a pivotable anvil 766 and, when configured for use, a staple cartridge 768 positioned opposite the anvil 766. A clinician may grasp tissue between the anvil 766 and the staple cartridge 768, as described herein. When ready to use the instrument 750, the clinician may provide a firing signal, for example by depressing a trigger of the instrument 750. In response to the firing signal, the motor 754 may drive the displacement member distally along the longitudinal axis of the end effector 752 from a proximal stroke begin position to a stroke end position distal of the stroke begin position. As the displacement member translates distally, an I-beam 764 with a cutting element positioned at a distal end, may cut the tissue between the staple cartridge 768 and the anvil 766.


In various examples, the surgical instrument 750 may comprise a control circuit 760 programmed to control the distal translation of the displacement member, such as the I-beam 764, for example, based on one or more tissue conditions. The control circuit 760 may be programmed to sense tissue conditions, such as thickness, either directly or indirectly, as described herein. The control circuit 760 may be programmed to select a firing control program based on tissue conditions. A firing control program may describe the distal motion of the displacement member. Different firing control programs may be selected to better treat different tissue conditions. For example, when thicker tissue is present, the control circuit 760 may be programmed to translate the displacement member at a lower velocity and/or with lower power. When thinner tissue is present, the control circuit 760 may be programmed to translate the displacement member at a higher velocity and/or with higher power.


In some examples, the control circuit 760 may initially operate the motor 754 in an open loop configuration for a first open loop portion of a stroke of the displacement member. Based on a response of the instrument 750 during the open loop portion of the stroke, the control circuit 760 may select a firing control program. The response of the instrument may include, a translation distance of the displacement member during the open loop portion, a time elapsed during the open loop portion, energy provided to the motor 754 during the open loop portion, a sum of pulse widths of a motor drive signal, etc. After the open loop portion, the control circuit 760 may implement the selected firing control program for a second portion of the displacement member stroke. For example, during the closed loop portion of the stroke, the control circuit 760 may modulate the motor 754 based on translation data describing a position of the displacement member in a closed loop manner to translate the displacement member at a constant velocity. Additional details are disclosed in U.S. patent application Ser. No. 15/720,852, titled SYSTEM AND METHODS FOR CONTROLLING A DISPLAY OF A SURGICAL INSTRUMENT, filed Sep. 29, 2017, which is herein incorporated by reference in its entirety.



FIG. 19 is a schematic diagram of a surgical instrument 790 configured to control various functions according to one aspect of this disclosure. In one aspect, the surgical instrument 790 is programmed to control distal translation of a displacement member such as the I-beam 764. The surgical instrument 790 comprises an end effector 792 that may comprise an anvil 766, an I-beam 764, and a removable staple cartridge 768 which may be interchanged with an RF cartridge 796 (shown in dashed line).


In one aspect, sensors 788 may be implemented as a limit switch, electromechanical device, solid-state switches, Hall-effect devices, MR devices, GMR devices, magnetometers, among others. In other implementations, the sensors 638 may be solid-state switches that operate under the influence of light, such as optical sensors, IR sensors, ultraviolet sensors, among others. Still, the switches may be solid-state devices such as transistors (e.g., FET, junction FET, MOSFET, bipolar, and the like). In other implementations, the sensors 788 may include electrical conductorless switches, ultrasonic switches, accelerometers, and inertial sensors, among others.


In one aspect, the position sensor 784 may be implemented as an absolute positioning system comprising a magnetic rotary absolute positioning system implemented as an AS5055EQFT single-chip magnetic rotary position sensor available from Austria Microsystems, AG. The position sensor 784 may interface with the control circuit 760 to provide an absolute positioning system. The position may include multiple Hall-effect elements located above a magnet and coupled to a CORDIC processor, also known as the digit-by-digit method and Volder's algorithm, that is provided to implement a simple and efficient algorithm to calculate hyperbolic and trigonometric functions that require only addition, subtraction, bitshift, and table lookup operations.


In one aspect, the I-beam 764 may be implemented as a knife member comprising a knife body that operably supports a tissue cutting blade thereon and may further include anvil engagement tabs or features and channel engagement features or a foot. In one aspect, the staple cartridge 768 may be implemented as a standard (mechanical) surgical fastener cartridge. In one aspect, the RF cartridge 796 may be implemented as an RF cartridge. These and other sensors arrangements are described in commonly-owned U.S. patent application Ser. No. 15/628,175, titled TECHNIQUES FOR ADAPTIVE CONTROL OF MOTOR VELOCITY OF A SURGICAL STAPLING AND CUTTING INSTRUMENT, filed Jun. 20, 2017, which is herein incorporated by reference in its entirety.


The position, movement, displacement, and/or translation of a linear displacement member, such as the I-beam 764, can be measured by an absolute positioning system, sensor arrangement, and position sensor represented as position sensor 784. Because the I-beam 764 is coupled to the longitudinally movable drive member, the position of the I-beam 764 can be determined by measuring the position of the longitudinally movable drive member employing the position sensor 784. Accordingly, in the following description, the position, displacement, and/or translation of the I-beam 764 can be achieved by the position sensor 784 as described herein. A control circuit 760 may be programmed to control the translation of the displacement member, such as the I-beam 764, as described herein. The control circuit 760, in some examples, may comprise one or more microcontrollers, microprocessors, or other suitable processors for executing instructions that cause the processor or processors to control the displacement member, e.g., the I-beam 764, in the manner described. In one aspect, a timer/counter 781 provides an output signal, such as the elapsed time or a digital count, to the control circuit 760 to correlate the position of the I-beam 764 as determined by the position sensor 784 with the output of the timer/counter 781 such that the control circuit 760 can determine the position of the I-beam 764 at a specific time (t) relative to a starting position. The timer/counter 781 may be configured to measure elapsed time, count external events, or time external events.


The control circuit 760 may generate a motor set point signal 772. The motor set point signal 772 may be provided to a motor controller 758. The motor controller 758 may comprise one or more circuits configured to provide a motor drive signal 774 to the motor 754 to drive the motor 754 as described herein. In some examples, the motor 754 may be a brushed DC electric motor. For example, the velocity of the motor 754 may be proportional to the motor drive signal 774. In some examples, the motor 754 may be a brushless DC electric motor and the motor drive signal 774 may comprise a PWM signal provided to one or more stator windings of the motor 754. Also, in some examples, the motor controller 758 may be omitted, and the control circuit 760 may generate the motor drive signal 774 directly.


The motor 754 may receive power from an energy source 762. The energy source 762 may be or include a battery, a super capacitor, or any other suitable energy source. The motor 754 may be mechanically coupled to the I-beam 764 via a transmission 756. The transmission 756 may include one or more gears or other linkage components to couple the motor 754 to the I-beam 764. A position sensor 784 may sense a position of the I-beam 764. The position sensor 784 may be or include any type of sensor that is capable of generating position data that indicate a position of the I-beam 764. In some examples, the position sensor 784 may include an encoder configured to provide a series of pulses to the control circuit 760 as the I-beam 764 translates distally and proximally. The control circuit 760 may track the pulses to determine the position of the I-beam 764. Other suitable position sensors may be used, including, for example, a proximity sensor. Other types of position sensors may provide other signals indicating motion of the I-beam 764. Also, in some examples, the position sensor 784 may be omitted. Where the motor 754 is a stepper motor, the control circuit 760 may track the position of the I-beam 764 by aggregating the number and direction of steps that the motor has been instructed to execute. The position sensor 784 may be located in the end effector 792 or at any other portion of the instrument.


The control circuit 760 may be in communication with one or more sensors 788. The sensors 788 may be positioned on the end effector 792 and adapted to operate with the surgical instrument 790 to measure the various derived parameters such as gap distance versus time, tissue compression versus time, and anvil strain versus time. The sensors 788 may comprise a magnetic sensor, a magnetic field sensor, a strain gauge, a pressure sensor, a force sensor, an inductive sensor such as an eddy current sensor, a resistive sensor, a capacitive sensor, an optical sensor, and/or any other suitable sensor for measuring one or more parameters of the end effector 792. The sensors 788 may include one or more sensors.


The one or more sensors 788 may comprise a strain gauge, such as a micro-strain gauge, configured to measure the magnitude of the strain in the anvil 766 during a clamped condition. The strain gauge provides an electrical signal whose amplitude varies with the magnitude of the strain. The sensors 788 may comprise a pressure sensor configured to detect a pressure generated by the presence of compressed tissue between the anvil 766 and the staple cartridge 768. The sensors 788 may be configured to detect impedance of a tissue section located between the anvil 766 and the staple cartridge 768 that is indicative of the thickness and/or fullness of tissue located therebetween.


The sensors 788 may be is configured to measure forces exerted on the anvil 766 by the closure drive system. For example, one or more sensors 788 can be at an interaction point between a closure tube and the anvil 766 to detect the closure forces applied by a closure tube to the anvil 766. The forces exerted on the anvil 766 can be representative of the tissue compression experienced by the tissue section captured between the anvil 766 and the staple cartridge 768. The one or more sensors 788 can be positioned at various interaction points along the closure drive system to detect the closure forces applied to the anvil 766 by the closure drive system. The one or more sensors 788 may be sampled in real time during a clamping operation by a processor portion of the control circuit 760. The control circuit 760 receives real-time sample measurements to provide and analyze time-based information and assess, in real time, closure forces applied to the anvil 766.


A current sensor 786 can be employed to measure the current drawn by the motor 754. The force required to advance the I-beam 764 corresponds to the current drawn by the motor 754. The force is converted to a digital signal and provided to the control circuit 760.


An RF energy source 794 is coupled to the end effector 792 and is applied to the RF cartridge 796 when the RF cartridge 796 is loaded in the end effector 792 in place of the staple cartridge 768. The control circuit 760 controls the delivery of the RF energy to the RF cartridge 796.


Additional details are disclosed in U.S. patent application Ser. No. 15/636,096, titled SURGICAL SYSTEM COUPLABLE WITH STAPLE CARTRIDGE AND RADIO FREQUENCY CARTRIDGE, AND METHOD OF USING SAME, filed Jun. 28, 2017, which is herein incorporated by reference in its entirety.


Generator Hardware


FIG. 20 is a simplified block diagram of a generator 800 configured to provide inductorless tuning, among other benefits. Additional details of the generator 800 are described in U.S. Pat. No. 9,060,775, titled SURGICAL GENERATOR FOR ULTRASONIC AND ELECTROSURGICAL DEVICES, which issued on Jun. 23, 2015, which is herein incorporated by reference in its entirety. The generator 800 may comprise a patient isolated stage 802 in communication with a non-isolated stage 804 via a power transformer 806. A secondary winding 808 of the power transformer 806 is contained in the isolated stage 802 and may comprise a tapped configuration (e.g., a center-tapped or a non-center-tapped configuration) to define drive signal outputs 810a, 810b, 810c for delivering drive signals to different surgical instruments, such as, for example, an ultrasonic surgical instrument, an RF electrosurgical instrument, and a multifunction surgical instrument which includes ultrasonic and RF energy modes that can be delivered alone or simultaneously. In particular, drive signal outputs 810a, 810c may output an ultrasonic drive signal (e.g., a 420V root-mean-square (RMS) drive signal) to an ultrasonic surgical instrument, and drive signal outputs 810b, 810c may output an RF electrosurgical drive signal (e.g., a 100V RMS drive signal) to an RF electrosurgical instrument, with the drive signal output 810b corresponding to the center tap of the power transformer 806.


In certain forms, the ultrasonic and electrosurgical drive signals may be provided simultaneously to distinct surgical instruments and/or to a single surgical instrument, such as the multifunction surgical instrument, having the capability to deliver both ultrasonic and electrosurgical energy to tissue. It will be appreciated that the electrosurgical signal, provided either to a dedicated electrosurgical instrument and/or to a combined multifunction ultrasonic/electrosurgical instrument may be either a therapeutic or sub-therapeutic level signal where the sub-therapeutic signal can be used, for example, to monitor tissue or instrument conditions and provide feedback to the generator. For example, the ultrasonic and RF signals can be delivered separately or simultaneously from a generator with a single output port in order to provide the desired output signal to the surgical instrument, as will be discussed in more detail below. Accordingly, the generator can combine the ultrasonic and electrosurgical RF energies and deliver the combined energies to the multifunction ultrasonic/electrosurgical instrument. Bipolar electrodes can be placed on one or both jaws of the end effector. One jaw may be driven by ultrasonic energy in addition to electrosurgical RF energy, working simultaneously. The ultrasonic energy may be employed to dissect tissue, while the electrosurgical RF energy may be employed for vessel sealing.


The non-isolated stage 804 may comprise a power amplifier 812 having an output connected to a primary winding 814 of the power transformer 806. In certain forms, the power amplifier 812 may comprise a push-pull amplifier. For example, the non-isolated stage 804 may further comprise a logic device 816 for supplying a digital output to a digital-to-analog converter (DAC) circuit 818, which in turn supplies a corresponding analog signal to an input of the power amplifier 812. In certain forms, the logic device 816 may comprise a programmable gate array (PGA), a FPGA, programmable logic device (PLD), among other logic circuits, for example. The logic device 816, by virtue of controlling the input of the power amplifier 812 via the DAC circuit 818, may therefore control any of a number of parameters (e.g., frequency, waveform shape, waveform amplitude) of drive signals appearing at the drive signal outputs 810a, 810b, 810c. In certain forms and as discussed below, the logic device 816, in conjunction with a processor (e.g., a DSP discussed below), may implement a number of DSP-based and/or other control algorithms to control parameters of the drive signals output by the generator 800.


Power may be supplied to a power rail of the power amplifier 812 by a switch-mode regulator 820, e.g., a power converter. In certain forms, the switch-mode regulator 820 may comprise an adjustable buck regulator, for example. The non-isolated stage 804 may further comprise a first processor 822, which in one form may comprise a DSP processor such as an Analog Devices ADSP-21469 SHARC DSP, available from Analog Devices, Norwood, MA, for example, although in various forms any suitable processor may be employed. In certain forms the DSP processor 822 may control the operation of the switch-mode regulator 820 responsive to voltage feedback data received from the power amplifier 812 by the DSP processor 822 via an ADC circuit 824. In one form, for example, the DSP processor 822 may receive as input, via the ADC circuit 824, the waveform envelope of a signal (e.g., an RF signal) being amplified by the power amplifier 812. The DSP processor 822 may then control the switch-mode regulator 820 (e.g., via a PWM output) such that the rail voltage supplied to the power amplifier 812 tracks the waveform envelope of the amplified signal. By dynamically modulating the rail voltage of the power amplifier 812 based on the waveform envelope, the efficiency of the power amplifier 812 may be significantly improved relative to a fixed rail voltage amplifier schemes.


In certain forms, the logic device 816, in conjunction with the DSP processor 822, may implement a digital synthesis circuit such as a direct digital synthesizer control scheme to control the waveform shape, frequency, and/or amplitude of drive signals output by the generator 800. In one form, for example, the logic device 816 may implement a DDS control algorithm by recalling waveform samples stored in a dynamically updated lookup table (LUT), such as a RAM LUT, which may be embedded in an FPGA. This control algorithm is particularly useful for ultrasonic applications in which an ultrasonic transducer, such as an ultrasonic transducer, may be driven by a clean sinusoidal current at its resonant frequency. Because other frequencies may excite parasitic resonances, minimizing or reducing the total distortion of the motional branch current may correspondingly minimize or reduce undesirable resonance effects. Because the waveform shape of a drive signal output by the generator 800 is impacted by various sources of distortion present in the output drive circuit (e.g., the power transformer 806, the power amplifier 812), voltage and current feedback data based on the drive signal may be input into an algorithm, such as an error control algorithm implemented by the DSP processor 822, which compensates for distortion by suitably pre-distorting or modifying the waveform samples stored in the LUT on a dynamic, ongoing basis (e.g., in real time). In one form, the amount or degree of pre-distortion applied to the LUT samples may be based on the error between a computed motional branch current and a desired current waveform shape, with the error being determined on a sample-by-sample basis. In this way, the pre-distorted LUT samples, when processed through the drive circuit, may result in a motional branch drive signal having the desired waveform shape (e.g., sinusoidal) for optimally driving the ultrasonic transducer. In such forms, the LUT waveform samples will therefore not represent the desired waveform shape of the drive signal, but rather the waveform shape that is required to ultimately produce the desired waveform shape of the motional branch drive signal when distortion effects are taken into account.


The non-isolated stage 804 may further comprise a first ADC circuit 826 and a second ADC circuit 828 coupled to the output of the power transformer 806 via respective isolation transformers 830, 832 for respectively sampling the voltage and current of drive signals output by the generator 800. In certain forms, the ADC circuits 826, 828 may be configured to sample at high speeds (e.g., 80 mega samples per second (MSPS)) to enable oversampling of the drive signals. In one form, for example, the sampling speed of the ADC circuits 826, 828 may enable approximately 200× (depending on frequency) oversampling of the drive signals. In certain forms, the sampling operations of the ADC circuit 826, 828 may be performed by a single ADC circuit receiving input voltage and current signals via a two-way multiplexer. The use of high-speed sampling in forms of the generator 800 may enable, among other things, calculation of the complex current flowing through the motional branch (which may be used in certain forms to implement DDS-based waveform shape control described above), accurate digital filtering of the sampled signals, and calculation of real power consumption with a high degree of precision. Voltage and current feedback data output by the ADC circuits 826, 828 may be received and processed (e.g., first-in-first-out (FIFO) buffer, multiplexer) by the logic device 816 and stored in data memory for subsequent retrieval by, for example, the DSP processor 822. As noted above, voltage and current feedback data may be used as input to an algorithm for pre-distorting or modifying LUT waveform samples on a dynamic and ongoing basis. In certain forms, this may require each stored voltage and current feedback data pair to be indexed based on, or otherwise associated with, a corresponding LUT sample that was output by the logic device 816 when the voltage and current feedback data pair was acquired. Synchronization of the LUT samples and the voltage and current feedback data in this manner contributes to the correct timing and stability of the pre-distortion algorithm.


In certain forms, the voltage and current feedback data may be used to control the frequency and/or amplitude (e.g., current amplitude) of the drive signals. In one form, for example, voltage and current feedback data may be used to determine impedance phase. The frequency of the drive signal may then be controlled to minimize or reduce the difference between the determined impedance phase and an impedance phase setpoint (e.g., 0°), thereby minimizing or reducing the effects of harmonic distortion and correspondingly enhancing impedance phase measurement accuracy. The determination of phase impedance and a frequency control signal may be implemented in the DSP processor 822, for example, with the frequency control signal being supplied as input to a DDS control algorithm implemented by the logic device 816.


In another form, for example, the current feedback data may be monitored in order to maintain the current amplitude of the drive signal at a current amplitude setpoint. The current amplitude setpoint may be specified directly or determined indirectly based on specified voltage amplitude and power setpoints. In certain forms, control of the current amplitude may be implemented by control algorithm, such as, for example, a proportional-integral-derivative (PID) control algorithm, in the DSP processor 822. Variables controlled by the control algorithm to suitably control the current amplitude of the drive signal may include, for example, the scaling of the LUT waveform samples stored in the logic device 816 and/or the full-scale output voltage of the DAC circuit 818 (which supplies the input to the power amplifier 812) via a DAC circuit 834.


The non-isolated stage 804 may further comprise a second processor 836 for providing, among other things user interface (UI) functionality. In one form, the UI processor 836 may comprise an Atmel AT91SAM9263 processor having an ARM 926EJ-S core, available from Atmel Corporation, San Jose, California, for example. Examples of UI functionality supported by the UI processor 836 may include audible and visual user feedback, communication with peripheral devices (e.g., via a USB interface), communication with a foot switch, communication with an input device (e.g., a touch screen display) and communication with an output device (e.g., a speaker). The UI processor 836 may communicate with the DSP processor 822 and the logic device 816 (e.g., via SPI buses). Although the UI processor 836 may primarily support UI functionality, it may also coordinate with the DSP processor 822 to implement hazard mitigation in certain forms. For example, the UI processor 836 may be programmed to monitor various aspects of user input and/or other inputs (e.g., touch screen inputs, foot switch inputs, temperature sensor inputs) and may disable the drive output of the generator 800 when an erroneous condition is detected.


In certain forms, both the DSP processor 822 and the UI processor 836, for example, may determine and monitor the operating state of the generator 800. For the DSP processor 822, the operating state of the generator 800 may dictate, for example, which control and/or diagnostic processes are implemented by the DSP processor 822. For the UI processor 836, the operating state of the generator 800 may dictate, for example, which elements of a UI (e.g., display screens, sounds) are presented to a user. The respective DSP and UI processors 822, 836 may independently maintain the current operating state of the generator 800 and recognize and evaluate possible transitions out of the current operating state. The DSP processor 822 may function as the master in this relationship and determine when transitions between operating states are to occur. The UI processor 836 may be aware of valid transitions between operating states and may confirm if a particular transition is appropriate. For example, when the DSP processor 822 instructs the UI processor 836 to transition to a specific state, the UI processor 836 may verify that requested transition is valid. In the event that a requested transition between states is determined to be invalid by the UI processor 836, the UI processor 836 may cause the generator 800 to enter a failure mode.


The non-isolated stage 804 may further comprise a controller 838 for monitoring input devices (e.g., a capacitive touch sensor used for turning the generator 800 on and off, a capacitive touch screen). In certain forms, the controller 838 may comprise at least one processor and/or other controller device in communication with the UI processor 836. In one form, for example, the controller 838 may comprise a processor (e.g., a Meg168 8-bit controller available from Atmel) configured to monitor user input provided via one or more capacitive touch sensors. In one form, the controller 838 may comprise a touch screen controller (e.g., a QT5480 touch screen controller available from Atmel) to control and manage the acquisition of touch data from a capacitive touch screen.


In certain forms, when the generator 800 is in a “power off” state, the controller 838 may continue to receive operating power (e.g., via a line from a power supply of the generator 800, such as the power supply 854 discussed below). In this way, the controller 838 may continue to monitor an input device (e.g., a capacitive touch sensor located on a front panel of the generator 800) for turning the generator 800 on and off. When the generator 800 is in the power off state, the controller 838 may wake the power supply (e.g., enable operation of one or more DC/DC voltage converters 856 of the power supply 854) if activation of the “on/off” input device by a user is detected. The controller 838 may therefore initiate a sequence for transitioning the generator 800 to a “power on” state. Conversely, the controller 838 may initiate a sequence for transitioning the generator 800 to the power off state if activation of the “on/off” input device is detected when the generator 800 is in the power on state. In certain forms, for example, the controller 838 may report activation of the “on/off” input device to the UI processor 836, which in turn implements the necessary process sequence for transitioning the generator 800 to the power off state. In such forms, the controller 838 may have no independent ability for causing the removal of power from the generator 800 after its power on state has been established.


In certain forms, the controller 838 may cause the generator 800 to provide audible or other sensory feedback for alerting the user that a power on or power off sequence has been initiated. Such an alert may be provided at the beginning of a power on or power off sequence and prior to the commencement of other processes associated with the sequence.


In certain forms, the isolated stage 802 may comprise an instrument interface circuit 840 to, for example, provide a communication interface between a control circuit of a surgical instrument (e.g., a control circuit comprising handpiece switches) and components of the non-isolated stage 804, such as, for example, the logic device 816, the DSP processor 822, and/or the UI processor 836. The instrument interface circuit 840 may exchange information with components of the non-isolated stage 804 via a communication link that maintains a suitable degree of electrical isolation between the isolated and non-isolated stages 802, 804, such as, for example, an IR-based communication link. Power may be supplied to the instrument interface circuit 840 using, for example, a low-dropout voltage regulator powered by an isolation transformer driven from the non-isolated stage 804.


In one form, the instrument interface circuit 840 may comprise a logic circuit 842 (e.g., logic circuit, programmable logic circuit, PGA, FPGA, PLD) in communication with a signal conditioning circuit 844. The signal conditioning circuit 844 may be configured to receive a periodic signal from the logic circuit 842 (e.g., a 2 kHz square wave) to generate a bipolar interrogation signal having an identical frequency. The interrogation signal may be generated, for example, using a bipolar current source fed by a differential amplifier. The interrogation signal may be communicated to a surgical instrument control circuit (e.g., by using a conductive pair in a cable that connects the generator 800 to the surgical instrument) and monitored to determine a state or configuration of the control circuit. The control circuit may comprise a number of switches, resistors, and/or diodes to modify one or more characteristics (e.g., amplitude, rectification) of the interrogation signal such that a state or configuration of the control circuit is uniquely discernable based on the one or more characteristics. In one form, for example, the signal conditioning circuit 844 may comprise an ADC circuit for generating samples of a voltage signal appearing across inputs of the control circuit resulting from passage of interrogation signal therethrough. The logic circuit 842 (or a component of the non-isolated stage 804) may then determine the state or configuration of the control circuit based on the ADC circuit samples.


In one form, the instrument interface circuit 840 may comprise a first data circuit interface 846 to enable information exchange between the logic circuit 842 (or other element of the instrument interface circuit 840) and a first data circuit disposed in or otherwise associated with a surgical instrument. In certain forms, for example, a first data circuit may be disposed in a cable integrally attached to a surgical instrument handpiece or in an adaptor for interfacing a specific surgical instrument type or model with the generator 800. The first data circuit may be implemented in any suitable manner and may communicate with the generator according to any suitable protocol, including, for example, as described herein with respect to the first data circuit. In certain forms, the first data circuit may comprise a non-volatile storage device, such as an EEPROM device. In certain forms, the first data circuit interface 846 may be implemented separately from the logic circuit 842 and comprise suitable circuitry (e.g., discrete logic devices, a processor) to enable communication between the logic circuit 842 and the first data circuit. In other forms, the first data circuit interface 846 may be integral with the logic circuit 842.


In certain forms, the first data circuit may store information pertaining to the particular surgical instrument with which it is associated. Such information may include, for example, a model number, a serial number, a number of operations in which the surgical instrument has been used, and/or any other type of information. This information may be read by the instrument interface circuit 840 (e.g., by the logic circuit 842), transferred to a component of the non-isolated stage 804 (e.g., to logic device 816, DSP processor 822, and/or UI processor 836) for presentation to a user via an output device and/or for controlling a function or operation of the generator 800. Additionally, any type of information may be communicated to the first data circuit for storage therein via the first data circuit interface 846 (e.g., using the logic circuit 842). Such information may comprise, for example, an updated number of operations in which the surgical instrument has been used and/or dates and/or times of its usage.


As discussed previously, a surgical instrument may be detachable from a handpiece (e.g., the multifunction surgical instrument may be detachable from the handpiece) to promote instrument interchangeability and/or disposability. In such cases, conventional generators may be limited in their ability to recognize particular instrument configurations being used and to optimize control and diagnostic processes accordingly. The addition of readable data circuits to surgical instruments to address this issue is problematic from a compatibility standpoint, however. For example, designing a surgical instrument to remain backwardly compatible with generators that lack the requisite data reading functionality may be impractical due to, for example, differing signal schemes, design complexity, and cost. Forms of instruments discussed herein address these concerns by using data circuits that may be implemented in existing surgical instruments economically and with minimal design changes to preserve compatibility of the surgical instruments with current generator platforms.


Additionally, forms of the generator 800 may enable communication with instrument-based data circuits. For example, the generator 800 may be configured to communicate with a second data circuit contained in an instrument (e.g., the multifunction surgical instrument). In some forms, the second data circuit may be implemented in a many similar to that of the first data circuit described herein. The instrument interface circuit 840 may comprise a second data circuit interface 848 to enable this communication. In one form, the second data circuit interface 848 may comprise a tri-state digital interface, although other interfaces may also be used. In certain forms, the second data circuit may generally be any circuit for transmitting and/or receiving data. In one form, for example, the second data circuit may store information pertaining to the particular surgical instrument with which it is associated. Such information may include, for example, a model number, a serial number, a number of operations in which the surgical instrument has been used, and/or any other type of information.


In some forms, the second data circuit may store information about the electrical and/or ultrasonic properties of an associated ultrasonic transducer, end effector, or ultrasonic drive system. For example, the first data circuit may indicate a burn-in frequency slope, as described herein. Additionally or alternatively, any type of information may be communicated to second data circuit for storage therein via the second data circuit interface 848 (e.g., using the logic circuit 842). Such information may comprise, for example, an updated number of operations in which the instrument has been used and/or dates and/or times of its usage. In certain forms, the second data circuit may transmit data acquired by one or more sensors (e.g., an instrument-based temperature sensor). In certain forms, the second data circuit may receive data from the generator 800 and provide an indication to a user (e.g., a light emitting diode indication or other visible indication) based on the received data.


In certain forms, the second data circuit and the second data circuit interface 848 may be configured such that communication between the logic circuit 842 and the second data circuit can be effected without the need to provide additional conductors for this purpose (e.g., dedicated conductors of a cable connecting a handpiece to the generator 800). In one form, for example, information may be communicated to and from the second data circuit using a one-wire bus communication scheme implemented on existing cabling, such as one of the conductors used transmit interrogation signals from the signal conditioning circuit 844 to a control circuit in a handpiece. In this way, design changes or modifications to the surgical instrument that might otherwise be necessary are minimized or reduced. Moreover, because different types of communications implemented over a common physical channel can be frequency-band separated, the presence of a second data circuit may be “invisible” to generators that do not have the requisite data reading functionality, thus enabling backward compatibility of the surgical instrument.


In certain forms, the isolated stage 802 may comprise at least one blocking capacitor 850-1 connected to the drive signal output 810b to prevent passage of DC current to a patient. A single blocking capacitor may be required to comply with medical regulations or standards, for example. While failure in single-capacitor designs is relatively uncommon, such failure may nonetheless have negative consequences. In one form, a second blocking capacitor 850-2 may be provided in series with the blocking capacitor 850-1, with current leakage from a point between the blocking capacitors 850-1, 850-2 being monitored by, for example, an ADC circuit 852 for sampling a voltage induced by leakage current. The samples may be received by the logic circuit 842, for example. Based changes in the leakage current (as indicated by the voltage samples), the generator 800 may determine when at least one of the blocking capacitors 850-1, 850-2 has failed, thus providing a benefit over single-capacitor designs having a single point of failure.


In certain forms, the non-isolated stage 804 may comprise a power supply 854 for delivering DC power at a suitable voltage and current. The power supply may comprise, for example, a 400 W power supply for delivering a 48 VDC system voltage. The power supply 854 may further comprise one or more DC/DC voltage converters 856 for receiving the output of the power supply to generate DC outputs at the voltages and currents required by the various components of the generator 800. As discussed above in connection with the controller 838, one or more of the DC/DC voltage converters 856 may receive an input from the controller 838 when activation of the “on/off” input device by a user is detected by the controller 838 to enable operation of, or wake, the DC/DC voltage converters 856.



FIG. 21 illustrates an example of a generator 900, which is one form of the generator 800 (FIG. 20). The generator 900 is configured to deliver multiple energy modalities to a surgical instrument. The generator 900 provides RF and ultrasonic signals for delivering energy to a surgical instrument either independently or simultaneously. The RF and ultrasonic signals may be provided alone or in combination and may be provided simultaneously. As noted above, at least one generator output can deliver multiple energy modalities (e.g., ultrasonic, bipolar or monopolar RF, irreversible and/or reversible electroporation, and/or microwave energy, among others) through a single port, and these signals can be delivered separately or simultaneously to the end effector to treat tissue.


The generator 900 comprises a processor 902 coupled to a waveform generator 904. The processor 902 and waveform generator 904 are configured to generate a variety of signal waveforms based on information stored in a memory coupled to the processor 902, not shown for clarity of disclosure. The digital information associated with a waveform is provided to the waveform generator 904 which includes one or more DAC circuits to convert the digital input into an analog output. The analog output is fed to an amplifier 1106 for signal conditioning and amplification. The conditioned and amplified output of the amplifier 906 is coupled to a power transformer 908. The signals are coupled across the power transformer 908 to the secondary side, which is in the patient isolation side. A first signal of a first energy modality is provided to the surgical instrument between the terminals labeled ENERGY1 and RETURN. A second signal of a second energy modality is coupled across a capacitor 910 and is provided to the surgical instrument between the terminals labeled ENERGY2 and RETURN. It will be appreciated that more than two energy modalities may be output and thus the subscript “n” may be used to designate that up to n ENERGYn terminals may be provided, where n is a positive integer greater than 1. It also will be appreciated that up to “n” return paths RETURNn may be provided without departing from the scope of the present disclosure.


A first voltage sensing circuit 912 is coupled across the terminals labeled ENERGY1 and the RETURN path to measure the output voltage therebetween. A second voltage sensing circuit 924 is coupled across the terminals labeled ENERGY2 and the RETURN path to measure the output voltage therebetween. A current sensing circuit 914 is disposed in series with the RETURN leg of the secondary side of the power transformer 908 as shown to measure the output current for either energy modality. If different return paths are provided for each energy modality, then a separate current sensing circuit should be provided in each return leg. The outputs of the first and second voltage sensing circuits 912, 924 are provided to respective isolation transformers 916, 922 and the output of the current sensing circuit 914 is provided to another isolation transformer 918. The outputs of the isolation transformers 916, 928, 922 in the on the primary side of the power transformer 908 (non-patient isolated side) are provided to a one or more ADC circuit 926. The digitized output of the ADC circuit 926 is provided to the processor 902 for further processing and computation. The output voltages and output current feedback information can be employed to adjust the output voltage and current provided to the surgical instrument and to compute output impedance, among other parameters. Input/output communications between the processor 902 and patient isolated circuits is provided through an interface circuit 920. Sensors also may be in electrical communication with the processor 902 by way of the interface circuit 920.


In one aspect, the impedance may be determined by the processor 902 by dividing the output of either the first voltage sensing circuit 912 coupled across the terminals labeled ENERGY1/RETURN or the second voltage sensing circuit 924 coupled across the terminals labeled ENERGY2/RETURN by the output of the current sensing circuit 914 disposed in series with the RETURN leg of the secondary side of the power transformer 908. The outputs of the first and second voltage sensing circuits 912, 924 are provided to separate isolations transformers 916, 922 and the output of the current sensing circuit 914 is provided to another isolation transformer 916. The digitized voltage and current sensing measurements from the ADC circuit 926 are provided the processor 902 for computing impedance. As an example, the first energy modality ENERGY1 may be ultrasonic energy and the second energy modality ENERGY2 may be RF energy. Nevertheless, in addition to ultrasonic and bipolar or monopolar RF energy modalities, other energy modalities include irreversible and/or reversible electroporation and/or microwave energy, among others. Also, although the example illustrated in FIG. 21 shows a single return path RETURN may be provided for two or more energy modalities, in other aspects, multiple return paths RETURNn may be provided for each energy modality ENERGYn. Thus, as described herein, the ultrasonic transducer impedance may be measured by dividing the output of the first voltage sensing circuit 912 by the current sensing circuit 914 and the tissue impedance may be measured by dividing the output of the second voltage sensing circuit 924 by the current sensing circuit 914.


As shown in FIG. 21, the generator 900 comprising at least one output port can include a power transformer 908 with a single output and with multiple taps to provide power in the form of one or more energy modalities, such as ultrasonic, bipolar or monopolar RF, irreversible and/or reversible electroporation, and/or microwave energy, among others, for example, to the end effector depending on the type of treatment of tissue being performed. For example, the generator 900 can deliver energy with higher voltage and lower current to drive an ultrasonic transducer, with lower voltage and higher current to drive RF electrodes for sealing tissue, or with a coagulation waveform for spot coagulation using either monopolar or bipolar RF electrosurgical electrodes. The output waveform from the generator 900 can be steered, switched, or filtered to provide the frequency to the end effector of the surgical instrument. The connection of an ultrasonic transducer to the generator 900 output would be preferably located between the output labeled ENERGY1 and RETURN as shown in FIG. 21. In one example, a connection of RF bipolar electrodes to the generator 900 output would be preferably located between the output labeled ENERGY2 and RETURN. In the case of monopolar output, the preferred connections would be active electrode (e.g., pencil or other probe) to the ENERGY2 output and a suitable return pad connected to the RETURN output.


Additional details are disclosed in U.S. Patent Application Publication No. 2017/0086914, titled TECHNIQUES FOR OPERATING GENERATOR FOR DIGITALLY GENERATING ELECTRICAL SIGNAL WAVEFORMS AND SURGICAL INSTRUMENTS, which published on Mar. 30, 2017, which is herein incorporated by reference in its entirety.


As used throughout this description, the term “wireless” and its derivatives may be used to describe circuits, devices, systems, methods, techniques, communications channels, etc., that may communicate data through the use of modulated electromagnetic radiation through a non-solid medium. The term does not imply that the associated devices do not contain any wires, although in some aspects they might not. The communication module may implement any of a number of wireless or wired communication standards or protocols, including but not limited to W-Fi (IEEE 802.11 family), WiMAX (IEEE 802.16 family), IEEE 802.20, long term evolution (LTE), Ev-DO, HSPA+, HSDPA+, HSUPA+, EDGE, GSM, GPRS, CDMA, TDMA, DECT, Bluetooth, Ethernet derivatives thereof, as well as any other wireless and wired protocols that are designated as 3G, 4G, 5G, and beyond. The computing module may include a plurality of communication modules. For instance, a first communication module may be dedicated to shorter range wireless communications such as Wi-Fi and Bluetooth and a second communication module may be dedicated to longer range wireless communications such as GPS, EDGE, GPRS, CDMA, WiMAX, LTE, Ev-DO, and others.


As used herein a processor or processing unit is an electronic circuit which performs operations on some external data source, usually memory or some other data stream. The term is used herein to refer to the central processor (central processing unit) in a system or computer systems (especially systems on a chip (SoCs)) that combine a number of specialized “processors.”


As used herein, a system on a chip or system on chip (SoC or SOC) is an integrated circuit (also known as an “IC” or “chip”) that integrates all components of a computer or other electronic systems. It may contain digital, analog, mixed-signal, and often radio-frequency functions—all on a single substrate. A SoC integrates a microcontroller (or microprocessor) with advanced peripherals like graphics processing unit (GPU), W-Fi module, or coprocessor. A SoC may or may not contain built-in memory.


As used herein, a microcontroller or controller is a system that integrates a microprocessor with peripheral circuits and memory. A microcontroller (or MCU for microcontroller unit) may be implemented as a small computer on a single integrated circuit. It may be similar to a SoC; an SoC may include a microcontroller as one of its components. A microcontroller may contain one or more core processing units (CPUs) along with memory and programmable input/output peripherals. Program memory in the form of Ferroelectric RAM, NOR flash or OTP ROM is also often included on chip, as well as a small amount of RAM. Microcontrollers may be employed for embedded applications, in contrast to the microprocessors used in personal computers or other general purpose applications consisting of various discrete chips.


As used herein, the term controller or microcontroller may be a stand-alone IC or chip device that interfaces with a peripheral device. This may be a link between two parts of a computer or a controller on an external device that manages the operation of (and connection with) that device.


Any of the processors or microcontrollers described herein, may be implemented by any single core or multicore processor such as those known under the trade name ARM Cortex by Texas Instruments. In one aspect, the processor may be an LM4F230H5QR ARM Cortex-M4F Processor Core, available from Texas Instruments, for example, comprising on-chip memory of 256 KB single-cycle flash memory, or other non-volatile memory, up to 40 MHz, a prefetch buffer to improve performance above 40 MHz, a 32 KB single-cycle serial random access memory (SRAM), internal read-only memory (ROM) loaded with StellarisWare® software, 2 KB electrically erasable programmable read-only memory (EEPROM), one or more pulse width modulation (PWM) modules, one or more quadrature encoder inputs (QEI) analog, one or more 12-bit Analog-to-Digital Converters (ADC) with 12 analog input channels, details of which are available for the product datasheet.


In one aspect, the processor may comprise a safety controller comprising two controller-based families such as TMS570 and RM4× known under the trade name Hercules ARM Cortex R4, also by Texas Instruments. The safety controller may be configured specifically for IEC 61508 and ISO 26262 safety critical applications, among others, to provide advanced integrated safety features while delivering scalable performance, connectivity, and memory options.


Modular devices include the modules (as described in connection with FIGS. 3 and 9, for example) that are receivable within a surgical hub and the surgical devices or instruments that can be connected to the various modules in order to connect or pair with the corresponding surgical hub. The modular devices include, for example, intelligent surgical instruments, medical imaging devices, suction/irrigation devices, smoke evacuators, energy generators, ventilators, insufflators, and displays. The modular devices described herein can be controlled by control algorithms. The control algorithms can be executed on the modular device itself, on the surgical hub to which the particular modular device is paired, or on both the modular device and the surgical hub (e.g., via a distributed computing architecture). In some exemplifications, the modular devices' control algorithms control the devices based on data sensed by the modular device itself (i.e., by sensors in, on, or connected to the modular device). This data can be related to the patient being operated on (e.g., tissue properties or insufflation pressure) or the modular device itself (e.g., the rate at which a knife is being advanced, motor current, or energy levels). For example, a control algorithm for a surgical stapling and cutting instrument can control the rate at which the instrument's motor drives its knife through tissue according to resistance encountered by the knife as it advances.


Visualization System

During a surgical procedure, a surgeon may be required to manipulate tissues to effect a desired medical outcome. The actions of the surgeon are limited by what is visually observable in the surgical site. Thus, the surgeon may not be aware, for example, of the disposition of vascular structures that underlie the tissues being manipulated during the procedure. Since the surgeon is unable to visualize the vasculature beneath a surgical site, the surgeon may accidentally sever one or more critical blood vessels during the procedure. The solution is a surgical visualization system that can acquire imaging data of the surgical site for presentation to a surgeon, in which the presentation can include information related to the presence and depth of vascular structures located beneath the surface of a surgical site.


In one aspect, the surgical hub 106 incorporates a visualization system 108 to acquire imaging data during a surgical procedure. The visualization system 108 may include one or more illumination sources and one or more light sensors. The one or more illumination sources and one or more light sensors may be incorporated together into a single device or may comprise one or more separate devices. The one or more illumination sources may be directed to illuminate portions of the surgical field. The one or more light sensors may receive light reflected or refracted from the surgical field including light reflected or refracted from tissue and/or surgical instruments. The following description includes all of the hardware and software processing techniques disclosed above and in those applications incorporated herein by reference as presented above.


In some aspects, the visualization system 108 may be integrated into a surgical system 100 as disclosed above and depicted in FIGS. 1 and 2. In addition to the visualization system 108, the surgical system 100 may include one or more hand-held intelligent instruments 112, a multi-functional robotic system 110, one or more visualization systems 108, and a centralized surgical hub system 106, among other components. The centralized surgical hub system 106 may control several functions a disclosed above and also depicted in FIG. 3. In one non-limiting example, such functions may include supplying and controlling power to any number of powered surgical devices. In another non-limiting example, such functions may include controlling fluid supplied to and evacuated from the surgical site. The centralized surgical hub system 106 may also be configured to manage and analyze data received from any of the surgical system components as well as communicate data and other information among and between the components of the surgical system. The centralized surgical hub system 106 may also be in data communication with a cloud computing system 104 as disclosed above and depicted, for example, in FIG. 1.


In some non-limiting examples, imaging data generated by the visualization system 108 may be analyzed by on-board computational components of the visualization system 108, and analysis results may be communicated to the centralized surgical hub 106. In alternative non-limiting examples, the imaging data generated by the visualization system 108 may be communicated directly to the centralized surgical hub 106 where the data may be analyzed by computational components in the hub system 106. The centralized surgical hub 106 may communicate the image analysis results to any one or more of the other components of the surgical system. In some other non-limiting examples, the centralized surgical hub may communicate the image data and/or the image analysis results to the cloud computing system 104.



FIGS. 22A-D and FIGS. 23A-F depict various aspects of one example of a visualization system 2108 that may be incorporated into a surgical system. The visualization system 2108 may include an imaging control unit 2002 and a hand unit 2020. The imaging control unit 2002 may include one or more illumination sources, a power supply for the one or more illumination sources, one or more types of data communication interfaces (including USB, Ethernet, or wireless interfaces 2004), and one or more a video outputs 2006. The imaging control unit 2002 may further include an interface, such as a USB interface 2010, configured to transmit integrated video and image capture data to a USB enabled device. The imaging control unit 2002 may also include one or more computational components including, without limitation, a processor unit, a transitory memory unit, a non-transitory memory unit, an image processing unit, a bus structure to form data links among the computational components, and any interface (e.g. input and/or output) devices necessary to receive information from and transmit information to components not included in the imaging control unit. The non-transitory memory may further contain instructions that when executed by the processor unit, may perform any number of manipulations of data that may be received from the hand unit 2020 and/or computational devices not included in the imaging control unit.


The illumination sources may include a white light source 2012 and one or more laser light sources. The imaging control unit 2002 may include one or more optical and/or electrical interfaces for optical and/or electrical communication with the hand unit 2020. The one or more laser light sources may include, as non-limiting examples, any one or more of a red laser light source, a green laser light source, a blue laser light source, an infra red laser light source, and an ultraviolet laser light source. In some non-limiting examples, the red laser light source may source illumination having a peak wavelength that may range between 635 nm and 660 nm, inclusive. Non-limiting examples of a red laser peak wavelength may include about 635 nm, about 640 nm, about 645 nm, about 650 nm, about 655 nm, about 660 nm, or any value or range of values therebetween. In some non-limiting examples, the green laser light source may source illumination having a peak wavelength that may range between 520 nm and 532 nm, inclusive. Non-limiting examples of a green laser peak wavelength may include about 520 nm, about 522 nm, about 524 nm, about 526 nm, about 528 nm, about 530 nm, about 532 nm, or any value or range of values therebetween. In some non-limiting examples, the blue laser light source may source illumination having a peak wavelength that may range between 405 nm and 445 nm, inclusive. Non-limiting examples of a blue laser peak wavelength may include about 405 nm, about 410 nm, about 415 nm, about 420 nm, about 425 nm, about 430 nm, about 435 nm, about 440 nm, about 445 nm, or any value or range of values therebetween. In some non-limiting examples, the infra red laser light source may source illumination having a peak wavelength that may range between 750 nm and 3000 nm, inclusive. Non-limiting examples of an infra red laser peak wavelength may include about 750 nm, about 1000 nm, about 1250 nm, about 1500 nm, about 1750 nm, about 2000 nm, about 2250 nm, about 2500 nm, about 2750 nm, 3000 nm, or any value or range of values therebetween. In some non-limiting examples, the ultraviolet laser light source may source illumination having a peak wavelength that may range between 200 nm and 360 nm, inclusive. Non-limiting examples of an ultraviolet laser peak wavelength may include about 200 nm, about 220 nm, about 240 nm, about 260 nm, about 280 nm, about 300 nm, about 320 nm, about 340 nm, about 360 nm, or any value or range of values therebetween.


In one non-limiting aspect, the hand unit 2020 may include a body 2021, a camera scope cable 2015 attached to the body 2021, and an elongated camera probe 2024. The body 2021 of the hand unit 2020 may include hand unit control buttons 2022 or other controls to permit a health professional using the hand unit 2020 to control the operations of the hand unit 2020 or other components of the imaging control unit 2002, including, for example, the light sources. The camera scope cable 2015 may include one or more electrical conductors and one or more optical fibers. The camera scope cable 2015 may terminate with a camera head connector 2008 at a proximal end in which the camera head connector 2008 is configured to mate with the one or more optical and/or electrical interfaces of the imaging control unit 2002. The electrical conductors may supply power to the hand unit 2020, including the body 2021 and the elongated camera probe 2024, and/or to any electrical components internal to the hand unit 2020 including the body 2021 and/or elongated camera probe 2024. The electrical conductors may also serve to provide bi-directional data communication between any one or more components the hand unit 2020 and the imaging control unit 2002. The one or more optical fibers may conduct illumination from the one or more illumination sources in the imaging control unit 2002 through the hand unit body 2021 and to a distal end of the elongated camera probe 2024. In some non-limiting aspects, the one or more optical fibers may also conduct light reflected or refracted from the surgical site to one or more optical sensors disposed in the elongated camera probe 2024, the hand unit body 2021, and/or the imaging control unit 2002.



FIG. 22B (a top plan view) depicts in more detail some aspects of a hand unit 2020 of the visualization system 2108. The hand unit body 2021 may be constructed of a plastic material. The hand unit control buttons 2022 or other controls may have a rubber overmolding to protect the controls while permitting them to be manipulated by the surgeon. The camera scope cable 2015 may have optical fibers integrated with electrical conductors, and the camera scope cable 2015 may have a protective and flexible overcoating such as PVC. In some non-limiting examples, the camera scope cable 2015 may be about 10 ft. long to permit ease of use during a surgical procedure. The length of the camera scope cable 2015 may range from about 5 ft. to about 15 ft. Non-limiting examples of a length of the camera scope cable 2015 may be about 5 ft., about 6 ft., about 7 ft., about 8 ft., about 9 ft., about 10 ft., about 11 ft., about 12 ft., about 13 ft., about 14 ft., about 15 ft., or any length or range of lengths therebetween. The elongated camera probe 2024 may be fabricated from a rigid material such as stainless steel. In some aspects, the elongated camera probe 2024 may be joined with the hand unit body 2021 via a rotatable collar 2026. The rotatable collar 2026 may permit the elongated camera probe 2024 to be rotated with respect to the hand unit body 2021. In some aspects, the elongated camera probe 2024 may terminate at a distal end with a plastic window 2028 sealed with epoxy.


The side plan view of the hand unit, depicted in FIG. 22C illustrates that a light or image sensor 2030 maybe disposed at a distal end 2032a of the elongated camera probe or within the hand unit body 2032b. In some alternative aspects, the light or image sensor 2030 may be dispose with additional optical elements in the imaging control unit 2002. FIG. 22C further depicts an example of a light sensor 2030 comprising a CMOS image sensor 2034 disposed within a mount 2036 having a radius of about 4 mm. FIG. 22D illustrates aspects of the CMOS image sensor 2034, depicting the active area 2038 of the image sensor. Although the CMOS image sensor in FIG. 22C is depicted to be disposed within a mount 2036 having a radius of about 4 mm, it may be recognized that such a sensor and mount combination may be of any useful size to be disposed within the elongated camera probe 2024, the hand unit body 2021, or in the image control unit 2002. Some non-limiting examples of such alternative mounts may include a 5.5 mm mount 2136a, a 4 mm mount 2136b, a 2.7 mm mount 2136c, and a 2 mm mount 2136d. It may be recognized that the image sensor may also comprise a CCD image sensor. The CMOS or CCD sensor may comprise an array of individual light sensing elements (pixels).



FIGS. 23A-23F depict various aspects of some examples of illumination sources and their control that may be incorporated into the visualization system 2108.



FIG. 23A illustrates an aspect of a laser illumination system having a plurality of laser bundles emitting a plurality of wavelengths of electromagnetic energy. As can be seen in the figure, the illumination system 2700 may comprise a red laser bundle 2720, a green laser bundle 2730, and a blue laser bundle 2740 that are all optically coupled together though fiber optics 2755. As can be seen in the figure, each of the laser bundles may have a corresponding light sensing element or electromagnetic sensor 2725, 2735, 2745 respectively, for sensing the output of the specific laser bundle or wavelength.


Additional disclosures regarding the laser illumination system depicted in FIG. 23A for use in a surgical visualization system 2108 may be found in U.S. Patent Application Publication No. 2014/0268860, titled CONTROLLING THE INTEGRAL LIGHT ENERGY OF A LASER PULSE filed on Mar. 15, 2014, which issued on Oct. 3, 2017 as U.S. Pat. No. 9,777,913, the contents thereof being incorporated by reference herein in its entirety and for all purposes.



FIG. 23B illustrates the operational cycles of a sensor used in rolling readout mode. It will be appreciated that the x direction corresponds to time and the diagonal lines 2202 indicate the activity of an internal pointer that reads out each frame of data, one line at time. The same pointer is responsible for resetting each row of pixels for the next exposure period. The net integration time for each row 2219a-c is equivalent, but they are staggered in time with respect to one another due to the rolling reset and read process. Therefore, for any scenario in which adjacent frames are required to represent different constitutions of light, the only option for having each row be consistent is to pulse the light between the readout cycles 2230a-c. More specifically, the maximum available period corresponds to the sum of the blanking time plus any time during which optical black or optically blind (OB) rows (2218, 2220) are serviced at the start or end of the frame.



FIG. 23B illustrates the operational cycles of a sensor used in rolling readout mode or during the sensor readout 2200. The frame readout may start at and may be represented by vertical line 2210. The read out period is represented by the diagonal or slanted line 2202. The sensor may be read out on a row by row basis, the top of the downwards slanted edge being the sensor top row 2212 and the bottom of the downwards slanted edge being the sensor bottom row 2214. The time between the last row readout and the next readout cycle may be called the blanking time 2216a-d. It may be understood that the blanking time 2216a-d may be the same between success readout cycles or it may differ between success readout cycles. It should be noted that some of the sensor pixel rows might be covered with a light shield (e.g., a metal coating or any other substantially black layer of another material type). These covered pixel rows may be referred to as optical black rows 2218 and 2220. Optical black rows 2218 and 2220 may be used as input for correction algorithms.


As shown in FIG. 23B, these optical black rows 2218 and 2220 may be located on the top of the pixel array or at the bottom of the pixel array or at the top and the bottom of the pixel array. In some aspects, it may be desirable to control the amount of electromagnetic radiation, e.g., light, that is exposed to a pixel, thereby integrated or accumulated by the pixel. It will be appreciated that photons are elementary particles of electromagnetic radiation. Photons are integrated, absorbed, or accumulated by each pixel and converted into an electrical charge or current. In some aspects, an electronic shutter or rolling shutter may be used to start the integration time (2219a-c) by resetting the pixel. The light will then integrate until the next readout phase. In some aspects, the position of the electronic shutter can be moved between two readout cycles 2202 in order to control the pixel saturation for a given amount of light. In some alternative aspects lacking an electronic shutter, the integration time 2219a-c of the incoming light may start during a first readout cycle 2202 and may end at the next readout cycle 2202, which also defines the start of the next integration. In some alternative aspects, the amount of light accumulated by each pixel may be controlled by a time during which light is pulsed 2230a-d during the blanking times 2216a-d. This ensures that all rows see the same light issued from the same light pulse 2230a-c. In other words, each row will start its integration in a first dark environment 2231, which may be at the optical black back row 2220 of read out frame (m) for a maximum light pulse width, and will then receive a light strobe and will end its integration in a second dark environment 2232, which may be at the optical black front row 2218 of the next succeeding read out frame (m+1) for a maximum light pulse width. Thus, the image generated from the light pulse 2230a-c will be solely available during frame (m+1) readout without any interference with frames (m) and (m+2).


It should be noted that the condition to have a light pulse 2230a-c to be read out only in one frame and not interfere with neighboring frames is to have the given light pulse 2230a-c firing during the blanking time 2216. Because the optical black rows 2218, 2220 are insensitive to light, the optical black back rows 2220 time of frame (m) and the optical black front rows 2218 time of frame (m+1) can be added to the blanking time 2216 to determine the maximum range of the firing time of the light pulse 2230.


In some aspects, FIG. 23B depicts an example of a timing diagram for sequential frame captures by a conventional CMOS sensor. Such a CMOS sensor may incorporate a Bayer pattern of color filters, as depicted in FIG. 23C. It is recognized that the Bayer pattern provides for greater luminance detail than chrominance. It may further be recognized that the sensor has a reduced spatial resolution since a total of 4 adjacent pixels are required to produce the color information for the aggregate spatial portion of the image. In an alternative approach, the color image may be constructed by rapidly strobing the visualized area at high speed with a variety of optical sources (either laser or light-emitting diodes) having different central optical wavelengths.


The optical strobing system may be under the control of the camera system, and may include a specially designed CMOS sensor with high speed readout. The principal benefit is that the sensor can accomplish the same spatial resolution with significantly fewer pixels compared with conventional Bayer or 3-sensor cameras. Therefore, the physical space occupied by the pixel array may be reduced. The actual pulse periods (2230a-c) may differ within the repeating pattern, as illustrated in FIG. 23B. This is useful for, e.g., apportioning greater time to the components that require the greater light energy or those having the weaker sources. As long as the average captured frame rate is an integer multiple of the requisite final system frame rate, the data may simply be buffered in the signal processing chain as appropriate.


The facility to reduce the CMOS sensor chip-area to the extent allowed by combining all of these methods is particularly attractive for small diameter (˜3-10 mm) endoscopy. In particular, it allows for endoscope designs in which the sensor is located in the space-constrained distal end, thereby greatly reducing the complexity and cost of the optical section, while providing high definition video. A consequence of this approach is that to reconstruct each final, full color image, requires that data be fused from three separate snapshots in time. Any motion within the scene, relative to the optical frame of reference of the endoscope, will generally degrade the perceived resolution, since the edges of objects appear at slightly different locations within each captured component. In this disclosure, a means of diminishing this issue is described which exploits the fact that spatial resolution is much more important for luminance information, than for chrominance.


The basis of the approach is that, instead of firing monochromatic light during each frame, combinations of the three wavelengths are used to provide all of the luminance information within a single image. The chrominance information is derived from separate frames with, e.g., a repeating pattern such as Y-Cb-Y-Cr (FIG. 23D). While it is possible to provide pure luminance data by a shrewd choice of pulse ratios, the same is not true of chrominance.


In one aspect, as illustrated in FIG. 23D, an endoscopic system 2300a may comprise a pixel array 2302a having uniform pixels and the system 2300a may be operated to receive Y (luminance pulse) 2304a, Cb (ChromaBlue) 2306a and Cr (ChromaRed) 2308a pulses.


To complete a full color image requires that the two components of chrominance also be provided. However, the same algorithm that was applied for luminance cannot be directly applied for chrominance images since it is signed, as reflected in the fact that some of the RGB coefficients are negative. The solution to this is to add a degree of luminance of sufficient magnitude that all of the final pulse energies become positive. As long as the color fusion process in the ISP is aware of the composition of the chrominance frames, they can be decoded by subtracting the appropriate amount of luminance from a neighboring frame. The pulse energy proportions are given by:

Y=0.183·R+0.614·G+0.062·B
Cb=λ·Y−0.101·R−0.339·G+0.439·B
Cr=δ·Y+0.439·R−0.399·G−0.040·B
where
λ≥0.399/0.614=0.552
δ≥0.399/0.614=0.650


It turns out that if the A factor is equal to 0.552; both the red and the green components are exactly cancelled, in which case the Cb information can be provided with pure blue light. Similarly, setting δ=0.650 cancels out the blue and green components for Cr which becomes pure red. This particular example is illustrated in FIG. 23E, which also depicts λ and δ as integer multiples of ½8. This is a convenient approximation for the digital frame reconstruction.


In the case of the Y-Cb-Y-Cr pulsing scheme, the image data is already in the YCbCr space following the color fusion. Therefore, in this case it makes sense to perform luminance and chrominance based operations up front, before converting back to linear RGB to perform the color correction etc.


The color fusion process is more straightforward than de-mosaic, which is necessitated by the Bayer pattern (see FIG. 23C), since there is no spatial interpolation. It does require buffering of frames though in order to have all of the necessary information available for each pixel. In one general aspect, data for the Y-Cb-Y-Cr pattern may be pipelined to yield one full color image per two raw captured images. This is accomplished by using each chrominance sample twice. In FIG. 23F the specific example of a 120 Hz frame capture rate providing 60 Hz final video is depicted.


Additional disclosures regarding the control of the laser components of an illumination system as depicted in FIGS. 23B-23F for use in a surgical visualization system 108 may be found in U.S. Patent Application Publication No. 2014/0160318, titled YCBCR PULSED ILLUMINATION SCHEME IN A LIGHT DEFICIENT ENVIRONMENT, filed on Jul. 26, 2013, which issued on Dec. 6, 2016 as U.S. Pat. No. 9,516,239, and U.S. Patent Application Publication No. 2014/0160319, titled CONTINUOUS VIDEO IN A LIGHT DEFICIENT ENVIRONMENT, filed on Jul. 26, 2013, which issued on Aug. 22, 2017 as U.S. Pat. No. 9,743,016, the contents thereof being incorporated by reference herein in their entirety and for all purposes.


Subsurface Vascular Imaging

During a surgical procedure, a surgeon may be required to manipulate tissues to effect a desired medical outcome. The actions of the surgeon are limited by what is visually observable in the surgical site. Thus, the surgeon may not be aware, for example, of the disposition of vascular structures that underlie the tissues being manipulated during the procedure.


Since the surgeon is unable to visualize the vasculature beneath a surgical site, the surgeon may accidentally sever one or more critical blood vessels during the procedure.


Therefore, it is desirable to have a surgical visualization system that can acquire imaging data of the surgical site for presentation to a surgeon in which the presentation can include information related to the presence of vascular structures located beneath the surface of a surgical site.


Some aspects of the present disclosure further provide for a control circuit configured to control the illumination of a surgical site using one or more illumination sources such as laser light sources and to receive imaging data from one or more image sensors. In some aspects, the present disclosure provides for a non-transitory computer readable medium storing computer readable instructions that, when executed, cause a device to detect a blood vessel in a tissue and determine its depth below the surface of the tissue.


In some aspects, a surgical image acquisition system may include a plurality of illumination sources wherein each illumination source is configured to emit light having a specified central wavelength, a light sensor configured to receive a portion of the light reflected from a tissue sample when illuminated by the one or more of the plurality of illumination sources, and a computing system. The computing system may be configured to: receive data from the light sensor when the tissue sample is illuminated by each of the plurality of illumination sources; determine a depth location of a structure within the tissue sample based on the data received by the light sensor when the tissue sample is illuminated by each of the plurality of illumination sources, and calculate visualization data regarding the structure and the depth location of the structure. In some aspects, the visualization data may have a data format that may be used by a display system, and the structure may comprise one or more vascular tissues.


Vascular Imaging Using NIR Spectroscopy

In one aspect, a surgical image acquisition system may include an independent color cascade of illumination sources comprising visible light and light outside of the visible range to image one or more tissues within a surgical site at different times and at different depths. The surgical image acquisition system may further detect or calculate characteristics of the light reflected and/or refracted from the surgical site. The characteristics of the light may be used to provide a composite image of the tissue within the surgical site as well as provide an analysis of underlying tissue not directly visible at the surface of the surgical site. The surgical image acquisition system may determine tissue depth location without the need for separate measurement devices.


In one aspect, the characteristic of the light reflected and/or refracted from the surgical site may be an amount of absorbance of light at one or more wavelengths. Various chemical components of individual tissues may result in specific patterns of light absorption that are wavelength dependent.


In one aspect, the illumination sources may comprise a red laser source and a near infrared laser source, wherein the one or more tissues to be imaged may include vascular tissue such as veins or arteries. In some aspects, red laser sources (in the visible range) may be used to image some aspects of underlying vascular tissue based on spectroscopy in the visible red range. In some non-limiting examples, a red laser light source may source illumination having a peak wavelength that may range between 635 nm and 660 nm, inclusive. Non-limiting examples of a red laser peak wavelength may include about 635 nm, about 640 nm, about 645 nm, about 650 nm, about 655 nm, about 660 nm, or any value or range of values therebetween. In some other aspects, near infrared laser sources may be used to image underlying vascular tissue based on near infrared spectroscopy. In some non-limiting examples, a near infrared laser source may emit illumination have a wavelength that may range between 750-3000 nm, inclusive. Non-limiting examples of an infra red laser peak wavelength may include about 750 nm, about 1000 nm, about 1250 nm, about 1500 nm, about 1750 nm, about 2000 nm, about 2250 nm, about 2500 nm, about 2750 nm, 3000 nm, or any value or range of values therebetween. It may be recognized that underlying vascular tissue may be probed using a combination of red and infrared spectroscopy. In some examples, vascular tissue may be probed using a red laser source having a peak wavelength at about 660 nm and a near IR laser source having a peak wavelength at about 750 nm or at about 850 nm.


Near infrared spectroscopy (NIRS) is a non-invasive technique that allows determination of tissue oxygenation based on spectro-photometric quantitation of oxy- and deoxyhemoglobin within a tissue. In some aspects, NIRS can be used to image vascular tissue directly based on the difference in illumination absorbance between the vascular tissue and non-vascular tissue. Alternatively, vascular tissue can be indirectly visualized based on a difference of illumination absorbance of blood flow in the tissue before and after the application of physiological interventions, such as arterial and venous occlusions methods.


Instrumentation for near-IR (NIR) spectroscopy may be similar to instruments for the UV-visible and mid-IR ranges. Such spectroscopic instruments may include an illumination source, a detector, and a dispersive element to select a specific near-IR wavelength for illuminating the tissue sample. In some aspects, the source may comprise an incandescent light source or a quartz halogen light source. In some aspects, the detector may comprise semiconductor (for example, an InGaAs) photodiode or photo array. In some aspects, the dispersive element may comprise a prism or, more commonly, a diffraction grating. Fourier transform NIR instruments using an interferometer are also common, especially for wavelengths greater than about 1000 nm. Depending on the sample, the spectrum can be measured in either reflection or transmission mode.



FIG. 24 depicts schematically one example of instrumentation 2400 similar to instruments for the UV-visible and mid-IR ranges for NIR spectroscopy. A light source 2402 may emit a broad spectral range of illumination 2404 that may impinge upon a dispersive element 2406 (such as a prism or a diffraction grating). The dispersive element 2406 may operate to select a narrow wavelength portion 2408 of the light emitted by the broad spectrum light source 2402, and the selected portion 2408 of the light may illuminate the tissue 2410. The light reflected from the tissue 2412 may be directed to a detector 2416 (for example, by means of a dichroic mirror 2414) and the intensity of the reflected light 2412 may be recorded. The wavelength of the light illuminating the tissue 2410 may be selected by the dispersive element 2406. In some aspects, the tissue 2410 may be illuminated only by a single narrow wavelength portion 2408 selected by the dispersive element 2406 form the light source 2402. In other aspects, the tissue 2410 may be scanned with a variety of narrow wavelength portions 2408 selected by the dispersive element 2406. In this manner, a spectroscopic analysis of the tissue 2410 may be obtained over a range of NIR wavelengths.



FIG. 25 depicts schematically one example of instrumentation 2430 for determining NIRS based on Fourier transform infrared imaging. In FIG. 25, a laser source emitting 2432 light in the near IR range 2434 illuminates a tissue sample 2440. The light reflected 2436 by the tissue 2440 is reflected 2442 by a mirror, such as a dichroic mirror 2444, to a beam splitter 2446. The beam splitter 2446 directs one portion of the light 2448 reflected 2436 by the tissue 2440 to a stationary mirror 2450 and one portion of the light 2452 reflected 2436 by the tissue 2440 a moving mirror 2454. The moving mirror 2454 may oscillate in position based on an affixed piezoelectric transducer activated by a sinusoidal voltage having a voltage frequency. The position of the moving mirror 2454 in space corresponds to the frequency of the sinusoidal activation voltage of the piezoelectric transducer. The light reflected from the moving mirror and the stationary mirror may be recombined 2458 at the beam splitter 2446 and directed to a detector 2456. Computational components may receive the signal output of the detector 2456 and perform a Fourier transform (in time) of the received signal. Because the wavelength of the light received from the moving mirror 2454 varies in time with respect to the wavelength of the light received from the stationary mirror 2450, the time-based Fourier transform of the recombined light corresponds to a wavelength-based Fourier transform of the recombined light 2458. In this manner, a wavelength-based spectrum of the light reflected from the tissue 2440 may be determined and spectral characteristics of the light reflected 2436 from the tissue 2440 may be obtained. Changes in the absorbance of the illumination in spectral components from the light reflected from the tissue 2440 may thus indicate the presence or absence of tissue having specific light absorbing properties (such as hemoglobin).


An alternative to near infrared light to determine hemoglobin oxygenation would be the use of monochromatic red light to determine the red light absorbance characteristics of hemoglobin. The absorbance characteristics of red light having a central wavelength of about 660 nm by the hemoglobin may indicate if the hemoglobin is oxygenated (arterial blood) or deoxygenated (venous blood).


In some alternative surgical procedures, contrasting agents can be used to improve the data that is collected on oxygenation and tissue oxygen consumption. In one non-limiting example, NIRS techniques may be used in conjunction with a bolus injection of a near-IR contrast agent such as indocyanine green (ICG) which has a peak absorbance at about 800 nm. ICG has been used in some medical procedures to measure cerebral blood flow.


Vascular Imaging Using Laser Doppler Flowmetry

In one aspect, the characteristic of the light reflected and/or refracted from the surgical site may be a Doppler shift of the light wavelength from its illumination source.


Laser Doppler flowmetry may be used to visualize and characterized a flow of particles moving relative to an effectively stationary background. Thus, laser light scattered by moving particles, such as blood cells, may have a different wavelength than that of the original illuminating laser source. In contrast, laser light scattered by the effectively stationary background (for example, the vascular tissue) may have the same wavelength of that of the original illuminating laser source. The change in wavelength of the scattered light from the blood cells may reflect both the direction of the flow of the blood cells relative to the laser source as well as the blood cell velocity. FIGS. 26A-C illustrate the change in wavelength of light scattered from blood cells that may be moving away from (FIG. 26A) or towards (FIG. 26C) the laser light source.


In each of FIGS. 26A-C, the original illuminating light 2502 is depicted having a relative central wavelength of 0. It may be observed from FIG. 26A that light scattered from blood cells moving away from the laser source 2504 has a wavelength shifted by some amount 2506 to a greater wavelength relative to that of the laser source (and is thus red shifted). It may also be observed from FIG. 26C that light scattered from blood cells moving towards from the laser source 2508 has a wavelength shifted by some amount 2510 to a shorter wavelength relative to that of the laser source (and is thus blue shifted). The amount of wavelength shift (for example 2506 or 2510) may be dependent on the velocity of the motion of the blood cells. In some aspects, an amount of a red shift (2506) of some blood cells may be about the same as the amount of blue shift (2510) of some other blood cells. Alternatively, an amount of a red shift (2506) of some blood cells may differ from the amount of blue shift (2510) of some other blood cells Thus, the velocity of the blood cells flowing away from the laser source as depicted in FIG. 26A may be less than the velocity of the blood cells flowing towards the laser source as depicted in FIG. 26C based on the relative magnitude of the wavelength shifts (2506 and 2510). In contrast, and as depicted in FIG. 26B, light scattered from tissue not moving relative to the laser light source (for example blood vessels 2512 or non-vascular tissue 2514) may not demonstrate any change in wavelength.



FIG. 27 depicts an aspect of instrumentation 2530 that may be used to detect a Doppler shift in laser light scattered from portions of a tissue 2540. Light 2534 originating from a laser 2532 may pass through a beam splitter 2544. Some portion of the laser light 2536 may be transmitted by the beam splitter 2544 and may illuminate tissue 2540. Another portion of the laser light may be reflected 2546 by the beam splitter 2544 to impinge on a detector 2550. The light back-scattered 2542 by the tissue 2540 may be directed by the beam splitter 2544 and also impinge on the detector 2550. The combination of the light 2534 originating from the laser 2532 with the light back-scattered 2542 by the tissue 2540 may result in an interference pattern detected by the detector 2550. The interference pattern received by the detector 2550 may include interference fringes resulting from the combination of the light 2534 originating from the laser 2532 and the Doppler shifted (and thus wavelength shifted) light back-scattered 2452 from the tissue 2540.


It may be recognized that back-scattered light 2542 from the tissue 2540 may also include back scattered light from boundary layers within the tissue 2540 and/or wavelength-specific light absorption by material within the tissue 2540. As a result, the interference pattern observed at the detector 2550 may incorporate interference fringe features from these additional optical effects and may therefore confound the calculation of the Doppler shift unless properly analyzed.



FIG. 28 depicts some of these additional optical effects. It is well known that light traveling through a first optical medium having a first refractive index, n1, may be reflected at an interface with a second optical medium having a second refractive index, n2. The light transmitted through the second optical medium will have a transmission angle relative to the interface that differs from the angle of the incident light based on a difference between the refractive indices n1 and n2 (Snell's Law). FIG. 28 illustrates the effect of Snell's Law on light impinging on the surface of a multi-component tissue 2150, as may be presented in a surgical field. The multi-component tissue 2150 may be composed of an outer tissue layer 2152 having a refractive index n1 and a buried tissue, such as a blood vessel having a vessel wall 2156. The blood vessel wall 2156 may be characterized by a refractive index n2. Blood may flow within the lumen of the blood vessel 2160. In some aspects, it may be important during a surgical procedure to determine the position of the blood vessel 2160 below the surface 2154 of the outer tissue layer 2152 and to characterize the blood flow using Doppler shift techniques.


An incident laser light 2170a may be used to probe for the blood vessel 2160 and may be directed on the top surface 2154 of the outer tissue layer 2152. A portion 2172 of the incident laser light 2170a may be reflected at the top surface 2154. Another portion 2170b of the incident laser light 2170a may penetrate the outer tissue layer 2152. The reflected portion 2172 at the top surface 2154 of the outer tissue layer 2152 has the same path length of the incident light 2170a, and therefore has the same wavelength and phase of the incident light 2170a. However, the portion 2170b of light transmitted into the outer tissue layer 2152 will have a transmission angle that differs from the incidence angle of the light impinging on the tissue surface because the outer tissue layer 2152 has an index of refraction n1 that differs from the index of refraction of air.


If the portion of light transmitted through the outer tissue layer 2152 impinges on a second tissue surface 2158, for example of the blood vessel wall 2156, some portion 2174a,b of light will be reflected back towards the source of the incident light 2170a. The light thus reflected 2174a at the interface between the outer tissue layer 2152 and the blood vessel wall 2156 will have the same wavelength as the incident light 2170a, but will be phase shifted due to the change in the light path length. Projecting the light reflected 2174a,b from the interface between the outer tissue layer 2152 and the blood vessel wall 2156 along with the incident light on the sensor, will produce an interference pattern based on the phase difference between the two light sources.


Further, a portion of the incident light 2170c may be transmitted through the blood vessel wall 2156 and penetrate into the blood vessel lumen 2160. This portion of the incident light 2170c may interact with the moving blood cells in the blood vessel lumen 2160 and may be reflected back 2176a-c towards the source of the impinging light having a wavelength Doppler shifted according to the velocity of the blood cells, as disclosed above. The Doppler shifted light reflected 2176a-c from the moving blood cells may be projected along with the incident light on the sensor, resulting in an interference pattern having a fringe pattern based on the wavelength difference between the two light sources.


In FIG. 28, a light path 2178 is presented of light impinging on the red blood cells in the blood vessel lumen 2160 if there are no changes in refractive index between the emitted light and the light reflected by the moving blood cells. In this example, only a Doppler shift in the reflected light wavelength can be detected. However, the light reflected by the blood cells (2176a-c) may incorporate phase changes due to the variation in the tissue refractive indices in addition to the wavelength changes due to the Doppler Effect.


Thus, it may be understood that if the light sensor receives the incident light, the light reflected from one or more tissue interfaces (2172, and 2174a,b) and the Doppler shifted light from the blood cells (2176a-c), the interference pattern thus produced on the light sensor may include the effects due to the Doppler shift (change in wavelength) as well as the effects due to the change in refractive index within the tissue (change in phase). As a result, a Doppler analysis of the light reflected by the tissue sample may produce erroneous results if the effects due to changes in the refractive index within the sample are not compensated for.



FIG. 29 illustrates an example of the effects on a Doppler analysis of light that impinge 2250 on a tissue sample to determine the depth and location of an underlying blood vessel. If there is no intervening tissue between the blood vessel and the tissue surface, the interference pattern detected at the sensor may be due primarily to the change in wavelength reflected from the moving blood cells. As a result, a spectrum 2252 derived from the interference pattern may generally reflect only the Doppler shift of the blood cells. However, if there is intervening tissue between the blood vessel and the tissue surface, the interference pattern detected at the sensor may be due to a combination of the change in wavelength reflected from the moving blood cells and the phase shift due to the refractive index of the intervening tissue. A spectrum 2254 derived from such an interference pattern, may result in the calculation of the Doppler shift that is confounded due to the additional phase change in the reflected light. In some aspects, if information regarding the characteristics (thickness and refractive index) of the intervening tissue is known, the resulting spectrum 2256 may be corrected to provide a more accurate calculation of the change in wavelength.


It is recognized that the tissue penetration depth of light is dependent on the wavelength of the light used. Thus, the wavelength of the laser source light may be chosen to detect particle motion (such a blood cells) at a specific range of tissue depth. FIGS. 30A-C depict schematically a means for detect moving particles such as blood cells at a variety of tissue depths based on the laser light wavelength. As illustrated in FIG. 30A, a laser source 2340 may direct an incident beam of laser light 2342 onto a surface 2344 of a surgical site. A blood vessel 2346 (such as a vein or artery) may be disposed within the tissue 2348 at some depth δ from the tissue surface. The penetration depth 2350 of a laser into a tissue 2348 may be dependent at least in part on the laser wavelength. Thus, laser light having a wavelength in the red range of about 635 nm to about 660 nm, may penetrate the tissue 2351a to a depth of about 1 mm. Laser light having a wavelength in the green range of about 520 nm to about 532 nm may penetrate the tissue 2351b to a depth of about 2-3 mm. Laser light having a wavelength in the blue range of about 405 nm to about 445 nm may penetrate the tissue 2351c to a depth of about 4 mm or greater. In the example depicted in FIGS. 30A-C, a blood vessel 2346 may be located at a depth δ of about 2-3 mm below the tissue surface. Red laser light will not penetrate to this depth and thus will not detect blood cells flowing within this vessel. However, both green and blue laser light can penetrate this depth. Therefore, scattered green and blue laser light from the blood cells within the blood vessel 2346 may demonstrate a Doppler shift in wavelength.



FIG. 30B illustrates how a Doppler shift 2355 in the wavelength of reflected laser light may appear. The emitted light (or laser source light 2342) impinging on a tissue surface 2344 may have a central wavelength 2352. For example, light from a green laser may have a central wavelength 2352 within a range of about 520 nm to about 532 nm. The reflected green light may have a central wavelength 2354 shifted to a longer wavelength (red shifted) if the light was reflected from a particle such as a red blood cell that is moving away from the detector. The difference between the central wavelength 2352 of the emitted laser light and the central wavelength 2354 of the emitted laser light comprises the Doppler shift 2355.


As disclosed above with respect to FIGS. 28 and 29, laser light reflected from structures within a tissue 2348 may also show a phase shift in the reflected light due to changes in the index of refraction arising from changes in tissue structure or composition. The emitted light (or laser source light 2342) impinging on a tissue surface 2344 may have a first phase characteristic 2356. The reflected laser light may have a second phase characteristic 2358. It may be recognized that blue laser light that can penetrate tissue to a depth of about 4 mm or greater 2351c may encounter a greater variety of tissue structures than red laser light (about 1 mm 2351a) or green laser light (about 2-3 mm 2351b). Consequently, as illustrated in FIG. 30C, the phase shift 2358 of reflected blue laser light may be significant at least due to the depth of penetration.



FIG. 30D illustrates aspects of illuminating tissue by red 2360a, green 2360b and blue 2360c laser light in a sequential manner. In some aspects, a tissue may be probed by red 2360a, green 2360b and blue 2360c laser illumination in a sequential manner. In some alternative examples, one or more combinations of red 2360a, green 2360b, and blue 2360c laser light, as depicted in FIGS. 23D-23F and disclosed above, may be used to illuminate the tissue according to a defined illumination sequence. 30D illustrates the effect of such illumination on a CMOS imaging sensor 2362a-d over time. Thus, at a first time t1, the CMOS sensor 2362a may be illuminated by the red 2360a laser. At a second time t2 the CMOS sensor 2362b may be illuminated by the green 2360b laser. At a third time t3, the CMOS sensor 2362c may be illuminated by the blue 2360c laser. The illumination cycle may then be repeated starting at a fourth time t4 in which the CMOS sensor 2362d may be illuminated by the red 2360a laser again. It may be recognized that sequential illumination of the tissue by laser illumination at differing wavelengths may permit a Doppler analysis at varying tissue depths over time. Although red 2360a, green 2360b and blue 2360c laser sources may be used to illuminate the surgical site, it may be recognized that other wavelengths outside of visible light (such as in the infra red or ultraviolet regions) may be used to illuminate the surgical site for Doppler analysis.



FIG. 31 illustrates an example of a use of Doppler imaging to detect the present of blood vessels not otherwise viewable at a surgical site 2600. In FIG. 31, a surgeon may wish to excise a tumor 2602 found in the right superior posterior lobe 2604 of a lung. Because the lungs are highly vascular, care must be taken to identify only those blood vessels associate with the tumor and to seal only those vessels without compromising the blood flow to the non-affected portions of the lung. In FIG. 31, the surgeon has identified the margin 2606 of the tumor 2604. The surgeon may then cut an initial dissected area 2608 in the margin region 2606, and exposed blood vessels 2610 may be observed for cutting and sealing. The Doppler imaging detector 2620 may be used to locate and identify blood vessels not observable 2612 in the dissected area. An imaging system may receive data from the Doppler imaging detector 2620 for analysis and display of the data obtained from the surgical site 2600. In some aspects, the imaging system may include a display to illustrate the surgical site 2600 including a visible image of the surgical site 2600 along with an image overlay of the hidden blood vessels 2612 on the image of the surgical site 2600.


In the scenario disclosed above regarding FIG. 31, a surgeon wishes to sever blood vessels that supply oxygen and nutrients to a tumor while sparing blood vessels associated with non-cancerous tissue. Additionally, the blood vessels may be disposed at different depths in or around the surgical site 2600. The surgeon must therefore identify the position (depth) of the blood vessels as well as determine if they are appropriate for resection. FIG. 32 illustrates one method for identifying deep blood vessels based on a Doppler shift of light from blood cells flowing therethrough. As disclosed above, red laser light has a penetration depth of about 1 mm and green laser light has a penetration depth of about 2-3 mm. However, a blood vessel having a below-surface depth of 4 mm or more will be outside the penetration depths at these wavelengths. Blue laser light, however, can detect such blood vessels based on their blood flow.



FIG. 32 depicts the Doppler shift of laser light reflected from a blood vessel at a specific depth below a surgical site. The site may be illuminated by red laser light, green laser light, and blue laser light. The central wavelength 2630 of the illuminating light may be normalized to a relative central 3631. If the blood vessel lies at a depth of 4 or more mm below the surface of the surgical site, neither the red laser light nor the green laser light will be reflected by the blood vessel. Consequently, the central wavelength 2632 of the reflected red light and the central wavelength 2634 of the reflected green light will not differ much from the central wavelength 2630 of the illuminating red light or green light, respectively. However, if the site is illuminated by blue laser light, the central wavelength 2638 of the reflected blue light 2636 will differ from the central wavelength 2630 of the illuminating blue light. In some instances, the amplitude of the reflected blue light 2636 may also be significantly reduced from the amplitude of the illuminating blue light. A surgeon may thus determine the presence of a deep lying blood vessel along with its approximate depth, and thereby avoiding the deep blood vessel during surface tissue dissection.



FIGS. 33 and 34 illustrates schematically the use of laser sources having differing central wavelengths (colors) for determining the approximate depth of a blood vessel beneath the surface of a surgical site. FIG. 33 depicts a first surgical site 2650 having a surface 2654 and a blood vessel 2656 disposed below the surface 2654. In one method, the blood vessel 2656 may be identified based on a Doppler shift of light impinging on the flow 2658 of blood cells within the blood vessel 2656. The surgical site 2650 may be illuminated by light from a number of lasers 2670, 2676, 2682, each laser being characterized by emitting light at one of several different central wavelengths. As noted above, illumination by a red laser 2670 can only penetrate tissue by about 1 mm. Thus, if the blood vessel 2656 was located at a depth of less than 1 mm 2672 below the surface 2654, the red laser illumination would be reflected 2674 and a Doppler shift of the reflected red illumination 2674 may be determined. Further, as noted above, illumination by a green laser 2676 can only penetrate tissue by about 2-3 mm. If the blood vessel 2656 was located at a depth of about 2-3 mm 2678 below the surface 2654, the green laser illumination would be reflected 2680 while the red laser illumination 2670 would not, and a Doppler shift of the reflected green illumination 2680 may be determined. However, as depicted in FIG. 33, the blood vessel 2656 is located at a depth of about 4 mm 2684 below the surface 2654. Therefore, neither the red laser illumination 2670 nor the green laser illumination 2676 would be reflected. Instead, only the blue laser illumination would be reflected 2686 and a Doppler shift of the reflected blue illumination 2686 may be determined.


In contrast to the blood vessel 2656 depicted in FIG. 33, the blood vessel 2656′ depicted in FIG. 34 is located closer to the surface of the tissue at the surgical site. Blood vessel 2656′ may also be distinguished from blood vessel 2656 in that blood vessel 2656′ is illustrated to have a much thicker wall 2657. Thus, blood vessel 2656′ may be an example of an artery while blood vessel 2656 may be an example of a vein because arterial walls are known to be thicker than venous walls. In some examples, arterial walls may have a thickness of about 1.3 mm. As disclosed above, red laser illumination 2670′ can penetrate tissue to a depth of about 1 mm 2672′. Thus, even if a blood vessel 2656′ is exposed at a surgical site (see 2610 at FIG. 31), red laser light that is reflected 2674′ from the surface of the blood vessel 2656′, may not be able to visualize blood flow 2658′ within the blood vessel 2656′ under a Doppler analysis due to the thickness of the blood vessel wall 2657. However, as disclosed above, green laser light impinging 2676′ on the surface of a tissue may penetrate to a depth of about 2-3 mm 2678′. Further, blue laser light impinging 2682′ on the surface of a tissue may penetrate to a depth of about 4 mm 2684′. Consequently, green laser light may be reflected 2680′ from the blood cells flowing 2658′ within the blood vessel 2656′ and blue laser light may be reflected 2686′ from the blood cells flowing 2658′ within the blood vessel 2656′. As a result, a Doppler analysis of the reflected green light 2680′ and reflected blue light 2686′ may provide information regarding blood flow in near-surface blood vessel, especially the approximate depth of the blood vessel.


As disclosed above, the depth of blood vessels below the surgical site may be probed based on wavelength-dependent Doppler imaging. The amount of blood flow through such a blood vessel may also be determined by speckle contrast (interference) analysis. Doppler shift may indicate a moving particle with respect to a stationary light source. As disclosed above, the Doppler wavelength shift may be an indication of the velocity of the particle motion. Individual particles such as blood cells may not be separately observable. However, the velocity of each blood cell will produce a proportional Doppler shift. An interference pattern may be generated by the combination of the light back-scattered from multiple blood cells due to the differences in the Doppler shift of the back-scattered light from each of the blood cells. The interference pattern may be an indication of the number density of blood cells within a visualization frame. The interference pattern may be termed speckle contrast. Speckle contrast analysis may be calculated using a full frame 300×300 CMOS imaging array, and the speckle contrast may be directly related to the amount of moving particles (for example blood cells) interacting with the laser light over a given exposure period.


A CMOS image sensor may be coupled to a digital signal processor (DSP). Each pixel of the sensor may be multiplexed and digitized. The Doppler shift in the light may be analyzed by looking at the source laser light in comparison to the Doppler shifted light. A greater Doppler shift and speckle may be related to a greater number of blood cells and their velocity in the blood vessel.



FIG. 35 depicts an aspect of a composite visual display 2800 that may be presented a surgeon during a surgical procedure. The composite visual display 2800 may be constructed by overlaying a white light image 2830 of the surgical site with a Doppler analysis image 2850.


In some aspects, the white light image 2830 may portray the surgical site 2832, one or more surgical incisions 2834, and the tissue 2836 readily visible within the surgical incision 2834. The white light image 2830 may be generated by illuminating 2840 the surgical site 2832 with a white light source 2838 and receiving the reflected white light 2842 by an optical detector. Although a white light source 2838 may be used to illuminate the surface of the surgical site, in one aspect, the surface of the surgical site may be visualized using appropriate combinations of red 2854, green 2856, and blue 2858 laser light as disclosed above with respect to FIGS. 23C-23F.


In some aspects, the Doppler analysis image 2850 may include blood vessel depth information along with blood flow information 2852 (from speckle analysis). As disclosed above, blood vessel depth and blood flow velocity may be obtained by illuminating the surgical site with laser light of multiple wavelengths, and determining the blood vessel depth and blood flow based on the known penetration depth of the light of a particular wavelength. In general, the surgical site 2832 may be illuminated by light emitted by one or more lasers such as a red laser 2854, a green laser 2856, and a blue laser 2858. A CMOS detector 2872 may receive the light reflected back (2862, 2866, 2870) from the surgical site 2832 and its surrounding tissue. The Doppler analysis image 2850 may be constructed 2874 based on an analysis of the multiple pixel data from the CMOS detector 2872.


In one aspect, a red laser 2854 may emit red laser illumination 2860 on the surgical site 2832 and the reflected light 2862 may reveal surface or minimally subsurface structures. In one aspect, a green laser 2856 may emit green laser illumination 2864 on the surgical site 2832 and the reflected light 2866 may reveal deeper subsurface characteristics. In another aspect, a blue laser 2858 may emit blue laser illumination 2868 on the surgical site 2832 and the reflected light 2870 may reveal, for example, blood flow within deeper vascular structures. In addition, the speckle contrast analysis my present the surgeon with information regarding the amount and velocity of blood flow through the deeper vascular structures.


Although not depicted in FIG. 35, it may be understood that the imaging system may also illuminate the surgical site with light outside of the visible range. Such light may include infra red light and ultraviolet light. In some aspects, sources of the infra red light or ultraviolet light may include broad-band wavelength sources (such as a tungsten source, a tungsten-halogen source, or a deuterium source). In some other aspects, the sources of the infra red or ultraviolet light may include narrow-band wavelength sources (IR diode lasers, UV gas lasers or dye lasers).



FIG. 36 is a flow chart 2900 of a method for determining a depth of a surface feature in a piece of tissue. An image acquisition system may illuminate 2910 a tissue with a first light beam having a first central frequency and receive 2912 a first reflected light from the tissue illuminated by the first light beam. The image acquisition system may then calculate 2914 a first Doppler shift based on the first light beam and the first reflected light. The image acquisition system may then illuminate 2916 the tissue with a second light beam having a second central frequency and receive 2918 a second reflected light from the tissue illuminated by the second light beam. The image acquisition system may then calculate 2920 a second Doppler shift based on the second light beam and the second reflected light. The image acquisition system may then calculate 2922 a depth of a tissue feature based at least in part on the first central wavelength, the first Doppler shift, the second central wavelength, and the second Doppler shift. In some aspects, the tissue features may include the presence of moving particles, such as blood cells moving within a blood vessel, and a direction and velocity of flow of the moving particles. It may be understood that the method may be extended to include illumination of the tissue by any one or more additional light beams. Further, the system may calculate an image comprising a combination of an image of the tissue surface and an image of the structure disposed within the tissue.


In some aspects, multiple visual displays may be used. For example, a 3D display may provide a composite image displaying the combined white light (or an appropriate combination of red, green, and blue laser light) and laser Doppler image. Additional displays may provide only the white light display or a displaying showing a composite white light display and an NIRS display to visualize only the blood oxygenation response of the tissue. However, the NIRS display may not be required every cycle allowing for response of tissue.


Subsurface Tissue Characterization Using Multispectral OCT

During a surgical procedure, the surgeon may employ “smart” surgical devices for the manipulation of tissue. Such devices may be considered “smart” in that they include automated features to direct, control, and/or vary the actions of the devices based parameters relevant to their uses. The parameters may include the type and/or composition of the tissue being manipulated. If the type and/or composition of the tissue being manipulated is unknown, the actions of the smart devices may be inappropriate for the tissue being manipulated. As a result, tissues may be damaged or the manipulation of the tissue may be ineffective due to inappropriate settings of the smart device.


The surgeon may manually attempt to vary the parameters of the smart device in a trial-and-error manner, resulting in an inefficient and lengthy surgical procedure.


Therefore, it is desirable to have a surgical visualization system that can probe tissue structures underlying a surgical site to determine their structural and compositional characteristics, and to provide such data to smart surgical instruments being used in a surgical procedure.


Some aspects of the present disclosure further provide for a control circuit configured to control the illumination of a surgical site using one or more illumination sources such as laser light sources and to receive imaging data from one or more image sensors. In some aspects, the present disclosure provides for a non-transitory computer readable medium storing computer readable instructions that, when executed, cause a device to characterize structures below the surface at a surgical site and determine the depth of the structures below the surface of the tissue.


In some aspects, a surgical image acquisition system may comprise a plurality of illumination sources wherein each illumination source is configured to emit light having a specified central wavelength, a light sensor configured to receive a portion of the light reflected from a tissue sample when illuminated by the one or more of the plurality of illumination sources, and a computing system. The computing system may be configured to receive data from the light sensor when the tissue sample is illuminated by each of the plurality of illumination sources, calculate structural data related to a characteristic of a structure within the tissue sample based on the data received by the light sensor when the tissue sample is illuminated by each of the illumination sources, and transmit the structural data related to the characteristic of the structure to be received by a smart surgical device. In some aspects, the characteristic of the structure is a surface characteristic or a structure composition.


In one aspect, a surgical system may include multiple laser light sources and may receive laser light reflected from a tissue. The light reflected from the tissue may be used by the system to calculate surface characteristics of components disposed within the tissue. The characteristics of the components disposed within the tissue may include a composition of the components and/or a metric related to surface irregularities of the components.


In one aspect, the surgical system may transmit data related to the composition of the components and/or metrics related to surface irregularities of the components to a second instrument to be used on the tissue to modify the control parameters of the second instrument.


In some aspects, the second device may be an advanced energy device and the modifications of the control parameters may include a clamp pressure, an operational power level, an operational frequency, and a transducer signal amplitude.


As disclosed above, blood vessels may be detected under the surface of a surgical site base on the Doppler shift in light reflected by the blood cells moving within the blood vessels.


Laser Doppler flowmetry may be used to visualize and characterized a flow of particles moving relative to an effectively stationary background. Thus, laser light scattered by moving particles, such as blood cells, may have a different wavelength than that of the original illuminating laser source. In contrast, laser light scattered by the effectively stationary background (for example, the vascular tissue) may have the same wavelength of that of the original illuminating laser source. The change in wavelength of the scattered light from the blood cells may reflect both the direction of the flow of the blood cells relative to the laser source as well as the blood cell velocity. As previously disclosed, FIGS. 26A-C illustrate the change in wavelength of light scattered from blood cells that may be moving away from (FIG. 26A) or towards (FIG. 26C) the laser light source.


In each of FIGS. 26A-C, the original illuminating light 2502 is depicted having a relative central wavelength of 0. It may be observed from FIG. 26A that light scattered from blood cells moving away from the laser source 2504 has a wavelength shifted by some amount 2506 to a greater wavelength relative to that of the laser source (and is thus red shifted). It may also be observed from FIG. 24C that light scattered from blood cells moving towards from the laser source 2508 has a wavelength shifted by some amount 2510 to a shorter wavelength relative to that of the laser source (and is thus blue shifted). The amount of wavelength shift (for example 2506 or 2510) may be dependent on the velocity of the motion of the blood cells. In some aspects, an amount of a red shift (2506) of some blood cells may be about the same as the amount of blue shift (2510) of some other blood cells. Alternatively, an amount of a red shift (2506) of some blood cells may differ from the amount of blue shift (2510) of some other blood cells Thus, the velocity of the blood cells flowing away from the laser source as depicted in FIG. 24A may be less than the velocity of the blood cells flowing towards the laser source as depicted in FIG. 26C based on the relative magnitude of the wavelength shifts (2506 and 2510). In contrast, and as depicted in FIG. 26B, light scattered from tissue not moving relative to the laser light source (for example blood vessels 2512 or non-vascular tissue 2514) may not demonstrate any change in wavelength.


As previously disclosed, FIG. 27 depicts an aspect of instrumentation 2530 that may be used to detect a Doppler shift in laser light scattered from portions of a tissue 2540. Light 2534 originating from a laser 2532 may pass through a beam splitter 2544. Some portion of the laser light 2536 may be transmitted by the beam splitter 2544 and may illuminate tissue 2540. Another portion of the laser light may be reflected 2546 by the beam splitter 2544 to impinge on a detector 2550. The light back-scattered 2542 by the tissue 2540 may be directed by the beam splitter 2544 and also impinge on the detector 2550. The combination of the light 2534 originating from the laser 2532 with the light back-scattered 2542 by the tissue 2540 may result in an interference pattern detected by the detector 2550. The interference pattern received by the detector 2550 may include interference fringes resulting from the combination of the light 2534 originating from the laser 2532 and the Doppler shifted (and thus wavelength shifted) light back-scattered 2452 from the tissue 2540.


It may be recognized that back-scattered light 2542 from the tissue 2540 may also include back scattered light from boundary layers within the tissue 2540 and/or wavelength-specific light absorption by material within the tissue 2540. As a result, the interference pattern observed at the detector 2550 may incorporate interference fringe features from these additional optical effects and may therefore confound the calculation of the Doppler shift unless properly analyzed.


It may be recognized that light reflected from the tissue may also include back scattered light from boundary layers within the tissue and/or wavelength-specific light absorption by material within the tissue. As a result, the interference pattern observed at the detector may incorporate fringe features that may confound the calculation of the Doppler shift unless properly analyzed.


As previously disclosed, FIG. 28 depicts some of these additional optical effects. It is well known that light traveling through a first optical medium having a first refractive index, n1, may be reflected at an interface with a second optical medium having a second refractive index, n2. The light transmitted through the second optical medium will have a transmission angle relative to the interface that differs from the angle of the incident light based on a difference between the refractive indices n1 and n2 (Snell's Law). FIG. 26 illustrates the effect of Snell's Law on light impinging on the surface of a multi-component tissue 2150, as may be presented in a surgical field. The multi-component tissue 2150 may be composed of an outer tissue layer 2152 having a refractive index n1 and a buried tissue, such as a blood vessel having a vessel wall 2156. The blood vessel wall 2156 may be characterized by a refractive index n2. Blood may flow within the lumen of the blood vessel 2160. In some aspects, it may be important during a surgical procedure to determine the position of the blood vessel 2160 below the surface 2154 of the outer tissue layer 2152 and to characterize the blood flow using Doppler shift techniques.


An incident laser light 2170a may be used to probe for the blood vessel 2160 and may be directed on the top surface 2154 of the outer tissue layer 2152. A portion 2172 of the incident laser light 2170a may be reflected at the top surface 2154. Another portion 2170b of the incident laser light 2170a may penetrate the outer tissue layer 2152. The reflected portion 2172 at the top surface 2154 of the outer tissue layer 2152 has the same path length of the incident light 2170a, and therefore has the same wavelength and phase of the incident light 2170a. However, the portion 2170b of light transmitted into the outer tissue layer 2152 will have a transmission angle that differs from the incidence angle of the light impinging on the tissue surface because the outer tissue layer 2152 has an index of refraction n1 that differs from the index of refraction of air.


If the portion of light transmitted through the outer tissue layer 2152 impinges on a second tissue surface 2158, for example of the blood vessel wall 2156, some portion 2174a,b of light will be reflected back towards the source of the incident light 2170a. The light thus reflected 2174a at the interface between the outer tissue layer 2152 and the blood vessel wall 2156 will have the same wavelength as the incident light 2170a, but will be phase shifted due to the change in the light path length. Projecting the light reflected 2174a,b from the interface between the outer tissue layer 2152 and the blood vessel wall 2156 along with the incident light on the sensor, will produce an interference pattern based on the phase difference between the two light sources.


Further, a portion of the incident light 2170c may be transmitted through the blood vessel wall 2156 and penetrate into the blood vessel lumen 2160. This portion of the incident light 2170c may interact with the moving blood cells in the blood vessel lumen 2160 and may be reflected back 2176a-c towards the source of the impinging light having a wavelength Doppler shifted according to the velocity of the blood cells, as disclosed above. The Doppler shifted light reflected 2176a-c from the moving blood cells may be projected along with the incident light on the sensor, resulting in an interference pattern having a fringe pattern based on the wavelength difference between the two light sources.


In FIG. 28, a light path 2178 is presented of light impinging on the red blood cells in the blood vessel lumen 2160 if there are no changes in refractive index between the emitted light and the light reflected by the moving blood cells. In this example, only a Doppler shift in the reflected light wavelength can be detected. However, the light reflected by the blood cells (2176a-c) may incorporate phase changes due to the variation in the tissue refractive indices in addition to the wavelength changes due to the Doppler Effect.


Thus, it may be understood that if the light sensor receives the incident light, the light reflected from one or more tissue interfaces (2172, and 2174a,b) and the Doppler shifted light from the blood cells (2176a-c), the interference pattern thus produced on the light sensor may include the effects due to the Doppler shift (change in wavelength) as well as the effects due to the change in refractive index within the tissue (change in phase). As a result, a Doppler analysis of the light reflected by the tissue sample may produce erroneous results if the effects due to changes in the refractive index within the sample are not compensated for.


As previously disclosed, FIG. 29 illustrates an example of the effects on a Doppler analysis of light that impinge 2250 on a tissue sample to determine the depth and location of an underlying blood vessel. If there is no intervening tissue between the blood vessel and the tissue surface, the interference pattern detected at the sensor may be due primarily to the change in wavelength reflected from the moving blood cells. As a result, a spectrum 2252 derived from the interference pattern may generally reflect only the Doppler shift of the blood cells. However, if there is intervening tissue between the blood vessel and the tissue surface, the interference pattern detected at the sensor may be due to a combination of the change in wavelength reflected from the moving blood cells and the phase shift due to the refractive index of the intervening tissue. A spectrum 2254 derived from such an interference pattern, may result in the calculation of the Doppler shift that is confounded due to the additional phase change in the reflected light. In some aspects, if information regarding the characteristics (thickness and refractive index) of the intervening tissue is known, the resulting spectrum 2256 may be corrected to provide a more accurate calculation of the change in wavelength.


It may be recognized that the phase shift in the reflected light from a tissue may provide additional information regarding underlying tissue structures, regardless of Doppler effects.



FIG. 37 illustrates that the location and characteristics of non-vascular structures may be determined based on the phase difference between the incident light 2372 and the light reflected from the deep tissue structures (2374, 2376, 2378). As noted above, the penetration depth of light impinging on a tissue is dependent on the wavelength of the impinging illumination. Red laser light (having a wavelength in the range of about 635 nm to about 660 nm) may penetrate the tissue to a depth of about 1 mm. Green laser light (having a wavelength in the range of about 520 nm to about 532 nm) may penetrate the tissue to a depth of about 2-3 mm. Blue laser light (having a wavelength in the range of about 405 nm to about 445 nm) may penetrate the tissue to a depth of about 4 mm or greater. In one aspect, an interface 2381a between two tissues differing in refractive index that is located less than or about 1 mm below a tissue surface 2380 may reflect 2374 red, green, or blue laser light. The phase of the reflected light 2374 may be compared to the incident light 2372 and thus the difference in the refractive index of the tissues at the interface 2381a may be determined. In another aspect, an interface 2381b between two tissues differing in refractive index that is located between 2 and 3 mm 2381b below a tissue surface 2380 may reflect 2376 green or blue laser light, but not red light. The phase of the reflected light 2376 may be compared to the incident light 2372 and thus the difference in the refractive index of the tissues at the interface 2381b may be determined. In yet another aspect, an interface 2381c between two tissues differing in refractive index that is located between 3 and 4 mm 2381c below a tissue surface 2380 may reflect 2378 only blue laser light, but not red or green light. The phase of the reflected light 2378 may be compared to the incident light 2372 and thus the difference in the refractive index of the tissues at the interface 2381c may be determined.


A phase interference measure of a tissue illuminated by light having different wavelengths may therefore provide information regarding the relative indices of refraction of the reflecting tissue as well as the depth of the tissue. The indices of refraction of the tissue may be assessed using the multiple laser sources and their intensity, and thereby relative indices of refraction may be calculated for the tissue. It is recognized that different tissues may have different refractive indices. For example, the refractive index may be related to the relative composition of collagen and elastin in a tissue or the amount of hydration of the tissue. Therefore, a technique to measure relative tissue index of refraction may result in the identification of a composition of the tissue.


In some aspects, smart surgical instruments include algorithms to determine parameters associated with the function of the instruments. One non-limiting example of such parameters may be the pressure of an anvil against a tissue for a smart stapling device. The amount of pressure of an anvil against a tissue may depend on the type and composition of the tissue. For example, less pressure may be required to staple a highly compressive tissue, while a greater amount of pressure may be required to stable a more non-compressive tissue. Another non-limiting example of a parameter associated with a smart surgical device may include a rate of firing of an i-beam knife to cut the tissue. For example, a stiff tissue may require more force and a slower cutting rate than a less stiff tissue. Another non-limiting example of such parameters may be the amount of current provided to an electrode in a smart cauterizing or RF sealing device. Tissue composition, such as percent tissue hydration, may determine an amount of current necessary to heat seal the tissue. Yet another non-limiting example of such parameters may be the amount of power provided to an ultrasonic transducer of a smart ultrasound cutting device or the driving frequency of the cutting device. A stiff tissue may require more power for cutting, and contact of the ultrasonic cutting tool with a stiff tissue may shift the resonance frequency of the cutter.


It may be recognized that a tissue visualization system that can identify tissue type and depth may provide such data to one or more smart surgical devices. The identification and location data may then be used by the smart surgical devices to adjust one or more of their operating parameters thereby allowing them to optimize their manipulation of the tissue. It may be understood that an optical method to characterize a type of tissue may permit automation of the operating parameters of the smart surgical devices. Such automation of the operation of smart surgical instruments may be preferable to relying on human estimation to determine the operational parameters of the instruments.


In one aspect, Optical Coherence Tomography (OCT) is a technique that can visual subsurface tissue structures based on the phase difference between an illuminating light source, and light reflected from structures located within the tissue. FIG. 38 depicts schematically one example of instrumentation 2470 for Optical Coherence Tomography. In FIG. 38, a laser source 2472 may emit light 2482 according to any optical wavelength of interest (red, green, blue, infrared, or ultraviolet). The light 2482 may be directed to a beam splitter 2486. The beam splitter 2486 directs one portion of the light 2488 to a tissue sample 2480. The beam splitter 2486 may also direct a portion of the light 2492 to a stationary reference mirror 2494. The light reflected from the tissue sample 2480 and from the stationary mirror 2494 may be recombined 2498 at the beam splitter 2486 and directed to a detector 2496. The phase difference between the light from the reference mirror 2494 and from the tissue sample 2480 may be detected at the detector 2496 as an interference pattern. Appropriate computing devices may then calculate phase information from the interference pattern. Additional computation may then provide information regarding structures below the surface of the tissue sample. Additional depth information may also be obtained by comparing the interference patterns generated from the sample when illuminated at different wavelengths of laser light.


As disclosed above, depth information regarding subsurface tissue structures may be ascertained from a combination of laser light wavelength and the phase of light reflected from a deep tissue structure. Additionally, local tissue surface inhomogeneity may be ascertained by comparing the phase as well as amplitude difference of light reflected from different portions of the same sub-surface tissues. Measurements of a difference in the tissue surface properties at a defined location compared to those at a neighboring location may be indicative of adhesions, disorganization of the tissue layers, infection, or a neoplasm in the tissue being probed.



FIG. 39 illustrates this effect. The surface characteristics of a tissue determine the angle of reflection of light impinging on the surface. A smooth surface 2551a reflects the light essentially with the same spread 2544 as the light impinging on the surface 2542 (specular reflection). Consequently, the amount of light received by a light detector having a known fixed aperture may effectively receive the entire amount of light reflected 2544 from the smooth surface 2551a. However, increased surface roughness at a tissue surface may result in an increase spread in the reflected light with respect to the incident light (diffuse reflection).


Some amount of the reflected light 2546 from a tissue surface having some amount of surface irregularities 2551b will fall outside the fixed aperture of the light detector due to the increased spread of the reflected light 2546. As a result, the light detector will detect less light (shown in FIG. 39 as a decrease in the amplitude of the reflected light signal 2546). It may be understood that the amount of reflected light spread will increase as the surface roughness of a tissue increases. Thus, as depicted in FIG. 39, the amplitude of light reflected 2548 from a surface 2551c having significant surface roughness may have a smaller amplitude than the light reflected 2544 from a smooth surface 2551a, or light reflected 2546 form a surface having only a moderate amount of surface roughness 2551b. Therefore, in some aspects, a single laser source may be used to investigate the quality of a tissue surface or subsurface by comparing the optical properties of reflected light from the tissue with the optical properties of reflected light from adjacent surfaces.


In other aspects, light from multiple laser sources (for example, lasers emitting light having different central wavelengths) may be used sequentially to probe tissue surface characteristics at a variety of depths below the surface 2550. As disclosed above (with reference to FIG. 37), the absorbance profile of a laser light in a tissue is dependent on the central wavelength of the laser light. Laser light having a shorter (more blue) central wavelength can penetrate tissue deeper than laser light having a longer (more red) central wavelength. Therefore, measurements related to light diffuse reflection made at different light wavelengths can indicate both an amount of surface roughness as well as the depth of the surface being measured.



FIG. 40 illustrates one method of displaying image processing data related to a combination of tissue visualization modalities. Data used in the display may be derived from image phase data related to tissue layer composition, image intensity (amplitude) data related to tissue surface features, and image wavelength data related to tissue mobility (such as blood cell transport) as well as tissue depth. As one example, light emitted by a laser in the blue optical region 2562 may impinge on blood flowing at a depth of about 4 mm below the surface of the tissue. The reflected light 2564 may be red shifted due to the Doppler effect of the blood flow. As a result, information may be obtained regarding the existence of a blood vessel and its depth below the surface.


In another example, a layer of tissue may lie at a depth of about 2-3 mm below the surface of the surgical site. This tissue may include surface irregularities indicative of scarring or other pathologies. Emitted red light 2572 may not penetrate to the 2-3 mm depth, so consequently, the reflected red light 2580 may have about the same amplitude of the emitted red light 2572 because it is unable to probe structures more than 1 mm below the top surface of the surgical site. However, green light reflected from the tissue 2578 may reveal the existence of the surface irregularities at that depth in that the amplitude of the reflected green light 2578 may be less than the amplitude of the emitted green light 2570. Similarly, blue light reflected from the tissue 2574 may reveal the existence of the surface irregularities at that depth in that the amplitude of the reflected blue light 2574 may be less than the amplitude of the emitted blue light 2562. In one example of an image processing step, the image 2582 may be smoothed using a moving window filter 2584 to reduce inter-pixel noise as well as reduce small local tissue anomalies 2586 that may hide more important features 2588.



FIGS. 41A-C illustrate several aspects of displays that may be provided to a surgeon for a visual identification of surface and sub-surface structures of a tissue in a surgical site. FIG. 41A may represent a surface map of the surgical site with color coding to indicate structures located at varying depths below the surface of the surgical site. FIG. 41B depicts an example of one of several horizontal slices through the tissue at varying depths, which may be color coded to indicate depth and further include data associated with differences in tissue surface anomalies (for example, as displayed in a 3D bar graph). FIG. 41C depicts yet another visual display in which surface irregularities as well as Doppler shift flowmetry data may indicate sub-surface vascular structures as well as tissue surface characteristics.



FIG. 42 is a flow chart 2950 of a method for providing information related to a characteristic of a tissue to a smart surgical instrument. An image acquisition system may illuminate 2960 a tissue with a first light beam having a first central frequency and receive 2962 a first reflected light from the tissue illuminated by the first light beam. The image acquisition system may then calculate 2964 a first tissue surface characteristic at a first depth based on the first emitted light beam and the first reflected light from the tissue. The image acquisition system may then illuminate 2966 the tissue with a second light beam having a second central frequency and receive 2968 a second reflected light from the tissue illuminated by the second light beam. The image acquisition system may then calculate 2970 a second tissue surface characteristic at a second depth based on the second emitted light beam and the second reflected light from the tissue. Tissue features that may include a tissue type, a tissue composition, and a tissue surface roughness metric may be determined from the first central light frequency, the second central light frequency, the first reflected light from the tissue, and the second reflected light from the tissue. The tissue characteristic may be used to calculate 2972 one or more parameters related to the function of a smart surgical instrument such as jaw pressure, power to effect tissue cauterization, or current amplitude and/or frequency to drive a piezoelectric actuator to cut a tissue. In some additional examples, the parameter may be transmitted 2974 either directly or indirectly to the smart surgical instrument which may modify its operating characteristics in response to the tissue being manipulated.


Multifocal Minimally Invasive Camera

In a minimally invasive procedure, e.g., laparoscopic, a surgeon may visualize the surgical site using imaging instruments including a light source and a camera. The imaging instruments may allow the surgeon to visualize the end effector of a surgical device during the procedure. However, the surgeon may need to visualize tissue away from the end effector to prevent unintended damage during the surgery. Such distant tissue may lie outside the field of view of the camera system when focused on the end effector. The imaging instrument may be moved in order to change the field of view of the camera, but it may be difficult to return the camera system back to its original position after being moved.


The surgeon may attempt to move the imaging system within the surgical site to visualize different portions of the site during the procedure. Repositioning of the imaging system is time consuming and the surgeon is not guaranteed to visualize the same field of view of the surgical site when the imaging system is returned to its original location.


It is therefore desirable to have a medical imaging visualization system that can provide multiple fields of view of the surgical site without the need to reposition the visualization system. Medical imaging devices include, without limitation, laparoscopes, endoscopes, thoracoscopes, and the like, as described herein. In some aspects, a single display system may display each of the multiple fields of view of the surgical site at about the same time. The display of each of the multiple fields of view may be independently updated depending on a display control system composed of one or more hardware modules, one or more software modules, one or more firmware modules, or any combination or combinations thereof.


Some aspects of the present disclosure further provide for a control circuit configured to control the illumination of a surgical site using one or more illumination sources such as laser light sources and to receive imaging data from one or more image sensors. In some aspects, the control circuit may be configured to control the operation of one or more light sensor modules to adjust a field of view. In some aspects, the present disclosure provides for a non-transitory computer readable medium storing computer readable instructions that, when executed, cause a device to adjust one or more components of the one or more light sensor modules and to process an image from each of the one or more light sensor modules.


An aspect of a minimally invasive image acquisition system may comprise a plurality of illumination sources wherein each illumination source is configured to emit light having a specified central wavelength, a first light sensing element having a first field of view and configured to receive illumination reflected from a first portion of the surgical site when the first portion of the surgical site is illuminated by at least one of the plurality of illumination sources, a second light sensing element having a second field of view and configured to receive illumination reflected from a second portion of the surgical site when the second portion of the surgical site is illuminated by at least one of the plurality of illumination sources, wherein the second field of view overlaps at least a portion of the first field of view; and a computing system.


The computing system may be configured to receive data from the first light sensing element, receive data from the second light sensing element, compute imaging data based on the data received from the first light sensing element and the data received from the second light sensing element, and transmit the imaging data for receipt by a display system.


A variety of surgical visualization systems have been disclosed above. Such systems provide for visualizing tissue and sub-tissue structures that may be encountered during one or more surgical procedures. Non-limiting examples of such systems may include: systems to determine the location and depth of subsurface vascular tissue such as veins and arteries; systems to determine an amount of blood flowing through the subsurface vascular tissue; systems to determine the depth of non-vascular tissue structures; systems to characterize the composition of such non-vascular tissue structures; and systems to characterize one or more surface characteristics of such tissue structures.


It may be recognized that a single surgical visualization system may incorporate components of any one or more of these visualization modalities. FIGS. 22A-D depict some examples of such a surgical visualization system 2108.


As disclosed above, in one non-limiting aspect, a surgical visualization system 2108 may include an imaging control unit 2002 and a hand unit 2020. The hand unit 2020 may include a body 2021, a camera scope cable 2015 attached to the body 2021, and an elongated camera probe 2024. The elongated camera probe 2024 may also terminate at its distal end with at least one window. In some non-limiting examples, a light sensor 2030 may be incorporated in the hand unit 2020, for example either in the body of the hand unit 2032b, or at a distal end 2032a of the elongated camera probe, as depicted in FIG. 22C. The light sensor 2030 may be fabricated using a CMOS sensor array or a CCD sensor array. As illustrated in FIG. 23C, a typical CMOS or CCD sensor array may generate an RGB (red-green-blue) image from light impinging on a mosaic of sensor elements, each sensor element having one of a red, green, or blue optical filter.


Alternatively, the illumination of the surgical site may be cycled among visible illumination sources as depicted in FIG. 30D. In some example, the illumination sources may include any one or more of a red laser 2360a, a green laser 2360b, or a blue laser 2360c. In some non-limiting examples, a red laser 2360a light source may source illumination having a peak wavelength that may range between 635 nm and 660 nm, inclusive. Non-limiting examples of a red laser peak wavelength may include about 635 nm, about 640 nm, about 645 nm, about 650 nm, about 655 nm, about 660 nm, or any value or range of values therebetween. In some non-limiting examples, a green laser 2360b light source may source illumination having a peak wavelength that may range between 520 nm and 532 nm, inclusive. Non-limiting examples of a red laser peak wavelength may include about 520 nm, about 522 nm, about 524 nm, about 526 nm, about 528 nm, about 530 nm, about 532 nm, or any value or range of values therebetween. In some non-limiting examples, the blue laser 2360c light source may source illumination having a peak wavelength that may range between 405 nm and 445 nm, inclusive. Non-limiting examples of a blue laser peak wavelength may include about 405 nm, about 410 nm, about 415 nm, about 420 nm, about 425 nm, about 430 nm, about 435 nm, about 440 nm, about 445 nm, or any value or range of values therebetween.


Additionally, illumination of the surgical site may be cycled to include non-visible illumination sources that may supply infra red or ultraviolet illumination. In some non-limiting examples, an infra red laser light source may source illumination having a peak wavelength that may range between 750 nm and 3000 nm, inclusive. Non-limiting examples of an infra red laser peak wavelength may include about 750 nm, about 1000 nm, about 1250 nm, about 1500 nm, about 1750 nm, about 2000 nm, about 2250 nm, about 2500 nm, about 2750 nm, 3000 nm, or any value or range of values therebetween. In some non-limiting examples, an ultraviolet laser light source may source illumination having a peak wavelength that may range between 200 nm and 360 nm, inclusive. Non-limiting examples of an ultraviolet laser peak wavelength may include about 200 nm, about 220 nm, about 240 nm, about 260 nm, about 280 nm, about 300 nm, about 320 nm, about 340 nm, about 360 nm, or any value or range of values therebetween.


The outputs of the sensor array under the different illumination wavelengths may be combined to form the RGB image, for example, if the illumination cycle time is sufficiently fast and the laser light is in the visible range. FIGS. 43A and 43B illustrate a multi-pixel light sensor receiving by light reflected by a tissue illuminated, for example, by sequential exposure to red, green, blue, infra red, (FIG. 43A) or red, green, blue, and ultraviolet laser light sources (FIG. 43B).



FIG. 44A depicts the distal end of a flexible elongated camera probe 2120 having a flexible camera probe shaft 2122 and a single light sensor module 2124 disposed at the distal end 2123 of the flexible camera probe shaft 2122. In some non-limiting examples, the flexible camera probe shaft 2122 may have an outer diameter of about 5 mm. The outer diameter of the flexible camera probe shaft 2122 may depend on geometric factors that may include, without limitation, the amount of allowable bend in the shaft at the distal end 2123. As depicted in FIG. 44A, the distal end 2123 of the flexible camera probe shaft 2122 may bend about 90° with respect to a longitudinal axis of an un-bent portion of the flexible camera probe shaft 2122 located at a proximal end of the elongated camera probe 2120. It may be recognized that the distal end 2123 of the flexible camera probe shaft 2122 may bend any appropriate amount as may be required for its function. Thus, as non-limiting examples, the distal end 2123 of the flexible camera probe shaft 2122 may bend any amount between about 0° and about 90°. Non-limiting examples of the bend angle of the distal end 2123 of the flexible camera probe shaft 2122 may include about 0°, about 10°, about 20°, about 30°, about 40°, about 50°, about 60°, about 70°, about 80°, about 90°, or any value or range of values therebetween. In some examples, the bend angle of the distal end 2123 of the flexible camera probe shaft 2122 may be set by a surgeon or other health care professional prior to or during a surgical procedure. In some other example, the bend angle of the distal end 2123 of the flexible camera probe shaft 2122 may be a fixed angle set at a manufacturing site.


The single light sensor module 2124 may receive light reflected from the tissue when illuminated by light emitted by one or more illumination sources 2126 disposed at the distal end of the elongated camera probe. In some examples, the light sensor module 2124 may be a 4 mm sensor module such as 4 mm mount 2136b, as depicted in FIG. 22D. It may be recognized that the light sensor module 2124 may have any appropriate size for its intended function. Thus, the light sensor module 2124 may include a 5.5 mm mount 2136a, a 2.7 mm mount 2136c, or a 2 mm mount 2136d as depicted in FIG. 22D.


It may be recognized that the one or more illumination sources 2126 may include any number of illumination sources 2126 including, without limitation, one illumination source, two illumination sources, three illumination sources, four illumination sources, or more than four illumination sources. It may be further understood that each illumination source may source illumination having any central wavelength including a central red illumination wavelength, a central green illumination wavelength, a central blue illumination wavelength, a central infrared illumination wavelength, a central ultraviolet illumination wavelength, or any other wavelength. In some examples, the one or more illumination sources 2126 may include a white light source, which may illuminate tissue with light having wavelengths that may span the range of optical white light from about 390 nm to about 700 nm.



FIG. 44B depicts the distal end 2133 of an alternative elongated camera probe 2130 having multiple light sensor modules, for example the two light sensor modules 2134a,b, each disposed at the distal end 2133 of the elongated camera probe 2130. In some non-limiting examples, the alternative elongated camera probe 2130 may have an outer diameter of about 7 mm. In some examples, the light sensor modules 2134a,b may each comprise a 4 mm sensor module, similar to light sensor module 2124 in FIG. 44A. Alternatively, each of the light sensor modules 2134a,b may comprise a 5.5 mm light sensor module, a 2.7 mm light sensor module, or a 2 mm light sensor module as depicted in FIG. 22D. In some examples, both light sensor modules 2134a,b may have the same size. In some examples, the light sensor modules 2134a,b may have different sizes. As one non-limiting example, an alternative elongated camera probe 2130 may have a first 4 mm light sensor and two additional 2 mm light sensors. In some aspects, a visualization system may combine the optical outputs from the multiple light sensor modules 2134a,b to form a 3D or quasi-3D image of the surgical site. In some other aspects, the outputs of the multiple light sensor modules 2134a,b may be combined in such a manner as to enhance the optical resolution of the surgical site, which may not be otherwise practical with only a single light sensor module.


Each of the multiple light sensor modules 2134a,b may receive light reflected from the tissue when illuminated by light emitted by one or more illumination sources 2136a,b disposed at the distal end 2133 of the alternative elongated camera probe 2130. In some non-limiting examples, the light emitted by all of the illumination sources 2136a,b may be derived from the same light source (such as a laser). In other non-limiting examples, the illumination sources 2136a surrounding a first light sensor module 2134a may emit light at a first wavelength and the illumination sources 2136b surrounding a second light sensor module 2134b may emit light at a second wavelength. It may be further understood that each illumination source 2136a,b may source illumination having any central wavelength including a central red illumination wavelength, a central green illumination wavelength, a central blue illumination wavelength, a central infrared illumination wavelength, a central ultraviolet illumination wavelength, or any other wavelength. In some examples, the one or more illumination sources 2136a,b may include a white light source, which may illuminate tissue with light having wavelengths that may span the range of optical white light from about 390 nm to about 700 nm.


In some additional aspects, the distal end 2133 of the alternative elongated camera probe 2130 may include one or more working channels 2138. Such working channels 2138 may be in fluid communication with an aspiration port of a device to aspirate material from the surgical site, thereby permitting the removal of material that may potentially obscure the field of view of the light sensor modules 2134a,b. Alternatively, such working channels 2138 may be in fluid communication with an fluid source port of a device to provide a fluid to the surgical site, to flush debris or material away from the surgical site. Such fluids may be used to clear material from the field of view of the light sensor modules 2134a,b.



FIG. 44C depicts a perspective view of an aspect of a monolithic sensor 2160 having a plurality of pixel arrays for producing a three dimensional image in accordance with the teachings and principles of the disclosure. Such an implementation may be desirable for three dimensional image capture, wherein the two pixel arrays 2162 and 2164 may be offset during use. In another implementation, a first pixel array 2162 and a second pixel array 2164 may be dedicated to receiving a predetermined range of wave lengths of electromagnetic radiation, wherein the first pixel array 2162 is dedicated to a different range of wave length electromagnetic radiation than the second pixel array 2164.


Additional disclosures regarding a dual sensor array may be found in U.S. Patent Application Publication No. 2014/0267655, titled SUPER RESOLUTION AND COLOR MOTION ARTIFACT CORRECTION IN A PULSED COLOR IMAGING SYSTEM, filed on Mar. 14, 2014, which issued on May 2, 2017 as U.S. Pat. No. 9,641,815, the contents thereof being incorporated by reference herein in its entirety and for all purposes.


In some aspects, a light sensor module may comprise a multi-pixel light sensor such as a CMOS array in addition to one or more additional optical elements such as a lens, a reticle, and a filter.


In some alternative aspects, the one or more light sensors may be located within the body 2021 of the hand unit 2020. Light reflected from the tissue may be acquired at a light receiving surface of one or more optical fibers at the distal end of the elongated camera probe 2024. The one or more optical fibers may conduct the light from the distal end of the elongated camera probe 2024 to the one or more light sensors, or to additional optical elements housed in the body of the hand unit 2020 or in the imaging control unit 2002. The additional optical elements may include, without limitation, one or more dichroic mirrors, one or more reference mirrors, one or more moving mirrors, and one or more beam splitters and/or combiners, and one or more optical shutters. In such alternative aspects, the light sensor module may include any one or more of a lens, a reticle and a filter, disposed at the distal end of the elongated camera probe 2024.


Images obtained from each of the multiple light sensors for example 2134a,b may be combined or processed in several different manners, either in combination or separately, and then displayed in a manner to allow a surgeon to visualize different aspects of the surgical site.


In one non-limiting example, each light sensor may have an independent field of view. In some additional examples, the field of view of a first light sensor may partially or completely overlap the field of view of a second light sensor.


As disclosed above, an imaging system may include a hand unit 2020 having an elongated camera probe 2024 with one or more light sensor modules 2124, 2134a,b disposed at its distal end 2123, 2133. As an example, the elongated camera probe 2024 may have two light sensor modules 2134a,b, although it may be recognized that there may be three, four, five, or more light sensor modules at the distal end of the elongated camera probe 2024. Although FIGS. 45 and 46A-D depict examples of the distal end of an elongated camera probe having two light sensor modules, it may be recognized that the description of the operation of the light sensor modules is not limited to solely two light sensor modules. As depicted in FIGS. 45, and 46A-D, the light sensor modules may include an image sensor, such as a CCD or CMOS sensor that may be composed of an array of light sensing elements (pixels). The light sensor modules may also include additional optical elements, such as lenses. Each lens may be adapted to provide a field of view for the light sensor of the respective light sensor module.



FIG. 45 depicts a generalized view of a distal end 2143 of an elongated camera probe having multiple light sensor modules 2144a,b. Each light sensor module 2144a,b may be composed of a CCD or CMOS sensor and one or more optical elements such as filters, lenses, shutters, and similar. In some aspects, the components of the light sensor modules 2144a,b may be fixed within the elongated camera probe. In some other aspects, one or more of the components of the light sensor modules 2144a,b may be adjustable. For example, the CCD or CMOS sensor of a light sensor module 2144a,b may be mounted on a movable mount to permit automated adjustment of the center 2145a,b of a field of view 2147a,b of the CCD or CMOS sensor. In some other aspects, the CCD or CMOS sensor may be fixed, but a lens in each light sensor modules 2144a,b may be adjustable to change the focus. In some aspects, the light sensor modules 2144a,b may include adjustable irises to permit changes in the visual aperture of the sensor modules 2144a,b.


As depicted in FIG. 45, each of the sensor modules 2144a,b may have a field of view 2147a,b having an acceptance angle. As depicted in FIG. 45, the acceptance angle for each sensor modules 2144a,b may have an acceptance angle of greater than 90°. In some examples, the acceptance angle may be about 100°. In some examples, the acceptance angle may be about 120°. In some examples, if the sensor modules 2144a,b have an acceptance angle of greater than 90° (for example, 100°), the fields of view 2147a and 2147b may form an overlap region 2150a,b. In some aspects, an optical field of view having an acceptance angle of 100° or greater may be called a “fish-eyed” field of view. A visualization system control system associated with such an elongated camera probe may include computer readable instructions that may permit the display of the overlap region 2150a,b in such a manner so that the extreme curvature of the overlapping fish-eyed fields of view is corrected, and a sharpened and flattened image may be displayed. In FIG. 45, the overlap region 2150a may represent a region wherein the overlapping fields of view 2147a,b of the sensor modules 2144a,b have their respective centers 2145a,b directed in a forward direction. However, if any one or more components of the sensor modules 2144a,b is adjustable, it may be recognized that the overlap region 2150b may be directed to any attainable angle within the fields of view 2147a,b of the sensor modules 2144a,b.



FIGS. 46A-D depict a variety of examples of an elongated light probe having two light sensor modules 2144a,b with a variety of fields of view. The elongated light probe may be directed to visualize a surface 2152 of a surgical site.


In FIG. 46A, the first light sensor module 2144a has a first sensor field of view 2147a of a tissue surface 2154a, and the second light sensor module 2144b has a second sensor field of view 2147b of a tissue surface 2154b. As depicted in FIG. 46A, the first field of view 2147a and the second field of view 2147b have approximately the same angle of view. Additionally, the first sensor field of view 2147a is adjacent to but does not overlap the second sensor field of view 2147b. The image received by the first light sensor module 2144a may be displayed separately from the image received by the second light sensor module 2144b, or the images may be combined to form a single image. In some non-limiting examples, the angle of view of a lens associated with the first light sensor module 2144a and the angle of view of a lens associated with the second light sensor module 2144b may be somewhat narrow, and image distortion may not be great at the periphery of their respective images. Therefore, the images may be easily combined edge to edge.


As depicted in FIG. 46B, the first field of view 2147a and the second field of view 2147b have approximately the same angular field of view, and the first sensor field of view 2147a overlaps completely the second sensor field of view 2147b. This may result in a first sensor field of view 2147a of a tissue surface 2154a being identical to the view of a tissue surface 2154b as obtained by the second light sensor module 2144b from the second sensor field 2147b of view. This configuration may be useful for applications in which the image from the first light sensor module 2144a may be processed differently than the image from the second light sensor module 2144b. The information in the first image may complement the information in the second image and refer to the same portion of tissue.


As depicted in FIG. 46C, the first field of view 2147a and the second field of view 2147b have approximately the same angular field of view, and the first sensor field of view 2147a partially overlaps the second sensor field of view 2147b. In some non-limiting examples, a lens associated with the first light sensor module 2144a and a lens associated with the second light sensor module 2144b may be wide angle lenses. These lenses may permit the visualization of a wider field of view than that depicted in FIG. 46A. Wide angle lenses are known to have significant optical distortion at their periphery. Appropriate image processing of the images obtained by the first light sensor module 2144a and the second light sensor module 2144b may permit the formation of a combined image in which the central portion of the combined image is corrected for any distortion induced by either the first lens or the second lens. It may be understood that a portion of the first sensor field of view 2147a of a tissue surface 2154a may thus have some distortion due to the wide angle nature of a lens associated with the first light sensor module 2144a and a portion of the second sensor field of view 2147b of a tissue surface 2154b may thus have some distortion due to the wide angle nature of a lens associated with the second light sensor module 2144b. However, a portion of the tissue viewed in the overlap region 2150′ of the two light sensor modules 2144a,b may be corrected for any distortion induced by either of the light sensor modules 2144a,b. The configuration depicted in FIG. 46C may be useful for applications in which it is desired to have a wide field of view of the tissue around a portion of a surgical instrument during a surgical procedure. In some examples, lenses associated with each light sensor module 2144a,b may be independently controllable, thereby controlling the location of the overlap region 2150′ of view within the combined image.


As depicted in FIG. 46D, the first light sensor module 2144a may have a first angular field of view 2147a that is wider than the second angular field of view 2147b of the second light sensor module 2144b. In some non-limiting examples, the second sensor field of view 2147b may be totally disposed within the first sensor field of view 2147a. In alternative examples, the second sensor field of view may lie outside of or tangent to the wide angle field of view 2147a of the first sensor 2144a. A display system that may use the configuration depicted in FIG. 46D may display a wide angle portion of tissue 2154a imaged by the first sensor module 2144a along with a magnified second portion of tissue 2154b imaged by the second sensor module 2144b and located in an overlap region 2150″ of the first field of view 2147a and the second field of view 2147b. This configuration may be useful to present a surgeon with a close-up image of tissue proximate to a surgical instrument (for example, imbedded in the second portion of tissue 2154b) and a wide-field image of the tissue surrounding the immediate vicinity of the medical instrument (for example, the proximal first portion of tissue 2154a). In some non-limiting examples, the image presented by the narrower second field of view 2147b of the second light sensor module 2144b may be a surface image of the surgical site. In some additional examples, the image presented in the first wide field view 2147a of the first light sensor module 2144a may include a display based on a hyperspectral analysis of the tissue visualized in the wide field view.



FIGS. 47A-C illustrate an example of the use of an imaging system incorporating the features disclosed in FIG. 46D. FIG. 47A illustrates schematically a proximal view 2170 at the distal end of the elongated camera probe depicting the light sensor arrays 2172a,b of the two light sensor modules 2174a,b. A first light sensor module 2174a may include a wide angle lens, and the second light sensor module 2174b may include a narrow angle lens. In some aspects, the second light sensor module 2174b may have a narrow aperture lens. In other aspects, the second light sensor module 2174b may have a magnifying lens. The tissue may be illuminated by the illumination sources disposed at the distal end of the elongated camera probe. The light sensor arrays 2172′ (either light sensor array 2172a or 2172b, or both 2172a and 2172b) may receive the light reflected from the tissue upon illumination. The tissue may be illuminated by light from a red laser source, a green laser source, a blue laser source, an infra red laser source, and/or an ultraviolet laser source. In some aspects, the light sensor arrays 2172′ may sequentially receive the red laser light 2175a, green laser light 2175b, blue laser light 2175c, infrared laser light 2175d, and the ultra-violet laser light 2175e. The tissue may be illuminated by any combination of such laser sources simultaneously, as depicted in FIGS. 23E and 23F. Alternatively, the illuminating light may be cycled among any combination of such laser sources, as depicted for example in FIG. 23D, and FIGS. 43A and 43B.



FIG. 47B schematically depicts a portion of lung tissue 2180 which may contain a tumor 2182. The tumor 2182 may be in communication with blood vessels including one or more veins 2184 and/or arteries 2186. In some surgical procedures, the blood vessels (veins 2184 and arteries 2186) associated with the tumor 2182 may require resection and/or cauterization prior to the removal of the tumor.



FIG. 47C illustrates the use of a dual imaging system as disclosed above with respect to FIG. 47A. The first light sensor module 2174a may acquire a wide angle image of the tissue surrounding a blood vessel 2187 to be severed with a surgical knife 2190. The wide angle image may permit the surgeon to verify the blood vessel to be severed 2187. In addition, the second light sensor module 2174b may acquire a narrow angle image of the specific blood vessel 2187 to be manipulated. The narrow angle image may show the surgeon the progress of the manipulation of the blood vessel 2187. In this manner, the surgeon is presented with the image of the vascular tissue to be manipulated as well as its environs to assure that the correct blood vessel is being manipulated.



FIGS. 48A and 48B depict another example of the use of a dual imaging system. FIG. 48A depicts a primary surgical display providing an image of a section of a surgical site. The primary surgical display may depict a wide view image 2800 of a section of intestine 2802 along with its vasculature 2804. The wide view image 2800 may include a portion of the surgical field 2809 that may be separately displayed as a magnified view 2810 in a secondary surgical display (FIG. 48B). As disclosed above with respect to surgery to remove a tumor from a lung (FIGS. 47A-C), it may be necessary to dissect blood vessels supplying a tumor 2806 before removing the cancerous tissue. The vasculature 2804 supplying the intestines 2802 is complex and highly ramified. It may necessary to determine which blood vessels supply the tumor 2806 and to identify blood vessels supplying blood to healthy intestinal tissue. The wide view image 2800 permits a surgeon to determine which blood vessel may supply the tumor 2806. The surgeon may then test a blood vessel using a clamping device 2812 to determine if the blood vessel supplies the tumor 2806 or not.



FIG. 48B depicts a secondary surgical display that may only display a narrow magnified view image 2810 of one portion of the surgical field 2809. The narrow magnified view image 2810 may present a close-up view of the vascular tree 2814 so that the surgeon can focus on dissecting only the blood vessel of interest 2815. For resecting the blood vessel of interest 2815, a surgeon may use a smart RF cautery device 2816. It may be understood that any image obtained by the visualization system may include not only images of the tissue in the surgical site but also images of the surgical instruments inserted therein. In some aspects, such a surgical display (either the primary display in FIG. 48A or the secondary display in FIG. 48B) may also include indicia 2817 related to functions or settings of any surgical device used during the surgical procedure. For example, the indicia 2817 may include a power setting of the smart RF cautery device 2816. In some aspects, such smart medical devices may transmit data related to their operating parameters to the visualization system to incorporate in display data to be transmitted to one or more display devices.



FIGS. 49A-C illustrate examples of a sequence of surgical steps for the removal of an intestinal/colon tumor and which may benefit from the use of multi-image analysis at the surgical site. FIG. 49A depicts a portion of the surgical site, including the intestines 2932 and the ramified vasculature 2934 supplying blood and nutrients to the intestines 2932. The intestines 2932 may have a tumor 2936 surrounded by a tumor margin 2937. A first light sensor module of a visualization system may have a wide field of view 2930, and it may provide imaging data of the wide field of view 2930 to a display system. A second light sensor module of the visualization system may have a narrow or standard field of view 2940, and it may provide imaging data of the narrow field of view 2940 to the display system. In some aspects, the wide field image and the narrow field image may be displayed by the same display device. In another aspect, the wide field image and the narrow field image may be displayed by separate display devices.


During the surgical procedure, it my be important to remove not just the tumor 2936 but the margin 2937 surrounding it to assure complete removal of the tumor. A wide angle field of view 2930 may be used to image both the vasculature 2934 as well as the section of the intestines 2932 surrounding the tumor 2936 and the margin 2637. As noted above, the vasculature feeding the tumor 2936 and the margin 2637 should be removed, but the vasculature feeding the surrounding intestinal tissue must be preserved to provide oxygen and nutrients to the surrounding tissue. Transection of the vasculature feeding the surrounding colon tissue will remove oxygen and nutrients from the tissue, leading to necrosis. In some examples, laser Doppler imaging of the tissue visualized in the wide angle field 2630 may be analyzed to provide a speckle contrast analysis 2933, indicating the blood flow within the intestinal tissue.



FIG. 49B illustrates a step during the surgical procedure. The surgeon may be uncertain which part of the vascular tree supplies blood to the tumor 2936. The surgeon may test a blood vessel 2944 to determine if it feeds the tumor 2936 or the healthy tissue. The surgeon may clamp a blood vessel 2944 with a clamping device 2812 and determine the section of the intestinal tissue 2943 that is no longer perfused by means of the speckle contrast analysis. The narrow field of view 2940 displayed on an imaging device may assist the surgeon in the close-up and detailed work required to visualize the single blood vessel 2944 to be tested. When the suspected blood vessel 2944 is clamped, a portion of the intestinal tissue 2943 is determined to lack perfusion based on the Doppler imaging speckle contras analysis. As depicted in FIG. 29B, the suspected blood vessel 2944 does not supply blood to the tumor 2935 or the tumor margin 2937, and therefore is recognized as a blood vessel to be spared during the surgical procedure.



FIG. 49C depicts a following stage of the surgical procedure. In stage, a supply blood vessel 2984 has been identified to supply blood to the margin 2937 of the tumor. When this supply blood vessel 2984 has been severed, blood is no longer supplied to a section of the intestine 2987 that may include at least a portion of the margin 2937 of the tumor 2936. In some aspects, the lack of perfusion to the section 2987 of the intestines may be determined by means of a speckle contrast analysis based on a Doppler analysis of blood flow into the intestines. The non-perfused section 2987 of the intestines may then be isolated by a seal 2985 applied to the intestine. In this manner, only those blood vessels perfusing the tissue indicated for surgical removal may be identified and sealed, thereby sparing healthy tissue from unintended surgical consequences.


In some additional aspects, a surgical visualization system may permit imaging analysis of the surgical site.


In some aspects, the surgical site may be inspected for the effectiveness of surgical manipulation of a tissue. Non-limiting examples of such inspection may include the inspection of surgical staples or welds used to seal tissue at a surgical site. Cone beam coherent tomography using one or more illumination sources may be used for such methods.


In some additional aspects, an image of a surgical site may have landmarks denoted in the image. In some examples, the landmarks may be determined through image analysis techniques. In some alternative examples, the landmarks may be denoted through a manual intervention of the image by the surgeon.


In some additional aspects, non-smart ready visualizations methods may be imported for used in Hub image fusion techniques.


In additional aspects, instruments that are not integrated in the Hub system may be identified and tracked during their use within the surgical site. In this aspect, computational and/or storage components of the Hub or in any of its components (including, for example, in the cloud system) may include a database of images related to EES and competitive surgical instruments that are identifiable from one or more images acquired through any image acquisition system or through visual analytics of such alternative instruments. The imaging analysis of such devices may further permit identification of when an instrument is replaced with a different instrument to do the same or a similar job. The identification of the replacement of an instrument during a surgical procedure may provide information related to when an instrument is not doing the job or a failure of the device.


Situational Awareness

Situational awareness is the ability of some aspects of a surgical system to determine or infer information related to a surgical procedure from data received from databases and/or instruments. The information can include the type of procedure being undertaken, the type of tissue being operated on, or the body cavity that is the subject of the procedure. With the contextual information related to the surgical procedure, the surgical system can, for example, improve the manner in which it controls the modular devices (e.g. a robotic arm and/or robotic surgical tool) that are connected to it and provide contextualized information or suggestions to the surgeon during the course of the surgical procedure.


Referring now to FIG. 50, a timeline 5200 depicting situational awareness of a hub, such as the surgical hub 106 or 206, for example, is depicted. The timeline 5200 is an illustrative surgical procedure and the contextual information that the surgical hub 106, 206 can derive from the data received from the data sources at each step in the surgical procedure. The timeline 5200 depicts the typical steps that would be taken by the nurses, surgeons, and other medical personnel during the course of a lung segmentectomy procedure, beginning with setting up the operating theater and ending with transferring the patient to a post-operative recovery room.


The situationally aware surgical hub 106, 206 receives data from the data sources throughout the course of the surgical procedure, including data generated each time medical personnel utilize a modular device that is paired with the surgical hub 106, 206. The surgical hub 106, 206 can receive this data from the paired modular devices and other data sources and continually derive inferences (i.e., contextual information) about the ongoing procedure as new data is received, such as which step of the procedure is being performed at any given time. The situational awareness system of the surgical hub 106, 206 is able to, for example, record data pertaining to the procedure for generating reports, verify the steps being taken by the medical personnel, provide data or prompts (e.g., via a display screen) that may be pertinent for the particular procedural step, adjust modular devices based on the context (e.g., activate monitors, adjust the field of view (FOV) of the medical imaging device, or change the energy level of an ultrasonic surgical instrument or RF electrosurgical instrument), and take any other such action described above.


As the first step 5202 in this illustrative procedure, the hospital staff members retrieve the patient's EMR from the hospital's EMR database. Based on select patient data in the EMR, the surgical hub 106, 206 determines that the procedure to be performed is a thoracic procedure.


Second step 5204, the staff members scan the incoming medical supplies for the procedure. The surgical hub 106, 206 cross-references the scanned supplies with a list of supplies that are utilized in various types of procedures and confirms that the mix of supplies corresponds to a thoracic procedure. Further, the surgical hub 106, 206 is also able to determine that the procedure is not a wedge procedure (because the incoming supplies either lack certain supplies that are necessary for a thoracic wedge procedure or do not otherwise correspond to a thoracic wedge procedure).


Third step 5206, the medical personnel scan the patient band via a scanner that is communicably connected to the surgical hub 106, 206. The surgical hub 106, 206 can then confirm the patient's identity based on the scanned data.


Fourth step 5208, the medical staff turns on the auxiliary equipment. The auxiliary equipment being utilized can vary according to the type of surgical procedure and the techniques to be used by the surgeon, but in this illustrative case they include a smoke evacuator, insufflator, and medical imaging device. When activated, the auxiliary equipment that are modular devices can automatically pair with the surgical hub 106, 206 that is located within a particular vicinity of the modular devices as part of their initialization process. The surgical hub 106, 206 can then derive contextual information about the surgical procedure by detecting the types of modular devices that pair with it during this pre-operative or initialization phase. In this particular example, the surgical hub 106, 206 determines that the surgical procedure is a VATS procedure based on this particular combination of paired modular devices. Based on the combination of the data from the patient's EMR, the list of medical supplies to be used in the procedure, and the type of modular devices that connect to the hub, the surgical hub 106, 206 can generally infer the specific procedure that the surgical team will be performing. Once the surgical hub 106, 206 knows what specific procedure is being performed, the surgical hub 106, 206 can then retrieve the steps of that procedure from a memory or from the cloud and then cross-reference the data it subsequently receives from the connected data sources (e.g., modular devices and patient monitoring devices) to infer what step of the surgical procedure the surgical team is performing.


Fifth step 5210, the staff members attach the EKG electrodes and other patient monitoring devices to the patient. The EKG electrodes and other patient monitoring devices are able to pair with the surgical hub 106, 206. As the surgical hub 106, 206 begins receiving data from the patient monitoring devices, the surgical hub 106, 206 thus confirms that the patient is in the operating theater.


Sixth step 5212, the medical personnel induce anesthesia in the patient. The surgical hub 106, 206 can infer that the patient is under anesthesia based on data from the modular devices and/or patient monitoring devices, including EKG data, blood pressure data, ventilator data, or combinations thereof, for example. Upon completion of the sixth step 5212, the pre-operative portion of the lung segmentectomy procedure is completed and the operative portion begins.


Seventh step 5214, the patient's lung that is being operated on is collapsed (while ventilation is switched to the contralateral lung). The surgical hub 106, 206 can infer from the ventilator data that the patient's lung has been collapsed, for example. The surgical hub 106, 206 can infer that the operative portion of the procedure has commenced as it can compare the detection of the patient's lung collapsing to the expected steps of the procedure (which can be accessed or retrieved previously) and thereby determine that collapsing the lung is the first operative step in this particular procedure.


Eighth step 5216, the medical imaging device (e.g., a scope) is inserted and video from the medical imaging device is initiated. The surgical hub 106, 206 receives the medical imaging device data (i.e., video or image data) through its connection to the medical imaging device. Upon receipt of the medical imaging device data, the surgical hub 106, 206 can determine that the laparoscopic portion of the surgical procedure has commenced. Further, the surgical hub 106, 206 can determine that the particular procedure being performed is a segmentectomy, as opposed to a lobectomy (note that a wedge procedure has already been discounted by the surgical hub 106, 206 based on data received at the second step 5204 of the procedure). The data from the medical imaging device 124 (FIG. 2) can be utilized to determine contextual information regarding the type of procedure being performed in a number of different ways, including by determining the angle at which the medical imaging device is oriented with respect to the visualization of the patient's anatomy, monitoring the number or medical imaging devices being utilized (i.e., that are activated and paired with the surgical hub 106, 206), and monitoring the types of visualization devices utilized. For example, one technique for performing a VATS lobectomy places the camera in the lower anterior corner of the patient's chest cavity above the diaphragm, whereas one technique for performing a VATS segmentectomy places the camera in an anterior intercostal position relative to the segmental fissure. Using pattern recognition or machine learning techniques, for example, the situational awareness system can be trained to recognize the positioning of the medical imaging device according to the visualization of the patient's anatomy. As another example, one technique for performing a VATS lobectomy utilizes a single medical imaging device, whereas another technique for performing a VATS segmentectomy utilizes multiple cameras. As yet another example, one technique for performing a VATS segmentectomy utilizes an infrared light source (which can be communicably coupled to the surgical hub as part of the visualization system) to visualize the segmental fissure, which is not utilized in a VATS lobectomy. By tracking any or all of this data from the medical imaging device, the surgical hub 106, 206 can thereby determine the specific type of surgical procedure being performed and/or the technique being used for a particular type of surgical procedure.


Ninth step 5218, the surgical team begins the dissection step of the procedure. The surgical hub 106, 206 can infer that the surgeon is in the process of dissecting to mobilize the patient's lung because it receives data from the RF or ultrasonic generator indicating that an energy instrument is being fired. The surgical hub 106, 206 can cross-reference the received data with the retrieved steps of the surgical procedure to determine that an energy instrument being fired at this point in the process (i.e., after the completion of the previously discussed steps of the procedure) corresponds to the dissection step. In certain instances, the energy instrument can be an energy tool mounted to a robotic arm of a robotic surgical system.


Tenth step 5220, the surgical team proceeds to the ligation step of the procedure. The surgical hub 106, 206 can infer that the surgeon is ligating arteries and veins because it receives data from the surgical stapling and cutting instrument indicating that the instrument is being fired. Similarly to the prior step, the surgical hub 106, 206 can derive this inference by cross-referencing the receipt of data from the surgical stapling and cutting instrument with the retrieved steps in the process. In certain instances, the surgical instrument can be a surgical tool mounted to a robotic arm of a robotic surgical system.


Eleventh step 5222, the segmentectomy portion of the procedure is performed. The surgical hub 106, 206 can infer that the surgeon is transecting the parenchyma based on data from the surgical stapling and cutting instrument, including data from its cartridge. The cartridge data can correspond to the size or type of staple being fired by the instrument, for example. As different types of staples are utilized for different types of tissues, the cartridge data can thus indicate the type of tissue being stapled and/or transected. In this case, the type of staple being fired is utilized for parenchyma (or other similar tissue types), which allows the surgical hub 106, 206 to infer that the segmentectomy portion of the procedure is being performed.


Twelfth step 5224, the node dissection step is then performed. The surgical hub 106, 206 can infer that the surgical team is dissecting the node and performing a leak test based on data received from the generator indicating that an RF or ultrasonic instrument is being fired. For this particular procedure, an RF or ultrasonic instrument being utilized after parenchyma was transected corresponds to the node dissection step, which allows the surgical hub 106, 206 to make this inference. It should be noted that surgeons regularly switch back and forth between surgical stapling/cutting instruments and surgical energy (i.e., RF or ultrasonic) instruments depending upon the particular step in the procedure because different instruments are better adapted for particular tasks. Therefore, the particular sequence in which the stapling/cutting instruments and surgical energy instruments are used can indicate what step of the procedure the surgeon is performing. Moreover, in certain instances, robotic tools can be utilized for one or more steps in a surgical procedure and/or handheld surgical instruments can be utilized for one or more steps in the surgical procedure. The surgeon(s) can alternate between robotic tools and handheld surgical instruments and/or can use the devices concurrently, for example. Upon completion of the twelfth step 5224, the incisions are closed up and the post-operative portion of the procedure begins.


Thirteenth step 5226, the patient's anesthesia is reversed. The surgical hub 106, 206 can infer that the patient is emerging from the anesthesia based on the ventilator data (i.e., the patient's breathing rate begins increasing), for example.


Lastly, the fourteenth step 5228 is that the medical personnel remove the various patient monitoring devices from the patient. The surgical hub 106, 206 can thus infer that the patient is being transferred to a recovery room when the hub loses EKG, BP, and other data from the patient monitoring devices. As can be seen from the description of this illustrative procedure, the surgical hub 106, 206 can determine or infer when each step of a given surgical procedure is taking place according to data received from the various data sources that are communicably coupled to the surgical hub 106, 206.


Situational awareness is further described in U.S. Provisional Patent Application Ser. No. 62/611,341, titled INTERACTIVE SURGICAL PLATFORM, filed Dec. 28, 2017, the disclosure of which is incorporated by reference herein in its entirety. In certain instances, operation of a robotic surgical system, including the various robotic surgical systems disclosed herein, for example, can be controlled by the hub 106, 206 based on its situational awareness and/or feedback from the components thereof and/or based on information from the cloud 102.


Various aspects of the subject matter described herein are set out in the following numbered examples.


Example 1

A surgical image acquisition system comprising: a plurality of illumination sources wherein each illumination source is configured to emit light having a specified central wavelength; a light sensor configured to receive a portion of the light reflected from a tissue sample when illuminated by the one or more of the plurality of illumination sources; and a computing system, wherein the computing system is configured to: receive data from the light sensor when the tissue sample is illuminated by each of the plurality of illumination sources; calculate structural data related to a characteristic of a structure within the tissue sample based on the data received by the light sensor when the tissue sample is illuminated by each of the illumination sources; and transmit the structural data related to the characteristic of the structure to be received by a smart surgical device, wherein the characteristic of the structure is a surface characteristic or a structure composition.


Example 2

The surgical image acquisition system of any one of Example 1, wherein the plurality of illumination sources comprises at least one of a red light illumination source, a green light illumination source, and a blue light illumination source.


Example 3

The surgical image acquisition system of any one of Examples 1-2, wherein the plurality of illumination sources comprises at least one of an infrared light illumination source and an ultraviolet light illumination source.


Example 4

The surgical image acquisition system of any one of Examples 1-3, wherein the computing system, configured to calculate structural data related to a characteristic of a structure within the tissue, comprises a computing system configured to calculate structural data related to a composition of a structure within the tissue.


Example 5

The surgical image acquisition system of any one of Examples 1-4, wherein the computing system, configured to calculate structural data related to a characteristic of a structure within the tissue, comprises a computing system configured to calculate structural data related to a surface roughness of a structure within the tissue.


Example 6

A surgical image acquisition system comprising: a processor; and a memory coupled to the processor, the memory storing instructions executable by the processor to: control the operation of a plurality of illumination sources of a tissue sample wherein each illumination source is configured to emit light having a specified central wavelength; receive data from the light sensor when the tissue sample is illuminated by each of the plurality of illumination sources; calculate structural data related to a characteristic of a structure within the tissue sample based on the data received by the light sensor when the tissue sample is illuminated by each of the illumination sources; and transmit the structural data related to the characteristic of the structure to be received by a smart surgical device, wherein the characteristic of the structure is a surface characteristic or a structure composition.


Example 7

The surgical image acquisition system of any one of Example 6, wherein the instructions executable by the processor to control the operation of a plurality of illumination sources comprise one or more instructions to illuminate the tissue sample sequentially by each of the plurality of illumination sources.


Example 8

The surgical image acquisition system of any one of Example 6 through Example 7 wherein the instructions executable by the processor to calculate structural data related to a characteristic of a structure within the tissue sample based on the data received by the light sensor comprise one or more instructions to calculate structural data related to a characteristic of a structure within the tissue sample based on a phase shift in the illumination reflected by the tissue sample.


Example 9

The surgical image acquisition system of any one of Examples 6-8, wherein the structure composition comprises a relative composition of collagen and elastin in a tissue.


Example 10

The surgical image acquisition system of any one of Examples 6-9, wherein the structure composition comprises an amount of hydration of a tissue.


Example 11

A surgical image acquisition system comprising: a control circuit configured to: control the operation of a plurality of illumination sources of a tissue sample wherein each illumination source is configured to emit light having a specified central wavelength; receive data from the light sensor when the tissue sample is illuminated by each of the plurality of illumination sources; calculate structural data related to a characteristic of a structure within the tissue sample based on the data received by the light sensor when the tissue sample is illuminated by each of the illumination sources; and transmit the structural data related to the characteristic of the structure to be received by a smart surgical device, wherein the characteristic of the structure is a surface characteristic or a structure composition.


Example 12

The surgical image acquisition system of any one of Example 11, wherein the control circuit is configured to transmit the structural data related to the characteristic of the structure to be received by a smart surgical device wherein the smart surgical device is a smart surgical stapler.


Example 13

The surgical image acquisition system of any one of Example 12, wherein the control circuit is further configured to transmit data related to an anvil pressure based on the characteristic of the structure to be received by the smart surgical stapler.


Example 14

The surgical image acquisition system of any one of Examples 11-13, wherein the control circuit is configured to transmit the structural data related to the characteristic of the structure to be received by a smart surgical device wherein the smart surgical device is a smart surgical RF sealing device.


Example 15

The surgical image acquisition system of any one of Example 14, wherein the control circuit is further configured to transmit data related to an amount of RF power based on the characteristic of the structure to be received by the smart RF sealing device.


Example 16

The surgical image acquisition system of any one of Examples 11-15, wherein the control circuit is configured to transmit the structural data related to the characteristic of the structure to be received by a smart surgical device wherein the smart surgical device is a smart ultrasound cutting device.


Example 17

The surgical image acquisition system of any one of Example 16, wherein the control circuit is further configured to transmit data related to an amount of power provided to an ultrasonic transducer or a driving frequency of the ultrasonic transducer based on the characteristic of the structure to be received by the ultrasound cutting device.


Example 18

A non-transitory computer readable medium storing computer readable instructions which, when executed, causes a machine to: control the operation of a plurality of illumination sources of a tissue sample wherein each illumination source is configured to emit light having a specified central wavelength; receive data from the light sensor when the tissue sample is illuminated by each of the plurality of illumination sources; calculate structural data related to a characteristic of a structure within the tissue sample based on the data received by the light sensor when the tissue sample is illuminated by each of the illumination sources; and transmit the structural data related to the characteristic of the structure to be received by a smart surgical device, wherein the characteristic of the structure is a surface characteristic or a structure composition.


While several forms have been illustrated and described, it is not the intention of the applicant to restrict or limit the scope of the appended claims to such detail. Numerous modifications, variations, changes, substitutions, combinations, and equivalents to those forms may be implemented and will occur to those skilled in the art without departing from the scope of the present disclosure. Moreover, the structure of each element associated with the described forms can be alternatively described as a means for providing the function performed by the element. Also, where materials are disclosed for certain components, other materials may be used. It is therefore to be understood that the foregoing description and the appended claims are intended to cover all such modifications, combinations, and variations as falling within the scope of the disclosed forms. The appended claims are intended to cover all such modifications, variations, changes, substitutions, modifications, and equivalents.


The foregoing detailed description has set forth various forms of the devices and/or processes via the use of block diagrams, flowcharts, and/or examples. Insofar as such block diagrams, flowcharts, and/or examples contain one or more functions and/or operations, it will be understood by those within the art that each function and/or operation within such block diagrams, flowcharts, and/or examples can be implemented, individually and/or collectively, by a wide range of hardware, software, firmware, or virtually any combination thereof. Those skilled in the art will recognize that some aspects of the forms disclosed herein, in whole or in part, can be equivalently implemented in integrated circuits, as one or more computer programs running on one or more computers (e.g., as one or more programs running on one or more computer systems), as one or more programs running on one or more processors (e.g., as one or more programs running on one or more microprocessors), as firmware, or as virtually any combination thereof, and that designing the circuitry and/or writing the code for the software and or firmware would be well within the skill of one of skill in the art in light of this disclosure. In addition, those skilled in the art will appreciate that the mechanisms of the subject matter described herein are capable of being distributed as one or more program products in a variety of forms, and that an illustrative form of the subject matter described herein applies regardless of the particular type of signal bearing medium used to actually carry out the distribution.


Instructions used to program logic to perform various disclosed aspects can be stored within a memory in the system, such as dynamic random access memory (DRAM), cache, flash memory, or other storage. Furthermore, the instructions can be distributed via a network or by way of other computer readable media. Thus a machine-readable medium may include any mechanism for storing or transmitting information in a form readable by a machine (e.g., a computer), but is not limited to, floppy diskettes, optical disks, compact disc, read-only memory (CD-ROMs), and magneto-optical disks, read-only memory (ROMs), random access memory (RAM), erasable programmable read-only memory (EPROM), electrically erasable programmable read-only memory (EEPROM), magnetic or optical cards, flash memory, or a tangible, machine-readable storage used in the transmission of information over the Internet via electrical, optical, acoustical or other forms of propagated signals (e.g., carrier waves, infrared signals, digital signals, etc.). Accordingly, the non-transitory computer-readable medium includes any type of tangible machine-readable medium suitable for storing or transmitting electronic instructions or information in a form readable by a machine (e.g., a computer).


As used in any aspect herein, the term “control circuit” may refer to, for example, hardwired circuitry, programmable circuitry (e.g., a computer processor comprising one or more individual instruction processing cores, processing unit, processor, microcontroller, microcontroller unit, controller, digital signal processor (DSP), programmable logic device (PLD), programmable logic array (PLA), or field programmable gate array (FPGA)), state machine circuitry, firmware that stores instructions executed by programmable circuitry, and any combination thereof. The control circuit may, collectively or individually, be embodied as circuitry that forms part of a larger system, for example, an integrated circuit (IC), an application-specific integrated circuit (ASIC), a system on-chip (SoC), desktop computers, laptop computers, tablet computers, servers, smart phones, etc. Accordingly, as used herein “control circuit” includes, but is not limited to, electrical circuitry having at least one discrete electrical circuit, electrical circuitry having at least one integrated circuit, electrical circuitry having at least one application specific integrated circuit, electrical circuitry forming a general purpose computing device configured by a computer program (e.g., a general purpose computer configured by a computer program which at least partially carries out processes and/or devices described herein, or a microprocessor configured by a computer program which at least partially carries out processes and/or devices described herein), electrical circuitry forming a memory device (e.g., forms of random access memory), and/or electrical circuitry forming a communications device (e.g., a modem, communications switch, or optical-electrical equipment). Those having skill in the art will recognize that the subject matter described herein may be implemented in an analog or digital fashion or some combination thereof.


As used in any aspect herein, the term “logic” may refer to an app, software, firmware and/or circuitry configured to perform any of the aforementioned operations. Software may be embodied as a software package, code, instructions, instruction sets and/or data recorded on non-transitory computer readable storage medium. Firmware may be embodied as code, instructions or instruction sets and/or data that are hard-coded (e.g., nonvolatile) in memory devices.


As used in any aspect herein, the terms “component,” “system,” “module” and the like can refer to a computer-related entity, either hardware, a combination of hardware and software, software, or software in execution.


As used in any aspect herein, an “algorithm” refers to a self-consistent sequence of steps leading to a desired result, where a “step” refers to a manipulation of physical quantities and/or logic states which may, though need not necessarily, take the form of electrical or magnetic signals capable of being stored, transferred, combined, compared, and otherwise manipulated. It is common usage to refer to these signals as bits, values, elements, symbols, characters, terms, numbers, or the like. These and similar terms may be associated with the appropriate physical quantities and are merely convenient labels applied to these quantities and/or states.


A network may include a packet switched network. The communication devices may be capable of communicating with each other using a selected packet switched network communications protocol. One example communications protocol may include an Ethernet communications protocol which may be capable permitting communication using a Transmission Control Protocol/Internet Protocol (TCP/IP). The Ethernet protocol may comply or be compatible with the Ethernet standard published by the Institute of Electrical and Electronics Engineers (IEEE) titled “IEEE 802.3 Standard”, published in December, 2008 and/or later versions of this standard. Alternatively or additionally, the communication devices may be capable of communicating with each other using an X.25 communications protocol. The X.25 communications protocol may comply or be compatible with a standard promulgated by the International Telecommunication Union-Telecommunication Standardization Sector (ITU-T). Alternatively or additionally, the communication devices may be capable of communicating with each other using a frame relay communications protocol. The frame relay communications protocol may comply or be compatible with a standard promulgated by Consultative Committee for International Telegraph and Telephone (CCITT) and/or the American National Standards Institute (ANSI). Alternatively or additionally, the transceivers may be capable of communicating with each other using an Asynchronous Transfer Mode (ATM) communications protocol. The ATM communications protocol may comply or be compatible with an ATM standard published by the ATM Forum titled “ATM-MPLS Network Interworking 2.0” published August 2001, and/or later versions of this standard. Of course, different and/or after-developed connection-oriented network communication protocols are equally contemplated herein.


Unless specifically stated otherwise as apparent from the foregoing disclosure, it is appreciated that, throughout the foregoing disclosure, discussions using terms such as “processing,” “computing,” “calculating,” “determining,” “displaying,” or the like, refer to the action and processes of a computer system, or similar electronic computing device, that manipulates and transforms data represented as physical (electronic) quantities within the computer system's registers and memories into other data similarly represented as physical quantities within the computer system memories or registers or other such information storage, transmission or display devices.


One or more components may be referred to herein as “configured to,” “configurable to,” “operable/operative to,” “adapted/adaptable,” “able to,” “conformable/conformed to,” etc. Those skilled in the art will recognize that “configured to” can generally encompass active-state components and/or inactive-state components and/or standby-state components, unless context requires otherwise.


The terms “proximal” and “distal” are used herein with reference to a clinician manipulating the handle portion of the surgical instrument. The term “proximal” refers to the portion closest to the clinician and the term “distal” refers to the portion located away from the clinician. It will be further appreciated that, for convenience and clarity, spatial terms such as “vertical”, “horizontal”, “up”, and “down” may be used herein with respect to the drawings. However, surgical instruments are used in many orientations and positions, and these terms are not intended to be limiting and/or absolute.


Those skilled in the art will recognize that, in general, terms used herein, and especially in the appended claims (e.g., bodies of the appended claims) are generally intended as “open” terms (e.g., the term “including” should be interpreted as “including but not limited to,” the term “having” should be interpreted as “having at least,” the term “includes” should be interpreted as “includes but is not limited to,” etc.). It will be further understood by those within the art that if a specific number of an introduced claim recitation is intended, such an intent will be explicitly recited in the claim, and in the absence of such recitation no such intent is present. For example, as an aid to understanding, the following appended claims may contain usage of the introductory phrases “at least one” and “one or more” to introduce claim recitations. However, the use of such phrases should not be construed to imply that the introduction of a claim recitation by the indefinite articles “a” or “an” limits any particular claim containing such introduced claim recitation to claims containing only one such recitation, even when the same claim includes the introductory phrases “one or more” or “at least one” and indefinite articles such as “a” or “an” (e.g., “a” and/or “an” should typically be interpreted to mean “at least one” or “one or more”); the same holds true for the use of definite articles used to introduce claim recitations.


In addition, even if a specific number of an introduced claim recitation is explicitly recited, those skilled in the art will recognize that such recitation should typically be interpreted to mean at least the recited number (e.g., the bare recitation of “two recitations,” without other modifiers, typically means at least two recitations, or two or more recitations). Furthermore, in those instances where a convention analogous to “at least one of A, B, and C, etc.” is used, in general such a construction is intended in the sense one having skill in the art would understand the convention (e.g., “a system having at least one of A, B, and C” would include but not be limited to systems that have A alone, B alone, C alone, A and B together, A and C together, B and C together, and/or A, B, and C together, etc.). In those instances where a convention analogous to “at least one of A, B, or C, etc.” is used, in general such a construction is intended in the sense one having skill in the art would understand the convention (e.g., “a system having at least one of A, B, or C” would include but not be limited to systems that have A alone, B alone, C alone, A and B together, A and C together, B and C together, and/or A, B, and C together, etc.). It will be further understood by those within the art that typically a disjunctive word and/or phrase presenting two or more alternative terms, whether in the description, claims, or drawings, should be understood to contemplate the possibilities of including one of the terms, either of the terms, or both terms unless context dictates otherwise. For example, the phrase “A or B” will be typically understood to include the possibilities of “A” or “B” or “A and B.”


With respect to the appended claims, those skilled in the art will appreciate that recited operations therein may generally be performed in any order. Also, although various operational flow diagrams are presented in a sequence(s), it should be understood that the various operations may be performed in other orders than those which are illustrated, or may be performed concurrently. Examples of such alternate orderings may include overlapping, interleaved, interrupted, reordered, incremental, preparatory, supplemental, simultaneous, reverse, or other variant orderings, unless context dictates otherwise. Furthermore, terms like “responsive to,” “related to,” or other past-tense adjectives are generally not intended to exclude such variants, unless context dictates otherwise.


It is worthy to note that any reference to “one aspect,” “an aspect,” “an exemplification,” “one exemplification,” and the like means that a particular feature, structure, or characteristic described in connection with the aspect is included in at least one aspect. Thus, appearances of the phrases “in one aspect,” “in an aspect,” “in an exemplification,” and “in one exemplification” in various places throughout the specification are not necessarily all referring to the same aspect. Furthermore, the particular features, structures or characteristics may be combined in any suitable manner in one or more aspects.


Any patent application, patent, non-patent publication, or other disclosure material referred to in this specification and/or listed in any Application Data Sheet is incorporated by reference herein, to the extent that the incorporated materials is not inconsistent herewith. As such, and to the extent necessary, the disclosure as explicitly set forth herein supersedes any conflicting material incorporated herein by reference. Any material, or portion thereof, that is said to be incorporated by reference herein, but which conflicts with existing definitions, statements, or other disclosure material set forth herein will only be incorporated to the extent that no conflict arises between that incorporated material and the existing disclosure material.


In summary, numerous benefits have been described which result from employing the concepts described herein. The foregoing description of the one or more forms has been presented for purposes of illustration and description. It is not intended to be exhaustive or limiting to the precise form disclosed. Modifications or variations are possible in light of the above teachings. The one or more forms were chosen and described in order to illustrate principles and practical application to thereby enable one of ordinary skill in the art to utilize the various forms and with various modifications as are suited to the particular use contemplated. It is intended that the claims submitted herewith define the overall scope.

Claims
  • 1. A surgical image acquisition system comprising: an imaging device comprising: an illumination source to emit light having a specified central wavelength; anda light sensor to receive a portion of light reflected from a tissue sample;a surgical hub comprising a situational awareness module; anda computing system comprising a processor and a memory coupled to the processor, wherein the memory stores machine executable instructions that when executed by the processor cause the processor to: receive, from the imaging device, imaging data based on the light reflected from the tissue sample;calculate tissue refractive index data from the imaging data;calculate structural data related to a characteristic of a structure within the tissue sample based on the tissue refractive index data; andtransmit the imaging data to the situational awareness module,wherein the situational awareness module comprises an artificial intelligence module trained to determine a position of the imaging device based on the imaging data, andwherein the characteristic is a surface characteristic or a structure composition.
  • 2. The surgical image acquisition system of claim 1, wherein the illumination source comprises a red light illumination source, a green light illumination source, a blue light illumination source, an infrared light illumination source, or an ultraviolet light illumination source.
  • 3. The surgical image acquisition system of claim 1, wherein the computing system calculates structural data related to a percent tissue hydration.
  • 4. The surgical image acquisition system of claim 1, wherein the computing system calculates structural data related to a surface roughness of a structure within the tissue sample.
  • 5. The surgical image acquisition system of claim 1, wherein the artificial intelligence module is trained to determine an orientation of the imaging device based on the imaging data.
  • 6. A surgical image acquisition system comprising: an imaging device comprising an illumination source to emit light having a specified central wavelength;a surgical hub comprising a situational awareness module;a processor; anda memory coupled to the processor, the memory storing instructions executable by the processor to: control an operation of the of illumination source;receive, from a light sensor, imaging data based on light reflected from a tissue sample;calculate tissue refractive index data from the imaging data;calculate structural data related to a characteristic of a structure within the tissue sample based on the tissue refractive index data; andtransmit the imaging data to the situational awareness module,wherein the situational awareness module comprises an artificial intelligence module trained to determine a position of the imaging device based on the imaging data, andwherein the characteristic of the structure is a surface characteristic or a structure composition.
  • 7. The surgical image acquisition system of claim 6, wherein the instructions executable by the processor further comprise one or more instructions to calculate the structural data related to the characteristic of the structure within the tissue sample based on a phase shift of the light reflected from the tissue sample.
  • 8. The surgical image acquisition system of claim 6, wherein the structure composition comprises a relative composition of collagen and elastin in the tissue sample.
  • 9. The surgical image acquisition system of claim 6, wherein the structure composition comprises an amount of hydration of the tissue sample.
  • 10. The surgical image acquisition system of claim 6, wherein the artificial intelligence module is trained to determine an orientation of the imaging device based on the imaging data.
CROSS-REFERENCE TO RELATED APPLICATIONS

This application claims the benefit of priority under 35 U.S.C. 119(e) to U.S. Provisional Patent Application Ser. No. 62/649,291, titled USE OF LASER LIGHT AND RED-GREEN-BLUE COLORATION TO DETERMINE PROPERTIES OF BACK SCATTERED LIGHT, filed Mar. 28, 2018, the disclosure of which is herein incorporated by reference in its entirety. This application also claims the benefit of priority under 35 U.S.C. 119(e) to U.S. Provisional Patent Application Ser. No. 62/611,341, titled INTERACTIVE SURGICAL PLATFORM, filed Dec. 28, 2017, of U.S. Provisional Patent Application Ser. No. 62/611,340, titled CLOUD-BASED MEDICAL ANALYTICS, filed Dec. 28, 2017, of U.S. Provisional Patent Application Ser. No. 62/611,339, titled ROBOT ASSISTED SURGICAL PLATFORM, filed Dec. 28, 2017, the disclosure of each of which is herein incorporated by reference in its entirety.

US Referenced Citations (2826)
Number Name Date Kind
1853416 Hall Apr 1932 A
2222125 Stehlik Nov 1940 A
3082426 Miles Mar 1963 A
3503396 Pierie et al. Mar 1970 A
3584628 Green Jun 1971 A
3626457 Duerr et al. Dec 1971 A
3633584 Farrell Jan 1972 A
3759017 Young Sep 1973 A
3863118 Lander et al. Jan 1975 A
3898545 Coppa et al. Aug 1975 A
3912121 Steffen Oct 1975 A
3915271 Harper Oct 1975 A
3932812 Milligan Jan 1976 A
4041362 Ichiyanagi Aug 1977 A
4052649 Greenwell et al. Oct 1977 A
4087730 Goles May 1978 A
4157859 Terry Jun 1979 A
4171700 Farin Oct 1979 A
4202722 Paquin May 1980 A
4412539 Jarvik Nov 1983 A
4448193 Ivanov May 1984 A
4523695 Braun et al. Jun 1985 A
4608160 Zoch Aug 1986 A
4614366 North et al. Sep 1986 A
4633874 Chow et al. Jan 1987 A
4701193 Robertson et al. Oct 1987 A
4735603 Goodson et al. Apr 1988 A
4788977 Farin et al. Dec 1988 A
4827911 Broadwin et al. May 1989 A
4849752 Bryant Jul 1989 A
D303787 Messenger et al. Oct 1989 S
4892244 Fox et al. Jan 1990 A
4962681 Yang Oct 1990 A
4976173 Yang Dec 1990 A
5010341 Huntley et al. Apr 1991 A
5026387 Thomas Jun 1991 A
5035692 Lyon et al. Jul 1991 A
5042460 Sakurai et al. Aug 1991 A
5047043 Kubota et al. Sep 1991 A
5084057 Green et al. Jan 1992 A
5100402 Fan Mar 1992 A
D327061 Soren et al. Jun 1992 S
5129570 Schulze et al. Jul 1992 A
5151102 Kamiyama et al. Sep 1992 A
5156315 Green et al. Oct 1992 A
5158585 Saho et al. Oct 1992 A
5160334 Billings et al. Nov 1992 A
5171247 Hughett et al. Dec 1992 A
5189277 Boisvert et al. Feb 1993 A
5197962 Sansom et al. Mar 1993 A
5204669 Dorfe et al. Apr 1993 A
5217003 Wilk Jun 1993 A
5242474 Herbst et al. Sep 1993 A
5253793 Green et al. Oct 1993 A
5271543 Grant et al. Dec 1993 A
RE34519 Fox et al. Jan 1994 E
5275323 Schulze et al. Jan 1994 A
5318516 Cosmescu Jun 1994 A
5318563 Malis et al. Jun 1994 A
5322055 Davison et al. Jun 1994 A
5342349 Kaufman Aug 1994 A
5364003 Williamson, IV Nov 1994 A
5383880 Hooven Jan 1995 A
5385544 Edwards et al. Jan 1995 A
5391144 Sakurai et al. Feb 1995 A
5396900 Slater et al. Mar 1995 A
5397046 Savage et al. Mar 1995 A
5403312 Yates et al. Apr 1995 A
5403327 Thornton et al. Apr 1995 A
5413267 Solyntjes et al. May 1995 A
5415335 Knodell, Jr. May 1995 A
5417699 Klein et al. May 1995 A
5439468 Schulze et al. Aug 1995 A
5445304 Plyley et al. Aug 1995 A
5462545 Wang et al. Oct 1995 A
5465895 Knodel et al. Nov 1995 A
5467911 Tsuruta et al. Nov 1995 A
5474566 Alesi et al. Dec 1995 A
5485947 Olson et al. Jan 1996 A
5496315 Weaver et al. Mar 1996 A
5496317 Goble et al. Mar 1996 A
5503320 Webster et al. Apr 1996 A
5507773 Huitema et al. Apr 1996 A
5529235 Boiarski et al. Jun 1996 A
5531743 Nettekoven et al. Jul 1996 A
5545148 Wurster Aug 1996 A
5552685 Young et al. Sep 1996 A
5560372 Cory Oct 1996 A
5584425 Savage et al. Dec 1996 A
5607436 Pratt et al. Mar 1997 A
5610379 Muz et al. Mar 1997 A
5610811 Honda Mar 1997 A
5613966 Makower et al. Mar 1997 A
5619881 Morikawa et al. Apr 1997 A
5624452 Yates Apr 1997 A
D379346 Mieki May 1997 S
5626587 Bishop et al. May 1997 A
5643291 Pier et al. Jul 1997 A
5654750 Weil et al. Aug 1997 A
5673841 Schulze et al. Oct 1997 A
5673842 Bittner et al. Oct 1997 A
5675227 Roos et al. Oct 1997 A
5693042 Boiarski et al. Dec 1997 A
5693052 Weaver Dec 1997 A
5695502 Pier et al. Dec 1997 A
5697926 Weaver Dec 1997 A
5706998 Plyley et al. Jan 1998 A
5718359 Palmer et al. Feb 1998 A
5720287 Chapelon et al. Feb 1998 A
5724468 Leone et al. Mar 1998 A
5725536 Oberlin et al. Mar 1998 A
5725542 Yoon Mar 1998 A
5735445 Vidal et al. Apr 1998 A
5735848 Yates et al. Apr 1998 A
5746209 Yost et al. May 1998 A
5749362 Funda et al. May 1998 A
5749893 Vidal et al. May 1998 A
5752644 Bolanos et al. May 1998 A
5762255 Chrisman et al. Jun 1998 A
5762458 Wang et al. Jun 1998 A
5766186 Faraz et al. Jun 1998 A
5769791 Benaron et al. Jun 1998 A
5775331 Raymond et al. Jul 1998 A
5796188 Bays Aug 1998 A
5797537 Oberlin et al. Aug 1998 A
5800350 Coppleson et al. Sep 1998 A
5807393 Williamson, IV et al. Sep 1998 A
D399561 Ellingson Oct 1998 S
5817093 Williamson, IV et al. Oct 1998 A
5820009 Melling et al. Oct 1998 A
5833690 Yates et al. Nov 1998 A
5836849 Mathiak et al. Nov 1998 A
5836869 Kudo et al. Nov 1998 A
5836909 Cosmescu Nov 1998 A
5843080 Fleenor et al. Dec 1998 A
5846237 Nettekoven Dec 1998 A
5849022 Sakashita et al. Dec 1998 A
5873873 Smith et al. Feb 1999 A
5878938 Bittner et al. Mar 1999 A
5893849 Weaver Apr 1999 A
5906625 Bito et al. May 1999 A
5942333 Arnett et al. Aug 1999 A
5947996 Logeman Sep 1999 A
5968032 Sleister Oct 1999 A
5980510 Tsonton et al. Nov 1999 A
5987346 Benaron et al. Nov 1999 A
5997528 Bisch et al. Dec 1999 A
6004269 Crowley et al. Dec 1999 A
6010054 Johnson et al. Jan 2000 A
6030437 Gourrier et al. Feb 2000 A
6036637 Kudo Mar 2000 A
6039734 Goble Mar 2000 A
6039735 Greep Mar 2000 A
6059799 Aranyi et al. May 2000 A
6066137 Greep May 2000 A
6079606 Milliman et al. Jun 2000 A
6090107 Borgmeier et al. Jul 2000 A
6099537 Sugai et al. Aug 2000 A
6102907 Smethers et al. Aug 2000 A
6109500 Alli et al. Aug 2000 A
6113598 Baker Sep 2000 A
6126592 Proch et al. Oct 2000 A
6126658 Baker Oct 2000 A
6131789 Schulze et al. Oct 2000 A
6139561 Shibata et al. Oct 2000 A
6155473 Tompkins et al. Dec 2000 A
6214000 Fleenor et al. Apr 2001 B1
6258105 Hart et al. Jul 2001 B1
6269411 Reasoner Jul 2001 B1
6273887 Yamauchi et al. Aug 2001 B1
6283960 Ashley Sep 2001 B1
6301495 Gueziec et al. Oct 2001 B1
6302881 Farin Oct 2001 B1
6308089 von der Ruhr et al. Oct 2001 B1
6325808 Bernard et al. Dec 2001 B1
6325811 Messerly Dec 2001 B1
6331181 Tierney et al. Dec 2001 B1
6341164 Dilkie et al. Jan 2002 B1
6391102 Bodden et al. May 2002 B1
6423057 He et al. Jul 2002 B1
6434416 Mizoguchi et al. Aug 2002 B1
6443973 Whitman Sep 2002 B1
6451015 Rittman, III et al. Sep 2002 B1
6454781 Witt et al. Sep 2002 B1
6457625 Tormala et al. Oct 2002 B1
6461352 Morgan et al. Oct 2002 B2
6466817 Kaula et al. Oct 2002 B1
6480796 Wiener Nov 2002 B2
6482217 Pintor et al. Nov 2002 B1
6524307 Palmerton et al. Feb 2003 B1
6530933 Yeung et al. Mar 2003 B1
6551243 Bocionek et al. Apr 2003 B2
6569109 Sakurai et al. May 2003 B2
6582424 Fleenor et al. Jun 2003 B2
6584358 Carter et al. Jun 2003 B2
6585791 Garito et al. Jul 2003 B1
6611793 Burnside et al. Aug 2003 B1
6618626 West, Jr. et al. Sep 2003 B2
6628989 Penner et al. Sep 2003 B1
6633234 Wiener et al. Oct 2003 B2
6648223 Boukhny et al. Nov 2003 B2
6678552 Pearlman Jan 2004 B2
6679899 Wiener et al. Jan 2004 B2
6685704 Greep Feb 2004 B2
6695199 Whitman Feb 2004 B2
6699187 Webb et al. Mar 2004 B2
6731514 Evans May 2004 B2
6742895 Robin Jun 2004 B2
6752816 Culp et al. Jun 2004 B2
6760616 Hoey et al. Jul 2004 B2
6770072 Truckai et al. Aug 2004 B1
6773444 Messerly Aug 2004 B2
6775575 Bommannan et al. Aug 2004 B2
6778846 Martinez et al. Aug 2004 B1
6781683 Kacyra et al. Aug 2004 B2
6783524 Anderson et al. Aug 2004 B2
6783525 Greep et al. Aug 2004 B2
6793652 Whitman et al. Sep 2004 B1
6793663 Kneifel et al. Sep 2004 B2
6824539 Novak Nov 2004 B2
6846308 Whitman et al. Jan 2005 B2
6849071 Whitman et al. Feb 2005 B2
6849074 Chen et al. Feb 2005 B2
6852219 Hammond Feb 2005 B2
6863650 Irion Mar 2005 B1
6869430 Balbierz et al. Mar 2005 B2
6869435 Blake, III Mar 2005 B2
6911033 de Guillebon et al. Jun 2005 B2
6913471 Smith Jul 2005 B2
6937892 Leyde et al. Aug 2005 B2
6945981 Donofrio et al. Sep 2005 B2
6951559 Greep Oct 2005 B1
6962587 Johnson et al. Nov 2005 B2
6978921 Shelton, IV et al. Dec 2005 B2
6988649 Shelton, IV et al. Jan 2006 B2
7000818 Shelton, IV et al. Feb 2006 B2
7009511 Mazar et al. Mar 2006 B2
7030146 Baynes et al. Apr 2006 B2
7032798 Whitman et al. Apr 2006 B2
7041941 Faries, Jr. et al. May 2006 B2
7044352 Shelton, IV et al. May 2006 B2
7044911 Drinan et al. May 2006 B2
7044949 Orszulak et al. May 2006 B2
7048775 Jornitz et al. May 2006 B2
7053752 Wang et al. May 2006 B2
7055730 Ehrenfels et al. Jun 2006 B2
7073765 Newkirk Jul 2006 B2
7077853 Kramer et al. Jul 2006 B2
7077856 Whitman Jul 2006 B2
7081096 Brister et al. Jul 2006 B2
7094231 Ellman et al. Aug 2006 B1
7097640 Wang et al. Aug 2006 B2
7103688 Strong Sep 2006 B2
7104949 Anderson et al. Sep 2006 B2
7118564 Ritchie et al. Oct 2006 B2
7121460 Parsons et al. Oct 2006 B1
7137980 Buysse et al. Nov 2006 B2
7140528 Shelton, IV Nov 2006 B2
7143923 Shelton, IV et al. Dec 2006 B2
7143925 Shelton, IV et al. Dec 2006 B2
7147139 Schwemberger et al. Dec 2006 B2
7155316 Sutherland et al. Dec 2006 B2
7164940 Hareyama et al. Jan 2007 B2
7169145 Isaacson et al. Jan 2007 B2
7177533 McFarlin et al. Feb 2007 B2
7182775 de Guillebon et al. Feb 2007 B2
7207472 Wukusick et al. Apr 2007 B2
7208005 Frecker et al. Apr 2007 B2
7217269 El-Galley et al. May 2007 B2
7230529 Ketcherside, Jr. et al. Jun 2007 B2
7232447 Gellman et al. Jun 2007 B2
7236817 Papas et al. Jun 2007 B2
7246734 Shelton, IV Jul 2007 B2
7252664 Nasab et al. Aug 2007 B2
7278563 Green Oct 2007 B1
7294106 Birkenbach et al. Nov 2007 B2
7294116 Ellman et al. Nov 2007 B1
7296724 Green et al. Nov 2007 B2
7317955 McGreevy Jan 2008 B2
7328828 Ortiz et al. Feb 2008 B2
7334717 Rethy et al. Feb 2008 B2
7343565 Ying et al. Mar 2008 B2
7344532 Goble et al. Mar 2008 B2
7353068 Tanaka et al. Apr 2008 B2
7362228 Nycz et al. Apr 2008 B2
7371227 Zeiner May 2008 B2
7380695 Doll et al. Jun 2008 B2
7383088 Spinelli et al. Jun 2008 B2
7391173 Schena Jun 2008 B2
7407074 Ortiz et al. Aug 2008 B2
7408439 Wang et al. Aug 2008 B2
7413541 Konishi Aug 2008 B2
7422136 Marczyk Sep 2008 B1
7422139 Shelton, IV et al. Sep 2008 B2
7422586 Morris et al. Sep 2008 B2
7423972 Shaham et al. Sep 2008 B2
D579876 Novotney et al. Nov 2008 S
7445620 Kefer Nov 2008 B2
7457804 Uber, III et al. Nov 2008 B2
D583328 Chiang Dec 2008 S
7464847 Viola et al. Dec 2008 B2
7464849 Shelton, IV et al. Dec 2008 B2
7496418 Kim et al. Feb 2009 B2
D589447 Sasada et al. Mar 2009 S
7515961 Germanson et al. Apr 2009 B2
7518502 Austin et al. Apr 2009 B2
7554343 Bromfield Jun 2009 B2
7563259 Takahashi Jul 2009 B2
7568604 Ehrenfels et al. Aug 2009 B2
7575144 Ortiz et al. Aug 2009 B2
7597731 Palmerton et al. Oct 2009 B2
7617137 Kreiner et al. Nov 2009 B2
7621192 Conti et al. Nov 2009 B2
7621898 Lalomia et al. Nov 2009 B2
7631793 Rethy et al. Dec 2009 B2
7637410 Marczyk Dec 2009 B2
7637907 Blaha Dec 2009 B2
7641092 Kruszynski et al. Jan 2010 B2
7644848 Swayze et al. Jan 2010 B2
7667592 Ohyama et al. Feb 2010 B2
7667839 Bates Feb 2010 B2
7670334 Hueil et al. Mar 2010 B2
7694865 Scirica Apr 2010 B2
7699772 Pauker et al. Apr 2010 B2
7699860 Huitema et al. Apr 2010 B2
7717312 Beetel May 2010 B2
7720306 Gardiner et al. May 2010 B2
7721934 Shelton, IV et al. May 2010 B2
7721936 Shalton, IV et al. May 2010 B2
7722603 McPherson May 2010 B2
7736357 Lee, Jr. et al. Jun 2010 B2
7742176 Braunecker et al. Jun 2010 B2
7743960 Whitman et al. Jun 2010 B2
7753245 Boudreaux et al. Jul 2010 B2
7757028 Druke et al. Jul 2010 B2
7766207 Mather et al. Aug 2010 B2
7766905 Paterson et al. Aug 2010 B2
7770773 Whitman et al. Aug 2010 B2
7771429 Ballard et al. Aug 2010 B2
7776037 Odom Aug 2010 B2
7782789 Stultz et al. Aug 2010 B2
7784663 Shelton, IV Aug 2010 B2
7803151 Whitman Sep 2010 B2
7810692 Hall et al. Oct 2010 B2
7818041 Kim et al. Oct 2010 B2
7819298 Hall et al. Oct 2010 B2
7832612 Baxter, III et al. Nov 2010 B2
7833219 Tashiro et al. Nov 2010 B2
7836085 Petakov et al. Nov 2010 B2
7837079 Holsten et al. Nov 2010 B2
7837680 Isaacson et al. Nov 2010 B2
7841980 Minosawa et al. Nov 2010 B2
7845537 Shelton, IV et al. Dec 2010 B2
7857185 Swayze et al. Dec 2010 B2
D631252 Leslie Jan 2011 S
7862560 Marion Jan 2011 B2
7862579 Ortiz et al. Jan 2011 B2
7865236 Cory et al. Jan 2011 B2
7884735 Newkirk Feb 2011 B2
7887530 Zemlok et al. Feb 2011 B2
7892337 Palmerton et al. Feb 2011 B2
7907166 Lamprecht et al. Mar 2011 B2
7913891 Doll et al. Mar 2011 B2
7918230 Whitman et al. Apr 2011 B2
7918377 Measamer et al. Apr 2011 B2
7920706 Asokan et al. Apr 2011 B2
7922063 Zemlok et al. Apr 2011 B2
7927014 Dehler Apr 2011 B2
7932826 Fritchie et al. Apr 2011 B2
7942300 Rethy et al. May 2011 B2
7945065 Menzl et al. May 2011 B2
7945342 Tsai et al. May 2011 B2
7950560 Zemlok et al. May 2011 B2
7951148 McClurken May 2011 B2
7954682 Giordano et al. Jun 2011 B2
7954687 Zemlok et al. Jun 2011 B2
7955322 Devengenzo et al. Jun 2011 B2
7956620 Gilbert Jun 2011 B2
7963433 Whitman et al. Jun 2011 B2
7966269 Bauer et al. Jun 2011 B2
7967180 Scirica Jun 2011 B2
7976553 Shelton, IV et al. Jul 2011 B2
7979157 Anvari Jul 2011 B2
7980443 Scheib et al. Jul 2011 B2
7982776 Dunki-Jacobs et al. Jul 2011 B2
7988028 Farascioni et al. Aug 2011 B2
7993140 Sakezles Aug 2011 B2
7993354 Brecher et al. Aug 2011 B1
7993954 Wieting Aug 2011 B2
7995045 Dunki-Jacobs Aug 2011 B2
8005947 Morris et al. Aug 2011 B2
8007494 Taylor et al. Aug 2011 B1
8007513 Nalagatla et al. Aug 2011 B2
8010180 Quaid et al. Aug 2011 B2
8012170 Whitman et al. Sep 2011 B2
8015976 Shah Sep 2011 B2
8016855 Whitman et al. Sep 2011 B2
8019094 Hsieh et al. Sep 2011 B2
8025199 Whitman et al. Sep 2011 B2
8027710 Dannan Sep 2011 B1
8035685 Jensen Oct 2011 B2
8038686 Huitema et al. Oct 2011 B2
8038693 Allen Oct 2011 B2
8043560 Okumoto et al. Oct 2011 B2
8054184 Cline et al. Nov 2011 B2
8054752 Druke et al. Nov 2011 B2
8062306 Nobis et al. Nov 2011 B2
8062330 Prommersberger et al. Nov 2011 B2
8066721 Kortenbach et al. Nov 2011 B2
8074861 Ehrenfels et al. Dec 2011 B2
8075571 Vitali et al. Dec 2011 B2
8095327 Tahara et al. Jan 2012 B2
8096459 Ortiz et al. Jan 2012 B2
8116848 Shahidi Feb 2012 B2
8118206 Zand et al. Feb 2012 B2
8120301 Goldberg et al. Feb 2012 B2
8123764 Meade et al. Feb 2012 B2
D655678 Kobayashi et al. Mar 2012 S
8128625 Odom Mar 2012 B2
8131565 Dicks et al. Mar 2012 B2
8136712 Zingman Mar 2012 B2
8146149 Steinkogler et al. Mar 2012 B2
D657368 Magee et al. Apr 2012 S
8147486 Honour et al. Apr 2012 B2
8155479 Hoffman et al. Apr 2012 B2
8157145 Shelton, IV et al. Apr 2012 B2
8157150 Viola et al. Apr 2012 B2
8157151 Ingmanson et al. Apr 2012 B2
8160098 Yan et al. Apr 2012 B1
8160690 Wilfley et al. Apr 2012 B2
8161977 Shelton, IV et al. Apr 2012 B2
8170396 Kuspa et al. May 2012 B2
8172836 Ward May 2012 B2
8181839 Beetel May 2012 B2
8185409 Putnam et al. May 2012 B2
8206345 Abboud et al. Jun 2012 B2
8208707 Mendonca et al. Jun 2012 B2
8210411 Yates et al. Jul 2012 B2
8211100 Podhajsky et al. Jul 2012 B2
8214007 Baker et al. Jul 2012 B2
8216849 Petty Jul 2012 B2
8220688 Laurent et al. Jul 2012 B2
8225643 Abboud et al. Jul 2012 B2
8225979 Farascioni et al. Jul 2012 B2
8229549 Whitman et al. Jul 2012 B2
8231042 Hessler et al. Jul 2012 B2
8239066 Jennings et al. Aug 2012 B2
8241322 Whitman et al. Aug 2012 B2
8255045 Gharib et al. Aug 2012 B2
D667838 Magee et al. Sep 2012 S
8257387 Cunningham Sep 2012 B2
8260016 Maeda et al. Sep 2012 B2
8262560 Whitman Sep 2012 B2
8292639 Achammer et al. Oct 2012 B2
8292888 Whitman Oct 2012 B2
8295902 Salahieh et al. Oct 2012 B2
8308040 Huang et al. Nov 2012 B2
8321581 Katis et al. Nov 2012 B2
8322590 Patel et al. Dec 2012 B2
8328065 Shah Dec 2012 B2
8335590 Costa et al. Dec 2012 B2
D675164 Kobayashi et al. Jan 2013 S
8343065 Bartol et al. Jan 2013 B2
8346392 Walser et al. Jan 2013 B2
8360299 Zemlok et al. Jan 2013 B2
8364222 Cook et al. Jan 2013 B2
D676392 Gassauer Feb 2013 S
8365975 Manoux et al. Feb 2013 B1
D678196 Miyauchi et al. Mar 2013 S
D678304 Yakoub et al. Mar 2013 S
8388652 Viola Mar 2013 B2
8393514 Shelton, IV et al. Mar 2013 B2
8397972 Kostrzewski Mar 2013 B2
8398541 DiMaio et al. Mar 2013 B2
8403944 Pain et al. Mar 2013 B2
8403945 Whitfield et al. Mar 2013 B2
8403946 Whitfield et al. Mar 2013 B2
8406859 Zuzak et al. Mar 2013 B2
8411034 Boillot et al. Apr 2013 B2
8413871 Racenet et al. Apr 2013 B2
8422035 Hinderling et al. Apr 2013 B2
8423182 Robinson et al. Apr 2013 B2
8428722 Verhoef et al. Apr 2013 B2
8429153 Birdwell et al. Apr 2013 B2
8439910 Greep et al. May 2013 B2
8444663 Houser et al. May 2013 B2
8452615 Abri May 2013 B2
8453906 Huang et al. Jun 2013 B2
8454506 Rothman et al. Jun 2013 B2
8461744 Wiener et al. Jun 2013 B2
8468030 Stroup et al. Jun 2013 B2
8469973 Meade et al. Jun 2013 B2
8472630 Konrad et al. Jun 2013 B2
8473066 Aghassian et al. Jun 2013 B2
D687146 Juzkiw et al. Jul 2013 S
8476227 Kaplan et al. Jul 2013 B2
8478418 Fahey Jul 2013 B2
8489235 Moll et al. Jul 2013 B2
8499992 Whitman et al. Aug 2013 B2
8500728 Newton et al. Aug 2013 B2
8500756 Papa et al. Aug 2013 B2
8503759 Greer et al. Aug 2013 B2
8505801 Ehrenfels et al. Aug 2013 B2
8506478 Mizuyoshi Aug 2013 B2
8512325 Mathonnet Aug 2013 B2
8512365 Wiener et al. Aug 2013 B2
8515520 Brunnett et al. Aug 2013 B2
8517239 Scheib et al. Aug 2013 B2
8521331 Itkowitz Aug 2013 B2
8523043 Ullrich et al. Sep 2013 B2
8533475 Frikart et al. Sep 2013 B2
8535342 Malackowski et al. Sep 2013 B2
8540709 Allen Sep 2013 B2
8546996 Messerly et al. Oct 2013 B2
8554697 Claus et al. Oct 2013 B2
8560047 Haider et al. Oct 2013 B2
8561870 Baxter, III et al. Oct 2013 B2
8562598 Falkenstein et al. Oct 2013 B2
8566115 Moore Oct 2013 B2
8567393 Hickle et al. Oct 2013 B2
8568411 Falkenstein et al. Oct 2013 B2
8571598 Valavi Oct 2013 B2
8573459 Smith et al. Nov 2013 B2
8573465 Shelton, IV Nov 2013 B2
8574229 Eder et al. Nov 2013 B2
8585631 Dacquay Nov 2013 B2
8585694 Amoah et al. Nov 2013 B2
8590762 Hess et al. Nov 2013 B2
8591536 Robertson Nov 2013 B2
8595607 Nekoomaram et al. Nov 2013 B2
8596513 Olson et al. Dec 2013 B2
8596515 Okoniewski Dec 2013 B2
8604709 Jalbout et al. Dec 2013 B2
8608044 Hueil et al. Dec 2013 B2
8608045 Smith et al. Dec 2013 B2
8616431 Timm et al. Dec 2013 B2
8617155 Johnson et al. Dec 2013 B2
8620055 Barratt et al. Dec 2013 B2
8620473 Diolaiti et al. Dec 2013 B2
8622275 Baxter, III et al. Jan 2014 B2
8623027 Price et al. Jan 2014 B2
8627483 Rachlin et al. Jan 2014 B2
8627993 Smith et al. Jan 2014 B2
8627995 Smith et al. Jan 2014 B2
8628518 Blumenkranz et al. Jan 2014 B2
8628545 Cabrera et al. Jan 2014 B2
8631987 Shelton, IV et al. Jan 2014 B2
8632525 Kerr et al. Jan 2014 B2
8636190 Zemlok et al. Jan 2014 B2
8636736 Yates et al. Jan 2014 B2
8641621 Razzaque et al. Feb 2014 B2
8652086 Gerg et al. Feb 2014 B2
8652121 Quick et al. Feb 2014 B2
8652128 Ward Feb 2014 B2
8657176 Shelton, IV et al. Feb 2014 B2
8657177 Scirica et al. Feb 2014 B2
8663220 Wiener et al. Mar 2014 B2
8663222 Anderson et al. Mar 2014 B2
8666544 Moll et al. Mar 2014 B2
8679114 Chapman et al. Mar 2014 B2
8682049 Zhao et al. Mar 2014 B2
8682489 Itkowitz et al. Mar 2014 B2
8685056 Evans et al. Apr 2014 B2
8688188 Heller et al. Apr 2014 B2
8690864 Hoarau Apr 2014 B2
8701962 Kostrzewski Apr 2014 B2
8708213 Shelton, IV et al. Apr 2014 B2
D704839 Juzkiw et al. May 2014 S
8719061 Birchall May 2014 B2
8720766 Hess et al. May 2014 B2
8733613 Huitema et al. May 2014 B2
8740840 Foley et al. Jun 2014 B2
8740866 Reasoner et al. Jun 2014 B2
8747238 Shelton, IV et al. Jun 2014 B2
8752749 Moore et al. Jun 2014 B2
8757465 Woodard, Jr. et al. Jun 2014 B2
8761717 Buchheit Jun 2014 B1
8763879 Shelton, IV et al. Jul 2014 B2
8768251 Claus et al. Jul 2014 B2
8771270 Burbank Jul 2014 B2
8775196 Simpson et al. Jul 2014 B2
8779648 Giordano et al. Jul 2014 B2
8790253 Sunagawa et al. Jul 2014 B2
8794497 Zingman Aug 2014 B2
8795001 Lam et al. Aug 2014 B1
8799008 Johnson et al. Aug 2014 B2
8799009 Mellin et al. Aug 2014 B2
8800838 Shelton, IV Aug 2014 B2
8801703 Gregg et al. Aug 2014 B2
8814996 Giurgiutiu et al. Aug 2014 B2
8818556 Sanchez et al. Aug 2014 B2
8819581 Nakamura et al. Aug 2014 B2
8820603 Shelton, IV et al. Sep 2014 B2
8820607 Marczyk Sep 2014 B2
8820608 Miyamoto Sep 2014 B2
8827134 Viola et al. Sep 2014 B2
8827136 Hessler Sep 2014 B2
8840003 Morgan et al. Sep 2014 B2
D716333 Chotin et al. Oct 2014 S
8851354 Swensgard et al. Oct 2014 B2
8852174 Burbank Oct 2014 B2
8864747 Merchant et al. Oct 2014 B2
8875973 Whitman Nov 2014 B2
8876857 Burbank Nov 2014 B2
8882662 Charles Nov 2014 B2
8885032 Igarashi et al. Nov 2014 B2
8886790 Harrang et al. Nov 2014 B2
8893946 Boudreaux et al. Nov 2014 B2
8893949 Shelton, IV et al. Nov 2014 B2
8899479 Cappuzzo et al. Dec 2014 B2
8905977 Shelton et al. Dec 2014 B2
8912746 Reid et al. Dec 2014 B2
8914098 Brennan et al. Dec 2014 B2
8917513 Hazzard Dec 2014 B1
8918207 Prisco Dec 2014 B2
8920186 Shishikura Dec 2014 B2
8920414 Stone et al. Dec 2014 B2
8920433 Barrier et al. Dec 2014 B2
8930203 Kiaie et al. Jan 2015 B2
8930214 Woolford Jan 2015 B2
8931679 Kostrzewski Jan 2015 B2
8934684 Mohamed Jan 2015 B2
8936614 Allen, IV Jan 2015 B2
8945095 Blumenkranz et al. Feb 2015 B2
8945163 Voegele et al. Feb 2015 B2
8955732 Zemlok et al. Feb 2015 B2
8956581 Rosenbaum et al. Feb 2015 B2
8960519 Whitman et al. Feb 2015 B2
8960520 McCuen Feb 2015 B2
8962062 Podhajsky et al. Feb 2015 B2
8967443 McCuen Mar 2015 B2
8967455 Zhou Mar 2015 B2
8968276 Zemlok et al. Mar 2015 B2
8968296 McPherson Mar 2015 B2
8968309 Roy et al. Mar 2015 B2
8968312 Marczyk et al. Mar 2015 B2
8968337 Whitfield et al. Mar 2015 B2
8968358 Reschke Mar 2015 B2
8974429 Gordon et al. Mar 2015 B2
8979890 Boudreaux Mar 2015 B2
8986288 Konishi Mar 2015 B2
8986302 Aldridge et al. Mar 2015 B2
8989903 Weir et al. Mar 2015 B2
8991678 Wellman et al. Mar 2015 B2
8992565 Brisson et al. Mar 2015 B2
8998797 Omori Apr 2015 B2
9002518 Manzo et al. Apr 2015 B2
9005230 Yates et al. Apr 2015 B2
9010608 Casasanta, Jr. et al. Apr 2015 B2
9010611 Ross et al. Apr 2015 B2
9011366 Dean et al. Apr 2015 B2
9011427 Price et al. Apr 2015 B2
9016539 Kostrzewski et al. Apr 2015 B2
9017326 DiNardo et al. Apr 2015 B2
9020240 Pettersson et al. Apr 2015 B2
D729267 Yoo et al. May 2015 S
9023032 Robinson May 2015 B2
9023071 Miller et al. May 2015 B2
9023079 Boulnois et al. May 2015 B2
9027431 Tang et al. May 2015 B2
9028494 Shelton, IV et al. May 2015 B2
9035568 Ganton et al. May 2015 B2
9038882 Racenet et al. May 2015 B2
9043027 Durant et al. May 2015 B2
9044227 Shelton, IV et al. Jun 2015 B2
9044244 Ludwin et al. Jun 2015 B2
9044261 Houser Jun 2015 B2
9050063 Roe et al. Jun 2015 B2
9050083 Yates et al. Jun 2015 B2
9050120 Swarup et al. Jun 2015 B2
9052809 Vesto Jun 2015 B2
9055035 Porsch et al. Jun 2015 B2
9055870 Meador et al. Jun 2015 B2
9060770 Shelton, IV et al. Jun 2015 B2
9060775 Wiener et al. Jun 2015 B2
9066650 Sekiguchi Jun 2015 B2
9072523 Houser et al. Jul 2015 B2
9072535 Shelton, IV et al. Jul 2015 B2
9072536 Shelton, IV et al. Jul 2015 B2
9078653 Leimbach et al. Jul 2015 B2
9078727 Miller Jul 2015 B2
9084606 Greep Jul 2015 B2
9089360 Messerly et al. Jul 2015 B2
9095362 Dachs, II et al. Aug 2015 B2
9095367 Olson et al. Aug 2015 B2
9099863 Smith et al. Aug 2015 B2
9101358 Kerr et al. Aug 2015 B2
9101359 Smith et al. Aug 2015 B2
9101374 Hoch et al. Aug 2015 B1
9106270 Puterbaugh et al. Aug 2015 B2
9107573 Birnkrant Aug 2015 B2
9107662 Kostrzewski Aug 2015 B2
9107684 Ma Aug 2015 B2
9107688 Kimball et al. Aug 2015 B2
9107689 Robertson et al. Aug 2015 B2
9107694 Hendriks et al. Aug 2015 B2
9111548 Nandy et al. Aug 2015 B2
9113880 Zemlok et al. Aug 2015 B2
9114494 Mah Aug 2015 B1
9116597 Gulasky Aug 2015 B1
9119617 Souls et al. Sep 2015 B2
9119655 Bowling et al. Sep 2015 B2
9119657 Shelton, IV et al. Sep 2015 B2
9123155 Cunningham et al. Sep 2015 B2
9125644 Lane et al. Sep 2015 B2
9129054 Nawana et al. Sep 2015 B2
9131957 Skarbnik et al. Sep 2015 B2
9137254 Bilbrey et al. Sep 2015 B2
9138129 Diolaiti Sep 2015 B2
9138225 Huang et al. Sep 2015 B2
9141758 Kress et al. Sep 2015 B2
9149322 Knowlton Oct 2015 B2
9155503 Cadwell Oct 2015 B2
9160853 Daddi et al. Oct 2015 B1
9161803 Yates et al. Oct 2015 B2
9168054 Turner et al. Oct 2015 B2
9168091 Janssen et al. Oct 2015 B2
9168104 Dein Oct 2015 B2
9179912 Yates et al. Nov 2015 B2
9183723 Sherman et al. Nov 2015 B2
9186143 Timm et al. Nov 2015 B2
9192375 Skinlo et al. Nov 2015 B2
9192447 Choi et al. Nov 2015 B2
9192707 Gerber et al. Nov 2015 B2
9198711 Joseph Dec 2015 B2
9198835 Swisher et al. Dec 2015 B2
9202078 Abuelsaad et al. Dec 2015 B2
9204830 Zand et al. Dec 2015 B2
9204879 Shelton, IV Dec 2015 B2
9204995 Scheller et al. Dec 2015 B2
9211120 Scheib et al. Dec 2015 B2
9216062 Duque et al. Dec 2015 B2
9218053 Komuro et al. Dec 2015 B2
9220502 Zemlok et al. Dec 2015 B2
9220505 Vasudevan et al. Dec 2015 B2
9226689 Jacobsen et al. Jan 2016 B2
9226751 Shelton, IV et al. Jan 2016 B2
9226766 Aldridge et al. Jan 2016 B2
9226767 Stulen et al. Jan 2016 B2
9226791 McCarthy et al. Jan 2016 B2
9232883 Ozawa et al. Jan 2016 B2
9237891 Shelton, IV Jan 2016 B2
9237921 Messerly et al. Jan 2016 B2
9241728 Price et al. Jan 2016 B2
9241730 Babaev Jan 2016 B2
9241731 Boudreaux et al. Jan 2016 B2
9247996 Merana et al. Feb 2016 B1
9250172 Harris et al. Feb 2016 B2
9255907 Heanue et al. Feb 2016 B2
9259282 Azizian et al. Feb 2016 B2
9265429 St. Pierre et al. Feb 2016 B2
9265585 Wingardner et al. Feb 2016 B2
9265959 Drew et al. Feb 2016 B2
9272406 Aronhalt et al. Mar 2016 B2
9277956 Zhang Mar 2016 B2
9277961 Panescu et al. Mar 2016 B2
9277969 Brannan et al. Mar 2016 B2
9280884 Schultz et al. Mar 2016 B1
9282962 Schmid et al. Mar 2016 B2
9282974 Shelton, IV Mar 2016 B2
9283045 Rhee et al. Mar 2016 B2
9283054 Morgan et al. Mar 2016 B2
9289211 Williams et al. Mar 2016 B2
9289212 Shelton, IV et al. Mar 2016 B2
9295514 Shelton, IV et al. Mar 2016 B2
9299138 Zellner et al. Mar 2016 B2
9301691 Hufnagel et al. Apr 2016 B2
9301753 Aldridge et al. Apr 2016 B2
9301755 Shelton, IV et al. Apr 2016 B2
9301759 Spivey et al. Apr 2016 B2
9301810 Amiri et al. Apr 2016 B2
9302213 Manahan et al. Apr 2016 B2
9307894 von Grunberg et al. Apr 2016 B2
9307914 Fahey Apr 2016 B2
9307986 Hall et al. Apr 2016 B2
9314246 Shelton, IV et al. Apr 2016 B2
9314308 Parihar et al. Apr 2016 B2
9320563 Brustad et al. Apr 2016 B2
9325732 Stickle et al. Apr 2016 B1
9326767 Koch et al. May 2016 B2
9326770 Shelton, IV et al. May 2016 B2
9331422 Nazzaro et al. May 2016 B2
9332987 Leimbach et al. May 2016 B2
9333042 Diolaiti et al. May 2016 B2
9336385 Spencer et al. May 2016 B1
9341704 Picard et al. May 2016 B2
9345481 Hall et al. May 2016 B2
9345490 Ippisch May 2016 B2
9345544 Hourtash et al. May 2016 B2
9345546 Toth et al. May 2016 B2
9345900 Wu et al. May 2016 B2
9351726 Leimbach et al. May 2016 B2
9351727 Leimbach et al. May 2016 B2
9358003 Hall et al. Jun 2016 B2
9358685 Meier et al. Jun 2016 B2
9360449 Duric Jun 2016 B2
9364200 Whitman et al. Jun 2016 B2
9364230 Shelton, IV et al. Jun 2016 B2
9364231 Wenchell Jun 2016 B2
9364249 Kimball et al. Jun 2016 B2
9364294 Razzaque et al. Jun 2016 B2
9370400 Parihar Jun 2016 B2
9375282 Nau, Jr. et al. Jun 2016 B2
9375539 Stearns et al. Jun 2016 B2
9381003 Todor et al. Jul 2016 B2
9381058 Houser et al. Jul 2016 B2
9386984 Aronhalt et al. Jul 2016 B2
9386988 Baxter, III et al. Jul 2016 B2
9387295 Mastri et al. Jul 2016 B1
9393017 Flanagan et al. Jul 2016 B2
9393037 Olson et al. Jul 2016 B2
9398905 Martin Jul 2016 B2
9398911 Auld Jul 2016 B2
9402629 Ehrenfels et al. Aug 2016 B2
9404868 Yamanaka et al. Aug 2016 B2
9414776 Sillay et al. Aug 2016 B2
9414940 Stein et al. Aug 2016 B2
9419018 Sasagawa et al. Aug 2016 B2
9421014 Ingmanson et al. Aug 2016 B2
9433470 Choi Sep 2016 B2
9439622 Case et al. Sep 2016 B2
9439668 Timm et al. Sep 2016 B2
9439736 Olson Sep 2016 B2
9445764 Gross et al. Sep 2016 B2
9445813 Shelton, IV et al. Sep 2016 B2
9450701 Do et al. Sep 2016 B2
9451949 Gorek et al. Sep 2016 B2
9451958 Shelton, IV et al. Sep 2016 B2
9463022 Swayze et al. Oct 2016 B2
9463646 Payne et al. Oct 2016 B2
9468438 Baber et al. Oct 2016 B2
9474565 Shikhman et al. Oct 2016 B2
D772252 Myers et al. Nov 2016 S
9480492 Aranyi et al. Nov 2016 B2
9485475 Speier et al. Nov 2016 B2
9486271 Dunning Nov 2016 B2
9492146 Kostrzewski et al. Nov 2016 B2
9492237 Kang et al. Nov 2016 B2
9493807 Little et al. Nov 2016 B2
9498182 Case et al. Nov 2016 B2
9498215 Duque et al. Nov 2016 B2
9498219 Moore et al. Nov 2016 B2
9498231 Haider et al. Nov 2016 B2
9498279 Artale et al. Nov 2016 B2
9498291 Balaji et al. Nov 2016 B2
9509566 Chu et al. Nov 2016 B2
9516239 Blanquart et al. Dec 2016 B2
9519753 Gerdeman et al. Dec 2016 B1
9522003 Weir et al. Dec 2016 B2
9526407 Hoeg et al. Dec 2016 B2
9526499 Kostrzewski et al. Dec 2016 B2
9526580 Humayun et al. Dec 2016 B2
9526587 Zhao et al. Dec 2016 B2
9532827 Morgan et al. Jan 2017 B2
9532845 Dossett et al. Jan 2017 B1
9539007 Dhakad et al. Jan 2017 B2
9539020 Conlon et al. Jan 2017 B2
9542481 Halter et al. Jan 2017 B2
9545216 D'Angelo et al. Jan 2017 B2
9546662 Shener-Irmakoglu et al. Jan 2017 B2
9549781 He et al. Jan 2017 B2
9554692 Levy Jan 2017 B2
9554794 Baber et al. Jan 2017 B2
9554854 Yates et al. Jan 2017 B2
9561038 Shelton, IV et al. Feb 2017 B2
9561045 Hinman et al. Feb 2017 B2
9561082 Yen et al. Feb 2017 B2
9561982 Enicks et al. Feb 2017 B2
9566708 Kurnianto Feb 2017 B2
9572592 Price et al. Feb 2017 B2
9579099 Penna et al. Feb 2017 B2
9579503 McKinney et al. Feb 2017 B2
9585657 Shelton, IV et al. Mar 2017 B2
9585658 Shelton, IV Mar 2017 B2
9592095 Panescu et al. Mar 2017 B2
9597081 Swayze et al. Mar 2017 B2
9600031 Kaneko et al. Mar 2017 B2
9600138 Thomas et al. Mar 2017 B2
9603024 Wang et al. Mar 2017 B2
9603277 Morgan et al. Mar 2017 B2
9603609 Kawashima et al. Mar 2017 B2
D783675 Yagisawa et al. Apr 2017 S
D784270 Bhattacharya Apr 2017 S
9610114 Baxter, III et al. Apr 2017 B2
9610412 Zemlok et al. Apr 2017 B2
9615877 Tyrrell et al. Apr 2017 B2
9622684 Wybo Apr 2017 B2
9622808 Beller et al. Apr 2017 B2
9628501 Datta Ray et al. Apr 2017 B2
9629560 Joseph Apr 2017 B2
9629623 Lytle, IV et al. Apr 2017 B2
9629628 Aranyi Apr 2017 B2
9629629 Leimbach et al. Apr 2017 B2
9630318 Ibarz Gabardos et al. Apr 2017 B2
9636096 Heaton, II et al. May 2017 B1
9636112 Penna et al. May 2017 B2
9636188 Gattani et al. May 2017 B2
9636239 Durand et al. May 2017 B2
9636825 Penn et al. May 2017 B2
9641596 Unagami et al. May 2017 B2
9641815 Richardson et al. May 2017 B2
9642620 Baxter, III et al. May 2017 B2
9643022 Mashiach et al. May 2017 B2
9649089 Smith et al. May 2017 B2
9649110 Parihar et al. May 2017 B2
9649111 Shelton, IV et al. May 2017 B2
9649126 Robertson et al. May 2017 B2
9649169 Cinquin et al. May 2017 B2
9652655 Satish et al. May 2017 B2
9655614 Swensgard et al. May 2017 B2
9655616 Aranyi May 2017 B2
9656092 Golden May 2017 B2
9662116 Smith et al. May 2017 B2
9662177 Weir et al. May 2017 B2
9668729 Williams et al. Jun 2017 B2
9668732 Patel et al. Jun 2017 B2
9668765 Grace et al. Jun 2017 B2
9671860 Ogawa et al. Jun 2017 B2
9675264 Acquista et al. Jun 2017 B2
9675354 Weir et al. Jun 2017 B2
9681870 Baxter, III et al. Jun 2017 B2
9686306 Chizeck et al. Jun 2017 B2
9687230 Leimbach et al. Jun 2017 B2
9690362 Leimbach et al. Jun 2017 B2
9700292 Nawana et al. Jul 2017 B2
9700309 Jaworek et al. Jul 2017 B2
9700312 Kostrzewski et al. Jul 2017 B2
9700320 Dinardo et al. Jul 2017 B2
9706993 Hessler et al. Jul 2017 B2
9710214 Lin et al. Jul 2017 B2
9710644 Reybok et al. Jul 2017 B2
9713424 Spaide Jul 2017 B2
9713503 Goldschmidt Jul 2017 B2
9717141 Tegg Jul 2017 B1
9717498 Aranyi et al. Aug 2017 B2
9717525 Ahluwalia et al. Aug 2017 B2
9717548 Couture Aug 2017 B2
9724094 Baber et al. Aug 2017 B2
9724100 Scheib et al. Aug 2017 B2
9724118 Schulte et al. Aug 2017 B2
9733663 Leimbach et al. Aug 2017 B2
9737301 Baber et al. Aug 2017 B2
9737310 Whitfield et al. Aug 2017 B2
9737335 Butler et al. Aug 2017 B2
9737355 Yates et al. Aug 2017 B2
9737371 Romo et al. Aug 2017 B2
9740826 Raghavan et al. Aug 2017 B2
9743016 Nestares et al. Aug 2017 B2
9743929 Leimbach et al. Aug 2017 B2
9743946 Faller et al. Aug 2017 B2
9743947 Price et al. Aug 2017 B2
9750499 Leimbach et al. Sep 2017 B2
9750500 Malkowski Sep 2017 B2
9750522 Scheib et al. Sep 2017 B2
9750523 Tsubuku Sep 2017 B2
9750560 Ballakur et al. Sep 2017 B2
9750563 Shikhman et al. Sep 2017 B2
9753135 Bosch Sep 2017 B2
9753568 McMillen Sep 2017 B2
9757126 Cappola Sep 2017 B2
9757128 Baber et al. Sep 2017 B2
9757142 Shimizu Sep 2017 B2
9757152 Ogilvie et al. Sep 2017 B2
9763741 Alvarez et al. Sep 2017 B2
9764164 Wiener et al. Sep 2017 B2
9770541 Carr et al. Sep 2017 B2
9775611 Kostrzewski Oct 2017 B2
9775623 Zammataro et al. Oct 2017 B2
9777913 Talbert et al. Oct 2017 B2
9782164 Mumaw et al. Oct 2017 B2
9782169 Kimsey et al. Oct 2017 B2
9782212 Wham et al. Oct 2017 B2
9782214 Houser et al. Oct 2017 B2
9788835 Morgan et al. Oct 2017 B2
9788836 Overmyer et al. Oct 2017 B2
9788851 Dannaher et al. Oct 2017 B2
9788902 Inoue et al. Oct 2017 B2
9788907 Alvi et al. Oct 2017 B1
9795436 Yates et al. Oct 2017 B2
9797486 Zergiebel et al. Oct 2017 B2
9801531 Morita et al. Oct 2017 B2
9801626 Parihar et al. Oct 2017 B2
9801627 Harris et al. Oct 2017 B2
9801679 Trees et al. Oct 2017 B2
9802033 Hibner et al. Oct 2017 B2
9804618 Leimbach et al. Oct 2017 B2
9805472 Chou et al. Oct 2017 B2
9808244 Leimbach et al. Nov 2017 B2
9808245 Richard et al. Nov 2017 B2
9808246 Shelton, IV et al. Nov 2017 B2
9808248 Hoffman Nov 2017 B2
9808249 Shelton, IV Nov 2017 B2
9808305 Hareyama et al. Nov 2017 B2
9814457 Martin et al. Nov 2017 B2
9814460 Kimsey et al. Nov 2017 B2
9814462 Woodard, Jr. et al. Nov 2017 B2
9814463 Williams et al. Nov 2017 B2
9820699 Bingley et al. Nov 2017 B2
9820738 Lytle, IV et al. Nov 2017 B2
9820741 Kostrzewski Nov 2017 B2
9820768 Gee et al. Nov 2017 B2
9826976 Parihar et al. Nov 2017 B2
9826977 Leimbach et al. Nov 2017 B2
9827054 Richmond et al. Nov 2017 B2
9827059 Robinson et al. Nov 2017 B2
9830424 Dixon et al. Nov 2017 B2
9833241 Huitema et al. Dec 2017 B2
9833254 Barral et al. Dec 2017 B1
9839419 Deck et al. Dec 2017 B2
9839424 Zergiebel et al. Dec 2017 B2
9839428 Baxter, III et al. Dec 2017 B2
9839467 Harper et al. Dec 2017 B2
9839470 Gilbert et al. Dec 2017 B2
9839487 Dachs, II Dec 2017 B2
9844321 Ekvall et al. Dec 2017 B1
9844368 Boudreaux et al. Dec 2017 B2
9844369 Huitema et al. Dec 2017 B2
9844374 Lytle, IV et al. Dec 2017 B2
9844375 Overmyer et al. Dec 2017 B2
9844376 Baxter, III et al. Dec 2017 B2
9844379 Shelton, IV et al. Dec 2017 B2
9848058 Johnson et al. Dec 2017 B2
9848877 Shelton, IV et al. Dec 2017 B2
9861354 Saliman et al. Jan 2018 B2
9861363 Chen et al. Jan 2018 B2
9861428 Trees et al. Jan 2018 B2
9864839 Baym et al. Jan 2018 B2
9867612 Parihar et al. Jan 2018 B2
9867651 Wham Jan 2018 B2
9867670 Brannan et al. Jan 2018 B2
9867914 Bonano et al. Jan 2018 B2
9872609 Levy Jan 2018 B2
9872683 Hopkins et al. Jan 2018 B2
9877718 Weir et al. Jan 2018 B2
9877721 Schellin et al. Jan 2018 B2
9883860 Leimbach Feb 2018 B2
9888864 Rondoni et al. Feb 2018 B2
9888914 Martin et al. Feb 2018 B2
9888919 Leimbach et al. Feb 2018 B2
9888921 Williams et al. Feb 2018 B2
9888975 Auld Feb 2018 B2
9895148 Shelton, IV et al. Feb 2018 B2
9900787 Ou Feb 2018 B2
9901342 Shelton, IV et al. Feb 2018 B2
9901406 State et al. Feb 2018 B2
9901411 Gombert et al. Feb 2018 B2
9905000 Chou et al. Feb 2018 B2
9907196 Susini et al. Feb 2018 B2
9907550 Sniffin et al. Mar 2018 B2
9913642 Leimbach et al. Mar 2018 B2
9913645 Zerkle et al. Mar 2018 B2
9918326 Gilson et al. Mar 2018 B2
9918730 Trees et al. Mar 2018 B2
9918778 Walberg et al. Mar 2018 B2
9918788 Paul et al. Mar 2018 B2
9922304 DeBusk et al. Mar 2018 B2
9924941 Burbank Mar 2018 B2
9924944 Shelton, IV et al. Mar 2018 B2
9924961 Shelton, IV et al. Mar 2018 B2
9931040 Homyk et al. Apr 2018 B2
9931118 Shelton, IV et al. Apr 2018 B2
9931124 Gokharu Apr 2018 B2
9936863 Tesar Apr 2018 B2
9936942 Chin et al. Apr 2018 B2
9936955 Miller et al. Apr 2018 B2
9936961 Chien et al. Apr 2018 B2
9937012 Hares et al. Apr 2018 B2
9937014 Bowling et al. Apr 2018 B2
9937626 Rockrohr Apr 2018 B2
9938972 Walley Apr 2018 B2
9943230 Kaku et al. Apr 2018 B2
9943309 Shelton, IV et al. Apr 2018 B2
9943312 Posada et al. Apr 2018 B2
9943377 Yates et al. Apr 2018 B2
9943379 Gregg, II et al. Apr 2018 B2
9943918 Grogan et al. Apr 2018 B2
9943964 Hares Apr 2018 B2
9949785 Price et al. Apr 2018 B2
9962157 Sapre May 2018 B2
9968355 Shelton, IV et al. May 2018 B2
9974595 Anderson et al. May 2018 B2
9976259 Tan et al. May 2018 B2
9980140 Spencer et al. May 2018 B1
9980769 Trees et al. May 2018 B2
9980778 Ohline et al. May 2018 B2
9987000 Shelton, IV et al. Jun 2018 B2
9987068 Anderson et al. Jun 2018 B2
9987072 McPherson Jun 2018 B2
9990856 Kuchenbecker et al. Jun 2018 B2
9993248 Shelton, IV et al. Jun 2018 B2
9993258 Shelton, IV et al. Jun 2018 B2
9993305 Andersson Jun 2018 B2
10004491 Martin et al. Jun 2018 B2
10004497 Overmyer et al. Jun 2018 B2
10004500 Shelton, IV et al. Jun 2018 B2
10004501 Shelton, IV et al. Jun 2018 B2
10004527 Gee et al. Jun 2018 B2
10004557 Gross Jun 2018 B2
D822206 Shelton, IV et al. Jul 2018 S
10010322 Shelton, IV et al. Jul 2018 B2
10010324 Huitema et al. Jul 2018 B2
10013049 Leimbach et al. Jul 2018 B2
10016199 Baber et al. Jul 2018 B2
10016538 Locke et al. Jul 2018 B2
10021318 Hugosson et al. Jul 2018 B2
10022090 Whitman Jul 2018 B2
10022120 Martin et al. Jul 2018 B2
10022391 Ruderman Chen et al. Jul 2018 B2
10022568 Messerly et al. Jul 2018 B2
10028402 Walker Jul 2018 B1
10028744 Shelton, IV et al. Jul 2018 B2
10028761 Leimbach et al. Jul 2018 B2
10028788 Kang Jul 2018 B2
10034704 Asher et al. Jul 2018 B2
10037641 Hyde et al. Jul 2018 B2
10037715 Toly et al. Jul 2018 B2
D826405 Shelton, IV et al. Aug 2018 S
10039546 Williams et al. Aug 2018 B2
10039564 Hibner et al. Aug 2018 B2
10039565 Vezzu Aug 2018 B2
10039589 Virshek et al. Aug 2018 B2
10041822 Zemlok Aug 2018 B2
10044791 Kamen et al. Aug 2018 B2
10045704 Fagin et al. Aug 2018 B2
10045776 Shelton, IV et al. Aug 2018 B2
10045779 Savage et al. Aug 2018 B2
10045781 Cropper et al. Aug 2018 B2
10045782 Murthy Aravalli Aug 2018 B2
10045813 Mueller Aug 2018 B2
10048379 Markendorf et al. Aug 2018 B2
10052044 Shelton, IV et al. Aug 2018 B2
10052102 Baxter, III et al. Aug 2018 B2
10052104 Shelton, IV et al. Aug 2018 B2
10054441 Schorr et al. Aug 2018 B2
10058393 Bonutti et al. Aug 2018 B2
10069633 Gulati et al. Sep 2018 B2
10076326 Yates et al. Sep 2018 B2
10080618 Marshall et al. Sep 2018 B2
10084833 McDonnell et al. Sep 2018 B2
D831209 Huitema et al. Oct 2018 S
10085748 Morgan et al. Oct 2018 B2
10085749 Cappola et al. Oct 2018 B2
10092355 Hannaford et al. Oct 2018 B1
10095942 Mentese et al. Oct 2018 B2
10097578 Baldonado et al. Oct 2018 B2
10098527 Weisenburgh, II et al. Oct 2018 B2
10098635 Burbank Oct 2018 B2
10098642 Baxter, III et al. Oct 2018 B2
10098705 Brisson et al. Oct 2018 B2
10102926 Leonardi Oct 2018 B1
10105140 Malinouskas et al. Oct 2018 B2
10105142 Baxter, III et al. Oct 2018 B2
10105470 Reasoner et al. Oct 2018 B2
10111658 Chowaniec et al. Oct 2018 B2
10111665 Aranyi et al. Oct 2018 B2
10111679 Baber et al. Oct 2018 B2
10111703 Cosman, Jr. et al. Oct 2018 B2
D834541 You et al. Nov 2018 S
10117649 Baxter et al. Nov 2018 B2
10117651 Whitman et al. Nov 2018 B2
10117702 Danziger et al. Nov 2018 B2
10118119 Sappok et al. Nov 2018 B2
10130359 Hess et al. Nov 2018 B2
10130360 Olson et al. Nov 2018 B2
10130361 Yates et al. Nov 2018 B2
10130367 Cappola et al. Nov 2018 B2
10130373 Castro et al. Nov 2018 B2
10130432 Auld et al. Nov 2018 B2
10133248 Fitzsimmons et al. Nov 2018 B2
10135242 Baber et al. Nov 2018 B2
10136246 Yamada Nov 2018 B2
10136887 Shelton, IV et al. Nov 2018 B2
10136891 Shelton, IV et al. Nov 2018 B2
10136949 Felder et al. Nov 2018 B2
10136954 Johnson et al. Nov 2018 B2
10137245 Melker et al. Nov 2018 B2
10143526 Walker et al. Dec 2018 B2
10143948 Bonifas et al. Dec 2018 B2
10147148 Wu et al. Dec 2018 B2
10149680 Parihar et al. Dec 2018 B2
10152789 Carnes et al. Dec 2018 B2
10154841 Weaner et al. Dec 2018 B2
10159044 Hrabak Dec 2018 B2
10159481 Whitman et al. Dec 2018 B2
10159483 Beckman et al. Dec 2018 B2
10164466 Calderoni Dec 2018 B2
10166025 Leimbach et al. Jan 2019 B2
10166061 Berry et al. Jan 2019 B2
10169862 Andre et al. Jan 2019 B2
10172618 Shelton, IV et al. Jan 2019 B2
10172687 Garbus et al. Jan 2019 B2
10175096 Dickerson Jan 2019 B2
10175127 Collins et al. Jan 2019 B2
10178992 Wise et al. Jan 2019 B2
10179413 Rockrohr Jan 2019 B2
10180463 Beckman et al. Jan 2019 B2
10182814 Okoniewski Jan 2019 B2
10182816 Shelton, IV et al. Jan 2019 B2
10182818 Hensel et al. Jan 2019 B2
10187742 Dor et al. Jan 2019 B2
10188385 Kerr et al. Jan 2019 B2
10189157 Schlegel et al. Jan 2019 B2
10190888 Hryb et al. Jan 2019 B2
10194891 Jeong et al. Feb 2019 B2
10194907 Marczyk et al. Feb 2019 B2
10194913 Nalagatla et al. Feb 2019 B2
10194972 Yates et al. Feb 2019 B2
10197803 Badiali et al. Feb 2019 B2
10198965 Hart Feb 2019 B2
10201311 Chou et al. Feb 2019 B2
10201349 Leimbach et al. Feb 2019 B2
10201364 Leimbach et al. Feb 2019 B2
10201365 Boudreaux et al. Feb 2019 B2
10205708 Fletcher et al. Feb 2019 B1
10206605 Shelton, IV et al. Feb 2019 B2
10206752 Hares et al. Feb 2019 B2
10213201 Shelton, IV et al. Feb 2019 B2
10213203 Swayze et al. Feb 2019 B2
10213266 Zemlok et al. Feb 2019 B2
10213268 Dachs, II Feb 2019 B2
10219491 Stiles, Jr. et al. Mar 2019 B2
10220522 Rockrohr Mar 2019 B2
10222750 Bang et al. Mar 2019 B2
10226249 Jaworek et al. Mar 2019 B2
10226250 Beckman et al. Mar 2019 B2
10226254 Cabrera et al. Mar 2019 B2
10226302 Lacal et al. Mar 2019 B2
10231634 Zand et al. Mar 2019 B2
10231733 Ehrenfels et al. Mar 2019 B2
10231775 Shelton, IV et al. Mar 2019 B2
10238413 Hibner et al. Mar 2019 B2
10245027 Shelton, IV et al. Apr 2019 B2
10245028 Shelton, IV et al. Apr 2019 B2
10245029 Hunter et al. Apr 2019 B2
10245030 Hunter et al. Apr 2019 B2
10245033 Overmyer et al. Apr 2019 B2
10245037 Conklin et al. Apr 2019 B2
10245038 Hopkins et al. Apr 2019 B2
10245040 Milliman Apr 2019 B2
10251661 Collings et al. Apr 2019 B2
10251725 Valentine et al. Apr 2019 B2
10255995 Ingmanson Apr 2019 B2
10258331 Shelton, IV et al. Apr 2019 B2
10258359 Kapadia Apr 2019 B2
10258362 Conlon Apr 2019 B2
10258363 Worrell et al. Apr 2019 B2
10258415 Harrah et al. Apr 2019 B2
10258418 Shelton, IV et al. Apr 2019 B2
10258425 Mustufa et al. Apr 2019 B2
10263171 Wiener et al. Apr 2019 B2
10265004 Yamaguchi et al. Apr 2019 B2
10265035 Fehre et al. Apr 2019 B2
10265066 Measamer et al. Apr 2019 B2
10265068 Harris et al. Apr 2019 B2
10265072 Shelton, IV et al. Apr 2019 B2
10265090 Ingmanson et al. Apr 2019 B2
10265130 Hess et al. Apr 2019 B2
10271840 Sapre Apr 2019 B2
10271844 Valentine et al. Apr 2019 B2
10271846 Shelton, IV et al. Apr 2019 B2
10271850 Williams Apr 2019 B2
10271851 Shelton, IV et al. Apr 2019 B2
D847989 Shelton, IV et al. May 2019 S
10278698 Racenet May 2019 B2
10278778 State et al. May 2019 B2
10282963 Fahey May 2019 B2
10283220 Azizian et al. May 2019 B2
10285694 Viola et al. May 2019 B2
10285698 Cappola et al. May 2019 B2
10285700 Scheib May 2019 B2
10285705 Shelton, IV et al. May 2019 B2
10292610 Srivastava May 2019 B2
10292704 Harris et al. May 2019 B2
10292707 Shelton, IV et al. May 2019 B2
10292758 Boudreaux et al. May 2019 B2
10292769 Yu May 2019 B1
10292771 Wood et al. May 2019 B2
10293129 Fox et al. May 2019 B2
10299792 Huitema et al. May 2019 B2
10299868 Tsuboi et al. May 2019 B2
10299870 Connolly et al. May 2019 B2
10305926 Mihan et al. May 2019 B2
D850617 Shelton, IV et al. Jun 2019 S
10307159 Harris et al. Jun 2019 B2
10307170 Parfett et al. Jun 2019 B2
10307199 Farritor et al. Jun 2019 B2
10311036 Hussam et al. Jun 2019 B1
10313137 Aarnio et al. Jun 2019 B2
10314577 Laurent et al. Jun 2019 B2
10314582 Shelton, IV et al. Jun 2019 B2
10321907 Shelton, IV et al. Jun 2019 B2
10321964 Grover et al. Jun 2019 B2
10327764 Harris et al. Jun 2019 B2
10327779 Richard et al. Jun 2019 B2
10335042 Schoenle et al. Jul 2019 B2
10335147 Rector et al. Jul 2019 B2
10335149 Baxter, III et al. Jul 2019 B2
10335180 Johnson et al. Jul 2019 B2
10335227 Heard Jul 2019 B2
10339496 Matson et al. Jul 2019 B2
10342543 Shelton, IV et al. Jul 2019 B2
10342602 Strobl et al. Jul 2019 B2
10342623 Huelman et al. Jul 2019 B2
10343102 Reasoner et al. Jul 2019 B2
10349824 Claude et al. Jul 2019 B2
10349939 Shelton, IV et al. Jul 2019 B2
10349941 Marczyk et al. Jul 2019 B2
10350016 Burbank et al. Jul 2019 B2
10357184 Crawford et al. Jul 2019 B2
10357246 Shelton, IV et al. Jul 2019 B2
10357247 Shelton, IV et al. Jul 2019 B2
10362179 Harris Jul 2019 B2
10363032 Scheib et al. Jul 2019 B2
10363037 Aronhalt et al. Jul 2019 B2
10368861 Baxter, III et al. Aug 2019 B2
10368865 Harris et al. Aug 2019 B2
10368867 Harris et al. Aug 2019 B2
10368876 Bhatnagar et al. Aug 2019 B2
10368894 Madan et al. Aug 2019 B2
10368903 Morales et al. Aug 2019 B2
10376263 Morgan et al. Aug 2019 B2
10376305 Yates et al. Aug 2019 B2
10376337 Kilroy et al. Aug 2019 B2
10376338 Taylor et al. Aug 2019 B2
10378893 Mankovskii Aug 2019 B2
10383518 Abu-Tarif et al. Aug 2019 B2
10383699 Kilroy et al. Aug 2019 B2
10384021 Koeth et al. Aug 2019 B2
10386990 Shikhman et al. Aug 2019 B2
10390718 Chen et al. Aug 2019 B2
10390794 Kuroiwa et al. Aug 2019 B2
10390825 Shelton, IV et al. Aug 2019 B2
10390831 Holsten et al. Aug 2019 B2
10390895 Henderson et al. Aug 2019 B2
10398348 Osadchy et al. Sep 2019 B2
10398434 Shelton, IV et al. Sep 2019 B2
10398517 Eckert et al. Sep 2019 B2
10398521 Itkowitz et al. Sep 2019 B2
10404521 McChord et al. Sep 2019 B2
10404801 Martch Sep 2019 B2
10405857 Shelton, IV et al. Sep 2019 B2
10405859 Harris et al. Sep 2019 B2
10405863 Wise et al. Sep 2019 B2
10413291 Worthington et al. Sep 2019 B2
10413293 Shelton, IV et al. Sep 2019 B2
10413297 Harris et al. Sep 2019 B2
10417446 Takeyama Sep 2019 B2
10420552 Shelton, IV et al. Sep 2019 B2
10420558 Nalagatla et al. Sep 2019 B2
10420559 Marczyk et al. Sep 2019 B2
10420620 Rockrohr Sep 2019 B2
10420865 Reasoner et al. Sep 2019 B2
10422727 Pliskin Sep 2019 B2
10426466 Contini et al. Oct 2019 B2
10426467 Miller et al. Oct 2019 B2
10426468 Contini et al. Oct 2019 B2
10426471 Shelton, IV et al. Oct 2019 B2
10426481 Aronhalt et al. Oct 2019 B2
10433837 Worthington et al. Oct 2019 B2
10433844 Shelton, IV et al. Oct 2019 B2
10433849 Shelton, IV et al. Oct 2019 B2
10433918 Shelton, IV et al. Oct 2019 B2
10441279 Shelton, IV et al. Oct 2019 B2
10441281 Shelton, IV et al. Oct 2019 B2
10441344 Notz et al. Oct 2019 B2
10441345 Aldridge et al. Oct 2019 B2
10448948 Shelton, IV et al. Oct 2019 B2
10448950 Shelton, IV et al. Oct 2019 B2
10456137 Vendely et al. Oct 2019 B2
10456140 Shelton, IV et al. Oct 2019 B2
10456193 Yates et al. Oct 2019 B2
10463365 Williams Nov 2019 B2
10463367 Kostrzewski et al. Nov 2019 B2
10463371 Kostrzewski Nov 2019 B2
10463436 Jackson et al. Nov 2019 B2
10470684 Toth et al. Nov 2019 B2
10470762 Leimbach et al. Nov 2019 B2
10470764 Baxter, III et al. Nov 2019 B2
10470768 Harris et al. Nov 2019 B2
10470791 Houser Nov 2019 B2
10471254 Sano et al. Nov 2019 B2
10478181 Shelton, IV et al. Nov 2019 B2
10478182 Taylor Nov 2019 B2
10478185 Nicholas Nov 2019 B2
10478189 Bear et al. Nov 2019 B2
10478190 Miller et al. Nov 2019 B2
10478544 Friederichs et al. Nov 2019 B2
10485450 Gupta et al. Nov 2019 B2
10485542 Shelton, IV et al. Nov 2019 B2
10485543 Shelton, IV et al. Nov 2019 B2
10492783 Shelton, IV et al. Dec 2019 B2
10492784 Beardsley et al. Dec 2019 B2
10492785 Overmyer et al. Dec 2019 B2
10496788 Amarasingham et al. Dec 2019 B2
10498269 Zemlok et al. Dec 2019 B2
10499847 Latimer et al. Dec 2019 B2
10499891 Chaplin et al. Dec 2019 B2
10499914 Huang et al. Dec 2019 B2
10499915 Aranyi Dec 2019 B2
10499994 Luks et al. Dec 2019 B2
10507068 Kopp et al. Dec 2019 B2
10507278 Gao et al. Dec 2019 B2
10510267 Jarc et al. Dec 2019 B2
10512413 Schepis et al. Dec 2019 B2
10512461 Gupta et al. Dec 2019 B2
10512499 McHenry et al. Dec 2019 B2
10512509 Bowling et al. Dec 2019 B2
10512514 Nowlin et al. Dec 2019 B2
10517588 Gupta et al. Dec 2019 B2
10517595 Hunter et al. Dec 2019 B2
10517596 Hunter et al. Dec 2019 B2
10517686 Vokrot et al. Dec 2019 B2
10524789 Swayze et al. Jan 2020 B2
10531579 Hsiao et al. Jan 2020 B2
10531874 Morgan et al. Jan 2020 B2
10531929 Widenhouse et al. Jan 2020 B2
10532330 Diallo et al. Jan 2020 B2
10536617 Liang et al. Jan 2020 B2
10537324 Shelton, IV et al. Jan 2020 B2
10537325 Bakos et al. Jan 2020 B2
10537351 Shelton, IV et al. Jan 2020 B2
10537396 Zingaretti et al. Jan 2020 B2
10542978 Chowaniec et al. Jan 2020 B2
10542979 Shelton, IV et al. Jan 2020 B2
10542982 Beckman et al. Jan 2020 B2
10542991 Shelton, IV et al. Jan 2020 B2
D876466 Kobayashi et al. Feb 2020 S
10548504 Shelton, IV et al. Feb 2020 B2
10548612 Martinez et al. Feb 2020 B2
10548673 Harris et al. Feb 2020 B2
10552574 Sweeney Feb 2020 B2
10555675 Satish et al. Feb 2020 B2
10555748 Yates et al. Feb 2020 B2
10555750 Conlon et al. Feb 2020 B2
10555769 Worrell et al. Feb 2020 B2
10561349 Wedekind et al. Feb 2020 B2
10561422 Schellin et al. Feb 2020 B2
10561470 Hourtash et al. Feb 2020 B2
10561471 Nichogi Feb 2020 B2
10561560 Boutoussov et al. Feb 2020 B2
10561753 Thompson et al. Feb 2020 B2
10565170 Walling et al. Feb 2020 B2
10568625 Harris et al. Feb 2020 B2
10568626 Shelton, IV et al. Feb 2020 B2
10568632 Miller et al. Feb 2020 B2
10568704 Savall et al. Feb 2020 B2
10575868 Hall et al. Mar 2020 B2
10582928 Hunter et al. Mar 2020 B2
10582931 Mujawar Mar 2020 B2
10582962 Friedrichs et al. Mar 2020 B2
10582964 Weinberg et al. Mar 2020 B2
10586074 Rose et al. Mar 2020 B2
10588623 Schmid et al. Mar 2020 B2
10588625 Weaner et al. Mar 2020 B2
10588629 Malinouskas et al. Mar 2020 B2
10588630 Shelton, IV et al. Mar 2020 B2
10588631 Shelton, IV et al. Mar 2020 B2
10588632 Shelton, IV et al. Mar 2020 B2
10588711 DiCarlo et al. Mar 2020 B2
10592067 Merdan et al. Mar 2020 B2
10595844 Nawana et al. Mar 2020 B2
10595882 Parfett et al. Mar 2020 B2
10595887 Shelton, IV et al. Mar 2020 B2
10595930 Scheib et al. Mar 2020 B2
10595952 Forrest et al. Mar 2020 B2
10602007 Takano Mar 2020 B2
10602848 Magana Mar 2020 B2
10603036 Hunter et al. Mar 2020 B2
10603128 Zergiebel et al. Mar 2020 B2
10610223 Wellman et al. Apr 2020 B2
10610224 Shelton, IV et al. Apr 2020 B2
10610286 Wiener et al. Apr 2020 B2
10610313 Bailey et al. Apr 2020 B2
10617412 Shelton, IV et al. Apr 2020 B2
10617413 Shelton, IV et al. Apr 2020 B2
10617414 Shelton, IV et al. Apr 2020 B2
10617482 Houser et al. Apr 2020 B2
10617484 Kilroy et al. Apr 2020 B2
10624635 Harris et al. Apr 2020 B2
10624667 Faller et al. Apr 2020 B2
10624691 Wiener et al. Apr 2020 B2
10631423 Collins et al. Apr 2020 B2
10631858 Burbank Apr 2020 B2
10631912 McFarlin et al. Apr 2020 B2
10631916 Horner et al. Apr 2020 B2
10631917 Ineson Apr 2020 B2
10631939 Dachs, II et al. Apr 2020 B2
10639027 Shelton, IV et al. May 2020 B2
10639034 Harris et al. May 2020 B2
10639035 Shelton, IV et al. May 2020 B2
10639036 Yates et al. May 2020 B2
10639037 Shelton, IV et al. May 2020 B2
10639039 Vendely et al. May 2020 B2
10639098 Cosman et al. May 2020 B2
10639111 Kopp May 2020 B2
10639185 Agrawal et al. May 2020 B2
10653413 Worthington et al. May 2020 B2
10653476 Ross May 2020 B2
10653489 Kopp May 2020 B2
10656720 Holz May 2020 B1
10660705 Piron et al. May 2020 B2
10667809 Bakos et al. Jun 2020 B2
10667810 Shelton, IV et al. Jun 2020 B2
10667811 Harris et al. Jun 2020 B2
10667877 Kapadia Jun 2020 B2
10674897 Levy Jun 2020 B2
10675021 Harris et al. Jun 2020 B2
10675023 Cappola Jun 2020 B2
10675024 Shelton, IV et al. Jun 2020 B2
10675025 Swayze et al. Jun 2020 B2
10675026 Harris et al. Jun 2020 B2
10675035 Zingman Jun 2020 B2
10675100 Frushour Jun 2020 B2
10675104 Kapadia Jun 2020 B2
10677764 Ross et al. Jun 2020 B2
10679758 Fox et al. Jun 2020 B2
10682136 Harris et al. Jun 2020 B2
10682138 Shelton, IV et al. Jun 2020 B2
10686805 Reybok, Jr. et al. Jun 2020 B2
10687806 Shelton, IV et al. Jun 2020 B2
10687809 Shelton, IV et al. Jun 2020 B2
10687810 Shelton, IV et al. Jun 2020 B2
10687884 Wiener et al. Jun 2020 B2
10687905 Kostrzewski Jun 2020 B2
10695055 Shelton, IV et al. Jun 2020 B2
10695081 Shelton, IV et al. Jun 2020 B2
10695134 Barral et al. Jun 2020 B2
10702270 Shelton, IV et al. Jul 2020 B2
10702271 Aranyi et al. Jul 2020 B2
10709446 Harris et al. Jul 2020 B2
10716473 Greiner Jul 2020 B2
10716489 Kalvoy et al. Jul 2020 B2
10716583 Look et al. Jul 2020 B2
10716615 Shelton, IV et al. Jul 2020 B2
10716639 Kapadia et al. Jul 2020 B2
10717194 Griffiths et al. Jul 2020 B2
10722222 Aranyi Jul 2020 B2
10722233 Wellman Jul 2020 B2
10722292 Arya et al. Jul 2020 B2
D893717 Messerly et al. Aug 2020 S
10729458 Stoddard et al. Aug 2020 B2
10729509 Shelton, IV et al. Aug 2020 B2
10733267 Pedersen Aug 2020 B2
10736219 Seow et al. Aug 2020 B2
10736498 Watanabe et al. Aug 2020 B2
10736616 Scheib et al. Aug 2020 B2
10736628 Yates et al. Aug 2020 B2
10736629 Shelton, IV et al. Aug 2020 B2
10736636 Baxter, III et al. Aug 2020 B2
10736705 Scheib et al. Aug 2020 B2
10743872 Leimbach et al. Aug 2020 B2
10748115 Laster et al. Aug 2020 B2
10751052 Stokes et al. Aug 2020 B2
10751136 Farritor et al. Aug 2020 B2
10751239 Volek et al. Aug 2020 B2
10751768 Hersey et al. Aug 2020 B2
10755813 Shelton, IV et al. Aug 2020 B2
D896379 Shelton, IV et al. Sep 2020 S
10758229 Shelton, IV et al. Sep 2020 B2
10758230 Shelton, IV et al. Sep 2020 B2
10758294 Jones Sep 2020 B2
10758310 Shelton, IV et al. Sep 2020 B2
10765376 Brown, III et al. Sep 2020 B2
10765424 Baxter, III et al. Sep 2020 B2
10765427 Shelton, IV et al. Sep 2020 B2
10765470 Yates et al. Sep 2020 B2
10772630 Wixey Sep 2020 B2
10772651 Shelton, IV et al. Sep 2020 B2
10772673 Allen, IV et al. Sep 2020 B2
10772688 Peine et al. Sep 2020 B2
10779818 Zemlok et al. Sep 2020 B2
10779821 Harris et al. Sep 2020 B2
10779823 Shelton, IV et al. Sep 2020 B2
10779897 Rockrohr Sep 2020 B2
10779900 Pedros et al. Sep 2020 B2
10783634 Nye et al. Sep 2020 B2
10786298 Johnson Sep 2020 B2
10786317 Zhou et al. Sep 2020 B2
10786327 Anderson et al. Sep 2020 B2
10792038 Becerra et al. Oct 2020 B2
10792118 Prpa et al. Oct 2020 B2
10792422 Douglas et al. Oct 2020 B2
10799304 Kapadia et al. Oct 2020 B2
10803977 Sanmugalingham Oct 2020 B2
10806445 Penna et al. Oct 2020 B2
10806453 Chen et al. Oct 2020 B2
10806454 Kopp Oct 2020 B2
10806499 Castaneda et al. Oct 2020 B2
10806506 Gaspredes et al. Oct 2020 B2
10806532 Grubbs et al. Oct 2020 B2
10811131 Schneider et al. Oct 2020 B2
10813638 Shelton, IV et al. Oct 2020 B2
10813703 Swayze et al. Oct 2020 B2
10818383 Sharifi Sedeh et al. Oct 2020 B2
10828028 Harris et al. Nov 2020 B2
10828030 Weir et al. Nov 2020 B2
10835206 Bell et al. Nov 2020 B2
10835245 Swayze et al. Nov 2020 B2
10835246 Shelton, IV et al. Nov 2020 B2
10835247 Shelton, IV et al. Nov 2020 B2
10842473 Scheib et al. Nov 2020 B2
10842490 DiNardo et al. Nov 2020 B2
10842492 Shelton, IV et al. Nov 2020 B2
10842522 Messerly et al. Nov 2020 B2
10842523 Shelton, IV et al. Nov 2020 B2
10842575 Panescu et al. Nov 2020 B2
10842897 Schwartz et al. Nov 2020 B2
D904612 Wynn et al. Dec 2020 S
10849697 Yates et al. Dec 2020 B2
10849700 Kopp et al. Dec 2020 B2
10856768 Osadchy et al. Dec 2020 B2
10856867 Shelton, IV et al. Dec 2020 B2
10856868 Shelton, IV et al. Dec 2020 B2
10856870 Harris et al. Dec 2020 B2
10863984 Shelton, IV et al. Dec 2020 B2
10864037 Mun et al. Dec 2020 B2
10864050 Tabandeh et al. Dec 2020 B2
10872684 McNutt et al. Dec 2020 B2
10874396 Moore et al. Dec 2020 B2
10881399 Shelton, IV et al. Jan 2021 B2
10881401 Baber et al. Jan 2021 B2
10881446 Strobl Jan 2021 B2
10881464 Odermatt et al. Jan 2021 B2
10888321 Shelton, IV et al. Jan 2021 B2
10888322 Morgan et al. Jan 2021 B2
10892899 Shelton, IV et al. Jan 2021 B2
10892995 Shelton, IV et al. Jan 2021 B2
10893863 Shelton, IV et al. Jan 2021 B2
10893864 Harris et al. Jan 2021 B2
10893884 Stoddard et al. Jan 2021 B2
10898105 Weprin et al. Jan 2021 B2
10898183 Shelton, IV et al. Jan 2021 B2
10898186 Bakos et al. Jan 2021 B2
10898189 McDonald, II Jan 2021 B2
10898256 Yates et al. Jan 2021 B2
10898280 Kopp Jan 2021 B2
10898622 Shelton, IV et al. Jan 2021 B2
10902944 Casey et al. Jan 2021 B1
10903685 Yates et al. Jan 2021 B2
10905415 DiNardo et al. Feb 2021 B2
10905418 Shelton, IV et al. Feb 2021 B2
10905420 Jasemian et al. Feb 2021 B2
10912559 Harris et al. Feb 2021 B2
10912567 Shelton, IV et al. Feb 2021 B2
10912580 Green et al. Feb 2021 B2
10912619 Jarc et al. Feb 2021 B2
10916415 Karancsi et al. Feb 2021 B2
D914878 Shelton, IV et al. Mar 2021 S
10932784 Mozdzierz et al. Mar 2021 B2
10950982 Regnier et al. Mar 2021 B2
10952732 Binmoeller et al. Mar 2021 B2
10962449 Unuma et al. Mar 2021 B2
10966590 Takahashi et al. Apr 2021 B2
10980595 Wham Apr 2021 B2
11000276 Shelton, IV et al. May 2021 B2
11020115 Scheib et al. Jun 2021 B2
11051817 Shelton, IV et al. Jul 2021 B2
11051902 Kruecker et al. Jul 2021 B2
11058501 Tokarchuk et al. Jul 2021 B2
11071595 Johnson et al. Jul 2021 B2
11103246 Marczyk et al. Aug 2021 B2
11141213 Yates et al. Oct 2021 B2
11183293 Lu et al. Nov 2021 B2
11185325 Shelton, IV et al. Nov 2021 B2
11197731 Hoffman et al. Dec 2021 B2
11218822 Morgan et al. Jan 2022 B2
11273290 Kowshik Mar 2022 B2
11289188 Mabotuwana et al. Mar 2022 B2
D950728 Bakos et al. May 2022 S
D952144 Boudreaux May 2022 S
11322248 Grantcharov et al. May 2022 B2
11350932 Shelton, IV et al. Jun 2022 B2
11373755 Shelton, IV et al. Jun 2022 B2
11376098 Shelton, IV et al. Jul 2022 B2
11382715 Arai et al. Jul 2022 B2
D964564 Boudreaux Sep 2022 S
11464514 Yates et al. Oct 2022 B2
11464971 Schepis et al. Oct 2022 B2
11504191 Mccloud et al. Nov 2022 B2
11571212 Yates et al. Feb 2023 B2
20010056237 Cane Dec 2001 A1
20020049551 Friedman et al. Apr 2002 A1
20020052616 Wiener et al. May 2002 A1
20020072746 Lingenfelder et al. Jun 2002 A1
20020138642 Miyazawa et al. Sep 2002 A1
20020144147 Basson et al. Oct 2002 A1
20020169584 Fu et al. Nov 2002 A1
20020194023 Turley et al. Dec 2002 A1
20030009111 Cory et al. Jan 2003 A1
20030009154 Whitman Jan 2003 A1
20030018329 Hooven Jan 2003 A1
20030028183 Sanchez et al. Feb 2003 A1
20030046109 Uchikubo Mar 2003 A1
20030050654 Whitman et al. Mar 2003 A1
20030069573 Kadhiresan et al. Apr 2003 A1
20030093503 Yamaki et al. May 2003 A1
20030114851 Truckai et al. Jun 2003 A1
20030120284 Palacios et al. Jun 2003 A1
20030130711 Pearson et al. Jul 2003 A1
20030210812 Khamene et al. Nov 2003 A1
20030223877 Anstine et al. Dec 2003 A1
20040015053 Bieger et al. Jan 2004 A1
20040078236 Stoodley et al. Apr 2004 A1
20040082850 Bonner et al. Apr 2004 A1
20040092992 Adams et al. May 2004 A1
20040108825 Lee et al. Jun 2004 A1
20040199180 Knodel et al. Oct 2004 A1
20040199659 Ishikawa et al. Oct 2004 A1
20040206365 Knowlton Oct 2004 A1
20040215131 Sakurai Oct 2004 A1
20040229496 Robinson et al. Nov 2004 A1
20040243147 Lipow Dec 2004 A1
20040243148 Wasielewski Dec 2004 A1
20040243435 Williams Dec 2004 A1
20050020909 Moctezuma de la Barrera et al. Jan 2005 A1
20050020918 Wilk et al. Jan 2005 A1
20050021027 Shields et al. Jan 2005 A1
20050023324 Doll et al. Feb 2005 A1
20050033108 Sawyer Feb 2005 A1
20050063575 Ma et al. Mar 2005 A1
20050065438 Miller Mar 2005 A1
20050070800 Takahashi Mar 2005 A1
20050100867 Hilscher et al. May 2005 A1
20050131390 Heinrich et al. Jun 2005 A1
20050139629 Schwemberger et al. Jun 2005 A1
20050143759 Kelly Jun 2005 A1
20050148854 Ito et al. Jul 2005 A1
20050149001 Uchikubo et al. Jul 2005 A1
20050149356 Cyr et al. Jul 2005 A1
20050165390 Mauti et al. Jul 2005 A1
20050182655 Merzlak et al. Aug 2005 A1
20050192633 Montpetit Sep 2005 A1
20050203380 Sauer et al. Sep 2005 A1
20050203384 Sati et al. Sep 2005 A1
20050203504 Wham et al. Sep 2005 A1
20050213832 Schofield et al. Sep 2005 A1
20050222631 Dalal et al. Oct 2005 A1
20050228246 Lee et al. Oct 2005 A1
20050228425 Boukhny et al. Oct 2005 A1
20050236474 Onuma et al. Oct 2005 A1
20050251233 Kanzius Nov 2005 A1
20050277913 McCary Dec 2005 A1
20050288425 Lee et al. Dec 2005 A1
20060020272 Gildenberg Jan 2006 A1
20060025816 Shelton Feb 2006 A1
20060039105 Smith et al. Feb 2006 A1
20060059018 Shiobara et al. Mar 2006 A1
20060069388 Truckai et al. Mar 2006 A1
20060079872 Eggleston Apr 2006 A1
20060079874 Faller et al. Apr 2006 A1
20060116908 Dew et al. Jun 2006 A1
20060122558 Sherman et al. Jun 2006 A1
20060136622 Rouvelin et al. Jun 2006 A1
20060142739 DiSilestro et al. Jun 2006 A1
20060184160 Ozaki et al. Aug 2006 A1
20060241399 Fabian Oct 2006 A1
20060282009 Oberg et al. Dec 2006 A1
20060287645 Tashiro et al. Dec 2006 A1
20070005002 Millman et al. Jan 2007 A1
20070010838 Shelton et al. Jan 2007 A1
20070016235 Tanaka et al. Jan 2007 A1
20070016979 Damaj et al. Jan 2007 A1
20070027459 Horvath et al. Feb 2007 A1
20070038080 Salisbury et al. Feb 2007 A1
20070049947 Menn et al. Mar 2007 A1
20070066970 Ineson Mar 2007 A1
20070078678 DiSilvestro et al. Apr 2007 A1
20070084896 Doll et al. Apr 2007 A1
20070085528 Govari et al. Apr 2007 A1
20070156019 Larkin et al. Jul 2007 A1
20070161979 McPherson Jul 2007 A1
20070167702 Hasser et al. Jul 2007 A1
20070168461 Moore Jul 2007 A1
20070173803 Wham et al. Jul 2007 A1
20070175951 Shelton et al. Aug 2007 A1
20070175955 Shelton et al. Aug 2007 A1
20070179482 Anderson Aug 2007 A1
20070179508 Arndt Aug 2007 A1
20070191713 Eichmann et al. Aug 2007 A1
20070192139 Cookson et al. Aug 2007 A1
20070203744 Scholl Aug 2007 A1
20070225556 Ortiz et al. Sep 2007 A1
20070225690 Sekiguchi et al. Sep 2007 A1
20070239028 Houser et al. Oct 2007 A1
20070244478 Bahney Oct 2007 A1
20070249990 Cosmescu Oct 2007 A1
20070270660 Caylor et al. Nov 2007 A1
20070270688 Gelbart et al. Nov 2007 A1
20070282195 Masini et al. Dec 2007 A1
20070282321 Shah et al. Dec 2007 A1
20070282333 Fortson et al. Dec 2007 A1
20070293218 Meylan et al. Dec 2007 A1
20080013460 Allen et al. Jan 2008 A1
20080015664 Podhajsky Jan 2008 A1
20080015912 Rosenthal et al. Jan 2008 A1
20080019393 Yamaki Jan 2008 A1
20080033404 Romoda et al. Feb 2008 A1
20080039742 Hashimshony et al. Feb 2008 A1
20080040151 Moore Feb 2008 A1
20080058593 Gu et al. Mar 2008 A1
20080059658 Williams Mar 2008 A1
20080077158 Haider et al. Mar 2008 A1
20080083414 Messerges Apr 2008 A1
20080091071 Kumar et al. Apr 2008 A1
20080114212 Messerges May 2008 A1
20080114350 Park et al. May 2008 A1
20080129465 Rao Jun 2008 A1
20080140090 Aranyi et al. Jun 2008 A1
20080164296 Shelton et al. Jul 2008 A1
20080167644 Shelton et al. Jul 2008 A1
20080177258 Govari et al. Jul 2008 A1
20080177362 Phillips et al. Jul 2008 A1
20080200940 Eichmann et al. Aug 2008 A1
20080223904 Marczyk Sep 2008 A1
20080234708 Houser et al. Sep 2008 A1
20080235052 Node-Langlois et al. Sep 2008 A1
20080245841 Smith et al. Oct 2008 A1
20080255413 Zemlok et al. Oct 2008 A1
20080262654 Omori et al. Oct 2008 A1
20080272172 Zemlok et al. Nov 2008 A1
20080281301 DeBoer et al. Nov 2008 A1
20080281678 Keuls et al. Nov 2008 A1
20080296346 Shelton, IV et al. Dec 2008 A1
20080306759 Ilkin et al. Dec 2008 A1
20080312953 Claus Dec 2008 A1
20090017910 Rofougaran et al. Jan 2009 A1
20090030437 Houser et al. Jan 2009 A1
20090036750 Weinstein et al. Feb 2009 A1
20090036794 Stubhaug et al. Feb 2009 A1
20090043253 Podaima Feb 2009 A1
20090046146 Hoyt Feb 2009 A1
20090048589 Takashino et al. Feb 2009 A1
20090048595 Mihori et al. Feb 2009 A1
20090048611 Funda et al. Feb 2009 A1
20090076409 Wu et al. Mar 2009 A1
20090090763 Zemlok et al. Apr 2009 A1
20090099866 Newman Apr 2009 A1
20090114699 Viola May 2009 A1
20090128084 Johnson et al. May 2009 A1
20090157072 Wham et al. Jun 2009 A1
20090182577 Squilla et al. Jul 2009 A1
20090188094 Cunningham et al. Jul 2009 A1
20090192591 Ryan et al. Jul 2009 A1
20090206131 Weisenburgh, II et al. Aug 2009 A1
20090217932 Voegele Sep 2009 A1
20090234352 Behnke et al. Sep 2009 A1
20090259149 Tahara et al. Oct 2009 A1
20090259221 Tahara et al. Oct 2009 A1
20090259489 Kimura et al. Oct 2009 A1
20090270678 Scott et al. Oct 2009 A1
20090281541 Ibrahim et al. Nov 2009 A1
20090299214 Wu et al. Dec 2009 A1
20090306581 Claus Dec 2009 A1
20090307681 Armado et al. Dec 2009 A1
20090326321 Jacobsen et al. Dec 2009 A1
20090326336 Lemke et al. Dec 2009 A1
20100036374 Ward Feb 2010 A1
20100036405 Giordano et al. Feb 2010 A1
20100038403 D'Arcangelo Feb 2010 A1
20100057106 Sorrentino et al. Mar 2010 A1
20100065604 Weng Mar 2010 A1
20100069939 Konishi Mar 2010 A1
20100069942 Shelton, IV Mar 2010 A1
20100070417 Flynn et al. Mar 2010 A1
20100120266 Rimborg May 2010 A1
20100132334 Duclos et al. Jun 2010 A1
20100137845 Ramstein et al. Jun 2010 A1
20100137886 Zergiebel et al. Jun 2010 A1
20100168561 Anderson Jul 2010 A1
20100179831 Brown et al. Jul 2010 A1
20100191100 Anderson et al. Jul 2010 A1
20100194574 Monk et al. Aug 2010 A1
20100198200 Horvath Aug 2010 A1
20100198248 Vakharia Aug 2010 A1
20100204717 Knodel Aug 2010 A1
20100217991 Choi Aug 2010 A1
20100234996 Schreiber et al. Sep 2010 A1
20100235689 Tian et al. Sep 2010 A1
20100250571 Pierce et al. Sep 2010 A1
20100258327 Esenwein et al. Oct 2010 A1
20100280247 Mutti et al. Nov 2010 A1
20100292535 Paskar Nov 2010 A1
20100292684 Cybulski et al. Nov 2010 A1
20100301095 Shelton, IV et al. Dec 2010 A1
20110006876 Moberg et al. Jan 2011 A1
20110015649 Anvari et al. Jan 2011 A1
20110022032 Zemlok et al. Jan 2011 A1
20110036890 Ma Feb 2011 A1
20110043612 Keller et al. Feb 2011 A1
20110046618 Minar et al. Feb 2011 A1
20110071530 Carson Mar 2011 A1
20110077512 Boswell Mar 2011 A1
20110087238 Wang et al. Apr 2011 A1
20110087502 Yelton et al. Apr 2011 A1
20110105277 Shauli May 2011 A1
20110105895 Kornblau et al. May 2011 A1
20110112569 Friedman et al. May 2011 A1
20110118708 Burbank et al. May 2011 A1
20110119075 Dhoble May 2011 A1
20110125149 El-Galley et al. May 2011 A1
20110152712 Cao et al. Jun 2011 A1
20110163147 Laurent et al. Jul 2011 A1
20110166883 Palmer et al. Jul 2011 A1
20110196398 Robertson et al. Aug 2011 A1
20110209128 Nikara et al. Aug 2011 A1
20110218526 Mathur Sep 2011 A1
20110237883 Chun Sep 2011 A1
20110238079 Hannaford et al. Sep 2011 A1
20110251612 Faller et al. Oct 2011 A1
20110264000 Paul et al. Oct 2011 A1
20110264078 Lipow et al. Oct 2011 A1
20110264086 Ingle Oct 2011 A1
20110265311 Kondo et al. Nov 2011 A1
20110273465 Konishi et al. Nov 2011 A1
20110278343 Knodel et al. Nov 2011 A1
20110290024 Lefler Dec 2011 A1
20110295270 Giordano et al. Dec 2011 A1
20110306840 Allen et al. Dec 2011 A1
20110307284 Thompson et al. Dec 2011 A1
20120012638 Huang et al. Jan 2012 A1
20120021684 Schultz et al. Jan 2012 A1
20120022519 Huang et al. Jan 2012 A1
20120029354 Mark et al. Feb 2012 A1
20120046662 Gilbert Feb 2012 A1
20120059684 Hampapur et al. Mar 2012 A1
20120078247 Worrell et al. Mar 2012 A1
20120080336 Shelton, IV et al. Apr 2012 A1
20120080498 Shelton, IV et al. Apr 2012 A1
20120083786 Artale et al. Apr 2012 A1
20120100517 Bowditch et al. Apr 2012 A1
20120101488 Aldridge et al. Apr 2012 A1
20120116265 Houser et al. May 2012 A1
20120116381 Houser et al. May 2012 A1
20120116394 Timm et al. May 2012 A1
20120130217 Kauphusman et al. May 2012 A1
20120145714 Farascioni et al. Jun 2012 A1
20120172696 Kallback et al. Jul 2012 A1
20120190981 Harris et al. Jul 2012 A1
20120191091 Allen Jul 2012 A1
20120191162 Villa Jul 2012 A1
20120197619 Namer Yelin et al. Aug 2012 A1
20120203067 Higgins et al. Aug 2012 A1
20120203785 Awada Aug 2012 A1
20120211542 Racenet Aug 2012 A1
20120226150 Balicki et al. Sep 2012 A1
20120232549 Willyard et al. Sep 2012 A1
20120245958 Lawrence et al. Sep 2012 A1
20120253329 Zemlok et al. Oct 2012 A1
20120253847 Dell'Anno et al. Oct 2012 A1
20120265555 Cappuzzo et al. Oct 2012 A1
20120292367 Morgan et al. Nov 2012 A1
20120319859 Taub et al. Dec 2012 A1
20130001121 Metzger Jan 2013 A1
20130006241 Takashino Jan 2013 A1
20130008677 Huifu Jan 2013 A1
20130024213 Poon Jan 2013 A1
20130046182 Hegg et al. Feb 2013 A1
20130046279 Niklewski et al. Feb 2013 A1
20130046295 Kerr et al. Feb 2013 A1
20130066647 Andrie et al. Mar 2013 A1
20130090526 Suzuki et al. Apr 2013 A1
20130090755 Kiryu et al. Apr 2013 A1
20130093829 Rosenblatt et al. Apr 2013 A1
20130096597 Anand et al. Apr 2013 A1
20130116218 Kaplan et al. May 2013 A1
20130131845 Guilleminot May 2013 A1
20130144284 Behnke, II et al. Jun 2013 A1
20130153635 Hodgkinson Jun 2013 A1
20130165776 Blomqvist Jun 2013 A1
20130168435 Huang et al. Jul 2013 A1
20130178853 Hyink et al. Jul 2013 A1
20130190755 Deborski et al. Jul 2013 A1
20130191647 Ferrara, Jr. et al. Jul 2013 A1
20130193188 Shelton, IV et al. Aug 2013 A1
20130196703 Masoud et al. Aug 2013 A1
20130197531 Boukhny et al. Aug 2013 A1
20130201356 Kennedy et al. Aug 2013 A1
20130206813 Nalagatla Aug 2013 A1
20130214025 Zemlok et al. Aug 2013 A1
20130253480 Kimball et al. Sep 2013 A1
20130256373 Schmid et al. Oct 2013 A1
20130267874 Marcotte et al. Oct 2013 A1
20130268283 Vann et al. Oct 2013 A1
20130277410 Fernandez et al. Oct 2013 A1
20130317837 Ballantyne et al. Nov 2013 A1
20130321425 Greene et al. Dec 2013 A1
20130325809 Kim et al. Dec 2013 A1
20130331873 Ross et al. Dec 2013 A1
20130331875 Ross et al. Dec 2013 A1
20140001231 Shelton, IV et al. Jan 2014 A1
20140001234 Shelton, IV et al. Jan 2014 A1
20140005640 Shelton, IV et al. Jan 2014 A1
20140006132 Barker Jan 2014 A1
20140006943 Robbins et al. Jan 2014 A1
20140009894 Yu Jan 2014 A1
20140013565 MacDonald et al. Jan 2014 A1
20140018788 Engelman et al. Jan 2014 A1
20140029411 Nayak et al. Jan 2014 A1
20140033926 Fassel et al. Feb 2014 A1
20140035762 Shelton, IV et al. Feb 2014 A1
20140039491 Bakos et al. Feb 2014 A1
20140058407 Tsekos et al. Feb 2014 A1
20140066700 Wilson et al. Mar 2014 A1
20140073893 Bencini Mar 2014 A1
20140074076 Gertner Mar 2014 A1
20140081255 Johnson et al. Mar 2014 A1
20140081659 Nawana et al. Mar 2014 A1
20140084949 Smith et al. Mar 2014 A1
20140087999 Kaplan et al. Mar 2014 A1
20140092089 Kasuya et al. Apr 2014 A1
20140107697 Patani et al. Apr 2014 A1
20140108035 Akbay et al. Apr 2014 A1
20140108983 William R. et al. Apr 2014 A1
20140117256 Mueller et al. May 2014 A1
20140121669 Claus May 2014 A1
20140142963 Hill et al. May 2014 A1
20140148729 Schmitz et al. May 2014 A1
20140148803 Taylor May 2014 A1
20140163359 Sholev et al. Jun 2014 A1
20140166724 Schellin et al. Jun 2014 A1
20140171778 Tsusaka et al. Jun 2014 A1
20140171787 Garbey et al. Jun 2014 A1
20140176576 Spencer Jun 2014 A1
20140187856 Holoien et al. Jul 2014 A1
20140188440 Donhowe et al. Jul 2014 A1
20140194864 Martin et al. Jul 2014 A1
20140195052 Tsusaka et al. Jul 2014 A1
20140204190 Rosenblatt, III et al. Jul 2014 A1
20140226572 Thota et al. Aug 2014 A1
20140243799 Parihar Aug 2014 A1
20140243809 Gelfand et al. Aug 2014 A1
20140243811 Reschke et al. Aug 2014 A1
20140246475 Hall et al. Sep 2014 A1
20140249557 Koch et al. Sep 2014 A1
20140252064 Mozdzierz et al. Sep 2014 A1
20140263541 Leimbach et al. Sep 2014 A1
20140263552 Hall et al. Sep 2014 A1
20140275760 Lee et al. Sep 2014 A1
20140276748 Ku et al. Sep 2014 A1
20140276749 Johnson Sep 2014 A1
20140278219 Canavan et al. Sep 2014 A1
20140287393 Kumar et al. Sep 2014 A1
20140296694 Jaworski Oct 2014 A1
20140303660 Boyden et al. Oct 2014 A1
20140303990 Schoenefeld et al. Oct 2014 A1
20140336943 Pellini et al. Nov 2014 A1
20140337052 Pellini et al. Nov 2014 A1
20140364691 Krivopisk et al. Dec 2014 A1
20150006201 Pait et al. Jan 2015 A1
20150012010 Adler et al. Jan 2015 A1
20150025549 Kilroy et al. Jan 2015 A1
20150032150 Ishida et al. Jan 2015 A1
20150051452 Ciaccio Feb 2015 A1
20150051598 Orszulak et al. Feb 2015 A1
20150051617 Takemura et al. Feb 2015 A1
20150053737 Leimbach et al. Feb 2015 A1
20150053743 Yates et al. Feb 2015 A1
20150053746 Shelton, IV et al. Feb 2015 A1
20150053749 Shelton, IV et al. Feb 2015 A1
20150057675 Akeel et al. Feb 2015 A1
20150062316 Haraguchi et al. Mar 2015 A1
20150066000 An et al. Mar 2015 A1
20150070187 Wiesner et al. Mar 2015 A1
20150073400 Sverdlik et al. Mar 2015 A1
20150077528 Awdeh Mar 2015 A1
20150083774 Measamer et al. Mar 2015 A1
20150099458 Weisner et al. Apr 2015 A1
20150108198 Estrella Apr 2015 A1
20150133945 Dushyant et al. May 2015 A1
20150136833 Shelton, IV et al. May 2015 A1
20150140982 Postrel May 2015 A1
20150141980 Jadhav et al. May 2015 A1
20150145682 Harris May 2015 A1
20150148830 Stulen et al. May 2015 A1
20150157354 Bales, Jr. et al. Jun 2015 A1
20150168126 Nevet Jun 2015 A1
20150173673 Toth et al. Jun 2015 A1
20150173756 Baxter, III et al. Jun 2015 A1
20150182220 Yates et al. Jul 2015 A1
20150196295 Shelton, IV et al. Jul 2015 A1
20150199109 Lee Jul 2015 A1
20150201918 Kumar et al. Jul 2015 A1
20150202014 Kim et al. Jul 2015 A1
20150208934 Sztrubel et al. Jul 2015 A1
20150223725 Engel et al. Aug 2015 A1
20150223868 Brandt et al. Aug 2015 A1
20150237502 Schmidt et al. Aug 2015 A1
20150238118 Legassey et al. Aug 2015 A1
20150238355 Vezzu et al. Aug 2015 A1
20150257783 Levine et al. Sep 2015 A1
20150272557 Overmyer et al. Oct 2015 A1
20150272571 Leimbach et al. Oct 2015 A1
20150272580 Leimbach et al. Oct 2015 A1
20150272582 Leimbach et al. Oct 2015 A1
20150272694 Charles Oct 2015 A1
20150282733 Fielden et al. Oct 2015 A1
20150282821 Look et al. Oct 2015 A1
20150289925 Voegele et al. Oct 2015 A1
20150296042 Aoyama Oct 2015 A1
20150297200 Fitzsimmons et al. Oct 2015 A1
20150297222 Huitema et al. Oct 2015 A1
20150297228 Huitema et al. Oct 2015 A1
20150297233 Huitema et al. Oct 2015 A1
20150297311 Tesar Oct 2015 A1
20150302157 Collar et al. Oct 2015 A1
20150305828 Park et al. Oct 2015 A1
20150310174 Coudert et al. Oct 2015 A1
20150313538 Bechtel et al. Nov 2015 A1
20150317899 Dumbauld et al. Nov 2015 A1
20150320423 Aranyi Nov 2015 A1
20150324114 Hurley et al. Nov 2015 A1
20150328474 Flyash et al. Nov 2015 A1
20150331995 Zhao et al. Nov 2015 A1
20150332003 Stamm et al. Nov 2015 A1
20150332196 Stiller et al. Nov 2015 A1
20150335344 Aljuri et al. Nov 2015 A1
20150374259 Garbey et al. Dec 2015 A1
20160000437 Giordano et al. Jan 2016 A1
20160001411 Alberti Jan 2016 A1
20160005169 Sela et al. Jan 2016 A1
20160015471 Piron et al. Jan 2016 A1
20160019346 Boston et al. Jan 2016 A1
20160022374 Haider et al. Jan 2016 A1
20160034648 Mohlenbrock et al. Feb 2016 A1
20160038224 Couture et al. Feb 2016 A1
20160038253 Piron et al. Feb 2016 A1
20160048780 Sethumadhavan et al. Feb 2016 A1
20160051315 Boudreaux Feb 2016 A1
20160058439 Shelton, IV et al. Mar 2016 A1
20160066913 Swayze et al. Mar 2016 A1
20160078190 Greene et al. Mar 2016 A1
20160100837 Huang et al. Apr 2016 A1
20160106516 Mesallum Apr 2016 A1
20160106934 Hiraga et al. Apr 2016 A1
20160121143 Mumaw et al. May 2016 A1
20160143659 Glutz et al. May 2016 A1
20160157717 Gaster Jun 2016 A1
20160158468 Tang et al. Jun 2016 A1
20160166336 Razzaque et al. Jun 2016 A1
20160174998 Lal et al. Jun 2016 A1
20160175025 Strobl Jun 2016 A1
20160180045 Syed Jun 2016 A1
20160182637 Adriaens et al. Jun 2016 A1
20160184054 Lowe Jun 2016 A1
20160192960 Bueno et al. Jul 2016 A1
20160203599 Gillies et al. Jul 2016 A1
20160206202 Frangioni Jul 2016 A1
20160206362 Mehta et al. Jul 2016 A1
20160224760 Petak et al. Aug 2016 A1
20160225551 Shedletsky Aug 2016 A1
20160228061 Kallback et al. Aug 2016 A1
20160228204 Quaid et al. Aug 2016 A1
20160235303 Fleming et al. Aug 2016 A1
20160242836 Eggers et al. Aug 2016 A1
20160249910 Shelton, IV et al. Sep 2016 A1
20160249920 Gupta et al. Sep 2016 A1
20160270732 Källbäck et al. Sep 2016 A1
20160270842 Strobl et al. Sep 2016 A1
20160270861 Guru et al. Sep 2016 A1
20160275259 Nolan et al. Sep 2016 A1
20160278841 Panescu et al. Sep 2016 A1
20160287312 Tegg et al. Oct 2016 A1
20160287316 Worrell et al. Oct 2016 A1
20160287337 Aram et al. Oct 2016 A1
20160287912 Warnking Oct 2016 A1
20160292456 Dubey et al. Oct 2016 A1
20160296246 Schaller Oct 2016 A1
20160302210 Thornton et al. Oct 2016 A1
20160310055 Zand et al. Oct 2016 A1
20160310204 McHenry et al. Oct 2016 A1
20160314716 Grubbs Oct 2016 A1
20160314717 Grubbs Oct 2016 A1
20160317172 Kumada et al. Nov 2016 A1
20160321400 Durrant et al. Nov 2016 A1
20160323283 Kang et al. Nov 2016 A1
20160331460 Cheatham, III et al. Nov 2016 A1
20160331473 Yamamura Nov 2016 A1
20160338685 Nawana et al. Nov 2016 A1
20160342753 Feazell Nov 2016 A1
20160342916 Arceneaux et al. Nov 2016 A1
20160345857 Jensrud et al. Dec 2016 A1
20160345976 Gonzalez et al. Dec 2016 A1
20160350490 Martinez et al. Dec 2016 A1
20160354160 Crowley et al. Dec 2016 A1
20160354162 Yen et al. Dec 2016 A1
20160361070 Ardel et al. Dec 2016 A1
20160367305 Hareland Dec 2016 A1
20160367401 Claus Dec 2016 A1
20160374710 Sinelnikov et al. Dec 2016 A1
20160374723 Frankhouser et al. Dec 2016 A1
20160374762 Case et al. Dec 2016 A1
20160379504 Bailey et al. Dec 2016 A1
20170000516 Stulen et al. Jan 2017 A1
20170000553 Wiener et al. Jan 2017 A1
20170005911 Kasargod et al. Jan 2017 A1
20170007247 Shelton, IV et al. Jan 2017 A1
20170027603 Pandey Feb 2017 A1
20170042604 McFarland et al. Feb 2017 A1
20170049522 Kapadia Feb 2017 A1
20170068792 Reiner Mar 2017 A1
20170079530 DiMaio et al. Mar 2017 A1
20170079730 Azizian et al. Mar 2017 A1
20170086829 Vendely et al. Mar 2017 A1
20170086906 Tsuruta Mar 2017 A1
20170086930 Thompson et al. Mar 2017 A1
20170105754 Boudreaux et al. Apr 2017 A1
20170105787 Witt et al. Apr 2017 A1
20170116873 Lendvay et al. Apr 2017 A1
20170119477 Amiot et al. May 2017 A1
20170127499 Unoson et al. May 2017 A1
20170132374 Lee et al. May 2017 A1
20170132385 Hunter et al. May 2017 A1
20170132785 Wshah et al. May 2017 A1
20170143284 Sehnert et al. May 2017 A1
20170143366 Groene et al. May 2017 A1
20170143442 Tesar et al. May 2017 A1
20170147759 Iyer et al. May 2017 A1
20170154156 Sevenster et al. Jun 2017 A1
20170156076 Eom et al. Jun 2017 A1
20170164996 Honda et al. Jun 2017 A1
20170164997 Johnson et al. Jun 2017 A1
20170165008 Finley Jun 2017 A1
20170165012 Chaplin et al. Jun 2017 A1
20170172550 Mukherjee et al. Jun 2017 A1
20170172565 Heneveld Jun 2017 A1
20170172614 Scheib et al. Jun 2017 A1
20170172674 Hanuschik et al. Jun 2017 A1
20170172676 Itkowitz et al. Jun 2017 A1
20170173262 Veltz Jun 2017 A1
20170177807 Fabian Jun 2017 A1
20170178069 Paterra et al. Jun 2017 A1
20170185732 Niklewski et al. Jun 2017 A1
20170196583 Sugiyama Jul 2017 A1
20170196637 Shelton, IV et al. Jul 2017 A1
20170202591 Shelton, IV et al. Jul 2017 A1
20170202595 Shelton, IV Jul 2017 A1
20170202607 Shelton, IV et al. Jul 2017 A1
20170202608 Shelton, IV et al. Jul 2017 A1
20170209145 Swayze et al. Jul 2017 A1
20170215944 Keffeler Aug 2017 A1
20170224332 Hunter et al. Aug 2017 A1
20170224334 Worthington et al. Aug 2017 A1
20170224428 Kopp Aug 2017 A1
20170231553 Igarashi et al. Aug 2017 A1
20170231627 Shelton, IV et al. Aug 2017 A1
20170231628 Shelton, IV et al. Aug 2017 A1
20170245809 Ma et al. Aug 2017 A1
20170249431 Shelton, IV et al. Aug 2017 A1
20170249432 Grantcharov Aug 2017 A1
20170262604 Francois Sep 2017 A1
20170265864 Hessler et al. Sep 2017 A1
20170265943 Sela et al. Sep 2017 A1
20170273715 Piron et al. Sep 2017 A1
20170281171 Shelton, IV et al. Oct 2017 A1
20170281173 Shelton, IV et al. Oct 2017 A1
20170281186 Shelton, IV et al. Oct 2017 A1
20170281189 Nalagatla et al. Oct 2017 A1
20170289617 Song et al. Oct 2017 A1
20170290585 Shelton, IV et al. Oct 2017 A1
20170296169 Yates et al. Oct 2017 A1
20170296173 Shelton, IV et al. Oct 2017 A1
20170296185 Swensgard et al. Oct 2017 A1
20170296213 Swensgard et al. Oct 2017 A1
20170303984 Malackowski Oct 2017 A1
20170304007 Piron et al. Oct 2017 A1
20170304020 Ng et al. Oct 2017 A1
20170311777 Hirayama et al. Nov 2017 A1
20170312456 Phillips Nov 2017 A1
20170319268 Akagane Nov 2017 A1
20170325876 Nakadate et al. Nov 2017 A1
20170325878 Messerly et al. Nov 2017 A1
20170333147 Bernstein Nov 2017 A1
20170333152 Wade Nov 2017 A1
20170337043 Brincat et al. Nov 2017 A1
20170348047 Reiter et al. Dec 2017 A1
20170360358 Amiot et al. Dec 2017 A1
20170360499 Greep et al. Dec 2017 A1
20170367583 Black et al. Dec 2017 A1
20170367695 Shelton, IV et al. Dec 2017 A1
20170367754 Narisawa Dec 2017 A1
20170367771 Tako et al. Dec 2017 A1
20170367772 Gunn et al. Dec 2017 A1
20170370710 Chen et al. Dec 2017 A1
20180008359 Randle Jan 2018 A1
20180011983 Zuhars et al. Jan 2018 A1
20180014764 Bechtel et al. Jan 2018 A1
20180021058 Meglan Jan 2018 A1
20180042659 Rupp et al. Feb 2018 A1
20180050196 Pawsey et al. Feb 2018 A1
20180052971 Hanina et al. Feb 2018 A1
20180055529 Messerly et al. Mar 2018 A1
20180056496 Rubens et al. Mar 2018 A1
20180065248 Barral et al. Mar 2018 A1
20180078170 Panescu et al. Mar 2018 A1
20180082480 White et al. Mar 2018 A1
20180085102 Kikuchi Mar 2018 A1
20180098049 Sugano et al. Apr 2018 A1
20180098816 Govari et al. Apr 2018 A1
20180108438 Ryan et al. Apr 2018 A1
20180110523 Shelton, IV Apr 2018 A1
20180116662 Shelton, IV et al. May 2018 A1
20180116735 Tierney et al. May 2018 A1
20180122506 Grantcharov et al. May 2018 A1
20180125590 Giordano et al. May 2018 A1
20180132895 Silver May 2018 A1
20180144243 Hsieh et al. May 2018 A1
20180144314 Miller May 2018 A1
20180153436 Olson Jun 2018 A1
20180153574 Faller et al. Jun 2018 A1
20180153628 Grover et al. Jun 2018 A1
20180153632 Tokarchuk et al. Jun 2018 A1
20180154297 Maletich et al. Jun 2018 A1
20180161062 Kaga et al. Jun 2018 A1
20180161716 Li et al. Jun 2018 A1
20180165780 Romeo Jun 2018 A1
20180168574 Robinson et al. Jun 2018 A1
20180168575 Simms et al. Jun 2018 A1
20180168577 Aronhalt et al. Jun 2018 A1
20180168578 Aronhalt et al. Jun 2018 A1
20180168579 Aronhalt et al. Jun 2018 A1
20180168584 Harris et al. Jun 2018 A1
20180168586 Shelton, IV et al. Jun 2018 A1
20180168590 Overmyer et al. Jun 2018 A1
20180168592 Overmyer et al. Jun 2018 A1
20180168593 Overmyer et al. Jun 2018 A1
20180168597 Fanelli et al. Jun 2018 A1
20180168598 Shelton, IV et al. Jun 2018 A1
20180168608 Shelton, IV et al. Jun 2018 A1
20180168609 Fanelli et al. Jun 2018 A1
20180168610 Shelton, IV et al. Jun 2018 A1
20180168614 Shelton, IV et al. Jun 2018 A1
20180168615 Shelton, IV et al. Jun 2018 A1
20180168617 Shelton, IV et al. Jun 2018 A1
20180168618 Scott et al. Jun 2018 A1
20180168619 Scott et al. Jun 2018 A1
20180168623 Simms et al. Jun 2018 A1
20180168625 Posada et al. Jun 2018 A1
20180168627 Weaner et al. Jun 2018 A1
20180168628 Hunter et al. Jun 2018 A1
20180168633 Shelton, IV et al. Jun 2018 A1
20180168647 Shelton, IV et al. Jun 2018 A1
20180168648 Shelton, IV et al. Jun 2018 A1
20180168649 Shelton, IV et al. Jun 2018 A1
20180168650 Shelton, IV et al. Jun 2018 A1
20180168651 Shelton, IV et al. Jun 2018 A1
20180172420 Hein et al. Jun 2018 A1
20180177383 Noonan et al. Jun 2018 A1
20180182475 Cossler et al. Jun 2018 A1
20180183684 Jacobson et al. Jun 2018 A1
20180193579 Hanrahan et al. Jul 2018 A1
20180206884 Beaupre Jul 2018 A1
20180206905 Batchelor et al. Jul 2018 A1
20180211726 Courtemanche et al. Jul 2018 A1
20180214025 Homyk et al. Aug 2018 A1
20180221005 Hamel et al. Aug 2018 A1
20180221598 Silver Aug 2018 A1
20180228557 Darisse et al. Aug 2018 A1
20180233222 Daley et al. Aug 2018 A1
20180233235 Angelides Aug 2018 A1
20180235719 Jarc Aug 2018 A1
20180235722 Baghdadi et al. Aug 2018 A1
20180242967 Meade Aug 2018 A1
20180247128 Alvi et al. Aug 2018 A1
20180247711 Terry Aug 2018 A1
20180250086 Grubbs Sep 2018 A1
20180250825 Hashimoto et al. Sep 2018 A1
20180263699 Murphy et al. Sep 2018 A1
20180263710 Sakaguchi et al. Sep 2018 A1
20180268320 Shekhar Sep 2018 A1
20180271520 Shelton, IV et al. Sep 2018 A1
20180271603 Nir et al. Sep 2018 A1
20180289427 Griffiths et al. Oct 2018 A1
20180294060 Kassab Oct 2018 A1
20180296286 Peine et al. Oct 2018 A1
20180296289 Rodriguez-Navarro et al. Oct 2018 A1
20180300506 Kawakami et al. Oct 2018 A1
20180303552 Ryan et al. Oct 2018 A1
20180304471 Tokuchi Oct 2018 A1
20180310935 Wixey Nov 2018 A1
20180310986 Batchelor et al. Nov 2018 A1
20180315492 Bishop et al. Nov 2018 A1
20180317826 Muhsin et al. Nov 2018 A1
20180317916 Wixey Nov 2018 A1
20180325619 Rauniyar et al. Nov 2018 A1
20180333188 Nott et al. Nov 2018 A1
20180333207 Moctezuma De la Barrera Nov 2018 A1
20180333209 Frushour et al. Nov 2018 A1
20180349721 Agrawal Dec 2018 A1
20180351987 Patel et al. Dec 2018 A1
20180353186 Mozdzierz et al. Dec 2018 A1
20180357383 Allen et al. Dec 2018 A1
20180360454 Shelton, IV et al. Dec 2018 A1
20180360456 Shelton, IV et al. Dec 2018 A1
20180366213 Fidone et al. Dec 2018 A1
20180368930 Esterberg et al. Dec 2018 A1
20180369511 Zergiebel et al. Dec 2018 A1
20190000446 Shelton, IV et al. Jan 2019 A1
20190000478 Messerly et al. Jan 2019 A1
20190000565 Shelton, IV et al. Jan 2019 A1
20190000569 Crawford et al. Jan 2019 A1
20190001079 Zergiebel et al. Jan 2019 A1
20190005641 Yamamoto Jan 2019 A1
20190006047 Gorek et al. Jan 2019 A1
20190025040 Andreason et al. Jan 2019 A1
20190036688 Wasily et al. Jan 2019 A1
20190038335 Mohr et al. Feb 2019 A1
20190038364 Enoki Feb 2019 A1
20190045515 Kwasnick et al. Feb 2019 A1
20190046198 Stokes et al. Feb 2019 A1
20190053801 Wixey et al. Feb 2019 A1
20190053866 Seow et al. Feb 2019 A1
20190059986 Shelton, IV et al. Feb 2019 A1
20190059997 Frushour Feb 2019 A1
20190069949 Vrba et al. Mar 2019 A1
20190069964 Hagn Mar 2019 A1
20190069966 Petersen et al. Mar 2019 A1
20190070550 Lalomia et al. Mar 2019 A1
20190070731 Bowling et al. Mar 2019 A1
20190083190 Graves et al. Mar 2019 A1
20190083809 Zhang Mar 2019 A1
20190087544 Peterson Mar 2019 A1
20190099221 Schmidt et al. Apr 2019 A1
20190099226 Hallen Apr 2019 A1
20190104919 Shelton, IV et al. Apr 2019 A1
20190105468 Kato et al. Apr 2019 A1
20190110828 Despatie Apr 2019 A1
20190110855 Barral et al. Apr 2019 A1
20190110856 Barral et al. Apr 2019 A1
20190115108 Hegedus et al. Apr 2019 A1
20190122330 Saget et al. Apr 2019 A1
20190125320 Shelton, IV et al. May 2019 A1
20190125321 Shelton, IV et al. May 2019 A1
20190125324 Scheib et al. May 2019 A1
20190125335 Shelton, IV et al. May 2019 A1
20190125336 Deck et al. May 2019 A1
20190125337 Shelton, IV et al. May 2019 A1
20190125338 Shelton, IV et al. May 2019 A1
20190125339 Shelton, IV et al. May 2019 A1
20190125347 Stokes et al. May 2019 A1
20190125348 Shelton, IV et al. May 2019 A1
20190125352 Shelton, IV et al. May 2019 A1
20190125353 Shelton, IV et al. May 2019 A1
20190125354 Deck et al. May 2019 A1
20190125355 Shelton, IV et al. May 2019 A1
20190125356 Shelton, IV et al. May 2019 A1
20190125357 Shelton, IV et al. May 2019 A1
20190125358 Shelton, IV et al. May 2019 A1
20190125359 Shelton, IV et al. May 2019 A1
20190125360 Shelton, IV et al. May 2019 A1
20190125361 Shelton, IV et al. May 2019 A1
20190125377 Shelton, IV May 2019 A1
20190125378 Shelton, IV et al. May 2019 A1
20190125379 Shelton, IV et al. May 2019 A1
20190125380 Hunter et al. May 2019 A1
20190125383 Scheib et al. May 2019 A1
20190125384 Scheib et al. May 2019 A1
20190125385 Scheib et al. May 2019 A1
20190125386 Shelton, IV et al. May 2019 A1
20190125387 Parihar et al. May 2019 A1
20190125388 Shelton, IV et al. May 2019 A1
20190125389 Shelton, IV et al. May 2019 A1
20190125430 Shelton, IV et al. May 2019 A1
20190125431 Shelton, IV et al. May 2019 A1
20190125432 Shelton, IV et al. May 2019 A1
20190125454 Stokes et al. May 2019 A1
20190125455 Shelton, IV et al. May 2019 A1
20190125456 Shelton, IV et al. May 2019 A1
20190125457 Parihar et al. May 2019 A1
20190125458 Shelton, IV et al. May 2019 A1
20190125459 Shelton, IV et al. May 2019 A1
20190125476 Shelton, IV et al. May 2019 A1
20190133703 Seow et al. May 2019 A1
20190142449 Shelton, IV et al. May 2019 A1
20190142535 Seow et al. May 2019 A1
20190145942 Dutriez et al. May 2019 A1
20190150975 Kawasaki et al. May 2019 A1
20190159777 Ehrenfels et al. May 2019 A1
20190159778 Shelton, IV et al. May 2019 A1
20190162179 O'Shea et al. May 2019 A1
20190163875 Allen et al. May 2019 A1
20190167296 Tsubuku et al. Jun 2019 A1
20190192044 Ravi et al. Jun 2019 A1
20190192157 Scott et al. Jun 2019 A1
20190192236 Shelton, IV et al. Jun 2019 A1
20190200844 Shelton, IV et al. Jul 2019 A1
20190200863 Shelton, IV et al. Jul 2019 A1
20190200906 Shelton, IV et al. Jul 2019 A1
20190200977 Shelton, IV et al. Jul 2019 A1
20190200980 Shelton, IV et al. Jul 2019 A1
20190200981 Harris et al. Jul 2019 A1
20190200984 Shelton, IV et al. Jul 2019 A1
20190200985 Shelton, IV et al. Jul 2019 A1
20190200986 Shelton, IV et al. Jul 2019 A1
20190200987 Shelton, IV et al. Jul 2019 A1
20190200988 Shelton, IV Jul 2019 A1
20190200996 Shelton, IV et al. Jul 2019 A1
20190200997 Shelton, IV et al. Jul 2019 A1
20190200998 Shelton, IV et al. Jul 2019 A1
20190201020 Shelton, IV et al. Jul 2019 A1
20190201021 Shelton, IV et al. Jul 2019 A1
20190201023 Shelton, IV et al. Jul 2019 A1
20190201024 Shelton, IV et al. Jul 2019 A1
20190201025 Shelton, IV et al. Jul 2019 A1
20190201026 Shelton, IV et al. Jul 2019 A1
20190201027 Shelton, IV et al. Jul 2019 A1
20190201028 Shelton, IV et al. Jul 2019 A1
20190201029 Shelton, IV et al. Jul 2019 A1
20190201030 Shelton, IV et al. Jul 2019 A1
20190201033 Yates et al. Jul 2019 A1
20190201034 Shelton, IV et al. Jul 2019 A1
20190201036 Nott et al. Jul 2019 A1
20190201037 Houser et al. Jul 2019 A1
20190201038 Yates et al. Jul 2019 A1
20190201039 Widenhouse et al. Jul 2019 A1
20190201040 Messerly et al. Jul 2019 A1
20190201041 Kimball et al. Jul 2019 A1
20190201042 Nott et al. Jul 2019 A1
20190201043 Shelton, IV et al. Jul 2019 A1
20190201044 Shelton, IV et al. Jul 2019 A1
20190201045 Yates et al. Jul 2019 A1
20190201046 Shelton, IV et al. Jul 2019 A1
20190201047 Yates et al. Jul 2019 A1
20190201073 Nott et al. Jul 2019 A1
20190201074 Yates et al. Jul 2019 A1
20190201075 Shelton, IV et al. Jul 2019 A1
20190201076 Honda et al. Jul 2019 A1
20190201077 Yates et al. Jul 2019 A1
20190201079 Shelton, IV et al. Jul 2019 A1
20190201080 Messerly et al. Jul 2019 A1
20190201081 Shelton, IV et al. Jul 2019 A1
20190201082 Shelton, IV et al. Jul 2019 A1
20190201083 Shelton, IV et al. Jul 2019 A1
20190201084 Shelton, IV et al. Jul 2019 A1
20190201085 Shelton, IV et al. Jul 2019 A1
20190201086 Shelton, IV et al. Jul 2019 A1
20190201087 Shelton, IV et al. Jul 2019 A1
20190201090 Shelton, IV et al. Jul 2019 A1
20190201091 Yates et al. Jul 2019 A1
20190201092 Yates et al. Jul 2019 A1
20190201102 Shelton, IV et al. Jul 2019 A1
20190201104 Shelton, IV et al. Jul 2019 A1
20190201105 Shelton, IV et al. Jul 2019 A1
20190201111 Shelton, IV et al. Jul 2019 A1
20190201112 Wiener et al. Jul 2019 A1
20190201113 Shelton, IV et al. Jul 2019 A1
20190201114 Shelton, IV et al. Jul 2019 A1
20190201115 Shelton, IV et al. Jul 2019 A1
20190201116 Shelton, IV et al. Jul 2019 A1
20190201118 Shelton, IV et al. Jul 2019 A1
20190201119 Harris et al. Jul 2019 A1
20190201120 Shelton, IV et al. Jul 2019 A1
20190201123 Shelton, IV et al. Jul 2019 A1
20190201124 Shelton, IV et al. Jul 2019 A1
20190201125 Shelton, IV et al. Jul 2019 A1
20190201126 Shelton, IV et al. Jul 2019 A1
20190201127 Shelton, IV et al. Jul 2019 A1
20190201128 Yates et al. Jul 2019 A1
20190201129 Shelton, IV et al. Jul 2019 A1
20190201130 Shelton, IV et al. Jul 2019 A1
20190201135 Shelton, IV et al. Jul 2019 A1
20190201136 Shelton, IV et al. Jul 2019 A1
20190201137 Shelton, IV et al. Jul 2019 A1
20190201138 Yates et al. Jul 2019 A1
20190201139 Shelton, IV et al. Jul 2019 A1
20190201140 Yates et al. Jul 2019 A1
20190201141 Shelton, IV et al. Jul 2019 A1
20190201142 Shelton, IV et al. Jul 2019 A1
20190201143 Shelton, IV et al. Jul 2019 A1
20190201144 Shelton, IV et al. Jul 2019 A1
20190201145 Shelton, IV et al. Jul 2019 A1
20190201146 Shelton, IV et al. Jul 2019 A1
20190201158 Shelton, IV et al. Jul 2019 A1
20190201159 Shelton, IV et al. Jul 2019 A1
20190201594 Shelton, IV et al. Jul 2019 A1
20190201597 Shelton, IV et al. Jul 2019 A1
20190204201 Shelton, IV et al. Jul 2019 A1
20190205001 Messerly et al. Jul 2019 A1
20190205441 Shelton, IV et al. Jul 2019 A1
20190205566 Shelton, IV et al. Jul 2019 A1
20190205567 Shelton, IV et al. Jul 2019 A1
20190206003 Harris et al. Jul 2019 A1
20190206004 Shelton, IV et al. Jul 2019 A1
20190206050 Yates et al. Jul 2019 A1
20190206216 Shelton, IV et al. Jul 2019 A1
20190206542 Shelton, IV et al. Jul 2019 A1
20190206551 Yates et al. Jul 2019 A1
20190206555 Morgan et al. Jul 2019 A1
20190206556 Shelton, IV et al. Jul 2019 A1
20190206561 Shelton, IV et al. Jul 2019 A1
20190206562 Shelton, IV et al. Jul 2019 A1
20190206563 Shelton, IV et al. Jul 2019 A1
20190206564 Shelton, IV et al. Jul 2019 A1
20190206565 Shelton, IV Jul 2019 A1
20190206569 Shelton, IV et al. Jul 2019 A1
20190206576 Shelton, IV et al. Jul 2019 A1
20190207911 Wiener et al. Jul 2019 A1
20190208641 Yates et al. Jul 2019 A1
20190224434 Silver et al. Jul 2019 A1
20190254759 Azizian Aug 2019 A1
20190261984 Nelson et al. Aug 2019 A1
20190269476 Bowling et al. Sep 2019 A1
20190272917 Couture et al. Sep 2019 A1
20190274662 Rockman et al. Sep 2019 A1
20190274705 Sawhney et al. Sep 2019 A1
20190274706 Nott et al. Sep 2019 A1
20190274707 Sawhney et al. Sep 2019 A1
20190274708 Boudreaux Sep 2019 A1
20190274709 Scoggins Sep 2019 A1
20190274710 Black Sep 2019 A1
20190274711 Scoggins et al. Sep 2019 A1
20190274712 Faller et al. Sep 2019 A1
20190274713 Scoggins et al. Sep 2019 A1
20190274714 Cuti et al. Sep 2019 A1
20190274716 Nott et al. Sep 2019 A1
20190274717 Nott et al. Sep 2019 A1
20190274718 Denzinger et al. Sep 2019 A1
20190274719 Stulen Sep 2019 A1
20190274720 Gee et al. Sep 2019 A1
20190274749 Brady et al. Sep 2019 A1
20190274750 Jayme et al. Sep 2019 A1
20190274752 Denzinger et al. Sep 2019 A1
20190278262 Taylor et al. Sep 2019 A1
20190282311 Nowlin et al. Sep 2019 A1
20190290389 Kopp Sep 2019 A1
20190298340 Shelton, IV et al. Oct 2019 A1
20190298341 Shelton, IV et al. Oct 2019 A1
20190298342 Shelton, IV et al. Oct 2019 A1
20190298343 Shelton, IV et al. Oct 2019 A1
20190298346 Shelton, IV et al. Oct 2019 A1
20190298347 Shelton, IV et al. Oct 2019 A1
20190298350 Shelton, IV et al. Oct 2019 A1
20190298351 Shelton, IV et al. Oct 2019 A1
20190298352 Shelton, IV et al. Oct 2019 A1
20190298353 Shelton, IV et al. Oct 2019 A1
20190298354 Shelton, IV et al. Oct 2019 A1
20190298355 Shelton, IV et al. Oct 2019 A1
20190298356 Shelton, IV et al. Oct 2019 A1
20190298357 Shelton, IV et al. Oct 2019 A1
20190298464 Abbott Oct 2019 A1
20190298481 Rosenberg et al. Oct 2019 A1
20190307520 Peine et al. Oct 2019 A1
20190311802 Kokubo et al. Oct 2019 A1
20190314015 Shelton, IV et al. Oct 2019 A1
20190314016 Huitema et al. Oct 2019 A1
20190314081 Brogna Oct 2019 A1
20190320929 Spencer et al. Oct 2019 A1
20190321117 Itkowitz et al. Oct 2019 A1
20190333626 Mansi et al. Oct 2019 A1
20190343594 Garcia Kilroy et al. Nov 2019 A1
20190365569 Skovgaard et al. Dec 2019 A1
20190374140 Tucker et al. Dec 2019 A1
20190374292 Barral et al. Dec 2019 A1
20190378610 Barral et al. Dec 2019 A1
20200000470 Du et al. Jan 2020 A1
20200000509 Hayashida et al. Jan 2020 A1
20200038120 Ziraknejad et al. Feb 2020 A1
20200046353 Deck et al. Feb 2020 A1
20200054317 Pisarnwongs et al. Feb 2020 A1
20200054320 Harris et al. Feb 2020 A1
20200054321 Harris et al. Feb 2020 A1
20200054322 Harris et al. Feb 2020 A1
20200054323 Harris et al. Feb 2020 A1
20200054326 Harris et al. Feb 2020 A1
20200054328 Harris et al. Feb 2020 A1
20200054330 Harris et al. Feb 2020 A1
20200078070 Henderson et al. Mar 2020 A1
20200078071 Asher Mar 2020 A1
20200078076 Henderson et al. Mar 2020 A1
20200078077 Henderson et al. Mar 2020 A1
20200078078 Henderson et al. Mar 2020 A1
20200078079 Morgan et al. Mar 2020 A1
20200078080 Henderson et al. Mar 2020 A1
20200078081 Jayme et al. Mar 2020 A1
20200078082 Henderson et al. Mar 2020 A1
20200078089 Henderson et al. Mar 2020 A1
20200078096 Barbagli et al. Mar 2020 A1
20200078106 Henderson et al. Mar 2020 A1
20200078110 Henderson et al. Mar 2020 A1
20200078111 Oberkircher et al. Mar 2020 A1
20200078112 Henderson et al. Mar 2020 A1
20200078113 Sawhney et al. Mar 2020 A1
20200078114 Asher et al. Mar 2020 A1
20200078115 Asher et al. Mar 2020 A1
20200078116 Oberkircher et al. Mar 2020 A1
20200078117 Henderson et al. Mar 2020 A1
20200078118 Henderson et al. Mar 2020 A1
20200078119 Henderson et al. Mar 2020 A1
20200078120 Aldridge et al. Mar 2020 A1
20200081585 Petre et al. Mar 2020 A1
20200090808 Carroll et al. Mar 2020 A1
20200100825 Henderson et al. Apr 2020 A1
20200100830 Henderson et al. Apr 2020 A1
20200106220 Henderson et al. Apr 2020 A1
20200162896 Su et al. May 2020 A1
20200168323 Bullington et al. May 2020 A1
20200178760 Kashima et al. Jun 2020 A1
20200178971 Harris et al. Jun 2020 A1
20200193600 Shameli et al. Jun 2020 A1
20200197027 Hershberger et al. Jun 2020 A1
20200203004 Shanbhag et al. Jun 2020 A1
20200214699 Shelton, IV et al. Jul 2020 A1
20200222079 Swaney et al. Jul 2020 A1
20200222149 Valentine et al. Jul 2020 A1
20200226751 Jin et al. Jul 2020 A1
20200230803 Yamashita et al. Jul 2020 A1
20200237372 Park Jul 2020 A1
20200261075 Boudreaux et al. Aug 2020 A1
20200261076 Boudreaux et al. Aug 2020 A1
20200261077 Shelton, IV et al. Aug 2020 A1
20200261078 Bakos et al. Aug 2020 A1
20200261080 Bakos et al. Aug 2020 A1
20200261081 Boudreaux et al. Aug 2020 A1
20200261082 Boudreaux et al. Aug 2020 A1
20200261083 Bakos et al. Aug 2020 A1
20200261084 Bakos et al. Aug 2020 A1
20200261085 Boudreaux et al. Aug 2020 A1
20200261086 Zeiner et al. Aug 2020 A1
20200261087 Timm et al. Aug 2020 A1
20200261088 Harris et al. Aug 2020 A1
20200261089 Shelton, IV et al. Aug 2020 A1
20200275928 Shelton, IV et al. Sep 2020 A1
20200275930 Harris et al. Sep 2020 A1
20200281665 Kopp Sep 2020 A1
20200305924 Carroll Oct 2020 A1
20200305945 Morgan et al. Oct 2020 A1
20200348662 Cella et al. Nov 2020 A1
20200352664 King et al. Nov 2020 A1
20200388385 De Los Reyes et al. Dec 2020 A1
20200405304 Mozdzierz et al. Dec 2020 A1
20200405375 Shelton, IV et al. Dec 2020 A1
20210000555 Shelton, IV et al. Jan 2021 A1
20210007760 Reisin Jan 2021 A1
20210015568 Liao et al. Jan 2021 A1
20210022731 Eisinger Jan 2021 A1
20210022738 Weir et al. Jan 2021 A1
20210022809 Crawford et al. Jan 2021 A1
20210059674 Shelton, IV et al. Mar 2021 A1
20210068834 Shelton, IV et al. Mar 2021 A1
20210076966 Grantcharov et al. Mar 2021 A1
20210128149 Whitfield et al. May 2021 A1
20210153889 Nott et al. May 2021 A1
20210169516 Houser et al. Jun 2021 A1
20210176179 Shelton, IV Jun 2021 A1
20210177452 Nott et al. Jun 2021 A1
20210177489 Yates et al. Jun 2021 A1
20210186454 Behzadi et al. Jun 2021 A1
20210192914 Shelton, IV et al. Jun 2021 A1
20210201646 Shelton, IV et al. Jul 2021 A1
20210205020 Shelton, IV et al. Jul 2021 A1
20210205021 Shelton, IV et al. Jul 2021 A1
20210205028 Shelton, IV et al. Jul 2021 A1
20210205029 Wiener et al. Jul 2021 A1
20210205030 Shelton, IV et al. Jul 2021 A1
20210205031 Shelton, IV et al. Jul 2021 A1
20210212602 Shelton, IV et al. Jul 2021 A1
20210212694 Shelton, IV et al. Jul 2021 A1
20210212717 Yates et al. Jul 2021 A1
20210212719 Houser et al. Jul 2021 A1
20210212770 Messerly et al. Jul 2021 A1
20210212771 Shelton, IV et al. Jul 2021 A1
20210212774 Shelton, IV et al. Jul 2021 A1
20210212775 Shelton, IV et al. Jul 2021 A1
20210212782 Shelton, IV et al. Jul 2021 A1
20210219976 DiNardo et al. Jul 2021 A1
20210220058 Messerly et al. Jul 2021 A1
20210240852 Shelton, IV et al. Aug 2021 A1
20210241898 Shelton, IV et al. Aug 2021 A1
20210249125 Morgan et al. Aug 2021 A1
20210251487 Shelton, IV et al. Aug 2021 A1
20210259687 Gonzalez et al. Aug 2021 A1
20210259697 Shelton, IV et al. Aug 2021 A1
20210259698 Shelton, IV et al. Aug 2021 A1
20210282780 Shelton, IV et al. Sep 2021 A1
20210282781 Shelton, IV et al. Sep 2021 A1
20210306176 Park et al. Sep 2021 A1
20210315579 Shelton, IV et al. Oct 2021 A1
20210315580 Shelton, IV et al. Oct 2021 A1
20210315581 Shelton, IV et al. Oct 2021 A1
20210315582 Shelton, IV et al. Oct 2021 A1
20210322014 Shelton, IV et al. Oct 2021 A1
20210322015 Shelton, IV et al. Oct 2021 A1
20210322017 Shelton, IV et al. Oct 2021 A1
20210322018 Shelton, IV et al. Oct 2021 A1
20210322019 Shelton, IV et al. Oct 2021 A1
20210322020 Shelton, IV et al. Oct 2021 A1
20210336939 Wiener et al. Oct 2021 A1
20210353287 Shelton, IV et al. Nov 2021 A1
20210353288 Shelton, IV et al. Nov 2021 A1
20210358599 Alvi et al. Nov 2021 A1
20210361284 Shelton, IV et al. Nov 2021 A1
20220000484 Shelton, IV et al. Jan 2022 A1
20220054158 Shelton, IV et al. Feb 2022 A1
20220079591 Bakos et al. Mar 2022 A1
20220157306 Albrecht et al. May 2022 A1
20220160438 Shelton, IV et al. May 2022 A1
20220175374 Shelton, IV et al. Jun 2022 A1
20220230738 Shelton, IV et al. Jul 2022 A1
20220241027 Shelton, IV et al. Aug 2022 A1
20220249097 Shelton, IV et al. Aug 2022 A1
20220323092 Shelton, IV et al. Oct 2022 A1
20220323095 Nott et al. Oct 2022 A1
20220323150 Yates et al. Oct 2022 A1
20220331011 Shelton, IV et al. Oct 2022 A1
20220331018 Parihar et al. Oct 2022 A1
20220346792 Shelton, IV et al. Nov 2022 A1
20220370117 Messerly et al. Nov 2022 A1
20220370126 Shelton, IV et al. Nov 2022 A1
20220374414 Shelton, IV et al. Nov 2022 A1
20220395276 Yates et al. Dec 2022 A1
20220401099 Shelton, IV et al. Dec 2022 A1
20220406452 Shelton, IV Dec 2022 A1
20220409302 Shelton, IV et al. Dec 2022 A1
20230000518 Nott et al. Jan 2023 A1
20230037577 Kimball et al. Feb 2023 A1
20230064821 Shelton, IV Mar 2023 A1
20230092371 Yates et al. Mar 2023 A1
20230098870 Harris et al. Mar 2023 A1
20230116571 Shelton, IV et al. Apr 2023 A1
20230146947 Shelton, IV et al. May 2023 A1
20230165642 Shelton, IV et al. Jun 2023 A1
20230171266 Brunner et al. Jun 2023 A1
20230171304 Shelton, IV et al. Jun 2023 A1
20230187060 Morgan et al. Jun 2023 A1
20230190390 Shelton, IV et al. Jun 2023 A1
20230200889 Shelton, IV et al. Jun 2023 A1
20230210611 Shelton, IV et al. Jul 2023 A1
Foreign Referenced Citations (137)
Number Date Country
2015201140 Mar 2015 AU
2709634 Jul 2009 CA
2795323 May 2014 CA
101617950 Jan 2010 CN
106027664 Oct 2016 CN
106413578 Feb 2017 CN
106456169 Feb 2017 CN
104490448 Mar 2017 CN
206097107 Apr 2017 CN
106777916 May 2017 CN
107811710 Mar 2018 CN
108652695 Oct 2018 CN
2037167 Jul 1980 DE
3016131 Oct 1981 DE
3824913 Feb 1990 DE
4002843 Apr 1991 DE
102005051367 Apr 2007 DE
102016207666 Nov 2017 DE
0000756 Oct 1981 EP
0408160 Jan 1991 EP
0473987 Mar 1992 EP
0929263 Jul 1999 EP
1214913 Jun 2002 EP
2730209 May 2014 EP
2732772 May 2014 EP
2942023 Nov 2015 EP
3047806 Jul 2016 EP
3056923 Aug 2016 EP
3095399 Nov 2016 EP
3120781 Jan 2017 EP
3135225 Mar 2017 EP
3141181 Mar 2017 EP
2838234 Oct 2003 FR
2509523 Jul 2014 GB
S5373315 Jun 1978 JP
S57185848 Nov 1982 JP
S58207752 Dec 1983 JP
S63315049 Dec 1988 JP
H06142113 May 1994 JP
H06178780 Jun 1994 JP
H06209902 Aug 1994 JP
H07132122 May 1995 JP
H08071072 Mar 1996 JP
H08332169 Dec 1996 JP
H0928663 Feb 1997 JP
H09154850 Jun 1997 JP
H11151247 Jun 1999 JP
H11197159 Jul 1999 JP
H11309156 Nov 1999 JP
2000058355 Feb 2000 JP
2001029353 Feb 2001 JP
2001195686 Jul 2001 JP
2001314411 Nov 2001 JP
2001340350 Dec 2001 JP
2002272758 Sep 2002 JP
2003061975 Mar 2003 JP
2003070921 Mar 2003 JP
2003153918 May 2003 JP
2004118664 Apr 2004 JP
2005111080 Apr 2005 JP
2005309702 Nov 2005 JP
2005348797 Dec 2005 JP
2006077626 Mar 2006 JP
2006117143 May 2006 JP
2006164251 Jun 2006 JP
2006280804 Oct 2006 JP
2006288431 Oct 2006 JP
2007123394 May 2007 JP
2007139822 Jun 2007 JP
2007300312 Nov 2007 JP
2009039515 Feb 2009 JP
2010057642 Mar 2010 JP
2010131265 Jun 2010 JP
2010269067 Dec 2010 JP
2012065698 Apr 2012 JP
2012239669 Dec 2012 JP
2012240158 Dec 2012 JP
2012533346 Dec 2012 JP
2013044303 Mar 2013 JP
2013081282 May 2013 JP
S5191993 May 2013 JP
2013135738 Jul 2013 JP
2013144057 Jul 2013 JP
2014155207 Aug 2014 JP
2015085454 May 2015 JP
2016514017 May 2016 JP
2016528010 Sep 2016 JP
2016174836 Oct 2016 JP
2016214553 Dec 2016 JP
2017047022 Mar 2017 JP
2017096359 Jun 2017 JP
2017513561 Jun 2017 JP
2017526510 Sep 2017 JP
2017532168 Nov 2017 JP
20140104587 Aug 2014 KR
101587721 Jan 2016 KR
2020860 Oct 1994 RU
WO-9734533 Sep 1997 WO
WO-9808449 Mar 1998 WO
WO-0024322 May 2000 WO
WO-0108578 Feb 2001 WO
WO-0112089 Feb 2001 WO
WO-0120892 Mar 2001 WO
WO-03079909 Oct 2003 WO
WO-2006001264 Jan 2006 WO
WO-2007137304 Nov 2007 WO
WO-2008053485 May 2008 WO
WO-2008056618 May 2008 WO
WO-2008069816 Jun 2008 WO
WO-2008076079 Jun 2008 WO
WO-2008147555 Dec 2008 WO
WO-2011112931 Sep 2011 WO
WO-2013143573 Oct 2013 WO
WO-2014031800 Feb 2014 WO
WO-2014071184 May 2014 WO
WO-2014116961 Jul 2014 WO
WO-2014134196 Sep 2014 WO
WO-2015030157 Mar 2015 WO
WO-2015054665 Apr 2015 WO
WO-2015129395 Sep 2015 WO
WO-2016093049 Jun 2016 WO
WO-2016100719 Jun 2016 WO
WO-2016118752 Jul 2016 WO
WO-2016206015 Dec 2016 WO
WO-2017011382 Jan 2017 WO
WO-2017011646 Jan 2017 WO
WO-2017058617 Apr 2017 WO
WO-2017058695 Apr 2017 WO
WO-2017151996 Sep 2017 WO
WO-2017183353 Oct 2017 WO
WO-2017189317 Nov 2017 WO
WO-2017205308 Nov 2017 WO
WO-2017210499 Dec 2017 WO
WO-2017210501 Dec 2017 WO
WO-2018116247 Jun 2018 WO
WO-2018152141 Aug 2018 WO
WO-2018176414 Oct 2018 WO
Non-Patent Literature Citations (62)
Entry
Rajai et al. “Simultaneous measurement of refractive index and thickness of multilayer systems using Fourier domain optical coherence tomography, part 1: theory.” Journal of Biomedical Optics 22(1): Jan. 2017.
V. V. Tuchin. “Tissue Optics and Photonics: Light-Tissue Interaction.” J of Biomedical Photonics & Eng 1(2): Jun. 2015.
Anonymous: “Screwdriver—Wikipedia”, en.wikipedia.org, Jun. 23, 2019, XP055725151, Retrieved from the Internet: URL:https://en.wikipedia.org/w/index.php?title=Screwdriver&oldid=903111203 [retrieved on Mar. 20, 2021].
Nordlinger, Christopher, “The Internet of Things and the Operating Room of the Future,” May 4, 2015, https://medium.com/@chrisnordlinger/the-internet-of-things-and-the-operating-room-of-the-future-8999a143d7b1, retrieved from the internet on Apr. 27, 2021, 9 pages.
Sorrells, P., “Application Note AN680. Passive RFID Basics,” retrieved from http://ww1.microchip.com/downloads/en/AppNotes/00680b.pdf on Feb. 26, 2020, Dec. 31, 1998, pp. 1-7.
US 10,504,709, 8/2018, Karancsi et al. (withdrawn)
Flores et al., “Large-scale Offloading in the Internet of Things,” 2017 IEEE International Conference on Pervasive Computing and Communications Workshops (PERCOM Workshops), IEEE, pp. 479-484, Mar. 13, 2017.
Kalantarian et al., “Computation Offloading for Real-Time Health-Monitoring Devices,” 2016 38th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EBMC), IEEE, pp. 4971-4974, Aug. 16, 2016.
Yuyi Mao et al., “A Survey on Mobile Edge Computing: The Communication Perspective,” IEEE Communications Surveys & Tutorials, pp. 2322-2358, Jun. 13, 2017.
Benkmann et al., “Concept of iterative optimization of minimally invasive surgery,” 2017 22nd International Conference on Methods and Models in Automation and Robotics (MMAR), IEEE pp. 443-446, Aug. 28, 2017.
Trautman, Peter, “Breaking the Human-Robot Deadlock: Surpassing Shared Control Performance Limits with Sparse Human-Robot Interaction,” Robotics: Science and Systems XIIII, pp. 1-10, Jul. 12, 2017.
Khazaei et al., “Health Informatics for Neonatal Intensive Care Units: An Analytical Modeling Perspective,” IEEE Journal of Translational Engineering in Health and Medicine, vol. 3, pp. 1-9, Oct. 21, 2015.
Yang et al., “A dynamic stategy for packet scheduling and bandwidth allocation based on channel quality in IEEE 802.16e OFDMA system,” Journal of Network and Computer Applications, vol. 39, pp. 52-60, May 2, 2013.
Takahashi et al., “Automatic smoke evacuation in laparoscopic surgery: a simplified method for objective evaluation,” Surgical Endoscopy, vol. 27, No. 8, pp. 2980-2987, Feb. 23, 2013.
Miksch et al., “Utilizing temporal data abstraction for data validation and therapy planning for artificially ventilated newborn infants,” Artificial Intelligence in Medicine, vol. 8, No. 6, pp. 543-576 (1996).
Horn et al., “Effective data validation of high-frequency data: Time-point-time-interval-, and trend-based methods,” Computers in Biology and Medic, New York, NY, vol. 27, No. 5, pp. 389-409 (1997).
Stacey et al., “Temporal abstraction in intelligent clinical data analysis: A survey, ” Artificial Intelligence in Medicine, vol. 39, No. 1, pp. 1-24 (2006).
Zoccali, Bruno, “A Method for Approximating Component Temperatures at Altitude Conditions Based on CFD Analysis at Sea Level Conditions,” (white paper), www.tdmginc.com, Dec. 6, 2018 (9 pages).
Slocinski et al., “Distance measure for impedance spectra for quantified evaluations,” Lecture Notes on Impedance Spectroscopy, vol. 3, Taylor and Francis Group (Jul. 2012)—Book Not Attached.
Engel et al. “A safe robot system for craniofacial surgery”, 2013 IEEE International Conference on Robotics and Automation (ICRA); May 6-10, 2013; Karlsruhe, Germany, vol. 2, Jan. 1, 2001, pp. 2020-2024.
Bonaci et al., “To Make a Robot Secure: An Experimental Analysis of Cyber Security Threats Against Teleoperated Surgical Robots,” May 13, 2015. Retrieved from the Internet: URL:https://arxiv.org/pdf/1504.04339v2.pdf [retrieved on Aug. 24, 2019].
Homa Alemzadeh et al., “Targeted Attacks on Teleoperated Surgical Robots: Dynamic Model-Based Detection and Mitigation,” 2016 46th Annual IEEE/IFIP International Conference on Dependable Systems and Networks (DSN), IEEE, Jun. 28, 2016, pp. 395-406.
Phumzile Malindi, “5. QoS in Telemedicine,” “Telemedicine,” Jun. 20, 2011, IntechOpen, pp. 119-138.
Staub et al., “Contour-based Surgical Instrument Tracking Supported by Kinematic Prediction,” Proceedings of the 2010 3rd IEEE RAS & EMBS International Conference on Biomedical Robotics and Biomechatronics, Sep. 1, 2010, pp. 746-752.
Allan et al., “3-D Pose Estimation of Articulated Instruments in Robotic Minimally Invasive Surgery,” IEEE Transactions on Medical Imaging, vol. 37, No. 5, May 1, 2018, pp. 1204-1213.
Kassahun et al., “Surgical Robotics Beyond Enhanced Dexterity Instrumentation: A Survey of the Machine Learning Techniques and their Role in Intelligent and Autonomous Surgical Actions.” International Journal of Computer Assisted Radiology and Surgery, vol. 11, No. 4, Oct. 8, 2015, pp. 553-568.
Weede et al. “An Intelligent and Autonomous Endoscopic Guidance System for Minimally Invasive Surgery,” 2013 IEEE International Conference on Robotics ad Automation (ICRA), May 6-10, 2013. Karlsruhe, Germany, May 1, 2011, pp. 5762-5768.
Altenberg et al., “Genes of Glycolysis are Ubiquitously Overexpressed in 24 Cancer Classes,” Genomics, vol. 84, pp. 1014-1020 (2004).
Harold I. Brandon and V. Leroy Young, Mar. 1997, Surgical Services Management vol. 3 No. 3. retrieved from the internet <https://www.surgimedics.com/Research%20Articles/Electrosurgical%20Plume/Characterization%20And%20Removal%20Of%20Electrosurgical%20Smoke.pdf> (Year: 1997).
Marshall Brain, How Microcontrollers Work, 2006, retrieved from the internet <https://web.archive.org/web/20060221235221/http://electronics.howstuffworks.com/microcontroller.htm/printable> (Year: 2006).
CRC Press, “The Measurement, Instrumentation and Sensors Handbook,” 1999, Section VII, Chapter 41, Peter O'Shea, “Phase Measurement,” pp. 1303-1321, ISBN 0-8493-2145-X.
Jiang, “‘Sound of Silence’: a secure indoor wireless ultrasonic communication system,” Article, 2014, pp. 46-50, Snapshots of Doctoral Research at University College Cork, School of Engineering—Electrical & Electronic Engineering, UCC, Cork, Ireland.
Li, et al., “Short-range ultrasonic communications in air using quadrature modulation,” Journal, Oct. 30, 2009, pp. 2060-2072, IEEE Transactions on Ultrasonics, Ferroelectrics, and Frequency Control, vol. 56, No. 10, IEEE.
Salamon, “AI Detects Polyps Better Than Colonoscopists” Online Article, Jun. 3, 2018, Medscape Medical News, Digestive Disease Week (DDW) 2018: Presentation 133.
Misawa, et al. “Artificial Intelligence-Assisted Polyp Detection for Colonoscopy: Initial Experience,” Article, Jun. 2018, pp. 2027-2029, vol. 154, Issue 8, American Gastroenterolgy Association.
Dottorato, “Analysis and Design of the Rectangular Microstrip Patch Antennas for TM0n0 operating mode,” Article, Oct. 8, 2010, pp. 1-9, Microwave Journal.
Miller, et al., “Impact of Powered and Tissue-Specific Endoscopic Stapling Technology on Clinical and Economic Outcomes of Video-Assisted Thoracic Surgery Lobectomy Procedures: A Retrospective, Observational Study,” Article, Apr. 2018, pp. 707-723, vol. 35 (Issue 5), Advances in Therapy.
Hsiao-Wei Tang, “ARCM”, Video, Sep. 2012, YouTube, 5 screenshots, Retrieved from internet: <https://www.youtube.com/watch?v=UldQaxb3fRw&feature=youtu.be>.
Giannios, et al., “Visible to near-infrared refractive properties of freshly-excised human-liver tissues: marking hepatic malignancies,” Article, Jun. 14, 2016, pp. 1-10, Scientific Reports 6, Article No. 27910, Nature.
Vander Heiden, et al., “Understanding the Warburg effect: the metabolic requirements of cell proliferation,” Article, May 22, 2009, pp. 1-12, vol. 324, Issue 5930, Science.
Hirayama et al., “Quantitative Metabolome Profiling of Colon and Stomach Cancer Microenvironment by Capillary Electrophoresis Time-of-Flight Mass Spectrometry,” Article, Jun. 2009, pp. 4918-4925, vol. 69, Issue 11, Cancer Research.
Cengiz, et al., “A Tale of Two Compartments: Interstitial Versus Blood Glucose Monitoring,” Article, Jun. 2009, pp. S11-S16, vol. 11, Supplement 1, Diabetes Technology & Therapeutics.
Shen, et al., “An iridium nanoparticles dispersed carbon based thick film electrochemical biosensor and its application for a single use, disposable glucose biosensor,” Article, Feb. 3, 2007, pp. 106-113, vol. 125, Issue 1, Sensors and Actuators B: Chemical, Science Direct.
“ATM-MPLS Network Interworking Version 2.0, af-aic-0178.001” ATM Standard, The ATM Forum Technical Committee, published Aug. 2003.
IEEE Std 802.3-2012 (Revision of IEEE Std 802.3-2008, published Dec. 28, 2012.
IEEE Std No. 177, “Standard Definitions and Methods of Measurement for Piezoelectric Vibrators,” published May 1966, The Institute of Electrical and Electronics Engineers, Inc., New York, N.Y.
Shi et al., An intuitive control console for robotic syrgery system, 2014, IEEE, p. 404-407 (Year: 2014).
Choi et al., A haptic augmented reality surgeon console for a laparoscopic surgery robot system, 2013, IEEE, p. 355-357 (Year: 2013).
Xie et al., Development of stereo vision and master-slave controller for a compact surgical robot system, 2015, IEEE, p. 403-407 (Year: 2015).
Sun et al., Innovative effector design for simulation training in robotic surgery, 2010, IEEE, p. 1735-1759 (Year: 2010).
Anonymous, “Internet of Things Powers Connected Surgical Device Infrastructure Case Study”, Dec. 31, 2016 (Dec. 31, 2016), Retrieved from the Internet: URL:https://www.cognizant.com/services-resources/150110_IoT_connected_surgical_devices.pdf.
Draijer, Matthijs et al., “Review of laser speckte contrast techniques for visualizing tissue perfusion,” Lasers in Medical Science, Springer-Verlag, LO, vol. 24, No. 4, Dec. 3, 2008, pp. 639-651.
Roy D Cullum, “Handbook of Engineering Design”, ISBN: 9780408005586, Jan. 1, 1988 (Jan. 1, 1988), XP055578597, ISBN: 9780408005586, 10-20, Chapter 6, p. 138, right-hand column, paragraph 3.
Nabil Simaan et al, “Intelligent Surgical Robots with Situational Awareness: From Good to Great Surgeons”, DOI: 10.1115/1.2015-Sep-6 external link, Sep. 2015 (Sep. 2015), p. 3-6, Retrieved from the Internet: URL:http://memagazineselect.asmedigitalcollection.asme.org/data/journals/meena/936888/me-2015-sep6.pdf XP055530863.
Anonymous: “Titanium Key Chain Tool 1.1, Ultralight Multipurpose Key Chain Tool, Forward Cutting Can Opener—Vargo Titanium,” vargooutdoors.com, Jul. 5, 2014 (Jul. 5, 2014), retrieved from the internet: https://vargooutdoors.com/titanium-key-chain-tool-1-1.html.
“Surgical instrumentation: the true cost of instrument trays and a potential strategy for optimization”; Mhlaba et al.; Sep. 23, 2015 (Year: 2015).
Hu, Jinwen, Stimulations of adaptive temperature control with self-focused hyperthermia system for tumor treatment, Jan. 9, 2012, Ultrasonics 53, pp. 171-177, (Year: 2012).
Hussain et al., “A survey on resource allocation in high performance distributed computing systems”, Parallel Computing, vol. 39, No. 11, pp. 709-736 (2013).
Lalys, et al., “Automatic knowledge-based recognition of low-level tasks in ophthalmological procedures”, Int J CARS, vol. 8, No. 1, pp. 1-49, Apr. 19, 2012.
Anonymous: “Quality of service—Wikipedia”, Dec. 7, 2017, Retrieved from the Internet: URL:https://en.wikipedia.org/w/index.php?title=Quality_of_service&oldid=814298744#Applications [retrieved on Feb. 14, 2023], pp. 1-12.
Anonymous: “Differentiated services—Wikipedia”, Dec. 14, 2017, Retrieved from the Internet: URL:https://en.wikipedia.org/w/index.php?title=Differentiated_services&oldid=815415620 [retrieved on Feb. 14, 2023], pp. 1-7.
Anonymous: “Cloud computing—Wikipedia”, Dec. 19, 2017, Retrieved from the Internet: URL:https://en.wikipedia.org/w/index.php?title=Cloud_computing&oldid=816206558 [retrieved Feb. 14, 2023], pp. 1-21.
Related Publications (1)
Number Date Country
20190200905 A1 Jul 2019 US
Provisional Applications (4)
Number Date Country
62649291 Mar 2018 US
62611339 Dec 2017 US
62611341 Dec 2017 US
62611340 Dec 2017 US