The present disclosure relates to various surgical systems. Surgical procedures are typically performed in surgical operating theaters or rooms in a healthcare facility such as, for example, a hospital. A sterile field is typically created around the patient. The sterile field may include the scrubbed team members, who are properly attired, and all furniture and fixtures in the area. Various surgical devices and systems are utilized in performance of a surgical procedure.
In one aspect the present disclosure provides a computer-implemented method for collecting data within a facility. The method comprises: receiving, by a computer system, perioperative data from a plurality of surgical devices located within the facility, the perioperative data associated with a plurality of surgical procedures performed in the facility; determining, by the computer system, procedural context data associated with the plurality of surgical procedures based at least in part on the perioperative data; aggregating, by the computer system, the perioperative data according to the procedural context data; and determining, by the computer system, trends associated with the surgical procedures performed in the facility according to the perioperative data and the procedural context data.
In another aspect the present disclosure provides a computer-implemented method for collecting data within a facility. The method comprises: receiving, by a computer system, perioperative data from a plurality of surgical devices located within the facility, the perioperative data associated with a plurality of surgical procedures performed in the facility; receiving, by the computer system, images of the facility and any staff members or surgical devices located therein from a plurality of cameras located within the facility; determining, by the computer system, procedural context data associated with the plurality of surgical procedures based at least in part on the perioperative data and the images; aggregating, by the computer system, the perioperative data according to the procedural context data; and determining, by the computer system, trends associated with the surgical procedures performed in the facility according to the perioperative data and the procedural context data.
In another aspect the present disclosure provides A computer-implemented method for collecting data within a facility. The method comprises: receiving, by a computer system, perioperative data from a plurality of surgical devices located within the facility, the perioperative data associated with a plurality of surgical procedures performed in the facility; receiving, by the computer system, images of the facility and any staff members or surgical devices located therein from a plurality of cameras located within the facility; receiving, by the computer system, patient data from a patient databased; receiving, by the computer system, physiological data from a plurality of patient monitors; determining, by the computer system, procedural context data associated with the plurality of surgical procedures based at least in part on the perioperative data, the images, the patient data, and the physiological data; aggregating, by the computer system, the perioperative data according to the procedural context data; and determining, by the computer system, trends associated with the surgical procedures performed in the facility according to the perioperative data and the procedural context data.
The various aspects described herein, both as to organization and methods of operation, together with further objects and advantages thereof, may best be understood by reference to the following description, taken in conjunction with the accompanying drawings as follows.
Applicant of the present application owns the following U.S. patent applications, filed on Dec. 4, 2018, the disclosure of each of which is herein incorporated by reference in its entirety:
Applicant of the present application owns the following U.S. patent applications, filed on Nov. 6, 2018, the disclosure of each of which is herein incorporated by reference in its entirety:
Applicant of the present application owns the following U.S. patent applications that were filed on Oct. 26, 2018, the disclosure of each of which is herein incorporated by reference in its entirety:
Applicant of the present application owns the following U.S. patent applications, filed on Aug. 28, 2018, the disclosure of each of which is herein incorporated by reference in its entirety:
Applicant of the present application owns the following U.S. patent applications, filed on Aug. 24, 2018, the disclosure of each of which is herein incorporated by reference in its entirety:
Applicant of the present application owns the following U.S. patent applications, filed on Jun. 29, 2018, the disclosure of each of which is herein incorporated by reference in its entirety:
Applicant of the present application owns the following U.S. patent applications, filed on Mar. 29, 2018, the disclosure of each of which is herein incorporated by reference in its entirety:
Applicant of the present application owns the following U.S. Provisional Patent Applications, filed on Mar. 8, 2018, the disclosure of each of which is herein incorporated by reference in its entirety:
Before explaining various aspects of surgical devices and generators in detail, it should be noted that the illustrative examples are not limited in application or use to the details of construction and arrangement of parts illustrated in the accompanying drawings and description. The illustrative examples may be implemented or incorporated in other aspects, variations and modifications, and may be practiced or carried out in various ways. Further, unless otherwise indicated, the terms and expressions employed herein have been chosen for the purpose of describing the illustrative examples for the convenience of the reader and are not for the purpose of limitation thereof. Also, it will be appreciated that one or more of the following-described aspects, expressions of aspects, and/or examples, can be combined with any one or more of the other following-described aspects, expressions of aspects and/or examples.
Referring to
Other types of robotic systems can be readily adapted for use with the surgical system 102. Various examples of robotic systems and surgical tools that are suitable for use with the present disclosure are described in U.S. Provisional Patent Application Ser. No. 62/611,339, titled ROBOT ASSISTED SURGICAL PLATFORM, filed Dec. 28, 2017, the disclosure of which is herein incorporated by reference in its entirety.
Various examples of cloud-based analytics that are performed by the cloud 104, and are suitable for use with the present disclosure, are described in U.S. Provisional Patent Application Ser. No. 62/611,340, titled CLOUD-BASED MEDICAL ANALYTICS, filed Dec. 28, 2017, the disclosure of which is herein incorporated by reference in its entirety.
In various aspects, the imaging device 124 includes at least one image sensor and one or more optical components. Suitable image sensors include, but are not limited to, Charge-Coupled Device (CCD) sensors and Complementary Metal-Oxide Semiconductor (CMOS) sensors.
The optical components of the imaging device 124 may include one or more illumination sources and/or one or more lenses. The one or more illumination sources may be directed to illuminate portions of the surgical field. The one or more image sensors may receive light reflected or refracted from the surgical field, including light reflected or refracted from tissue and/or surgical instruments.
The one or more illumination sources may be configured to radiate electromagnetic energy in the visible spectrum as well as the invisible spectrum. The visible spectrum, sometimes referred to as the optical spectrum or luminous spectrum, is that portion of the electromagnetic spectrum that is visible to (i.e., can be detected by) the human eye and may be referred to as visible light or simply light. A typical human eye will respond to wavelengths in air that are from about 380 nm to about 750 nm.
The invisible spectrum (i.e., the non-luminous spectrum) is that portion of the electromagnetic spectrum that lies below and above the visible spectrum (i.e., wavelengths below about 380 nm and above about 750 nm). The invisible spectrum is not detectable by the human eye. Wavelengths greater than about 750 nm are longer than the red visible spectrum, and they become invisible infrared (IR), microwave, and radio electromagnetic radiation. Wavelengths less than about 380 nm are shorter than the violet spectrum, and they become invisible ultraviolet, x-ray, and gamma ray electromagnetic radiation.
In various aspects, the imaging device 124 is configured for use in a minimally invasive procedure. Examples of imaging devices suitable for use with the present disclosure include, but not limited to, an arthroscope, angioscope, bronchoscope, choledochoscope, colonoscope, cytoscope, duodenoscope, enteroscope, esophagogastro-duodenoscope (gastroscope), endoscope, laryngoscope, nasopharyngo-neproscope, sigmoidoscope, thoracoscope, and ureteroscope.
In one aspect, the imaging device employs multi-spectrum monitoring to discriminate topography and underlying structures. A multi-spectral image is one that captures image data within specific wavelength ranges across the electromagnetic spectrum. The wavelengths may be separated by filters or by the use of instruments that are sensitive to particular wavelengths, including light from frequencies beyond the visible light range, e.g., IR and ultraviolet. Spectral imaging can allow extraction of additional information the human eye fails to capture with its receptors for red, green, and blue. The use of multi-spectral imaging is described in greater detail under the heading “Advanced Imaging Acquisition Module” in U.S. Provisional Patent Application Ser. No. 62/611,341, titled INTERACTIVE SURGICAL PLATFORM, filed Dec. 28, 2017, the disclosure of which is herein incorporated by reference in its entirety. Multi-spectrum monitoring can be a useful tool in relocating a surgical field after a surgical task is completed to perform one or more of the previously described tests on the treated tissue.
It is axiomatic that strict sterilization of the operating room and surgical equipment is required during any surgery. The strict hygiene and sterilization conditions required in a “surgical theater,” i.e., an operating or treatment room, necessitate the highest possible sterility of all medical devices and equipment. Part of that sterilization process is the need to sterilize anything that comes in contact with the patient or penetrates the sterile field, including the imaging device 124 and its attachments and components. It will be appreciated that the sterile field may be considered a specified area, such as within a tray or on a sterile towel, that is considered free of microorganisms, or the sterile field may be considered an area, immediately around a patient, who has been prepared for a surgical procedure. The sterile field may include the scrubbed team members, who are properly attired, and all furniture and fixtures in the area.
In various aspects, the visualization system 108 includes one or more imaging sensors, one or more image-processing units, one or more storage arrays, and one or more displays that are strategically arranged with respect to the sterile field, as illustrated in
As illustrated in
In one aspect, the hub 106 is also configured to route a diagnostic input or feedback entered by a non-sterile operator at the visualization tower 111 to the primary display 119 within the sterile field, where it can be viewed by a sterile operator at the operating table. In one example, the input can be in the form of a modification to the snapshot displayed on the non-sterile display 107 or 109, which can be routed to the primary display 119 by the hub 106.
Referring to
Referring now to
During a surgical procedure, energy application to tissue, for sealing and/or cutting, is generally associated with smoke evacuation, suction of excess fluid, and/or irrigation of the tissue. Fluid, power, and/or data lines from different sources are often entangled during the surgical procedure. Valuable time can be lost addressing this issue during a surgical procedure. Detangling the lines may necessitate disconnecting the lines from their respective modules, which may require resetting the modules. The hub modular enclosure 136 offers a unified environment for managing the power, data, and fluid lines, which reduces the frequency of entanglement between such lines.
Aspects of the present disclosure present a surgical hub for use in a surgical procedure that involves energy application to tissue at a surgical site. The surgical hub includes a hub enclosure and a combo generator module slidably receivable in a docking station of the hub enclosure. The docking station includes data and power contacts. The combo generator module includes two or more of an ultrasonic energy generator component, a bipolar RF energy generator component, and a monopolar RF energy generator component that are housed in a single unit. In one aspect, the combo generator module also includes a smoke evacuation component, at least one energy delivery cable for connecting the combo generator module to a surgical instrument, at least one smoke evacuation component configured to evacuate smoke, fluid, and/or particulates generated by the application of therapeutic energy to the tissue, and a fluid line extending from the remote surgical site to the smoke evacuation component.
In one aspect, the fluid line is a first fluid line and a second fluid line extends from the remote surgical site to a suction and irrigation module slidably received in the hub enclosure. In one aspect, the hub enclosure comprises a fluid interface.
Certain surgical procedures may require the application of more than one energy type to the tissue. One energy type may be more beneficial for cutting the tissue, while another different energy type may be more beneficial for sealing the tissue. For example, a bipolar generator can be used to seal the tissue while an ultrasonic generator can be used to cut the sealed tissue. Aspects of the present disclosure present a solution where a hub modular enclosure 136 is configured to accommodate different generators, and facilitate an interactive communication therebetween. One of the advantages of the hub modular enclosure 136 is enabling the quick removal and/or replacement of various modules.
Aspects of the present disclosure present a modular surgical enclosure for use in a surgical procedure that involves energy application to tissue. The modular surgical enclosure includes a first energy-generator module, configured to generate a first energy for application to the tissue, and a first docking station comprising a first docking port that includes first data and power contacts, wherein the first energy-generator module is slidably movable into an electrical engagement with the power and data contacts and wherein the first energy-generator module is slidably movable out of the electrical engagement with the first power and data contacts,
Further to the above, the modular surgical enclosure also includes a second energy-generator module configured to generate a second energy, different than the first energy, for application to the tissue, and a second docking station comprising a second docking port that includes second data and power contacts, wherein the second energy-generator module is slidably movable into an electrical engagement with the power and data contacts, and wherein the second energy-generator module is slidably movable out of the electrical engagement with the second power and data contacts.
In addition, the modular surgical enclosure also includes a communication bus between the first docking port and the second docking port, configured to facilitate communication between the first energy-generator module and the second energy-generator module.
Referring to
In one aspect, the hub modular enclosure 136 comprises a modular power and communication backplane 149 with external and wireless communication headers to enable the removable attachment of the modules 140, 126, 128 and interactive communication therebetween.
In one aspect, the hub modular enclosure 136 includes docking stations, or drawers, 151, herein also referred to as drawers, which are configured to slidably receive the modules 140, 126, 128.
In various aspects, the smoke evacuation module 126 includes a fluid line 154 that conveys captured/collected smoke and/or fluid away from a surgical site and to, for example, the smoke evacuation module 126. Vacuum suction originating from the smoke evacuation module 126 can draw the smoke into an opening of a utility conduit at the surgical site. The utility conduit, coupled to the fluid line, can be in the form of a flexible tube terminating at the smoke evacuation module 126. The utility conduit and the fluid line define a fluid path extending toward the smoke evacuation module 126 that is received in the hub enclosure 136.
In various aspects, the suction/irrigation module 128 is coupled to a surgical tool comprising an aspiration fluid line and a suction fluid line. In one example, the aspiration and suction fluid lines are in the form of flexible tubes extending from the surgical site toward the suction/irrigation module 128. One or more drive systems can be configured to cause irrigation and aspiration of fluids to and from the surgical site.
In one aspect, the surgical tool includes a shaft having an end effector at a distal end thereof and at least one energy treatment associated with the end effector, an aspiration tube, and an irrigation tube. The aspiration tube can have an inlet port at a distal end thereof and the aspiration tube extends through the shaft. Similarly, an irrigation tube can extend through the shaft and can have an inlet port in proximity to the energy deliver implement. The energy deliver implement is configured to deliver ultrasonic and/or RF energy to the surgical site and is coupled to the generator module 140 by a cable extending initially through the shaft.
The irrigation tube can be in fluid communication with a fluid source, and the aspiration tube can be in fluid communication with a vacuum source. The fluid source and/or the vacuum source can be housed in the suction/irrigation module 128. In one example, the fluid source and/or the vacuum source can be housed in the hub enclosure 136 separately from the suction/irrigation module 128. In such example, a fluid interface can be configured to connect the suction/irrigation module 128 to the fluid source and/or the vacuum source.
In one aspect, the modules 140, 126, 128 and/or their corresponding docking stations on the hub modular enclosure 136 may include alignment features that are configured to align the docking ports of the modules into engagement with their counterparts in the docking stations of the hub modular enclosure 136. For example, as illustrated in
In some aspects, the drawers 151 of the hub modular enclosure 136 are the same, or substantially the same size, and the modules are adjusted in size to be received in the drawers 151. For example, the side brackets 155 and/or 156 can be larger or smaller depending on the size of the module. In other aspects, the drawers 151 are different in size and are each designed to accommodate a particular module.
Furthermore, the contacts of a particular module can be keyed for engagement with the contacts of a particular drawer to avoid inserting a module into a drawer with mismatching contacts.
As illustrated in
In various aspects, the imaging module 138 comprises an integrated video processor and a modular light source and is adapted for use with various imaging devices. In one aspect, the imaging device is comprised of a modular housing that can be assembled with a light source module and a camera module. The housing can be a disposable housing. In at least one example, the disposable housing is removably coupled to a reusable controller, a light source module, and a camera module. The light source module and/or the camera module can be selectively chosen depending on the type of surgical procedure. In one aspect, the camera module comprises a CCD sensor. In another aspect, the camera module comprises a CMOS sensor. In another aspect, the camera module is configured for scanned beam imaging. Likewise, the light source module can be configured to deliver a white light or a different light, depending on the surgical procedure.
During a surgical procedure, removing a surgical device from the surgical field and replacing it with another surgical device that includes a different camera or a different light source can be inefficient. Temporarily losing sight of the surgical field may lead to undesirable consequences. The module imaging device of the present disclosure is configured to permit the replacement of a light source module or a camera module midstream during a surgical procedure, without having to remove the imaging device from the surgical field.
In one aspect, the imaging device comprises a tubular housing that includes a plurality of channels. A first channel is configured to slidably receive the camera module, which can be configured for a snap-fit engagement with the first channel. A second channel is configured to slidably receive the light source module, which can be configured for a snap-fit engagement with the second channel. In another example, the camera module and/or the light source module can be rotated into a final position within their respective channels. A threaded engagement can be employed in lieu of the snap-fit engagement.
In various examples, multiple imaging devices are placed at different positions in the surgical field to provide multiple views. The imaging module 138 can be configured to switch between the imaging devices to provide an optimal view. In various aspects, the imaging module 138 can be configured to integrate the images from the different imaging device.
Various image processors and imaging devices suitable for use with the present disclosure are described in U.S. Pat. No. 7,995,045, titled COMBINED SBI AND CONVENTIONAL IMAGE PROCESSOR, which issued on Aug. 9, 2011, which is herein incorporated by reference in its entirety. In addition, U.S. Pat. No. 7,982,776, titled SBI MOTION ARTIFACT REMOVAL APPARATUS AND METHOD, which issued on Jul. 19, 2011, which is herein incorporated by reference in its entirety, describes various systems for removing motion artifacts from image data. Such systems can be integrated with the imaging module 138. Furthermore, U.S. Patent Application Publication No. 2011/0306840, titled CONTROLLABLE MAGNETIC SOURCE TO FIXTURE INTRACORPOREAL APPARATUS, which published on Dec. 15, 2011, and U.S. Pat. No. 10,098,527, titled SYSTEM FOR PERFORMING A MINIMALLY INVASIVE SURGICAL PROCEDURE, which issued on Oct. 16, 2018, each of which is herein incorporated by reference in its entirety.
Modular devices 1a-1n located in the operating theater may be coupled to the modular communication hub 203. The network hub 207 and/or the network switch 209 may be coupled to a network router 211 to connect the devices 1a-1n to the cloud 204 or the local computer system 210. Data associated with the devices 1a-1n may be transferred to cloud-based computers via the router for remote data processing and manipulation. Data associated with the devices 1a-1n may also be transferred to the local computer system 210 for local data processing and manipulation. Modular devices 2a-2m located in the same operating theater also may be coupled to a network switch 209. The network switch 209 may be coupled to the network hub 207 and/or the network router 211 to connect to the devices 2a-2m to the cloud 204. Data associated with the devices 2a-2n may be transferred to the cloud 204 via the network router 211 for data processing and manipulation. Data associated with the devices 2a-2m may also be transferred to the local computer system 210 for local data processing and manipulation.
It will be appreciated that the surgical data network 201 may be expanded by interconnecting multiple network hubs 207 and/or multiple network switches 209 with multiple network routers 211. The modular communication hub 203 may be contained in a modular control tower configured to receive multiple devices 1a-1n/2a-2m. The local computer system 210 also may be contained in a modular control tower. The modular communication hub 203 is connected to a display 212 to display images obtained by some of the devices 1a-1n/2a-2m, for example during surgical procedures. In various aspects, the devices 1a-1n/2a-2m may include, for example, various modules such as an imaging module 138 coupled to an endoscope, a generator module 140 coupled to an energy-based surgical device, a smoke evacuation module 126, a suction/irrigation module 128, a communication module 130, a processor module 132, a storage array 134, a surgical device coupled to a display, and/or a non-contact sensor module, among other modular devices that may be connected to the modular communication hub 203 of the surgical data network 201.
In one aspect, the surgical data network 201 may comprise a combination of network hub(s), network switch(es), and network router(s) connecting the devices 1a-1n/2a-2m to the cloud. Any one of or all of the devices 1a-1n/2a-2m coupled to the network hub or network switch may collect data in real time and transfer the data to cloud computers for data processing and manipulation. It will be appreciated that cloud computing relies on sharing computing resources rather than having local servers or personal devices to handle software applications. The word “cloud” may be used as a metaphor for “the Internet,” although the term is not limited as such. Accordingly, the term “cloud computing” may be used herein to refer to “a type of Internet-based computing,” where different services—such as servers, storage, and applications—are delivered to the modular communication hub 203 and/or computer system 210 located in the surgical theater (e.g., a fixed, mobile, temporary, or field operating room or space) and to devices connected to the modular communication hub 203 and/or computer system 210 through the Internet. The cloud infrastructure may be maintained by a cloud service provider. In this context, the cloud service provider may be the entity that coordinates the usage and control of the devices 1a-1n/2a-2m located in one or more operating theaters. The cloud computing services can perform a large number of calculations based on the data gathered by smart surgical instruments, robots, and other computerized devices located in the operating theater. The hub hardware enables multiple devices or connections to be connected to a computer that communicates with the cloud computing resources and storage.
Applying cloud computer data processing techniques on the data collected by the devices 1a-1n/2a-2m, the surgical data network provides improved surgical outcomes, reduced costs, and improved patient satisfaction. At least some of the devices 1a-1n/2a-2m may be employed to view tissue states to assess leaks or perfusion of sealed tissue after a tissue sealing and cutting procedure. At least some of the devices 1a-1n/2a-2m may be employed to identify pathology, such as the effects of diseases, using the cloud-based computing to examine data including images of samples of body tissue for diagnostic purposes. This includes localization and margin confirmation of tissue and phenotypes. At least some of the devices 1a-1n/2a-2m may be employed to identify anatomical structures of the body using a variety of sensors integrated with imaging devices and techniques such as overlaying images captured by multiple imaging devices. The data gathered by the devices 1a-1n/2a-2m, including image data, may be transferred to the cloud 204 or the local computer system 210 or both for data processing and manipulation including image processing and manipulation. The data may be analyzed to improve surgical procedure outcomes by determining if further treatment, such as the application of endoscopic intervention, emerging technologies, a targeted radiation, targeted intervention, and precise robotics to tissue-specific sites and conditions, may be pursued. Such data analysis may further employ outcome analytics processing, and using standardized approaches may provide beneficial feedback to either confirm surgical treatments and the behavior of the surgeon or suggest modifications to surgical treatments and the behavior of the surgeon.
In one implementation, the operating theater devices 1a-1n may be connected to the modular communication hub 203 over a wired channel or a wireless channel depending on the configuration of the devices 1a-1n to a network hub. The network hub 207 may be implemented, in one aspect, as a local network broadcast device that works on the physical layer of the Open System Interconnection (OSI) model. The network hub provides connectivity to the devices 1a-1n located in the same operating theater network. The network hub 207 collects data in the form of packets and sends them to the router in half duplex mode. The network hub 207 does not store any media access control/Internet Protocol (MAC/IP) to transfer the device data. Only one of the devices 1a-1n can send data at a time through the network hub 207. The network hub 207 has no routing tables or intelligence regarding where to send information and broadcasts all network data across each connection and to a remote server 213 (
In another implementation, the operating theater devices 2a-2m may be connected to a network switch 209 over a wired channel or a wireless channel. The network switch 209 works in the data link layer of the OSI model. The network switch 209 is a multicast device for connecting the devices 2a-2m located in the same operating theater to the network. The network switch 209 sends data in the form of frames to the network router 211 and works in full duplex mode. Multiple devices 2a-2m can send data at the same time through the network switch 209. The network switch 209 stores and uses MAC addresses of the devices 2a-2m to transfer data.
The network hub 207 and/or the network switch 209 are coupled to the network router 211 for connection to the cloud 204. The network router 211 works in the network layer of the OSI model. The network router 211 creates a route for transmitting data packets received from the network hub 207 and/or network switch 211 to cloud-based computer resources for further processing and manipulation of the data collected by any one of or all the devices 1a-1n/2a-2m. The network router 211 may be employed to connect two or more different networks located in different locations, such as, for example, different operating theaters of the same healthcare facility or different networks located in different operating theaters of different healthcare facilities. The network router 211 sends data in the form of packets to the cloud 204 and works in full duplex mode. Multiple devices can send data at the same time. The network router 211 uses IP addresses to transfer data.
In one example, the network hub 207 may be implemented as a USB hub, which allows multiple USB devices to be connected to a host computer. The USB hub may expand a single USB port into several tiers so that there are more ports available to connect devices to the host system computer. The network hub 207 may include wired or wireless capabilities to receive information over a wired channel or a wireless channel. In one aspect, a wireless USB short-range, high-bandwidth wireless radio communication protocol may be employed for communication between the devices 1a-1n and devices 2a-2m located in the operating theater.
In other examples, the operating theater devices 1a-1n/2a-2m may communicate to the modular communication hub 203 via Bluetooth wireless technology standard for exchanging data over short distances (using short-wavelength UHF radio waves in the ISM band from 2.4 to 2.485 GHz) from fixed and mobile devices and building personal area networks (PANs). In other aspects, the operating theater devices 1a-1n/2a-2m may communicate to the modular communication hub 203 via a number of wireless or wired communication standards or protocols, including but not limited to Wi-Fi (IEEE 802.11 family), WiMAX (IEEE 802.16 family), IEEE 802.20, long-term evolution (LIE), and Ev-DO, HSPA+, HSDPA+, HSUPA+, EDGE, GSM, GPRS, CDMA, TDMA, DECT, and Ethernet derivatives thereof, as well as any other wireless and wired protocols that are designated as 3G, 4G, 5G, and beyond. The computing module may include a plurality of communication modules. For instance, a first communication module may be dedicated to shorter-range wireless communications such as Wi-Fi and Bluetooth, and a second communication module may be dedicated to longer-range wireless communications such as GPS, EDGE, GPRS, CDMA, WiMAX, LTE, Ev-DO, and others.
The modular communication hub 203 may serve as a central connection for one or all of the operating theater devices 1a-1n/2a-2m and handles a data type known as frames. Frames carry the data generated by the devices 1a-1n/2a-2m. When a frame is received by the modular communication hub 203, it is amplified and transmitted to the network router 211, which transfers the data to the cloud computing resources by using a number of wireless or wired communication standards or protocols, as described herein.
The modular communication hub 203 can be used as a standalone device or be connected to compatible network hubs and network switches to form a larger network. The modular communication hub 203 is generally easy to install, configure, and maintain, making it a good option for networking the operating theater devices 1a-1n/2a-2m.
The surgical hub 206 employs a non-contact sensor module 242 to measure the dimensions of the operating theater and generate a map of the surgical theater using either ultrasonic or laser-type non-contact measurement devices. An ultrasound-based non-contact sensor module scans the operating theater by transmitting a burst of ultrasound and receiving the echo when it bounces off the perimeter walls of an operating theater as described under the heading “Surgical Hub Spatial Awareness Within an Operating Room” in U.S. Provisional Patent Application Ser. No. 62/611,341, titled INTERACTIVE SURGICAL PLATFORM, filed Dec. 28, 2017, which is herein incorporated by reference in its entirety, in which the sensor module is configured to determine the size of the operating theater and to adjust Bluetooth-pairing distance limits. A laser-based non-contact sensor module scans the operating theater by transmitting laser light pulses, receiving laser light pulses that bounce off the perimeter walls of the operating theater, and comparing the phase of the transmitted pulse to the received pulse to determine the size of the operating theater and to adjust Bluetooth pairing distance limits, for example.
The computer system 210 comprises a processor 244 and a network interface 245. The processor 244 is coupled to a communication module 247, storage 248, memory 249, non-volatile memory 250, and input/output interface 251 via a system bus. The system bus can be any of several types of bus structure(s) including the memory bus or memory controller, a peripheral bus or external bus, and/or a local bus using any variety of available bus architectures including, but not limited to, 9-bit bus, Industrial Standard Architecture (ISA), Micro-Charmel Architecture (MSA), Extended ISA (EISA), Intelligent Drive Electronics (IDE), VESA Local Bus (VLB), Peripheral Component Interconnect (PCI), USB, Advanced Graphics Port (AGP), Personal Computer Memory Card International Association bus (PCMCIA), Small Computer Systems Interface (SCSI), or any other proprietary bus.
The processor 244 may be any single-core or multicore processor such as those known under the trade name ARM Cortex by Texas Instruments. In one aspect, the processor may be an LM4F230H5QR ARM Cortex-M4F Processor Core, available from Texas Instruments, for example, comprising an on-chip memory of 256 KB single-cycle flash memory, or other non-volatile memory, up to 40 MHz, a prefetch buffer to improve performance above 40 MHz, a 32 KB single-cycle serial random access memory (SRAM), an internal read-only memory (ROM) loaded with StellarisWare® software, a 2 KB electrically erasable programmable read-only memory (EEPROM), and/or one or more pulse width modulation (PWM) modules, one or more quadrature encoder inputs (QEI) analogs, one or more 12-bit analog-to-digital converters (ADCs) with 12 analog input channels, details of which are available for the product datasheet.
In one aspect, the processor 244 may comprise a safety controller comprising two controller-based families such as TMS570 and RM4x, known under the trade name Hercules ARM Cortex R4, also by Texas Instruments. The safety controller may be configured specifically for IEC 61508 and ISO 26262 safety critical applications, among others, to provide advanced integrated safety features while delivering scalable performance, connectivity, and memory options.
The system memory includes volatile memory and non-volatile memory. The basic input/output system (BIOS), containing the basic routines to transfer information between elements within the computer system, such as during start-up, is stored in non-volatile memory. For example, the non-volatile memory can include ROM, programmable ROM (PROM), electrically programmable ROM (EPROM), EEPROM, or flash memory. Volatile memory includes random-access memory (RAM), which acts as external cache memory. Moreover, RAM is available in many forms such as SRAM, dynamic RAM (DRAM), synchronous DRAM (SDRAM), double data rate SDRAM (DDR SDRAM), enhanced SDRAM (ESDRAM), Synchlink DRAM (SLDRAM), and direct Rambus RAM (DRRAM).
The computer system 210 also includes removable/non-removable, volatile/non-volatile computer storage media, such as for example disk storage. The disk storage includes, but is not limited to, devices like a magnetic disk drive, floppy disk drive, tape drive, Jaz drive, Zip drive, LS-60 drive, flash memory card, or memory stick. In addition, the disk storage can include storage media separately or in combination with other storage media including, but not limited to, an optical disc drive such as a compact disc ROM device (CD-ROM), compact disc recordable drive (CD-R Drive), compact disc rewritable drive (CD-RW Drive), or a digital versatile disc ROM drive (DVD-ROM). To facilitate the connection of the disk storage devices to the system bus, a removable or non-removable interface may be employed.
It is to be appreciated that the computer system 210 includes software that acts as an intermediary between users and the basic computer resources described in a suitable operating environment. Such software includes an operating system. The operating system, which can be stored on the disk storage, acts to control and allocate resources of the computer system. System applications take advantage of the management of resources by the operating system through program modules and program data stored either in the system memory or on the disk storage. It is to be appreciated that various components described herein can be implemented with various operating systems or combinations of operating systems.
A user enters commands or information into the computer system 210 through input device(s) coupled to the I/O interface 251. The input devices include, but are not limited to, a pointing device such as a mouse, trackball, stylus, touch pad, keyboard, microphone, joystick, game pad, satellite dish, scanner, TV tuner card, digital camera, digital video camera, web camera, and the like. These and other input devices connect to the processor through the system bus via interface port(s). The interface port(s) include, for example, a serial port, a parallel port, a game port, and a USB. The output device(s) use some of the same types of ports as input device(s). Thus, for example, a USB port may be used to provide input to the computer system and to output information from the computer system to an output device. An output adapter is provided to illustrate that there are some output devices like monitors, displays, speakers, and printers, among other output devices that require special adapters. The output adapters include, by way of illustration and not limitation, video and sound cards that provide a means of connection between the output device and the system bus. It should be noted that other devices and/or systems of devices, such as remote computer(s), provide both input and output capabilities.
The computer system 210 can operate in a networked environment using logical connections to one or more remote computers, such as cloud computer(s), or local computers. The remote cloud computer(s) can be a personal computer, server, router, network PC, workstation, microprocessor-based appliance, peer device, or other common network node, and the like, and typically includes many or all of the elements described relative to the computer system. For purposes of brevity, only a memory storage device is illustrated with the remote computer(s). The remote computer(s) is logically connected to the computer system through a network interface and then physically connected via a communication connection. The network interface encompasses communication networks such as local area networks (LANs) and wide area networks (WANs). LAN technologies include Fiber Distributed Data Interface (FDDI), Copper Distributed Data Interface (CDDI), Ethernet/IEEE 802.3, Token Ring/IEEE 802.5 and the like. WAN technologies include, but are not limited to, point-to-point links, circuit-switching networks like Integrated Services Digital Networks (ISDN) and variations thereon, packet-switching networks, and Digital Subscriber Lines (DSL).
In various aspects, the computer system 210 of
The communication connection(s) refers to the hardware/software employed to connect the network interface to the bus. While the communication connection is shown for illustrative clarity inside the computer system, it can also be external to the computer system 210. The hardware/software necessary for connection to the network interface includes, for illustrative purposes only, internal and external technologies such as modems, including regular telephone-grade modems, cable modems, and DSL modems, ISDN adapters, and Ethernet cards.
The USB network hub 300 device is implemented with a digital state machine instead of a microcontroller, and no firmware programming is required. Fully compliant USB transceivers are integrated into the circuit for the upstream USB transceiver port 302 and all downstream USB transceiver ports 304, 306, 308. The downstream USB transceiver ports 304, 306, 308 support both full-speed and low-speed devices by automatically setting the slew rate according to the speed of the device attached to the ports. The USB network hub 300 device may be configured either in bus-powered or self-powered mode and includes a hub power logic 312 to manage power.
The USB network hub 300 device includes a serial interface engine 310 (SIE). The SIE 310 is the front end of the USB network hub 300 hardware and handles most of the protocol described in chapter 8 of the USB specification. The SIE 310 typically comprehends signaling up to the transaction level. The functions that it handles could include: packet recognition, transaction sequencing, SOP, EOP, RESET, and RESUME signal detection/generation, clock/data separation, non-return-to-zero invert (NRZI) data encoding/decoding and bit-stuffing, CRC generation and checking (token and data), packet ID (PID) generation and checking/decoding, and/or serial-parallel/parallel-serial conversion. The 310 receives a clock input 314 and is coupled to a suspend/resume logic and frame timer 316 circuit and a hub repeater circuit 318 to control communication between the upstream USB transceiver port 302 and the downstream USB transceiver ports 304, 306, 308 through port logic circuits 320, 322, 324. The SIE 310 is coupled to a command decoder 326 via interface logic 328 to control commands from a serial EEPROM via a serial EEPROM interface 330.
In various aspects, the USB network hub 300 can connect 127 functions configured in up to six logical layers (tiers) to a single computer. Further, the USB network hub 300 can connect to all peripherals using a standardized four-wire cable that provides both communication and power distribution. The power configurations are bus-powered and self-powered modes. The USB network hub 300 may be configured to support four modes of power management: a bus-powered hub, with either individual-port power management or ganged-port power management, and the self-powered hub, with either individual-port power management or ganged-port power management. In one aspect, using a USB cable, the USB network hub 300, the upstream USB transceiver port 302 is plugged into a USB host controller, and the downstream USB transceiver ports 304, 306, 308 are exposed for connecting USB compatible devices, and so forth.
Additional details regarding the structure and function of the surgical hub and/or surgical hub networks can be found in U.S. Provisional Patent Application No. 62/659,900, titled METHOD OF HUB COMMUNICATION, filed Apr. 19, 2018, which is hereby incorporated by reference herein in its entirety.
In addition, surgical instruments 7012 may comprise transceivers for data transmission to and from their corresponding surgical hubs 7006 (which may also comprise transceivers). Combinations of surgical instruments 7012 and corresponding hubs 7006 may indicate particular locations, such as operating theaters in healthcare facilities (e.g., hospitals), for providing medical operations. For example, the memory of a surgical hub 7006 may store location data. As shown in
Based on connections to various surgical hubs 7006 via the network 7001, the cloud 7004 can aggregate data from specific data generated by various surgical instruments 7012 and their corresponding hubs 7006. Such aggregated data may be stored within the aggregated medical data databases 7011 of the cloud 7004. In particular, the cloud 7004 may advantageously perform data analysis and operations on the aggregated data to yield insights and/or perform functions that individual hubs 7006 could not achieve on their own. To this end, as shown in
The particular cloud computing system configuration described in the present disclosure is specifically designed to address various issues arising in the context of medical operations and procedures performed using medical devices, such as the surgical instruments 7012, 112. In particular, the surgical instruments 7012 may be digital surgical devices configured to interact with the cloud 7004 for implementing techniques to improve the performance of surgical operations. Various surgical instruments 7012 and/or surgical hubs 7006 may comprise touch controlled user interfaces such that clinicians may control aspects of interaction between the surgical instruments 7012 and the cloud 7004. Other suitable user interfaces for control such as auditory controlled user interfaces can also be used.
For example, the data collection and aggregation module 7022 could be used to generate self-describing data (e.g., metadata) including identification of notable features or configuration (e.g., trends), management of redundant data sets, and storage of the data in paired data sets which can be grouped by surgery but not necessarily keyed to actual surgical dates and surgeons. In particular, pair data sets generated from operations of surgical instruments 7012 can comprise applying a binary classification, e.g., a bleeding or a non-bleeding event. More generally, the binary classification may be characterized as either a desirable event (e.g., a successful surgical procedure) or an undesirable event (e.g., a misfired or misused surgical instrument 7012). The aggregated self-describing data may correspond to individual data received from various groups or subgroups of surgical hubs 7006. Accordingly, the data collection and aggregation module 7022 can generate aggregated metadata or other organized data based on raw data received from the surgical hubs 7006. To this end, the processors 7008 can be operationally coupled to the hub applications 7014 and aggregated medical data databases 7011 for executing the data analytics modules 7034. The data collection and aggregation module 7022 may store the aggregated organized data into the aggregated medical data databases 2212.
The resource optimization module 7020 can be configured to analyze this aggregated data to determine an optimal usage of resources for a particular or group of healthcare facilities. For example, the resource optimization module 7020 may determine an optimal order point of surgical stapling instruments 7012 for a group of healthcare facilities based on corresponding predicted demand of such instruments 7012. The resource optimization module 7020 might also assess the resource usage or other operational configurations of various healthcare facilities to determine whether resource usage could be improved. Similarly, the recommendations module 7030 can be configured to analyze aggregated organized data from the data collection and aggregation module 7022 to provide recommendations. For example, the recommendations module 7030 could recommend to healthcare facilities (e.g., medical service providers such as hospitals) that a particular surgical instrument 7012 should be upgraded to an improved version based on a higher than expected error rate, for example. Additionally, the recommendations module 7030 and/or resource optimization module 7020 could recommend better supply chain parameters such as product reorder points and provide suggestions of different surgical instrument 7012, uses thereof, or procedure steps to improve surgical outcomes. The healthcare facilities can receive such recommendations via corresponding surgical hubs 7006. More specific recommendations regarding parameters or configurations of various surgical instruments 7012 can also be provided. Hubs 7006 and/or surgical instruments 7012 each could also have display screens that display data or recommendations provided by the cloud 7004.
The patient outcome analysis module 7028 can analyze surgical outcomes associated with currently used operational parameters of surgical instruments 7012. The patient outcome analysis module 7028 may also analyze and assess other potential operational parameters. In this connection, the recommendations module 7030 could recommend using these other potential operational parameters based on yielding better surgical outcomes, such as better sealing or less bleeding. For example, the recommendations module 7030 could transmit recommendations to a surgical hub 7006 regarding when to use a particular cartridge for a corresponding stapling surgical instrument 7012. Thus, the cloud-based analytics system, while controlling for common variables, may be configured to analyze the large collection of raw data and to provide centralized recommendations over multiple healthcare facilities (advantageously determined based on aggregated data). For example, the cloud-based analytics system could analyze, evaluate, and/or aggregate data based on type of medical practice, type of patient, number of patients, geographic similarity between medical providers, which medical providers/facilities use similar types of instruments, etc., in a way that no single healthcare facility alone would be able to analyze independently.
The control program updating module 7026 could be configured to implement various surgical instrument 7012 recommendations when corresponding control programs are updated. For example, the patient outcome analysis module 7028 could identify correlations linking specific control parameters with successful (or unsuccessful) results. Such correlations may be addressed when updated control programs are transmitted to surgical instruments 7012 via the control program updating module 7026. Updates to instruments 7012 that are transmitted via a corresponding hub 7006 may incorporate aggregated performance data that was gathered and analyzed by the data collection and aggregation module 7022 of the cloud 7004. Additionally, the patient outcome analysis module 7028 and recommendations module 7030 could identify improved methods of using instruments 7012 based on aggregated performance data.
The cloud-based analytics system may include security features implemented by the cloud 7004. These security features may be managed by the authorization and security module 7024. Each surgical hub 7006 can have associated unique credentials such as username, password, and other suitable security credentials. These credentials could be stored in the memory 7010 and be associated with a permitted cloud access level. For example, based on providing accurate credentials, a surgical hub 7006 may be granted access to communicate with the cloud to a predetermined extent (e.g., may only engage in transmitting or receiving certain defined types of information). To this end, the aggregated medical data databases 7011 of the cloud 7004 may comprise a database of authorized credentials for verifying the accuracy of provided credentials. Different credentials may be associated with varying levels of permission for interaction with the cloud 7004, such as a predetermined access level for receiving the data analytics generated by the cloud 7004.
Furthermore, for security purposes, the cloud could maintain a database of hubs 7006, instruments 7012, and other devices that may comprise a “black list” of prohibited devices. In particular, a surgical hub 7006 listed on the black list may not be permitted to interact with the cloud, while surgical instruments 7012 listed on the black list may not have functional access to a corresponding hub 7006 and/or may be prevented from fully functioning when paired to its corresponding hub 7006. Additionally or alternatively, the cloud 7004 may flag instruments 7012 based on incompatibility or other specified criteria. In this manner, counterfeit medical devices and improper reuse of such devices throughout the cloud-based analytics system can be identified and addressed.
The surgical instruments 7012 may use wireless transceivers to transmit wireless signals that may represent, for example, authorization credentials for access to corresponding hubs 7006 and the cloud 7004. Wired transceivers may also be used to transmit signals Such authorization credentials can be stored in the respective memory devices of the surgical instruments 7012. The authorization and security module 7024 can determine whether the authorization credentials are accurate or counterfeit. The authorization and security module 7024 may also dynamically generate authorization credentials for enhanced security. The credentials could also be encrypted, such as by using hash based encryption. Upon transmitting proper authorization, the surgical instruments 7012 may transmit a signal to the corresponding hubs 7006 and ultimately the cloud 7004 to indicate that the instruments 7012 are ready to obtain and transmit medical data. In response, the cloud 7004 may transition into a state enabled for receiving medical data for storage into the aggregated medical data databases 7011. This data transmission readiness could be indicated by a light indicator on the instruments 7012, for example. The cloud 7004 can also transmit signals to surgical instruments 7012 for updating their associated control programs. The cloud 7004 can transmit signals that are directed to a particular class of surgical instruments 7012 (e.g., electrosurgical instruments) so that software updates to control programs are only transmitted to the appropriate surgical instruments 7012. Moreover, the cloud 7004 could be used to implement system wide solutions to address local or global problems based on selective data transmission and authorization credentials. For example, if a group of surgical instruments 7012 are identified as having a common manufacturing defect, the cloud 7004 may change the authorization credentials corresponding to this group to implement an operational lockout of the group.
The cloud-based analytics system may allow for monitoring multiple healthcare facilities (e.g., medical facilities like hospitals) to determine improved practices and recommend changes (via the recommendations module 2030, for example) accordingly. Thus, the processors 7008 of the cloud 7004 can analyze data associated with an individual healthcare facility to identify the facility and aggregate the data with other data associated with other healthcare facilities in a group. Groups could be defined based on similar operating practices or geographical location, for example. In this way, the cloud 7004 may provide healthcare facility group wide analysis and recommendations. The cloud-based analytics system could also be used for enhanced situational awareness. For example, the processors 7008 may predictively model the effects of recommendations on the cost and effectiveness for a particular facility (relative to overall operations and/or various medical procedures). The cost and effectiveness associated with that particular facility can also be compared to a corresponding local region of other facilities or any other comparable facilities.
The data sorting and prioritization module 7032 may prioritize and sort data based on criticality (e.g., the severity of a medical event associated with the data, unexpectedness, suspiciousness). This sorting and prioritization may be used in conjunction with the functions of the other data analytics modules 7034 described above to improve the cloud-based analytics and operations described herein. For example, the data sorting and prioritization module 7032 can assign a priority to the data analysis performed by the data collection and aggregation module 7022 and patient outcome analysis modules 7028. Different prioritization levels can result in particular responses from the cloud 7004 (corresponding to a level of urgency) such as escalation for an expedited response, special processing, exclusion from the aggregated medical data databases 7011, or other suitable responses. Moreover, if necessary, the cloud 7004 can transmit a request (e.g. a push message) through the hub application servers for additional data from corresponding surgical instruments 7012. The push message can result in a notification displayed on the corresponding hubs 7006 for requesting supporting or additional data. This push message may be required in situations in which the cloud detects a significant irregularity or outlier and the cloud cannot determine the cause of the irregularity. The central servers 7013 may be programmed to trigger this push message in certain significant circumstances, such as when data is determined to be different from an expected value beyond a predetermined threshold or when it appears security has been comprised, for example.
Additional details regarding the cloud analysis system can be found in U.S. Provisional Patent Application No. 62/659,900, titled METHOD OF HUB COMMUNICATION, filed Apr. 19, 2018, which is hereby incorporated by reference herein in its entirety.
Although an “intelligent” device including control algorithms that respond to sensed data can be an improvement over a “dumb” device that operates without accounting for sensed data, some sensed data can be incomplete or inconclusive when considered in isolation, i.e., without the context of the type of surgical procedure being performed or the type of tissue that is being operated on. Without knowing the procedural context (e.g., knowing the type of tissue being operated on or the type of procedure being performed), the control algorithm may control the modular device incorrectly or suboptimally given the particular context-free sensed data. For example, the optimal manner for a control algorithm to control a surgical instrument in response to a particular sensed parameter can vary according to the particular tissue type being operated on. This is due to the fact that different tissue types have different properties (e.g., resistance to tearing) and thus respond differently to actions taken by surgical instruments. Therefore, it may be desirable for a surgical instrument to take different actions even when the same measurement for a particular parameter is sensed. As one specific example, the optimal manner in which to control a surgical stapling and cutting instrument in response to the instrument sensing an unexpectedly high force to close its end effector will vary depending upon whether the tissue type is susceptible or resistant to tearing. For tissues that are susceptible to tearing, such as lung tissue, the instrument's control algorithm would optimally ramp down the motor in response to an unexpectedly high force to close to avoid tearing the tissue. For tissues that are resistant to tearing, such as stomach tissue, the instrument's control algorithm would optimally ramp up the motor in response to an unexpectedly high force to close to ensure that the end effector is clamped properly on the tissue. Without knowing whether lung or stomach tissue has been clamped, the control algorithm may make a suboptimal decision.
One solution utilizes a surgical hub including a system that is configured to derive information about the surgical procedure being performed based on data received from various data sources and then control the paired modular devices accordingly. In other words, the surgical hub is configured to infer information about the surgical procedure from received data and then control the modular devices paired to the surgical hub based upon the inferred context of the surgical procedure.
A surgical hub 5104, which may be similar to the hub 106 in many respects, can be configured to derive the contextual information pertaining to the surgical procedure from the data based upon, for example, the particular combination(s) of received data or the particular order in which the data is received from the data sources 5126. The contextual information inferred from the received data can include, for example, the type of surgical procedure being performed, the particular step of the surgical procedure that the surgeon is performing, the type of tissue being operated on, or the body cavity that is the subject of the procedure. This ability by some aspects of the surgical hub 5104 to derive or infer information related to the surgical procedure from received data can be referred to as “situational awareness.” In one exemplification, the surgical hub 5104 can incorporate a situational awareness system, which is the hardware and/or programming associated with the surgical hub 5104 that derives contextual information pertaining to the surgical procedure from the received data.
The situational awareness system of the surgical hub 5104 can be configured to derive the contextual information from the data received from the data sources 5126 in a variety of different ways. In one exemplification, the situational awareness system includes a pattern recognition system, or machine learning system (e.g., an artificial neural network), that has been trained on training data to correlate various inputs (e.g., data from databases 5122, patient monitoring devices 5124, and/or modular devices 5102) to corresponding contextual information regarding a surgical procedure. In other words, a machine learning system can be trained to accurately derive contextual information regarding a surgical procedure from the provided inputs. In another exemplification, the situational awareness system can include a lookup table storing pre-characterized contextual information regarding a surgical procedure in association with one or more inputs (or ranges of inputs) corresponding to the contextual information. In response to a query with one or more inputs, the lookup table can return the corresponding contextual information for the situational awareness system for controlling the modular devices 5102. In one exemplification, the contextual information received by the situational awareness system of the surgical hub 5104 is associated with a particular control adjustment or set of control adjustments for one or more modular devices 5102. In another exemplification, the situational awareness system includes a further machine learning system, lookup table, or other such system, which generates or retrieves one or more control adjustments for one or more modular devices 5102 when provided the contextual information as input.
A surgical hub 5104 incorporating a situational awareness system provides a number of benefits for the surgical system 5100. One benefit includes improving the interpretation of sensed and collected data, which would in turn improve the processing accuracy and/or the usage of the data during the course of a surgical procedure. To return to a previous example, a situationally aware surgical hub 5104 could determine what type of tissue was being operated on; therefore, when an unexpectedly high force to close the surgical instrument's end effector is detected, the situationally aware surgical hub 5104 could correctly ramp up or ramp down the motor of the surgical instrument for the type of tissue.
As another example, the type of tissue being operated can affect the adjustments that are made to the compression rate and load thresholds of a surgical stapling and cutting instrument for a particular tissue gap measurement. A situationally aware surgical hub 5104 could infer whether a surgical procedure being performed is a thoracic or an abdominal procedure, allowing the surgical hub 5104 to determine whether the tissue clamped by an end effector of the surgical stapling and cutting instrument is lung (for a thoracic procedure) or stomach (for an abdominal procedure) tissue. The surgical hub 5104 could then adjust the compression rate and load thresholds of the surgical stapling and cutting instrument appropriately for the type of tissue.
As yet another example, the type of body cavity being operated in during an insufflation procedure can affect the function of a smoke evacuator. A situationally aware surgical hub 5104 could determine whether the surgical site is under pressure (by determining that the surgical procedure is utilizing insufflation) and determine the procedure type. As a procedure type is generally performed in a specific body cavity, the surgical hub 5104 could then control the motor rate of the smoke evacuator appropriately for the body cavity being operated in. Thus, a situationally aware surgical hub 5104 could provide a consistent amount of smoke evacuation for both thoracic and abdominal procedures.
As yet another example, the type of procedure being performed can affect the optimal energy level for an ultrasonic surgical instrument or radio frequency (RF) electrosurgical instrument to operate at. Arthroscopic procedures, for example, require higher energy levels because the end effector of the ultrasonic surgical instrument or RF electrosurgical instrument is immersed in fluid. A situationally aware surgical hub 5104 could determine whether the surgical procedure is an arthroscopic procedure. The surgical hub 5104 could then adjust the RF power level or the ultrasonic amplitude of the generator (i.e., “energy level”) to compensate for the fluid filled environment. Relatedly, the type of tissue being operated on can affect the optimal energy level for an ultrasonic surgical instrument or RF electrosurgical instrument to operate at. A situationally aware surgical hub 5104 could determine what type of surgical procedure is being performed and then customize the energy level for the ultrasonic surgical instrument or RF electrosurgical instrument, respectively, according to the expected tissue profile for the surgical procedure. Furthermore, a situationally aware surgical hub 5104 can be configured to adjust the energy level for the ultrasonic surgical instrument or RF electrosurgical instrument throughout the course of a surgical procedure, rather than just on a procedure-by-procedure basis. A situationally aware surgical hub 5104 could determine what step of the surgical procedure is being performed or will subsequently be performed and then update the control algorithms for the generator and/or ultrasonic surgical instrument or RF electrosurgical instrument to set the energy level at a value appropriate for the expected tissue type according to the surgical procedure step.
As yet another example, data can be drawn from additional data sources 5126 to improve the conclusions that the surgical hub 5104 draws from one data source 5126. A situationally aware surgical hub 5104 could augment data that it receives from the modular devices 5102 with contextual information that it has built up regarding the surgical procedure from other data sources 5126. For example, a situationally aware surgical hub 5104 can be configured to determine whether hemostasis has occurred (i.e., whether bleeding at a surgical site has stopped) according to video or image data received from a medical imaging device. However, in some cases the video or image data can be inconclusive. Therefore, in one exemplification, the surgical hub 5104 can be further configured to compare a physiologic measurement (e.g., blood pressure sensed by a BP monitor communicably connected to the surgical hub 5104) with the visual or image data of hemostasis (e.g., from a medical imaging device 124 (
Another benefit includes proactively and automatically controlling the paired modular devices 5102 according to the particular step of the surgical procedure that is being performed to reduce the number of times that medical personnel are required to interact with or control the surgical system 5100 during the course of a surgical procedure. For example, a situationally aware surgical hub 5104 could proactively activate the generator to which an RF electrosurgical instrument is connected if it determines that a subsequent step of the procedure requires the use of the instrument. Proactively activating the energy source allows the instrument to be ready for use a soon as the preceding step of the procedure is completed.
As another example, a situationally aware surgical hub 5104 could determine whether the current or subsequent step of the surgical procedure requires a different view or degree of magnification on the display according to the feature(s) at the surgical site that the surgeon is expected to need to view. The surgical hub 5104 could then proactively change the displayed view (supplied by, e.g., a medical imaging device for the visualization system 108) accordingly so that the display automatically adjusts throughout the surgical procedure.
As yet another example, a situationally aware surgical hub 5104 could determine which step of the surgical procedure is being performed or will subsequently be performed and whether particular data or comparisons between data will be required for that step of the surgical procedure. The surgical hub 5104 can be configured to automatically call up data screens based upon the step of the surgical procedure being performed, without waiting for the surgeon to ask for the particular information.
Another benefit includes checking for errors during the setup of the surgical procedure or during the course of the surgical procedure. For example, a situationally aware surgical hub 5104 could determine whether the operating theater is setup properly or optimally for the surgical procedure to be performed. The surgical hub 5104 can be configured to determine the type of surgical procedure being performed, retrieve the corresponding checklists, product location, or setup needs (e.g., from a memory), and then compare the current operating theater layout to the standard layout for the type of surgical procedure that the surgical hub 5104 determines is being performed. In one exemplification, the surgical hub 5104 can be configured to compare the list of items for the procedure scanned by a suitable scanner for example and/or a list of devices paired with the surgical hub 5104 to a recommended or anticipated manifest of items and/or devices for the given surgical procedure. If there are any discontinuities between the lists, the surgical hub 5104 can be configured to provide an alert indicating that a particular modular device 5102, patient monitoring device 5124, and/or other surgical item is missing. In one exemplification, the surgical hub 5104 can be configured to determine the relative distance or position of the modular devices 5102 and patient monitoring devices 5124 via proximity sensors, for example. The surgical hub 5104 can compare the relative positions of the devices to a recommended or anticipated layout for the particular surgical procedure. If there are any discontinuities between the layouts, the surgical hub 5104 can be configured to provide an alert indicating that the current layout for the surgical procedure deviates from the recommended layout.
As another example, a situationally aware surgical hub 5104 could determine whether the surgeon (or other medical personnel) was making an error or otherwise deviating from the expected course of action during the course of a surgical procedure. For example, the surgical hub 5104 can be configured to determine the type of surgical procedure being performed, retrieve the corresponding list of steps or order of equipment usage (e.g., from a memory), and then compare the steps being performed or the equipment being used during the course of the surgical procedure to the expected steps or equipment for the type of surgical procedure that the surgical hub 5104 determined is being performed. In one exemplification, the surgical hub 5104 can be configured to provide an alert indicating that an unexpected action is being performed or an unexpected device is being utilized at the particular step in the surgical procedure.
Overall, the situational awareness system for the surgical hub 5104 improves surgical procedure outcomes by adjusting the surgical instruments (and other modular devices 5102) for the particular context of each surgical procedure (such as adjusting to different tissue types) and validating actions during a surgical procedure. The situational awareness system also improves surgeons' efficiency in performing surgical procedures by automatically suggesting next steps, providing data, and adjusting displays and other modular devices 5102 in the surgical theater according to the specific context of the procedure.
Referring now to
The situationally aware surgical hub 106, 206 receives data from the data sources throughout the course of the surgical procedure, including data generated each time medical personnel utilize a modular device that is paired with the surgical hub 106, 206. The surgical hub 106, 206 can receive this data from the paired modular devices and other data sources and continually derive inferences (i.e., contextual information) about the ongoing procedure as new data is received, such as which step of the procedure is being performed at any given time. The situational awareness system of the surgical hub 106, 206 is able to, for example, record data pertaining to the procedure for generating reports, verify the steps being taken by the medical personnel, provide data or prompts (e.g., via a display screen) that may be pertinent for the particular procedural step, adjust modular devices based on the context (e.g., activate monitors, adjust the field of view (FOV) of the medical imaging device, or change the energy level of an ultrasonic surgical instrument or RF electrosurgical instrument), and take any other such action described above.
As the first step S202 in this illustrative procedure, the hospital staff members retrieve the patient's EMR from the hospital's EMR database. Based on select patient data in the EMR, the surgical hub 106, 206 determines that the procedure to be performed is a thoracic procedure.
Second step S204, the staff members scan the incoming medical supplies for the procedure. The surgical hub 106, 206 cross-references the scanned supplies with a list of supplies that are utilized in various types of procedures and confirms that the mix of supplies corresponds to a thoracic procedure. Further, the surgical hub 106, 206 is also able to determine that the procedure is not a wedge procedure (because the incoming supplies either lack certain supplies that are necessary for a thoracic wedge procedure or do not otherwise correspond to a thoracic wedge procedure).
Third step S206, the medical personnel scan the patient band via a scanner that is communicably connected to the surgical hub 106, 206. The surgical hub 106, 206 can then confirm the patient's identity based on the scanned data.
Fourth step S208, the medical staff turns on the auxiliary equipment. The auxiliary equipment being utilized can vary according to the type of surgical procedure and the techniques to be used by the surgeon, but in this illustrative case they include a smoke evacuator, insufflator, and medical imaging device. When activated, the auxiliary equipment that are modular devices can automatically pair with the surgical hub 106, 206 that is located within a particular vicinity of the modular devices as part of their initialization process. The surgical hub 106, 206 can then derive contextual information about the surgical procedure by detecting the types of modular devices that pair with it during this pre-operative or initialization phase. In this particular example, the surgical hub 106, 206 determines that the surgical procedure is a VATS procedure based on this particular combination of paired modular devices. Based on the combination of the data from the patient's EMR, the list of medical supplies to be used in the procedure, and the type of modular devices that connect to the hub, the surgical hub 106, 206 can generally infer the specific procedure that the surgical team will be performing. Once the surgical hub 106, 206 knows what specific procedure is being performed, the surgical hub 106, 206 can then retrieve the steps of that procedure from a memory or from the cloud and then cross-reference the data it subsequently receives from the connected data sources (e.g., modular devices and patient monitoring devices) to infer what step of the surgical procedure the surgical team is performing.
Fifth step S210, the staff members attach the EKG electrodes and other patient monitoring devices to the patient. The EKG electrodes and other patient monitoring devices are able to pair with the surgical hub 106, 206. As the surgical hub 106, 206 begins receiving data from the patient monitoring devices, the surgical hub 106, 206 thus confirms that the patient is in the operating theater.
Sixth step S212, the medical personnel induce anesthesia in the patient. The surgical hub 106, 206 can infer that the patient is under anesthesia based on data from the modular devices and/or patient monitoring devices, including EKG data, blood pressure data, ventilator data, or combinations thereof, for example. Upon completion of the sixth step S212, the pre-operative portion of the lung segmentectomy procedure is completed and the operative portion begins.
Seventh step S214, the patient's lung that is being operated on is collapsed (while ventilation is switched to the contralateral lung). The surgical hub 106, 206 can infer from the ventilator data that the patient's lung has been collapsed, for example. The surgical hub 106, 206 can infer that the operative portion of the procedure has commenced as it can compare the detection of the patient's lung collapsing to the expected steps of the procedure (which can be accessed or retrieved previously) and thereby determine that collapsing the lung is the first operative step in this particular procedure.
Eighth step S216, the medical imaging device (e.g., a scope) is inserted and video from the medical imaging device is initiated. The surgical hub 106, 206 receives the medical imaging device data (i.e., video or image data) through its connection to the medical imaging device. Upon receipt of the medical imaging device data, the surgical hub 106, 206 can determine that the laparoscopic portion of the surgical procedure has commenced. Further, the surgical hub 106, 206 can determine that the particular procedure being performed is a segmentectomy, as opposed to a lobectomy (note that a wedge procedure has already been discounted by the surgical hub 106, 206 based on data received at the second step S204 of the procedure). The data from the medical imaging device 124 (
Ninth step S218, the surgical team begins the dissection step of the procedure. The surgical hub 106, 206 can infer that the surgeon is in the process of dissecting to mobilize the patient's lung because it receives data from the RF or ultrasonic generator indicating that an energy instrument is being fired. The surgical hub 106, 206 can cross-reference the received data with the retrieved steps of the surgical procedure to determine that an energy instrument being fired at this point in the process (i.e., after the completion of the previously discussed steps of the procedure) corresponds to the dissection step. In certain instances, the energy instrument can be an energy tool mounted to a robotic arm of a robotic surgical system.
Tenth step S220, the surgical team proceeds to the ligation step of the procedure. The surgical hub 106, 206 can infer that the surgeon is ligating arteries and veins because it receives data from the surgical stapling and cutting instrument indicating that the instrument is being fired. Similarly to the prior step, the surgical hub 106, 206 can derive this inference by cross-referencing the receipt of data from the surgical stapling and cutting instrument with the retrieved steps in the process. In certain instances, the surgical instrument can be a surgical tool mounted to a robotic arm of a robotic surgical system.
Eleventh step S222, the segmentectomy portion of the procedure is performed. The surgical hub 106, 206 can infer that the surgeon is transecting the parenchyma based on data from the surgical stapling and cutting instrument, including data from its cartridge. The cartridge data can correspond to the size or type of staple being fired by the instrument, for example. As different types of staples are utilized for different types of tissues, the cartridge data can thus indicate the type of tissue being stapled and/or transected. In this case, the type of staple being fired is utilized for parenchyma (or other similar tissue types), which allows the surgical hub 106, 206 to infer that the segmentectomy portion of the procedure is being performed.
Twelfth step S224, the node dissection step is then performed. The surgical hub 106, 206 can infer that the surgical team is dissecting the node and performing a leak test based on data received from the generator indicating that an RF or ultrasonic instrument is being fired. For this particular procedure, an RF or ultrasonic instrument being utilized after parenchyma was transected corresponds to the node dissection step, which allows the surgical hub 106, 206 to make this inference. It should be noted that surgeons regularly switch back and forth between surgical stapling/cutting instruments and surgical energy (i.e., RF or ultrasonic) instruments depending upon the particular step in the procedure because different instruments are better adapted for particular tasks. Therefore, the particular sequence in which the stapling/cutting instruments and surgical energy instruments are used can indicate what step of the procedure the surgeon is performing. Moreover, in certain instances, robotic tools can be utilized for one or more steps in a surgical procedure and/or handheld surgical instruments can be utilized for one or more steps in the surgical procedure. The surgeon(s) can alternate between robotic tools and handheld surgical instruments and/or can use the devices concurrently, for example. Upon completion of the twelfth step S224, the incisions are closed up and the post-operative portion of the procedure begins.
Thirteenth step S226, the patient's anesthesia is reversed. The surgical hub 106, 206 can infer that the patient is emerging from the anesthesia based on the ventilator data (i.e., the patient's breathing rate begins increasing), for example.
Lastly, the fourteenth step S228 is that the medical personnel remove the various patient monitoring devices from the patient. The surgical hub 106, 206 can thus infer that the patient is being transferred to a recovery room when the hub loses EKG, BP, and other data from the patient monitoring devices. As can be seen from the description of this illustrative procedure, the surgical hub 106, 206 can determine or infer when each step of a given surgical procedure is taking place according to data received from the various data sources that are communicably coupled to the surgical hub 106, 206.
Situational awareness is further described in U.S. Provisional Patent Application Ser. No. 62/659,900, titled METHOD OF HUB COMMUNICATION, filed Apr. 19, 2018, which is herein incorporated by reference in its entirety. In certain instances, operation of a robotic surgical system, including the various robotic surgical systems disclosed herein, for example, can be controlled by the hub 106, 206 based on its situational awareness and/or feedback from the components thereof and/or based on information from the cloud 104.
A variety of computer systems have been described herein, including surgical hubs 106, 206 (
A variety of paradigms or techniques can be utilized to efficiently share data between interrelated or connected databases, such as implementing relational database models or utilizing consistent data formats so that data is portable across the different computer systems in a network. Two general structured data-sharing paradigms described herein are referred to as “data interoperability” and “data fluidity.” These data-sharing paradigms can be characterized as rulesets executed by each of the computer systems within a computer network that define how and in what ways data is shared by and between the computer systems within the computer network. The rule set can be embodied as a set of computer-executable instructions stored in a memory of a computer system (e.g., memory 249 of the surgical hub 206 illustrated in
Data interoperability is defined as the ability of computer or database systems to work cooperatively by having a database automatically transmit particular data to recipient databases according to predefined rules. For each type of data generated by or at a computer system, the rules of the data interoperability paradigm delineate to which recipient database(s) the computer should transmit each type of data and, in some cases, the data format each type of data is to be transmitted in to each particular recipient database. In some aspects, data interoperability can be characterized as a one-way communication of data between computer systems. Further, in some aspects, the computer system transmitting data through the one-way communication channel can lack the ability to accept data of the same type back from the receiving computer system. These aspects can be beneficial in order to, for example, have one database drive or control the data that is stored or presented in another database.
Illustrative of these concepts,
For example, the first database 212002 can include an EHR database, and the second database 212004 can include a pharmacy database. In this implementation, the data interoperability ruleset can dictate that when a patient's EHR is updated in the EHR database to indicate that a new medication has been prescribed to the patient, the relevant prescription data can be automatically transmitted to the pharmacy database as a new prescription request for processing by the pharmacy department. Accordingly, the first database 212002 can be programmed to transmit 212006 data representing a prescription request to the second database 212004. The data in the prescription request can include, for example, drug interaction data and a current drug list from the associated patient's EHRs. Further, the data interoperability ruleset can dictate that when a prescription is prepared in response to a received prescription request, a billing update can be automatically transmitted to the EHR database. Accordingly, the second database 212004 can be programmed to transmit 212008 data representing a billing update to the first database 212002 in response to or upon fulfillment of the prescription request. The transmission of each of these types of data can be unidirectional with respect to the respective databases 212002, 212004.
As another example, the first database 212002 can include an OR scheduling database, and the second database 212004 can include a medical supply database. In this implementation, the data interoperability ruleset can dictate that when a new operation is scheduled or input into the OR scheduling database, relevant data for the scheduled operation can be automatically transmitted to the medical supply database to indicate which supplies should be prepared by the medical supply department and at what time and date they should be prepared by. Accordingly, the OR scheduling database can automatically transmit 212006 data representing a procedure to the medical supply database when a new procedure is scheduled. Accordingly, the employees with access to the medical supply database can automatically receive updates so that they can have the products and instruments needed for the scheduled procedure prepared at the scheduled time.
As yet another example, the first database 212002 can include a lab database, and the second database 212004 can include an EHR database. In this implementation, the data interoperability ruleset can dictate that when a patient's lab results are uploaded to the lab database, the lab results data can be automatically transmitted to the EHR database to be associated with the patient's EHR. Accordingly, the lab database can automatically populate the EHR database with data representing test results and labs when they are completed. Accordingly, physicians and any other individuals with access to the patient EHR can immediately access the results of any ordered tests and labs without the need to take any further action.
As yet another example, the first database 212002 can include a prescription-entering or EHR database, and the second database 212004 can include a medication-dispensing or pharmacy database. In this implementation, the data interoperability ruleset can dictate that when a new prescription is entered for a patient, the relevant prescription data can be automatically transmitted to the pharmacy database as a new prescription request for processing by the pharmacy department. Accordingly, the medication-dispensing database can automatically receive the prescription when entered by the practitioner so that the prescription can be ready as needed.
As yet another example, the first database 212002 can include a pathology database, and the second database 212004 can include an OR database (e.g., stored in a surgical hub 106, 206). In this implementation, the data interoperability ruleset can dictate that when new pathology results are received for a patient, the relevant pathology data can automatically be transmitted to the OR database for review by the surgical staff. Accordingly, data including updates or results stored in the pathology database can be automatically transmitted 212006 to the OR through an update to the OR database. The data can be transmitted 212006 between the pathology database and the OR database in real time, such as during the course or a surgical procedure to inform subsequent steps of the procedure. As a specific illustration, during a wedge resection procedure to remove a small tumor in a patient's lung, the surgical staff sends the resected specimen to the pathology department to check for malignancy while the patient is still in the OR. If the pathology department confirms malignancy, the surgical staff often elects to complete a lobectomy procedure on the lobe from which the wedge was taken. Accordingly, this process of providing notifications from other departments to the surgical staff during the course of a surgical procedure via the surgical hub can be automated by utilizing a data interoperability paradigm between the pathology database and the surgical hubs, as described above.
Data fluidity is defined as the ability of data to flow from one database to another database according to predefined rules that delineate bidirectional relationships between databases for data sets stored therein. In some aspects, the data fluidity paradigm can define whether data is transmitted to particular recipient databases and/or whether data is linked to particular recipient databases. Data can be automatically shared with or transferred to other databases utilizing relational database techniques (i.e., relations defined between the databases), for example. In one aspect, the databases can execute a set of rules that define which types of data are to be automatically transmitted to which particular recipient database. Furthermore, in one aspect, the databases can execute a set of rules that define the format of the data or the database to which the data is transmitted according to surgical contextual data (metadata) associated with the data. The ruleset can be embodied as a set of computer-executable instructions stored in a memory of a computer system (e.g., memory 249 of the surgical hub 206 illustrated in
For example, a surgical hub can utilize situational awareness (described above under the heading SITUATIONAL AWARENESS) to determine the surgical context (e.g., the surgical procedure type or the surgical procedure step being performed) based on the perioperative data received from the surgical instrument, patient monitors, and other surgical devices or databases and then associate the surgical context with the data being generated (e.g., store the surgical context as metadata for the generated data). The determined surgical context can influence which particular database(s) receive particular data, how much of the data is transmitted to the recipient database(s), the data format in which the data is transmitted, and so on. Accordingly, the computer system (e.g., a surgical hub) can then transmit the gathered data (with or without its associated surgical metadata) to particular recipient databases or in particular data formats according to the determined surgical context. In various aspects, the surgical context can influence the bit size, quantity, resolution, and/or time bracket around the transmitted data (e.g., the number of samples of the data captured at a particular sampling rate). Accordingly, the data fluidity paradigm allows interrelated databases to share data relevant to each database according to the needs of each recipient database. In other words, computer systems sharing data according to a data fluidity paradigm can anticipate the potential uses and needs for data received by the computer systems and then automatically route data to recipient databases or computer systems accordingly. Further, the surgical context can dictate the format that a computer system transmits the data in, the breadth of the data transmitted by the computer system, and so on.
Illustrative of these concepts,
The data fluidity rule sets dictating data flow between different databases can be defined (e.g., by administrators of the database system 212020) according to the relationships between the departments represented by the databases 212022, 212024, 212026. For example, some departments (e.g., OR and pathology or OR and supply) routinely collaborate or consult with each other on medical issues occurring with patients in the medical facility. Accordingly, the data fluidity rules can dictate that when an update is made to a particular data type (or a set of data types) in one of these collaborating databases, a substantial portion or all of the updated data can be transmitted or linked to the other collaborating database. Further, the transmitted data can include contextual metadata determined through surgical situational awareness and other additional or associated data, for example. Alternatively, some departments (e.g., billing) only need a small portion of certain data types. Accordingly, the data fluidity rules can dictate that when an update is made to a particular data type (or a set of data types) in a database, only a small portion of the updated data that is relevant to the recipient database is transmitted or linked to the recipient database. For example, if the recipient database is a billing department database, the data shared with the billing database may only include procedure codes, the time, and the expendables consumed during a medical procedure because only that data that is needed by the billing department. As can be seen, only data that is relevant to the recipient database is actually transmitted or linked to the recipient database, which limits access to sensitive patient data, prevents the recipient from being overwhelmed with unneeded data, and minimizes required data transmission bandwidths, while still allowing all connected databases to be seamlessly updated in accordance with each other.
In one implementation, the first database 212022 can include a laboratory database, the second database 212024 can include an EHR database, and the third database 212026 can include a hospital administration database. In this implementation of a data fluidity paradigm, the laboratory database and the administration database can transmit 212028 data 212029 between each other, the laboratory database and the EHR database can transmit 212030 data 212031 between each other, and the laboratory database and the administration database can transmit 212032 data between each other as dictated by the particular data fluidity ruleset defining the relations between the various databases. For example, the laboratory database could automatically transmit 212030 data 212031 including completed lab results to the EHR database to associate the lab results with the corresponding patient, whereafter the lab results can be retrieved from the EHR database. As another example, the laboratory database could automatically transmit 212028 data 212029 including a list of tests performed and other details to the hospital administration database, which can then be utilized to update billing information, reorder test supplies as needed, and so on. Further, each of the connections between the various aforementioned databases can be bidirectional. For example, if a patient's EHR is updated in the EHR database to include additional test results performed outside the given medical facility, those test results can likewise be automatically transmitted to the laboratory database for consideration and evaluation by the laboratory staff.
In another implementation, a computer system and/or network of linked databases can be configured to automatically collect and compile surgical outcomes resulting from specific treatment regimes by connecting the databases of various departments via a data fluidity paradigm, allowing all of the data pertaining to a patient's treatment to be aggregated and seamlessly integrated together. By automatically compiling patient outcome data with patient treatment data, patient care can be tracked more accurately and improvements can be developed for treatment regimes, surgical procedures, and other patient care. In some aspects, by automatically sharing relevant data across departments in a specific format for that department, the data can be more easily communicated, which can in turn allow the data to be presented more easily to patients, at meetings, in clinical papers, and so on. In some aspects, data can be recorded in each database and transmitted to the other connected databases in a standard format, allowing data from any given database to be seamlessly integrated into another compliant database.
In one aspect, collaboration across multiple departments could be increased by allowing or causing the data collected in any given database to easily flow from one group of specialists to another. The data fluidity paradigm allows for data to easily flow between departments at a medical facility by establishing a standard set of rules that all computer systems within the medical facility utilize to transmit or link data that dictates the destinations for any given type of data, the format that the data is to be transmitted in to the recipient database, and so on. The structured data-sharing paradigms described herein are beneficial in this and other contexts because they ensure that the correct data is being collected for physicians' uses. By allowing a computer system to automatically retrieve the necessary data from the relevant database(s) and having the databases update in concert with each other when data is added or changed, human errors in transmitting and transcribing data, errors due to receiving partial incomplete information, and other such errors are avoided.
In one aspect, some or all of the data in particular databases can respond fluidity to requests from users, rather than being automatically transmitted or linked to another database. Accordingly, a first computer system can be programmed to receive data requests from a second computer or database system (which can be initiated by a user, for example) and then transmit the requested data and/or define a relation between the database stored by the first computer system and the second computer system depending upon the identity or the type of request sent by the second computer system. For example, physicians can make data requests from the computer system, which then proceeds to automatically collect and compile the requested data from the relevant databases that the computer system is linked to. Such aspects can be utilized in a variety of applications, such as personalized cancer medicine. For example, the computer system can link the oncologist, surgeon, and histologist collaborating to treat a patient by allowing any of them to retrieve all of the treatment data related to the given patient. This in turn allows the medical personnel to each track the patient's treatment and allows the individual associated with a patient's care to easily retrieve and analyze data regarding the patient, such as a tumor location, margins, nodal dissection, and chemo treatment. By giving each individual associated with the treatment of a patient total access to the patient's data, follow-up and post-surgical treatment can be improved by ensuring that the medical personnel are all fully up to date on the patient's treatment. In some aspects, in addition to defining what information they would like to receive, the computer system can also be programmed to allow users to define the format that they would like the data to be presented in. Accordingly, the computer system can retrieve the identified data from the corresponding databases, convert the data to the desired format, and then present the data to the user.
Accordingly, the processor 244 executing the process 212100 receives 212102 perioperative data from the connected surgical devices and determines 212104 the surgical context based at least in part on the received perioperative data, as discussed above under the heading SITUATIONAL AWARENESS.
What the surgical hub 206 does with the collected data is dictated by the structured data ruleset being implemented by the surgical hub 206. Depending upon the surgical context and the type of data, the surgical hub 206 can transmit the data (or a subset thereof) to another database, set a relation between the database stored in the memory 249 of the surgical hub 206 and another database (i.e., link the relevant data fields of the databases), or take other such actions. In the illustrated aspect, the processor 244 transmits 212106 at least a portion of the collected surgical data to one or more recipient databases based on the determined surgical context and the identities of the recipient databases. The surgical data can include, for example, perioperative data received from the surgical devices, surgical contextual data determined via situational awareness (e.g., the surgery type or the step of the surgical procedure being performed), metadata associated with the surgical devices and/or the surgical context, and so on. Further, the processor 244 sets 212108 a relation between at least a portion of the collected surgical data stored in the surgical hub memory 249 and one or more recipient databases based the determined surgical context and the identities of the recipient databases. In other words, the surgical hub 206 transmits 212106 data and/or sets 212108 relations between its database and other databases according to the structured data-sharing ruleset, which defines which databases are to receive certain types of data or be linked to certain types of data collected by the surgical hub 206 based on the determined surgical context. For example, the surgical hub 206 could determine that a number of nonreusable surgical devices were used during the surgical procedure via situational awareness and accordingly transmit 212106 data indicating the types and numbers of nonreusable devices that were used to a purchasing database communicably connected to the surgical hub 206 for reordering of those nonreusable devices. The structured data-sharing ruleset can thus define that the purchasing database receives data related to consumed nonreusable surgical devices and that data is to be transmitted to the purchasing database. As another example, the surgical hub 206 could determine that the surgical procedure is completed or will be completed soon and accordingly set 212108 a relation between the data in its database storing the patient's biographical information and the surgical procedure type and a recovery department database to notify the recovery staff to prepare to receive the patient. The structured data-sharing ruleset can thus define that the recovery department database receives data related to identifying a patient and the surgery type and that data is to be linked to the recovery department database.
Another illustrative implementation of the process 212100 is depicted in
As discussed above, databases may only share a subset of the data they store with other connected databases. Further, different subsets of the data stored by each database may be shared with different databases, depending upon the data needed by the recipient databases. For example, data stored within each database can be organized into data categories and the structured data-sharing ruleset can dictate, for example, which data categories are shared with which other databases. For example,
The computer systems storing the databases 212130, 212132, 212134 that define a database system 212020 can be communicably linked together via, for example, a network. In some aspects, the computer systems can be cloud computing systems, as described above under the heading CLOUD SYSTEM HARDWARE AND FUNCTIONAL MODULES. In some aspects, multiple databases can be stored by a single computer system. In some aspects, the computer systems can be connected via a distributed computing communication protocol.
In one aspect, users can also define the types of data that they would like the medical facility's computer systems, such as the surgical hubs 106, 206 (
In various aspects, database systems executing a structured data-sharing paradigm can monitor the activities occurring in an OR through a surgical hub 206 therein and automatically route relevant data to relevant departments in order to improve the efficiency and function of the medical facility. In one aspect, a surgical hub 206 can be configured to monitor the progress of a surgical procedure, surgical device success rate, and other OR data via, for example, situational awareness. The ability of the surgical hub 206 to seamlessly share and communicate data with other databases in the medical facility can have a substantial number of benefits. For example, the surgical hub 206 can automatically share data regarding surgical device utilization with the re-ordering department through structured data sharing so that they know, for example, not to reorder surgical devices that have poor success rates. As another example, the surgical hub 206 can automatically share data regarding surgical outcomes with the pharmacy department so that they know, for example, that the patient may require additional pain medication due to a prolonged surgical procedure. As yet another example, the surgical hub 206 can automatically share data regarding any biopsies taken during the surgical procedure or other tissue samples that require testing with the pathology department so that they know, for example, to prepare to receive the tissue. As yet another example, the surgical hub 206 can automatically share data regarding the depletion of fluids (e.g., blood) during a surgical procedure with the medical supplies department so that they know to an order for backup supplies as the OR supply is depleted. As yet another example, the surgical hub 206 can automatically share data regarding an impending procedure with the medical supplies department so that they know, for example, to ready OR-specific drugs, hemostatic agents, and healing impacting agents (e.g., matrix metalloproteinase inhibiters) before the procedure. With the supplies readied ahead of time, they could then be delivered to the OR in a timely manner, allowing the surgical procedure to proceed on time and with the supplies at the correct usage temperature. Usage temperature can be important for certain types of agents, such as fibrin and thrombin. Fibrin and thrombin are refrigerated, biologically active agents that have to be dispensed at room temperature. If the surgical procedure calls for an agent, it can accordingly be critical for the adjunct to be at the correct temperature for the procedure. Through structured data sharing, a scheduling database can share scheduled surgical procedure times with all other relevant databases in the medical facility, ensuring that all relevant departments are fully up to date as to the start time for each procedure. If an agent is needed at the beginning of the procedure, then the medical facility personnel can be provided the precise time that the surgical procedure is to begin and can thus know to deliver the agent at that time. If an agent is needed during a procedure, a surgical hub 206 executing a situational awareness system can further monitor the progress of the surgical procedure after it has begun and update other relevant databases as to the status of the surgical procedure through structured data sharing so that medical facility personnel know the precise time at which they should bring desired agents to the OR so that they are maintained at the proper usage temperature. Accordingly, structured data sharing in the OR context can ensure that the agents are ready at the correct time, at the correct temperature, without risking any damage to the agents. As yet another example, the surgical hub 206 could monitor the progress of the surgical procedure (e.g., via situational awareness) and automatically share the procedural progress with the cleaning department so that they know when to expect to turn over the OR for the next procedure, which in turn aids in overall hospital logistics and scheduling by facilitating the process of cleaning and preparing surgical facilities for subsequent procedures.
In one aspect, a computer system (e.g., a surgical hub 206) can be programmed to track the use of surgical devices and their movement through a medical facility to, for example, collect data on the surgical instruments throughout their life cycle. Such data can include the number of times that a surgical device has been sterilized, repaired, and/or held in inventory or the amount of time that a surgical device has been held in each of the respective departments. A computer system can track surgical devices in this manner through structured data sharing by receiving from the databases of each relevant department location data for a surgical device (e.g., when a surgical device is brought to a department, it can be scanned into that department, which generates a record of the location of the surgical device), repair and maintenance records for the surgical device, and so on. Such data can be utilized to evaluate values, costs, and efficiencies of all of the medical products that are utilized in the medical facility.
In one aspect, a computer system can be programmed to allow patients to contribute self-reported data. In various aspects, the self-reported data could be directly entered into a database of a medical facility computer system via a computer terminal or the patient could cause a personal electronic device (or another personal data collection device) to automatically transmit collected information to a designated recipient database. The self-reported data could include, for example, blood sugar logs from testing equipment, such as a continuous blood glucose monitor, insulin pumps, artificial pancreas data, and so on. The self-reported data can also include, for example, data from activity monitors (e.g., Fitbit or Apple Watch) that are configured to collect activity data, location data, and other types of data. The activity monitors can provide, for example, activity level data (e.g., distance traveled, active minutes, number of steps taken, number of flights of stairs traversed), sleep data (e.g., sleep cycles, duration, and stages), heart rate monitoring data (e.g., resting heart rate, percent of time in specified heart rate zones, which can be determined by age, and heart rate variability), nutritional information, water intake, calories burned, and so on. When uploaded to a recipient database, the recipient database can then, in some aspects, automatically share relevant self-reported patient data with other connected devices according to a structured data-sharing ruleset.
With structured data sharing, one concern is for access to data to only be granted to appropriate recipients. Accordingly, all data requests and all requests to link databases must be verified and authorized to prevent unauthorized recipients from gaining access to the data.
Accordingly, the structured data-sharing paradigms described herein, i.e., data fluidity and data interoperability, can facilitate the movement of data throughout a medical facility (or a network of interconnected medical facilities). By seamlessly sharing data so that every interconnected database always has access to all of the data generated in the medical facility that is relevant to its department, structured data-sharing paradigms allow medical facilities to operate more efficiently and provide better patient outcomes.
In some aspects, the computer systems described herein are programmed to provide clear, holistic analyses of the total costs associated with any given surgical procedure or treatment, such as by calculating the total cost associated with all of the items that are used during a surgical procedure or treatment. Such functionality can provide a range of benefits, including allowing administrators to understand precisely where and how money is being expended in a medical facility, providing suggestions on cost-effective product mixes for particular types of surgical procedures, identifying when reusable items should be replaced, determining the degree of wear and tear on the surgical instruments and other items used during a procedure, and so on. Further, this economic data can be integrated with data on treatment or surgical outcomes so that users can provide additional analyses or so that the systems can provide recommendations to users. The data on treatment or surgical outcomes can be determined by, for example, the cloud computing system described in connection with
Accordingly, systems and methods are described herein for analyzing the total costs of surgical instruments and devices for surgical procedures, including both in-house costs and servicing costs. In one aspect, a computer system (e.g., a surgical hub) can be programmed to provide real-time analyses of the comprehensive costs of all instruments and devices used in a surgical procedure, including the costs associated with both reusable devices (e.g., maintenance, cleaning, and resterilization costs) and non-reusable devices (i.e., replacement costs). In some aspects, the computer system can utilize the data-sharing paradigms described above under the heading STRUCTURED DATA SHARING to determine the replacement costs of non-reusable surgical devices by, for example, receiving or sharing data between a purchasing database. In some aspects, the computer system can utilize the data-sharing paradigms described above under the heading STRUCTURED DATA SHARING to determine the actual maintenance costs of reusable surgical devices by, for example, receiving or sharing data between a variety of medical facility databases to track the devices throughout the medical facility. By tracking the devices as they are transported throughout the medical facility for stocking, sterilization, and other in-house maintenance processes, the computer system can calculate the maintenance costs according to the time and resources actually expended on maintaining the surgical devices.
In one aspect, the various computer systems (e.g., surgical hubs) throughout a medical facility can generate, store, and share metadata indicating when and how each surgical device has interacted with each of the various computer systems. For example, when a surgical device is brought into an OR and connects to the surgical hub located within that OR, the surgical hub can generate metadata associated with that surgical instrument indicating the date, time, and location of the surgical instrument and then store and share that metadata with other computer systems within the network. Accordingly, the computer systems described herein can track surgical instruments according to their associated metadata. In one aspect, a computer system (e.g., a surgical hub) can be programmed to retrieve or otherwise receive metadata for all of the surgical devices utilized during the course of a surgical procedure to track them throughout their pre- and post-operative processes, including their locations, statuses, replacement parts installed in them, repairs applied, and cleaning times. Accordingly, the computer system can track the cost and utilization of the surgical devices as they are circulated through the medical facility.
In one aspect, a computer system (e.g., a surgical hub) can be programmed to track the number of uses of a resterilized or otherwise reused device. The computer system can further be programmed to determine when the device has reached the end of its life according to whether the number of uses meets or exceeds a use threshold. In another aspect, a computer system (e.g., a surgical hub) can be programmed to determine the maintenance costs of a surgical device, determine the replacement cost of the surgical device (e.g., by retrieving the replacement cost from a purchasing database), and then determine whether the surgical device should be replaced according to whether the maintenance costs exceed the replacement costs. Accordingly, the computer system can execute cost analysis algorithms for tracking surgical devices throughout medical facilities, analyze the costs associated with the surgical devices, and provide recommendations to users.
Tracking all of the various costs associated with the total care and maintenance associated with each surgical device allows the cost analysis module 210502 to provide true one-for-one comparisons between different mixes of surgical products. Accordingly, users can utilize the cost analysis module 210502 to perform cost analyses, or the cost analysis module 210502 can automatically perform such analyses and make recommendations to users to more efficiently utilize hospital resources, identify bottlenecks within the medical facility's systems and provide suggestions on how to improve them, identify when there are too few or too many of specific products that are costing time or money, and so on.
As mentioned above, the various computer systems (e.g., surgical hubs) within a medical facility can track each individual surgical device as it is processed through the medical facility's workflow by generating, storing, and sharing metadata indicating when and how each surgical device has interacted with each of the various computer systems. For example,
Additional processes or algorithms can then utilize this location surgical device metadata. For example, a computer system 210704 can determine when a particular surgical device 210702 is at a preceding department in the workflow for the surgical device 210702 and then automatically provide a prompt or notification for the staff to prepare to receive the surgical device 210702 (e.g., prepare sterilization supplies when the surgical device 210702 is in surgical 210706 and is expected to then be sent to sterilization 210708). As another example, a computer system 210704 can determine when a surgical device 210702 has been used in a surgical procedure or cleaned a threshold number of times and then provide a notification for the staff to order replacement parts for the surgical device 210702 or dispose of the surgical device 210702. Alternatively, the computer system 210704 can automatically order replacement parts for the surgical device 210702 after a threshold number of uses. Such processes reduce or eliminate the need for the medical facility 210700 to excessively stock replacement parts, cleaning products, and other such products onsite.
In another aspect, the computer systems 210704 can be programmed to compare and analyze actual postoperative outcomes to predicted postoperative outcomes, incorporating the economic data generated by the cost analysis module 210502. For example, predicted reoperation costs can be calculated based on predicted surgical outcomes. More particularly, the computer systems 210704 can be programmed to retrieve data (e.g., medical literature data surgical outcomes that are uploaded to a database accessible by the computer systems 210704) or determine (e.g., by the cloud computing system described in connection with
As described above under the heading SURGICAL HUBS, surgical hubs 206 can be connected to a variety of surgical devices, such as surgical instruments, generators, smoke evacuators, displays, and so on. Through their connections to these surgical devices, the surgical hubs 206 can receive an array of perioperative data from these paired surgical devices while the devices are in use during a surgical procedure. Further, as described above under the heading SITUATIONAL AWARENESS, surgical hubs 206 can determine the context of the surgical procedure being performed (e.g., the procedure type or the step of the procedure being performed) based on perioperative data received, at least in part, from these connected surgical devices. Accordingly, the processor 244 executing the process 210600 determines 210602 whether a surgical procedure is being performed via, for example, a situational awareness system executed by the surgical hub 206. Accordingly, the processor 244 determines 210604 what surgical devices are being utilized during the surgical procedure. In one aspect, the processor 244 can determine 210604 what surgical devices are being used at any given time by detecting which surgical devices are connected to the surgical hub 206, which devices are actively being powered (e.g., whether energy is being supplied to an ultrasonic or RF electrosurgical instrument), by visually identifying which devices are being held or manipulated by the surgeon through camera systems set up throughout the OR, by determining which step of the procedure the surgical staff is performing and thereby inferring what devices are currently being utilized, and so on.
Accordingly, for each surgical device that is or was used during the surgical procedure, the processor 244 determines 210606 whether the surgical device is reusable or non-reusable. The processor 244 can determine 210606 whether a surgical device is reusable by querying a database listing whether each particular item is reusable, retrieving manufacturer's specifications for the surgical device, or retrieving the metadata associated with the surgical device to ascertain whether the item has previously been or is intended to be used multiple times, for example. If the given surgical device is reusable, then the process proceeds along the YES branch and the processor 244 determines 210608 the maintenance cost for the device. The maintenance cost can include repair costs, resterilization costs, cleaning costs, and so on. The processor 244 can determine 210608 the maintenance cost using the techniques discussed above, i.e., tracking the metadata associated with the given surgical device to determine how often and what types of maintenance steps the surgical device is taken through during its workflow. If the given surgical device is not reusable, then the process proceeds along the NO branch and the processor 244 determines 20610 the replacement cost for the device. The processor 244 can determine 210610 the replacement cost by querying a purchasing database associated with the medical facility 210700 to retrieve the purchase price of the given surgical device, for example.
In various aspects, the process 210600 calculates the costs associated with each surgical device used during the surgical procedure in order to calculate a complete cost associated with the surgical procedure. Accordingly, the processor 244 determines 210612 whether the surgical procedure is completed via, for example, a situational awareness system, as discussed above. If the procedure is not completed, then the process 210600 proceeds along the NO branch and the processor 244 continues a loop of monitoring which surgical devices are being utilized or consumed until the procedure is completed. If the procedure is completed, then the process 210600 proceeds along the YES branch and the processor 244 determines 210614 the total cost for the surgical procedure based on the aggregated maintenance and replacement costs of the surgical devices utilized during the surgical procedure.
In some aspects, the computer systems described herein are programmed to evaluate the surgical staff during the course of a surgical procedure (e.g., how they are using surgical instruments) and propose suggestions to improve the surgical staff members' techniques or actions. In one aspect, the computer systems described herein, such as the surgical hubs 106, 206 (
As described above under the heading SURGICAL HUBS, computer systems, such as surgical hubs 211801, can be connected to or paired with a variety of surgical devices, such as surgical instruments, generators, smoke evacuators, displays, and so on. Through their connections to these surgical devices, the surgical hubs 211801 can receive an array of perioperative data from these paired surgical devices while the devices are in use during a surgical procedure. Further, as described above under the heading SITUATIONAL AWARENESS, surgical hubs 211801 can determine the context of the surgical procedure being performed (e.g., the procedure type or the step of the procedure being performed) based, at least in part, on perioperative data received from these connected surgical devices. Accordingly, the processor 244 executing the process 211000 receives 211002 perioperative data from the surgical device(s) connected or paired with the surgical hub 211801 and determines 211004 the surgical context based at least in part on the received perioperative data utilizing situational awareness. The surgical context determined by the surgical hub 211801 through situational awareness can be utilized to inform evaluations of the surgical staff performing the surgical procedure.
Accordingly, the processor 244 captures 211006 image(s) of the surgical staff performing the surgical procedure via, for example, cameras 211802 positioned within the OR 211800. The captured image(s) can include static images or moving images (i.e., video). The images of the surgical staff can be captured at a variety of angles and magnifications, utilize different filters, and so on. In one implementation, the cameras 211802 are arranged within the OR 211800 so that they can collectively visualize each surgical staff member performing the procedure.
Accordingly, the processor 244 determines 211008 a physical characteristic of one or more surgical staff members from the captured image(s). For example, the physical characteristic can include posture, as discussed in connection with
Accordingly, the processor 244 evaluates 211010 the determined physical characteristic of the surgical staff member to a baseline. In one aspect, the baseline can correspond to the surgical context determined via situational awareness. The processor 244 can retrieve the baselines for various physical characteristics from a memory (e.g., the memory 249 illustrated in
In one aspect, the processor 244 can provide feedback to the surgical staff members in real time during the surgical procedure. The real-time feedback can include a graphical notification or recommendation displayed on a display 211806 within the OR 211800, audio feedback emitted by the surgical hub 211801 or a surgical instrument 211810, and so on. Further, the feedback can include suggestions that trocar port placements be shifted, that a surgical instrument be moved from one trocar port to another port, that the positioning of the patient being operated on be adjusted (e.g., situated at an increased table angle or rolled), and other such suggestions to improve access to the surgical site and minimize non-ideal surgical technique exhibited by the surgical staff. In another aspect, the processor 244 can provide postoperative feedback to the surgical staff members. The postoperative feedback can include graphical overlays or notifications displayed on the captured video of the procedure that can be reviewed by the surgical staff for learning purposes, a post-surgery report indicating times or particular surgical steps where the surgical staff deviated from the baselines, and so on. Any visually identifiable physical characteristic (or combination of physical characteristics) can be utilized as the basis for suggesting improvements in the technique exhibited by the surgical staff.
In one aspect, one or more of the steps of the process 211000 can be executed by a second or remote computer system, such as the cloud computing systems described under the heading CLOUD SYSTEM HARDWARE AND FUNCTIONAL MODULES. For example, the surgical hub 211801 can receive 211002 perioperative data from the connected surgical devices, determine 211004 the surgical context based at least in part on the perioperative data, capture 211006 or receive images of a surgical staff member 211803 via the cameras 211802, and determine 211008 a physical characteristic of the surgical staff member 211803, as described above. However, in this aspect, instead of performing the evaluation onboard the surgical hub 211801, the surgical hub 211801 can instead transmit data regarding the physical characteristic and the determined surgical context to a second computer system, such as a cloud computing system. The cloud computing system can then perform the evaluation by determining whether the determined physical characteristic deviates from the baseline physical characteristic that corresponds to the surgical context. In some aspects, the baseline physical characteristic can be determined or calculated from data aggregated from all of the surgical hubs 211801 that are communicably connected to the cloud computing system, which allows for the cloud computing system to compare surgical staff members' 211803 techniques across a number of medical facilities. Accordingly, the cloud computing system can transmit the results of the comparison between the physical characteristic determined by the surgical hub 211801 and the corresponding baseline stored on or determined by the cloud computing system. Upon receiving the results, the surgical hub 211801 can then take appropriate action (e.g., displaying a notification if the surgical staff members' 211803 technique is deviating from the baseline, as described above). In other aspects, one or more additional or different steps of the process 211000 can be performed by other computing systems that are communicably coupled to the first computing system. Such connected computer systems can, in some aspects, be embodied as distributed computing systems.
Referring to
In one aspect, the posture of the individual being evaluated by the computer system can be quantified as a metric corresponding to the deviation in position of one or more locations of the individual's body from corresponding initial or threshold positions. For example,
In one aspect, the surgical hub 211801 executing the process 211000 can compare the calculated posture metric to one or more thresholds and then take various actions accordingly. In the depicted implementation, the surgical hub 211801 compares the posture metric to a first threshold 211110 and a second threshold 211112. If the normalized posture metric, represented by the second line 211106, exceeds the first threshold 211110, then the surgical hub 211801 can be configured to provide a first notification or warning to the surgical staff in the OR 211800 that indicates that there is a potential risk with the particular individual's form. Further, if the normalized posture metric, represented by the second line 211106, exceeds the second threshold 211112, then the surgical hub 211801 can be configured to provide a second notification or warning to the users in the OR 211800 that indicates that there is a high degree of risk with the particular individual's form. For example, at time t4, the posture metric for the evaluated surgical staff member, as represented by the fourth model 211050d, exceeds the first threshold 211110; accordingly, the surgical hub 211801 can be configured to provide a first or initial warning to the surgical staff.
Accordingly, the surgical hub 211801 executing the process 211000 can analyze the wrist angle of a surgical staff member's hand holding a surgical instrument 211654 and provide recommendations if the staff member's wrist angle deviates from the baseline. Awkwardly holding a surgical instrument, as evidenced by an extreme wrist angle relative to the surgical instrument, can indicate, for example, that the surgeon is utilizing the surgical instrument incorrectly, has positioned the surgical instrument incorrectly, is utilizing an incorrect surgical instrument for the particular procedural step, or is otherwise acting in a potentially risky manner that could create danger.
In this particular implementation, the angle of the individual's wrist 211650 is defined as the angle α between the longitudinal axis 211656 of the surgical instrument 211654 being held by the surgeon and the longitudinal axis 211652 (i.e., the proximal-to-distal axis) of the individual's hand In other implementations, wrist angle can be defined as the angle between the individual's hand and forearm, for example. In the scatterplot 211700 of
In one aspect, the surgical hub 211801 executing the process 211000 can compare the calculated wrist angle α to one or more thresholds and then take various actions accordingly. In the depicted implementation, the surgical hub 211801 determines whether the surgeon's wrist angle α falls within a first zone, which is delineated by a first threshold 211708a and a second threshold 211708b, within a second zone, which is delineated by a third threshold 211706a and a fourth threshold 211706b, or outside the second zone. If the wrist angle α measured by the surgical hub 211801 during the course of a surgical procedure falls between the first and second thresholds 221708a, 221708b then the surgical hub 211801 can be configured to determine that the wrist angle α is within acceptable parameters and take no action. If the surgeon's wrist angle α falls between the first and second thresholds 221708a, 221708b and third and fourth thresholds 221706a, 221706b, then the surgical hub 211801 can be configured to provide a first notification or warning to the surgical staff in the OR 211800 that indicates that there is a potential risk with the particular individual's form. Further, if the surgeon's wrist angle α falls outside of the third and fourth thresholds 221706a, 221706b, then the surgical hub 211801 can be configured to provide a second notification or warning to the users in the OR 211800 that indicates that there is a high degree of risk with the particular individual's form.
In some aspects, the various thresholds or baselines against which the monitored physical characteristic is compared can be determined empirically. The surgical hubs 211801 and/or cloud computing system described above under the heading CLOUD SYSTEM HARDWARE AND FUNCTIONAL MODULES can capture data related to various physical characteristics of the surgical staff members from a sample population of surgical procedures for analysis. In one aspect, the computer system can correlate those physical characteristics with various surgical outcomes and then set the thresholds or baselines according to the particular physical characteristics of the surgeon or other surgical staff members that are correlated most highly with positive surgical outcomes. Accordingly, a surgical hub 211801 executing the process 211000 can provide notifications or warnings when the surgical staff members are deviating from best practices. In another aspect, the computer system can set the thresholds or baselines according to the physical characteristics that are exhibited most often within the sample population. Accordingly, a surgical hub 211801 executing the process 211000 can provide notifications or warnings when the surgical staff members are deviating from the most common practices. For example, in
In one aspect, the physical characteristic being tracked by the surgical hub 211801 can be differentiated according to product type. Accordingly, the surgical hub 211801 can be configured to notify the surgical staff members when the particular physical characteristic being tracked corresponds to a different product type. For example, the surgical hub 211801 can be configured to notify the surgeon when the surgeon's arm and/or wrist posture deviates from the baseline for the particular surgical instrument currently being utilized and thus indicates that a different surgical instrument would be more appropriate.
In one aspect, the surgical hub 211801 can be configured to compare the external orientation of a surgical instrument 211810 to the internal access orientation of its end effector. The external orientation of the surgical instrument 211810 can be determined via the cameras 211802 and optical systems described above. The internal orientation of the end effector of the surgical instrument 211810 can be determined via an endoscope or another scope utilized to visualize the surgical site. By comparing the external and internal orientations of the surgical instrument 211810, the surgical hub 211801 can then determine whether a different type of surgical instrument 211810 would be more appropriate. For example, the surgical hub 211801 can be configured to provide a notification to the surgical staff if the external orientation of the surgical instrument 211810 deviates from the internal orientation of the end effector of the surgical instrument 211810 to more than a threshold degree.
In sum, computer systems, such as a surgical hub 211801, can be configured to provide recommendations to a surgical staff member (e.g., a surgeon) as the surgical staff member's technique starts to drift from best or common practices. In some aspects, the computer system can be configured to only provide notifications or feedback when the individual has repeatedly exhibited suboptimal behavior during the course of a given surgical procedure. The notifications provided by the computer systems can suggest, for example, that the surgical staff member adjust their technique to coincide with the optimal technique for the procedure type, utilize a more appropriate instrument, and so on.
In one aspect, the computer system (e.g., a surgical hub 211801) can be configured to allow surgical staff members to compare their technique to themselves, rather than to the baselines established by the sampled population or pre-programmed into the computer system. In other words, the baseline against which the computer system compares a surgical staff member can be the surgical staff member's prior performance in a particular surgical procedure type or a prior instance of utilizing a particular type of surgical instrument. Such aspects can be useful to allow surgeons to track improvements in their surgical techniques or document trial periods for new surgical products. Accordingly, the surgical hub 211801 can be configured to evaluate products during a trial period and provide highlights of the use of the products during the given period. In one aspect, the surgical hub 211801 can be programmed to be especially sensitive to deviations between the surgical staff members performance and the corresponding baselines so that the surgical hub 211801 can reinforce the proper techniques for using the surgical device when the trial period is ongoing. In one aspect, the surgical hub 211801 could be configured to record the use of the new surgical products and compare and contrast the new products with the previous baseline product use. The surgical hub 211801 could further provide a post-analysis review to highlight similarities and differences noted between the surgeon's tracked physical characteristics when utilizing the two different products. Further, the surgical hub 211801 can allow the surgeon to compare populations of procedures between the new and old surgical products. The recommendations provided by the surgical hub 211801 can include, for example, comparative videos demonstrating the use of the new products.
In one aspect, the computer system (e.g., a surgical hub 211801) can be configured to allow surgical staff members to compare their technique directly to other surgeons, rather than to the baselines established by the sampled population or pre-programmed into the computer system.
In one aspect, the computer system (e.g., a surgical hub 211801) can be configured to analyze trends in surgical device usage as surgeons become more experienced in performing particular surgical procedures (or performing surgical procedures generally) or using new surgical instruments. For example, the computer system could identify motions, behaviors, and other physical characteristics that change dramatically as the surgeons become more experienced. Accordingly, the computer system can recognize when a surgeon is exhibiting suboptimal techniques early in the surgeon's learning curve and can provide recommendations about the optimal approach, prior to the suboptimal technique becoming ingrained in the surgeon.
Accordingly, the processor 244 executing the process 211600 captures 211602 image(s) (which can include static images or video) of the OR 211800 via an assembly of cameras 211802 situated therein. Any captured images that include surgical staff members 211803 and/or surgical devices can be analyzed by the process 211600 to ascertain information about the surgical staff members 211803 and/or surgical devices for controlling the surgical devices. Targets to be tracked or monitored (i.e., the surgical staff members 211803 and surgical devices) can be recognized from images captured by the assembly of cameras 211802 utilizing a variety of image or object recognition techniques, including appearance and feature-based techniques. For example, the captured images can be processed utilizing an edge detection algorithm (e.g., a Canny edge detector algorithm) to generate outlines of the various objects within each image. An algorithm can then compare the templates of target objects to the images containing the outlined objects to determine whether any of the target objects are located within the images. As another example, an algorithm can extract features from the captured images. The extracted features can be then be fed to a machine learning model (e.g., an artificial neural network or a support vector machine) trained via supervised or unsupervised learning techniques to correlate a feature vector to the targets. The features can include edges (extracted via a Canny edge detector algorithm, for example), curvature, corners (extracted via a Harris & Stephens corner detector algorithm, for example), and so on.
Accordingly, the processor 244 determines 211604 a characteristic or condition of the surgical staff and/or surgical devices captured by the images. Such characteristics or conditions can include physical properties, actions, interactions between other objects or individuals, and so on. More particularly, characteristics or conditions of the surgical staff members 211803 can include whether a surgical staff member 211803 is performing a gesture 211804 (as shown in
Accordingly, the processor 244 controls 211606 a surgical device that is paired with the surgical hub 211801 in a manner that depends upon the particular determined characteristic or condition. For example, if the processor 244 determines 211604 that a surgical staff member 211803 is making a “change instrument mode” gesture, then the processor 244 can transmit a signal to or otherwise control 211606 a particular surgical instrument 211810 (or its associated generator) connected to the surgical hub 211801 to change the operational mode of the surgical instrument 211810 (e.g., change an electrosurgical surgical instrument from a sealing mode to a cutting mode). This would allow the surgical staff to control the surgical instruments 211810 without the need to directly interact with the surgical instruments 211810 themselves. As another example, if the processor 244 determines 211604 that a surgical instrument 211810 is being passed (or is being prepared to be passed) from one surgical staff member 211803 (e.g., a nurse) to another surgical staff member 211803 (e.g., a surgeon), then the processor 244 can transmit a signal to or otherwise control 211606 the energy generator to activate and begin supplying energy to the connected surgical instrument 211810. This would allow the surgical hub 211801 to preemptively activate surgical instruments 211810 so that they are ready for use without the surgeon needing to take any affirmative action. As yet another example, if the processor 244 determines 211604 that a surgical instrument 211810 is at a particular orientation when being (or as it is about to be) fired, the processor 244 can transmit a signal to or otherwise control 211606 the surgical instrument 211810 to modify the operational parameters of the surgical instrument 211810 (e.g., force to fire or maximum permitted articulation angle) accordingly. This would allow the surgical hub 211801 to control the functions of the surgical instruments 211810 to account for differences in placements and orientations of the surgical instruments 211810.
In another aspect, the surgical hub 211801 can include a voice recognition system in addition to or in lieu of the gesture recognition system 211500, described below. In this aspect, the surgical hub 211801 can be programmed to identify and respond to a variety of voice commands and control the functions of any connected surgical devices accordingly.
In another aspect,
In one implementation of the processes 211600, 211620 described in connection with
The gesture recognition system 211500 is programmed to receive image or video data from the image recognition hardware (e.g., the cameras 211802), recognize various gestures 211804 that can be performed by the surgical staff members 211803 (i.e., determine 211604, 211624 whether a gesture is being performed in the processes 211600, 211620 described in connection with
Upon recognizing a gesture via the gesture recognition module 211504, the gesture recognition system 211500 can take an action 211510 or make a response that corresponds to the identified gesture. In one aspect, the action 211510 taken by the computer system includes controlling a surgical device within the OR 211800, as discussed above in connection with
In another aspect, the action 211510 taken by the computer system includes saving the gestures made by the surgical staff as metadata associated with or linked to the perioperative data generated by the surgical devices during the course of the surgical procedure, as discussed above in connection with
In another aspect, the gesture recognition system 211500 utilizes a magnetic sensing system for receiving non-contact input from users, in addition to or in lieu of cameras 211802 to visually identify gestures. In this aspect, the gesture recognition system 211500 can include, for example, a magnetic sensing array that can be positioned within the OR 211800. The magnetic sensing array can be configured to monitor for the positions of magnetic elements that can be controlled by the surgical staff members 211803. In one aspect, the magnetic elements can be built into a surgical glove or another such article of clothing. In another aspect, the magnetic elements can be located within an object or token that is manipulable by the surgical staff members 211803. Accordingly, the magnetic sensing array can be configured to detect the position of the magnetic sensing elements over time and identify any gestures that are performed by the individual controlling the magnetic elements. As with the gesture recognition system 211500, users can scroll through menus or selected items from menus displayed on displays 211806 within the OR 211800 or make other gestures to control the functions of various surgical devices within the OR 211800. Accordingly, the position, movement, and/or orientation of the magnetic element can be utilized as a tracking marker for controlling displays 211806 or other surgical devices that are connected by the surgical hub 211801, whether they are located within or outside of the sterile field.
In one prophetic implementation of the processes 211600, 211620 described in connection with
In one aspect of the process 211620 described in
Further, this data can then be utilized to establish thresholds or baselines, which can in turn be utilized to provide recommendations to surgical staff members 211803 during or after the completion of a surgical procedure, as described in U.S. Patent Application Publication No. 2019/0201126, titled USAGE AND TECHNIQUE ANALYSIS OF SURGEON/STAFF PERFORMANCE AGAINST A BASELINE TO OPTIMIZE DEVICE UTILIZATION AND PERFORMANCE FOR BOTH CURRENT AND FUTURE PROCEDURES, which published on Jul. 4, 2019. For example, as illustrated in
In one aspect of the process 211600 described in
In one aspect, the computer system can be programmed to create an orientation index that defines the pose of a surgical instrument 211810 with respect to a predefined or normalized reference frame. This can allow data captured in ORs of differing dimensions to be compared seamlessly. The orientation index can be defined when the surgical hub 206 scans its surroundings utilizing a non-contact sensor module 242, as described under the heading SURGICAL HUBS, for example. Accordingly, the computer system can detect and save the pose of the surgical instrument 211810 as a function of the predefined reference frame.
In other implementations, the computer system can track the locations and orientations of trocars utilized for a particular surgical procedure type, which can then be saved as metadata and/or utilized to control the displays 211806 or other surgical devices to provide recommendations to the surgical staff. The trocar positions can be analyzed to determine which range of positions (or combination of positions for surgical procedures utilized multiple trocars) is correlated most highly with positive procedural outcomes. Accordingly, the computer system can then provide recommendations for trocar placements in future surgical procedures.
In other implementations, the computer system can track the location of the handle with respect to surrounding objects (e.g., the surgical table or other equipment), which can then be saved as metadata and/or utilized to control the displays 211806 or other surgical devices to provide recommendations to the surgical staff. For example, the computer system can provide recommendations on the placement of trocars to avoid issues in previous procedures where particular placements caused the surgical instruments 211810 inserted throughout those trocars to be obstructed by various objects, resulting in more challenging procedures (which can be correlated with worse surgical outcomes or longer procedure times, for example).
In other implementations, the computer system can identify the surgical instruments 211810 and other surgical devices in the setup located on the preoperative back table to provide additional context to the surgical procedure data and/or the inferences made by the situational awareness system, as described under the heading SITUATIONAL AWARENESS. Identifying which surgical devices are (or are not) in the preoperative setup can inform the later inferences made by the situational awareness system.
In other implementations, the computer system can identify the circulating nurses and/or scrub nurses from the surgical staff members 211803 and track their locations and activities to assist in informing what the next step of the surgical procedure may be. The activities of the scrub nurse can be informative because the scrub nurse usually retrieves the surgical instrument 211810 that is expected to be needed next and then transfers that surgical instrument 211810 to the surgeon when needed. Further, some surgical instruments 211810 or other devices need preparation before they are utilized (e.g., when dictated by the tissue conditions, buttress may be placed on a surgical stapler). Accordingly, when the scrub nurse is holding a surgical instrument 211810, which surgical instrument 211810 is being held by the scrub nurse and what preparations are being performed by the scrub nurse can assist in inferring which steps of the surgical procedure are being performed or will be performed. Still further, new equipment being transferred from the circulating nurse to the scrub nurse can generally inform how the procedure is going, inform which procedure steps are being performed, and indicate the possibility of complications. For example, if additional adjunctive hemostats are being transferred to the scrub nurse, that can indicate that the surgical procedure is not proceeding well because there is more bleeding than was initially anticipated. Still further, circulating nurses bring materials into the OR, adjust the settings of surgical devices outside the sterile field, and so on. Accordingly, these activities can be monitored and also be used to inform which steps of the surgical procedure are being performed.
In various aspects, computer systems, such as the surgical hubs 106, 206 described in connection with
As described above under the heading SURGICAL HUBS, computer systems, such as the surgical hubs 106, 206 (
Accordingly, the processor 244 executing the process 210000 receives 210002 perioperative data from the surgical device(s) connected or paired with the surgical hub 206 and determines 210004 the surgical context based at least in part on the received perioperative data utilizing situational awareness. The surgical context determined by the surgical hub 206 through situational awareness can be utilized to inform evaluations of the surgical staff performing the surgical procedure.
Accordingly, the processor 244 determines 210006 a procedural variable associated with the surgical procedure based on the surgical context and the perioperative data from the connected surgical devices. The procedural variable can include any aspect or characteristic of the surgical procedure that can vary between individual performances of the surgical procedure type. For example, the procedural variable can include the length of time for the surgical procedure as a whole, the length of time for a particular step of the surgical procedure, the type of surgical instrument being utilized, the costs (e.g., maintenance costs and replacement costs) associated with surgical devices, the type of staple cartridge being utilized in a surgical stapler, the power level or mode of an ultrasonic surgical instrument or an electrosurgical instrument, and the surgical device setup for the procedure (i.e., the preoperative assortment of surgical devices selected for the procedure). The processor 244 can monitor a single procedural variable or multiple procedural variables. In one aspect, the surgical hub 206 can monitor the status of every procedural variable that it has been programmed to determine. In another aspect, the surgical hub 206 can monitor one or more procedural variables that have been selected or programmed by users.
Accordingly, the processor 244 compares 210008 the determined procedural variable (or variables) to a corresponding baseline (or baselines). Further, the baseline can correspond to or otherwise depend upon the determined surgical context. In one aspect, the surgical hub 206 can retrieve the baseline corresponding to the procedural variable and the surgical context from its memory 249. In another aspect, the surgical hub 206 can retrieve the baseline from a cloud computing system, as is described under the heading CLOUD SYSTEM HARDWARE AND FUNCTIONAL MODULES, that is communicably connected to the surgical hub 206. In one aspect, the cloud computing system can aggregate data across all of the surgical hubs 206 connected thereto and calculate corresponding baselines for the procedural variables of various types of surgical procedures. The baselines for different procedural variables can be determined from the data aggregated from individual surgical hubs 206 or networks of surgical hubs 206 by averaging the data (e.g., the average length of time to complete a step of a surgical procedure), determining the most common instance of the procedural variable (e.g., the most common type of surgical instrument utilized for a surgical procedure or a step thereof), determining which instance of the procedural variable is most correlated with positive procedural outcomes (e.g., the force to fire a surgical stapling and cutting instrument that is associated with the least amount of bleeding for a given tissue type), and so on. Further, the baselines can be aggregated according to surgical metadata (e.g., surgical contextual data determined via situational awareness or patient data from an electronic medical record (EMR)), such as tissue thickness for firings of a surgical instrument and comorbidities suffered by the patient at the time of the surgical procedure, in order to ensure that the determined procedural variables are compared to relevant baselines.
Based on the results of the comparison between the determined procedural variable and its corresponding baseline, the surgical hub 206 can take various actions in response. In one aspect, the processor 244 provides 210010 a notification or recommendation according to whether the determined procedural variable deviates from its corresponding baseline. The recommendation can vary depending upon the particular type of procedural variable that is being compared. The recommendation can be for the user to utilize a different surgical instrument (e.g., an instrument with smaller jaws, a larger maximum articulation angle, or a longer shaft), utilize a different trocar or other access point, change the patient's position on the surgical table (e.g., roll the patient), and so on.
In one aspect, the recommendations provided 210010 by the process 210000 illustrated in
In the first image 210100a, the surgeon is raising a vessel for transection during a lobectomy procedure. In a lobectomy procedure, the pulmonary vessels that supply blood to the lobe of the lung that is the subject of the procedure must be dissected out and transected. These pulmonary vessels are typically fragile and have a very high volume of blood flowing through them. Therefore, the dissection is delicate and a mistake can be fatal for the patient. Further, the orientation of the pulmonary vessels is not always predictable, which makes trocar placement difficult to optimize. If the orientation of the pulmonary vessels is poor with respect to the location of the trocars, the surgical procedure may be awkward or especially challenging for the surgeon. Therefore, it would be highly beneficial for a computer system, such as the surgical hub 206, to recognize when the surgeon performing the particular procedure is having difficulty and provide intraoperative recommendations to assist the surgeon.
In the second image 210100b, the surgeon is attempting to transect the vessel with a straight-tipped vascular stapler 210106. As discussed above in connection with
In one aspect, the surgical hub 206 is programmed to determine the recommendation based on the surgical context and the given procedural variables and then access the inventory database of the medical facility to determine whether the recommended alternative is available to the medical facility. If the recommended alternative is not available, the surgical hub 206 can be programmed to not make the recommendation intraoperatively or otherwise record that the alternative surgical device would have been recommended had it been available. If the recommended alternative is available, the surgical hub 206 can be programmed to provide the intraoperative recommendation, as discussed above. In one aspect, the recommendation can be provided as an icon 210104 or graphical overlay on the displayed live video feed. In other aspects, the recommendation can be provided on a separate display, via audio through a speaker, and so on. After receiving the recommendation, the surgeon switches to the recommended curved-tip vascular stapler 210108, as indicated by the third image 210100c, and then completes the given step of the surgical procedure, as indicated by the fourth image 210100d of the video feed.
In another prophetic implementation of the process 210000 where recommendations are provided intraoperatively, the surgical hub 206 could be programmed to determine when the surgical staff is preparing to place the trocars for a laparoscopic procedure (e.g., through situational awareness) or if the surgical staff is having difficulties in performing a procedure due to poor trocar placement. Accordingly, the surgical hub 206 can then recommend placement locations for the trocars and/or specific types of trocars to use on a display coupled to the surgical hub 206. The recommended placement locations and trocar types can be selected to maximize accessibility to target tissue for the given surgical procedure. The recommendations provided by this implementation of the process 210000 can be based on, for example, preoperative imaging of the target tissue (which can indicate whether a target tissue, such as a lymph node, will be difficult to access), an inference as to the location of the target tissue from outside the body by aligning the preoperative imaging data with the patient, intraoperative imaging of the target tissue (e.g., image data captured via an endoscope 239), imaging of the operating room (OR) via a camera assembly (which can include cameras positioned around the OR or cameras worn by the surgical staff, for example), available surgical device types (e.g., whether particular surgical devices are already opened on the prep table or what surgical devices are available in the medical facility), and whether the surgical hub 206 has determined that the surgical staff has been having difficulty with an ongoing procedure (e.g., the duration of time spent on a procedural step or the number of instrument exchanges for the procedural step is deviating from a baseline). Imaging the OR via a camera assembly can be beneficial to, for example, determine the positioning of the patient on the OR table (e.g., whether the patient is in the Trendelenburg position, supine, or in the lithotomy position), where trocars are already positioned in the patient, potential positions where the surgeons and/or assistants could be located when performing the surgical procedure, and locations of potential obstructions within the OR (which could affect optimal trocar positioning). In addition to suggesting particular trocar positions and trocar types, the surgical hub 206 can also be programmed to provide other types of recommendations based on these procedural variables, such as an alternative surgical device (e.g., a different grasper with smaller jaws, a surgical instrument having a larger maximum articulation angle, or a surgical instrument having a longer shaft), shifting the position of a surgical instrument with respect to the currently placed trocars (e.g., moving a surgical instrument from one trocar to another trocar), shifting the position of the patient on the surgical table (e.g., increase the angle of the table or roll the patient), and so on.
In another aspect, the recommendations provided 210010 by the process 210000 illustrated in
In one aspect, the graphical user interface 210200 can be configured to overlay a recommended alternative surgical device over the surgical device shown in the video feed to demonstrate to the surgeon how the step of the procedure could have proceeded differently with the alternative surgical device. Further, the graphical user interface 210200 can combine data from recorded video feeds of multiple surgical procedures to show the surgeon his or her movement or technique patterns, where the surgeon differs from peers, where the surgeon can change his or her patterns to optimize outcomes relative to peers, and/or when and where particular surgical device types, techniques, or positions are strongly correlated with outcomes for the given procedure type or step thereof.
In another aspect, the recommendations provided 210010 by the process 210000 illustrated in
The historical data underlying the recommendations provided by the surgical hub 206 can be provided in a number of different graphical formats, including as graphs, charts, raw data, and so on. In the illustrated implementation, the graphical user interface 210300 is displaying a recommendation for which particular type of surgical instrument should be utilized during a particular surgical procedure and the historical data on which the recommendation is based. The graphical user interface 210300 can display a graph 210302 including a vertical axis 210304 indicating the number of instances various types of surgical instruments have been utilized for the surgical procedure and a horizontal axis 210306 indicating the surgical instrument types. Further, the uses for each surgical instrument type can be subdivided by the number of positive and negative procedural outcomes. This allows users to visualize whether each surgical instrument type is correlated with positive or negative outcomes, in addition to visualizing the total number of times that the instrument was utilized in a surgical procedure. Accordingly, the recommendation, which can be indicated by an icon 210308 within the graphical user interface 210300, can correspond to the surgical instrument that has been utilized the most times during surgical procedures, is most correlated with positive procedural outcomes, is least correlated with negative procedural outcomes, and so on.
In one aspect, the historical data illustrated in the graphical user interface 210300 in
In some cases, the recommendations that the surgical hub 206 is programmed to provide can be predetermined or set by administrators of the computer network to which the surgical hub 206 is connected, rather than being determined by the computer system itself from the aggregated data. For example, Daniel L. Miller et al., Impact of Powered and Tissue-Specific Endoscopic Stapling Technology on Clinical and Economic Outcomes of Video Assisted Thoracic Surgery Lobectomy Procedures: A Retrospective, Observational Study, Advances in Therapy, May 2018, 35(5), p. 707-23, demonstrates that a tissue-specific stapler is associated with better patient outcomes and lower hospital costs. Accordingly, a network administrator could program or set a rule that causes any surgical hubs 206 to provide recommendations in accordance with this research. Such predetermined recommendations can be based on external research, such as white papers. In one aspect, the surgical hub 206 can be programmed to provide access to the external research on which the particular recommendations are based via, for example, a link or widget supplied by the graphical user interface providing the intraoperative or postoperative recommendations.
In one aspect, a computer system can be configured to collect, analyze, and compare published external research and other data sets against outcomes in the medical facility or the network of surgical hubs 206. The computer system can, in some aspects, mimic the analytical procedure performed by the particular piece of research to confirm the research. If the research is confirmed, then the computer system can provide recommendations corresponding to the research. For example, the Miller et al. paper referenced above shows that powered staplers are associated with fewer hemostasis-related complications and lower procedure costs, particular instrument types (e.g., powered staplers) are associated with fewer hemostasis-related complications than other instrument types (e.g., manual staplers), and the effect size is larger in patients with chronic obstructive pulmonary disease (COPD). Accordingly, when this research is confirmed, the computer system can automatically implement corresponding recommendations dictated by the research throughout the network of surgical hubs 206.
In one aspect, the surgical hub 206 can be programmed to highlight the specific feature of an alternative product that makes the alternative product superior. Returning to the example discussed in connection with
As discussed above with respect to
In various aspects, a computer system executing the process 210000 illustrated in
In one aspect, the computer system executing the process 210000 can be configured to determine or quantify the effects of unauthentic surgical devices. The computer system can detect whether a surgical device has been reprocessed in an authorized manner in a variety of different ways, including whether a usage counter (e.g., stored in the memory of the surgical device) exceeds a limit, which indicates that the surgical device is being used beyond its intended lifespan. The computer system can detect whether a surgical device is a knockoff in a variety of different ways, including whether the surgical device is able to transmit a security key properly identifying the surgical device to the computer system. Accordingly, when the surgical hub 206 detects an unauthorized surgical device in use during a surgical procedure, the surgical hub 206 can record functions of the unauthorized surgical device and the corresponding outcomes. Accordingly, the functions and outcomes of using unauthorized surgical devices can be compared with those resulting from authorized surgical devices and presented to users as reports or as evidence supporting a recommendation to utilize an authorized surgical device. For example, variances between the performance of unauthorized surgical devices and authorized surgical devices could be highlighted in a regularly generated (e.g., compiled weekly) report on the medical facility. As another example, the computer system could track the number of times that a surgical device has been resterilized and identify the number of times where resterilization begins to affect the performance of the surgical device. As another example, the computer system could show what steps or operations of the procedure were adversely affected by the unauthorized surgical device. As yet another example, the computer system could identify the number of damaged or replaced products in the OR resulting from the unauthorized products. In one aspect, if a surgical device is reprocessed through authorized reprocessing channels, the functions and outcomes of the surgical device could be monitored and highlighted in a regularly generated (e.g., compiled weekly) report on the medical facility.
In one aspect, the functions and outcomes associated with an individual surgical device can be compared against itself, rather than a baseline defined for the surgical device type, to determine whether there is any degradation in the performance of the surgical device over time. Such analyses can be useful in order to determine when a surgical device should be replaced or undergo maintenance, for example. Reports on functions and outcomes associated with individual surgical devices could be highlighted in a regularly generated (e.g., compiled weekly) report on the medical facility, for example.
In one aspect, the computer system can be configured to compare different brands of products and provide recommendations accordingly. For example, the computer system could show when another brand's product delivers the same or better performance at a lower cost than the brand of a given product utilized during a surgical procedure or that is to be used during a surgical procedure.
Various aspects of the subject matter described herein are set out in the following numbered examples:
Example 1: A computer-implemented method for collecting data within a facility. The method comprises: receiving, by a computer system, perioperative data from a plurality of surgical devices located within the facility, the perioperative data associated with a plurality of surgical procedures performed in the facility; determining, by the computer system, procedural context data associated with the plurality of surgical procedures based at least in part on the perioperative data; aggregating, by the computer system, the perioperative data according to the procedural context data; and determining, by the computer system, trends associated with the surgical procedures performed in the facility according to the perioperative data and the procedural context data.
Example 2: The computer-implemented method of Example 1, wherein the computer system comprises a plurality of surgical hubs located within the facility.
Example 3: The computer-implemented method of Example 2, wherein the computer system further comprises a cloud analytics system communicatively coupled to the plurality of surgical hubs.
Example 4: The computer-implemented method of Example 3, further comprising: determining, by the cloud analytics system, recommendations for the surgical procedures based on the trends associated with the surgical procedures; transmitting, by the cloud analytics system, the recommendations to the plurality of surgical hubs according to the trends associated with the surgical procedures; and providing, by the computer system, one or more of the recommendations to users during a surgical procedure type to which the one or more of the recommendations correspond.
Example 5: The computer-implemented method of any one of Examples 1-4, further comprising: determining, by the computer system, whether the trends associated with the surgical procedures correspond to positive or negative procedural outcomes; and determining, by the computer system, recommendations for the surgical procedures based on whether the trends correspond to positive or negative procedural outcomes.
Example 6: The computer-implemented method of any one of Examples 1-5, wherein the procedural context data comprises at least one of types of the surgical procedures, steps of the surgical procedures, tissue types being operated on, body cavities being operated on, orientations of the surgical devices, or combinations thereof.
Example 7: A computer-implemented method for collecting data within a facility. The method comprises: receiving, by a computer system, perioperative data from a plurality of surgical devices located within the facility, the perioperative data associated with a plurality of surgical procedures performed in the facility; receiving, by the computer system, images of the facility and any staff members or surgical devices located therein from a plurality of cameras located within the facility; determining, by the computer system, procedural context data associated with the plurality of surgical procedures based at least in part on the perioperative data and the images; aggregating, by the computer system, the perioperative data according to the procedural context data; and determining, by the computer system, trends associated with the surgical procedures performed in the facility according to the perioperative data and the procedural context data.
Example 8: The computer-implemented method of Example 7, wherein the computer system comprises a plurality of surgical hubs located within the facility.
Example 9: The computer-implemented method of Example 8, wherein the computer system further comprises a cloud analytics system communicatively coupled to the plurality of surgical hubs.
Example 10: The computer-implemented method of Example 9, further comprising: determining, by the cloud analytics system, recommendations for the surgical procedures based on the trends associated with the surgical procedures; transmitting, by the cloud analytics system, the recommendations to the plurality of surgical hubs according to the trends associated with the surgical procedures; and providing, by the computer system, one or more of the recommendations to users during a surgical procedure type to which the one or more of the recommendations correspond.
Example 11: The computer-implemented method of any one of Examples 7-10, further comprising: determining, by the computer system, whether the trends associated with the surgical procedures correspond to positive or negative procedural outcomes; and determining, by the computer system, recommendations for the surgical procedures based on whether the trends correspond to positive or negative procedural outcomes.
Example 12: The computer-implemented method of any one of Examples 7-11, wherein the procedural context data comprises at least one of types of the surgical procedures, steps of the surgical procedures, tissue types being operated on, body cavities being operated on, orientations of the surgical devices, or combinations thereof.
Example 13: A computer-implemented method for collecting data within a facility. The method comprises: receiving, by a computer system, perioperative data from a plurality of surgical devices located within the facility, the perioperative data associated with a plurality of surgical procedures performed in the facility; receiving, by the computer system, images of the facility and any staff members or surgical devices located therein from a plurality of cameras located within the facility; receiving, by the computer system, patient data from a patient databased; receiving, by the computer system, physiological data from a plurality of patient monitors; determining, by the computer system, procedural context data associated with the plurality of surgical procedures based at least in part on the perioperative data, the images, the patient data, and the physiological data; aggregating, by the computer system, the perioperative data according to the procedural context data; and determining, by the computer system, trends associated with the surgical procedures performed in the facility according to the perioperative data and the procedural context data.
Example 14: The computer-implemented method of Example 13, wherein the computer system comprises a plurality of surgical hubs located within the facility.
Example 15: The computer-implemented method of Example 14, wherein the computer system further comprises a cloud analytics system communicatively coupled to the plurality of surgical hubs.
Example 16: The computer-implemented method of Example 15, further comprising: determining, by the cloud analytics system, recommendations for the surgical procedures based on the trends associated with the surgical procedures; transmitting, by the cloud analytics system, the recommendations to the plurality of surgical hubs according to the trends associated with the surgical procedures; and providing, by the computer system, one or more of the recommendations to users during a surgical procedure type to which the one or more of the recommendations correspond.
Example 17: The computer-implemented method of any one of Examples 13-16, further comprising: determining, by the computer system, whether the trends associated with the surgical procedures correspond to positive or negative procedural outcomes; and determining, by the computer system, recommendations for the surgical procedures based on whether the trends correspond to positive or negative procedural outcomes.
Example 18: The computer-implemented method of any one of Examples 13-17, wherein the procedural context data comprises at least one of types of the surgical procedures, steps of the surgical procedures, tissue types being operated on, body cavities being operated on, orientations of the surgical devices, or combinations thereof.
Various aspects of the subject matter described herein are set out in the following numbered examples:
Example 1: A computer system configured to be communicably coupled to a plurality of surgical devices. The computer system comprises a processor and a memory coupled to the processor. The memory stores instructions that, when executed by the processor, cause the computer system to: determine which of the plurality of surgical devices are utilized during a surgical procedure based at least in part on perioperative data received from the one or more of the plurality of surgical devices; determine whether each of the plurality of surgical devices utilized during the surgical procedure is a reusable surgical device or a non-reusable surgical device; determine a maintenance cost for each reusable surgical device; determine a replacement cost for each non-reusable surgical device; and determine a total cost of the plurality of surgical devices for the surgical procedure according to the maintenance cost for each reusable surgical device and the replacement cost for each non-reusable surgical device.
Example 2: The computer system of Example 1, wherein the maintenance cost comprises at least one of a cleaning cost, a resterilization cost, a repair cost, or any combination thereof.
Example 3: The computer system of Example 1 or 2, wherein the memory further stores instructions that, when executed by the processor, cause the computer system to: determine whether the maintenance cost exceeds the replacement cost for each reusable surgical device; and provide a replacement recommendation for each reusable surgical device where the maintenance cost exceeds the replacement cost.
Example 4: The computer system of any one of Examples 1-3, wherein the memory further stores instructions that, when executed by the processor, cause the computer system to: determine a number of uses for each reusable surgical device; and provide a replacement recommendation for each reusable surgical device where the number of uses exceeds a threshold.
Example 5: The computer system of any one of Examples 1-4, wherein the memory further stores instructions that, when executed by the processor, cause the computer system to: retrieve metadata associated with each reusable surgical device, the metadata storing at least one of locations of the reusable surgical device, lengths of time for the locations, a number of uses of the reusable surgical device, or any combination thereof; and determine the maintenance cost for each reusable surgical device according to the metadata.
Example 6: The computer system of any one of Examples 1-5, wherein the memory further stores instructions that, when executed by the processor, cause the computer system to retrieve a purchase price associated with each non-reusable surgical device from a purchasing database, wherein the replacement cost corresponds to the purchase price.
Example 7: The computer system of any one of Examples 1-6, wherein the computer system comprises a surgical hub.
Example 8: A computer system comprising a processor and a memory coupled to the processor. The memory stores instructions that, when executed by the processor, cause the computer system to: identify one or more surgical devices utilized during a surgical procedure according to perioperative data received from the one or more surgical devices; and determine a total cost of the one or more surgical devices for the surgical procedure according to a maintenance cost or a replacement cost associated with each of the one or more surgical devices.
Example 9: The computer system of Example 8, wherein the maintenance cost comprises at least one of a cleaning cost, a resterilization cost, a repair cost, or any combination thereof.
Example 10: The computer system of Example 8 or 9, wherein the memory further stores instructions that, when executed by the processor, cause the computer system to: determine whether the maintenance cost exceeds the replacement cost for each reusable surgical device; and provide a replacement recommendation for each reusable surgical device where the maintenance cost exceeds the replacement cost.
Example 11: The computer system of any one of Examples 8-10, wherein the memory further stores instructions that, when executed by the processor, cause the computer system to: determine a number of uses for each reusable surgical device; and provide a replacement recommendation for each reusable surgical device where the number of uses exceeds a threshold.
Example 12: The computer system of any one of Examples 8-11, wherein the memory further stores instructions that, when executed by the processor, cause the computer system to: retrieve metadata associated with each reusable surgical device, the metadata storing at least one of locations of the reusable surgical device, lengths of time for the locations, a number of uses of the reusable surgical device, or any combination thereof; and determine the maintenance cost for each reusable surgical device according to the metadata.
Example 13: The computer system of any one of Examples 8-12, wherein the memory further stores instructions that, when executed by the processor, cause the computer system to retrieve a purchase price associated with each non-reusable surgical device from a purchasing database, wherein the replacement cost corresponds to the purchase price.
Example 14: The computer system of any one of Examples 8-13, wherein the computer system comprises a surgical hub.
Example 15: A computer-implemented method for determining a surgical device cost for a surgical procedure. The method comprises: determining, by a computer system, which of a plurality of surgical devices are utilized during the surgical procedure based at least in part on perioperative data received from one or more of the plurality of surgical devices; determining, by the computer system, whether each of the plurality of surgical devices utilized during the surgical procedure is a reusable surgical device or a non-reusable surgical device; determining, by the computer system, a maintenance cost for each reusable surgical device; determining, by the computer system, a replacement cost for each non-reusable surgical device; and determining, by the computer system, a total cost of the plurality of surgical devices for the surgical procedure according to the maintenance cost for each reusable surgical device and the replacement cost for each non-reusable surgical device.
Example 16: The computer-implemented method of Example 15, wherein the maintenance cost comprises at least one of a cleaning cost, a resterilization cost, a repair cost, or any combination thereof.
Example 17: The computer-implemented method of Example 15 or 16, further comprising: determining, by the computer system, whether the maintenance cost exceeds the replacement cost for each reusable surgical device; and providing, by the computer system, a replacement recommendation for each reusable surgical device where the maintenance cost exceeds the replacement cost.
Example 18: The computer-implemented method of any one of Examples 15-17, further comprising: determining, by the computer system, a number of uses for each reusable surgical device; and providing, by the computer system, a replacement recommendation for each reusable surgical device where the number of uses exceeds a threshold.
Example 19: The computer-implemented method of any one of Examples 15-18, further comprising: retrieving, by the computer system, metadata associated with each reusable surgical device, the metadata storing at least one of locations of the reusable surgical device, lengths of time for the locations, a number of uses of the reusable surgical device, or any combination thereof; and determining, by the computer system, the maintenance cost for each reusable surgical device according to the metadata.
Example 20: The computer-implemented method of any one of Examples 15-19, further comprising retrieving, by the computer system, a purchase price associated with each non-reusable surgical device from a purchasing database, wherein the replacement cost corresponds to the purchase price.
Example 21: The computer-implemented method of any one of Examples 15-20, wherein the computer system comprises a surgical hub.
Various additional aspects of the subject matter described herein are set out in the following numbered examples:
Example 1: A computer system configured to be communicably coupled to a surgical device and a camera. The computer system comprises a processor and a memory coupled to the processor. The memory stores instructions that, when executed by the processor, cause the computer system to: receive perioperative data from the surgical device; determine a surgical context based at least in part on the perioperative data; receive an image of an individual via the camera; determine a physical characteristic of the individual from the image; retrieve a baseline physical characteristic corresponding to the surgical context; and determine whether the physical characteristic of the individual deviates from the baseline physical characteristic.
Example 2: The computer system of Example 1, wherein the physical characteristic comprises a posture of the individual.
Example 3: The computer system of Example 2, wherein the posture of the individual corresponds to a deviation from at least one body part position and a reference position.
Example 4: The computer system of Example 1, wherein the physical characteristic comprises a wrist orientation of the individual.
Example 5: The computer system of Example 4, wherein the wrist orientation of the individual corresponds to an angle between a wrist of the individual and a surgical instrument held by the individual.
Example 6: The computer system of any one of Examples 1-5, wherein the baseline physical characteristic comprises a previously recorded instance of the physical characteristic for the individual.
Example 7: The computer system of any one of Examples 1-6, wherein the memory further stores instructions that, when executed by the processor, cause the computer system to provide a notification according to whether the physical characteristic deviates from the baseline physical characteristic.
Example 8: The computer system of Example 7, wherein the computer system provides the notification during a surgical procedure in which the perioperative data is received.
Example 9: A computer-implemented method for tracking a physical characteristic of an individual. The method comprises: receiving, by a computer system, perioperative data from a surgical device; determining, by the computer system, a surgical context based at least in part on the perioperative data; receiving, by the computer system, an image of the individual via a camera communicably coupled to the computer system; determining, by the computer system, a physical characteristic of the individual from the image; retrieving, by the computer system, a baseline physical characteristic corresponding to the surgical context; and determining, by the computer system, whether the physical characteristic of the individual deviates from the baseline physical characteristic.
Example 10: The computer-implemented method of Example 9, wherein the physical characteristic comprises a posture of the individual.
Example 11: The computer-implemented method of Example 10, wherein the posture of the individual corresponds to a deviation from at least one body part position and a reference position.
Example 12: The computer-implemented method of Example 9, wherein the physical characteristic comprises a wrist orientation of the individual.
Example 13: The computer-implemented method of Example 12, wherein the wrist orientation of the individual corresponds to an angle between a wrist of the individual and a surgical instrument held by the individual.
Example 14: The computer-implemented method of any one of Examples 9-13, wherein the baseline physical characteristic comprises a previously recorded instance of the physical characteristic for the individual.
Example 15: The computer-implemented method of any one of Examples 9-14, further comprising providing, by the computer system, a notification on a display according to whether the physical characteristic deviates from the baseline physical characteristic.
Example 16: A computer system configured to be communicably coupled to a surgical device and a camera. The computer system comprises a processor and a memory coupled to the processor. The memory stores instructions that, when executed by the processor, cause the computer system to: receive perioperative data from the surgical device; determine a surgical context based at least in part on the perioperative data; receive an image of an individual via the camera; determine a physical characteristic of the individual from the image; transmit data identifying the physical characteristic and the surgical context to a remote computer system; wherein the remote computer system determines a baseline physical characteristic corresponding to the surgical context and the physical characteristic according to data aggregated from a plurality of computer systems connected to the remote computer system; and receive, from the remote computer system, whether the physical characteristic of the individual deviates from the baseline physical characteristic.
Example 17: The computer system of Example 16, wherein the remote computer system comprises a cloud computing system.
Example 18: The computer system of Example 16 or 17, wherein the physical characteristic comprises a posture of the individual.
Example 19: The computer system of Example 18, wherein the posture of the individual corresponds to a deviation from at least one body part position and a reference position.
Example 20: The computer system of Example 16 or 17, wherein the physical characteristic comprises a wrist orientation of the individual.
Example 21: The computer system of Example 20, wherein the wrist orientation of the individual corresponds to an angle between a wrist of the individual and a surgical instrument held by the individual.
Various additional aspects of the subject matter described herein are set out in the following numbered examples:
Example 1: A computer system configured to be communicably coupled to a surgical device and a camera configured to view an operating room. The computer system comprises a processor and a memory coupled to the processor. The memory stores instructions that, when executed by the processor, cause the computer system to: receive an image of an individual within the operating room via the camera; determine whether the individual is making a gesture based on the image; and control the surgical device according to the gesture.
Example 2: The computer system of Example 1, wherein the surgical device comprises a display and the instructions stored in the memory, when executed by the processor, cause the computer system to control information displayed on the display according to the gesture.
Example 3: The computer system of Example 2, wherein the information displayed on the display corresponds to a surgical instrument controlled by the individual.
Example 4: The computer system of Example 1, wherein the surgical device comprises a surgical instrument and the instructions stored in the memory, when executed by the processor, cause the computer system to change an operation of the surgical instrument according to the gesture.
Example 5: The computer system of Example 4, wherein the surgical instrument is selected from the group consisting of an electrosurgical instrument, an ultrasonic surgical instrument, and a surgical stapling instrument.
Example 6: The computer system of any one of Examples 1-5, wherein the instructions stored in the memory, when executed by the processor, cause the computer system to extract features from the image received from the camera and determine whether the individual is making the gesture according to whether the extracted features correspond to the gesture.
Example 7: A computer system configured to be communicably coupled to a surgical device and a camera configured to view an operating room. The computer system comprises a processor and a memory coupled to the processor. The memory stores instructions that, when executed by the processor, cause the computer system to: receive an image of the surgical device within the operating room via the camera; determine a pose of the surgical device based on the image; and control the surgical device according to the pose of the surgical device.
Example 8: The computer system of Example 7, wherein the instructions stored in the memory, when executed by the processor, cause the computer system to change an operation of the surgical device according to the pose.
Example 9: The computer system of Example 8, wherein the surgical device comprises an end effector and the operation comprises an orientation of the end effector.
Example 10: The computer system of Example 8, wherein the surgical device comprises an end effector configured to staple or deliver energy to a tissue according to a control algorithm and the operation comprises the control algorithm.
Example 11: The computer system of Example 7, wherein the instructions stored in the memory, when executed by the processor, cause the computer system to cause the surgical device to display information corresponding to the pose.
Example 12: The computer system of Example 11, wherein the displayed information corresponds to a surgical context.
Example 13: The computer system of Example 12, wherein the instructions stored in the memory, when executed by the processor, cause the computer system to: receive perioperative data from one or more surgical devices, the one or more surgical devices comprising the surgical device; and determine the surgical context based at least in part on the perioperative data from the one or more surgical devices.
Example 14: The computer system of any one of Examples 7-13, wherein the instructions stored in the memory, when executed by the processor, cause the computer system to determine the pose of the surgical device according to a static reference frame associated with the operating room.
Example 15: A computer system configured to be communicably coupled to a surgical device and a camera configured to view an operating room. The computer system comprises a processor and a memory coupled to the processor. The memory stores instructions that, when executed by the processor, cause the computer system to: receive an image of a surgical device or an individual within the operating room via the camera determine a pose of the surgical device based on the image according to whether the image is of the surgical device; determine whether the individual is making a gesture based on the image according to whether the image is of the individual; and control the surgical device according to at least one of the pose of the surgical device or the gesture.
Example 16: The computer system of Example 15, wherein the surgical device comprises a display and the instructions stored in the memory, when executed by the processor, cause the computer system to control information displayed on the display according to the gesture.
Example 17: The computer system of Example 15, wherein the surgical device comprises a surgical instrument and the instructions stored in the memory, when executed by the processor, cause the computer system to change an operation of the surgical instrument according to the gesture.
Example 18: The computer system of any one of Examples 15-17, wherein the instructions stored in the memory, when executed by the processor, cause the computer system to change an operation of the surgical device according to the pose.
Example 19: The computer system of any one of Examples 15-17, wherein the instructions stored in the memory, when executed by the processor, cause the computer system to cause the surgical device to display information corresponding to the pose.
Example 20: The computer system of any one of Examples 15-19, wherein the instructions stored in the memory, when executed by the processor, cause the computer system to determine the pose of the surgical device according to a static reference frame associated with the operating room.
Various additional aspects of the subject matter described herein are set out in the following numbered examples:
Example 1: A computer system configured to be communicably coupled to a surgical device and a database system. The computer system comprises a processor and a memory coupled to the processor. The memory stores instructions that, when executed by the processor, cause the computer system to: receive perioperative data from the surgical device; determine a surgical context based at least in part on the perioperative data, the surgical context corresponding to surgical contextual data; transmit a first subset of surgical data to one or more databases database of the database system for storage thereon, the surgical data comprising at least a portion of the perioperative data or the surgical contextual data; and define a relation between a second subset of the surgical data stored in the memory and one or more databases of the database system; wherein the first subset and the second subset of the surgical data correspond to the surgical context and an identity of each of the one or more databases.
Example 2: The computer system of Example 1, wherein the perioperative data comprises metadata associated with the surgical device.
Example 3: The computer system of Example 1 or 2, wherein the surgical contextual data is selected from the group consisting of a procedure type, a procedure step, and a combination thereof.
Example 4: The computer system of any one of Examples 1-3, wherein a property of the first subset of surgical data transmitted to the database corresponds to the surgical context.
Example 5: The computer system of Example 4, wherein the property is selected from the group consisting of a bit size, a quantity, a resolution, a time bracket, and any combination thereof.
Example 6: The computer system of any one of Examples 1-5, wherein the computer system transmits the first subset of the surgical data and defines the relation for the second subset of the surgical without requiring action by a user.
Example 7: The computer system of any one of Examples 1-6, wherein the identity of each of the one or more databases correspond to departments of a medical facility.
Example 8: A computer-implemented method for sharing data between a computer system and a database system, wherein the computer system is configured to be communicably coupled to a surgical device. The method comprises: receiving, by the computer system, perioperative data from the surgical device; determining, by the computer system, a surgical context based at least in part on the perioperative data, the surgical context corresponding to surgical contextual data; transmitting, by the computer system, a first subset of surgical data to one or more databases of the database system for storage thereon, the surgical data comprising at least a portion of the perioperative data or the surgical contextual data; and defining, by the computer system, a relation between a second subset of the surgical data stored in a memory of the computer system and one or more databases of the database system; wherein the first subset and the second subset of the surgical data correspond to the surgical context and an identity of each of the one or more databases.
Example 9: The computer-implemented method of Example 8, wherein the perioperative data comprises metadata associated with the surgical device.
Example 10: The computer-implemented method of Example 8 or 9, wherein the surgical contextual data is selected from the group consisting of a procedure type, a procedure step, and a combination thereof.
Example 11: The computer-implemented method of any one of Examples 8-10, wherein a property of the first subset of surgical data transmitted to the database corresponds to the surgical context.
Example 12: The computer-implemented method of Example 11, wherein the property is selected from the group consisting of a bit size, a quantity, a resolution, a time bracket, and any combination thereof.
Example 13: The computer-implemented method of any one of Examples 8-12, wherein the computer system transmits the first subset of the surgical data and defines the relation for the second subset of the surgical without requiring action by a user.
Example 14: The computer-implemented method of any one of Examples 8-13, wherein the identity of each of the one or more databases correspond to departments of a medical facility.
Example 15: A computer system configured to be communicably coupled to a plurality of surgical devices and a database. The computer system comprises a processor and a memory coupled to the processor. The memory stores instructions that, when executed by the processor, cause the computer system to: receive perioperative data from the plurality of surgical devices; determine a surgical context based at least in part on the perioperative data, the surgical context corresponding to surgical contextual data; receive a request for surgical data from the database, the surgical data comprising at least a portion of the perioperative data or the surgical contextual data; transmit the surgical data to the database according to an identity of the database; and define a relation between the surgical data stored in the memory and the database according to the identity of the database.
Example 16: The computer system of Example 15, wherein the memory stores instructions that, when executed by the processor, cause the computer system to: receive a security key in association with the request; authenticate the security key; transmit the surgical data to the database according to whether the security key is authentic; and define a relation between the surgical data stored in the memory and the database according to the security key is authentic.
Example 17: The computer system of Example 15 or 16, wherein the perioperative data comprises metadata associated with the surgical device.
Example 18: The computer system of any one of Examples 15-17, wherein the surgical contextual data is selected from the group consisting of a procedure type, a procedure step, and a combination thereof.
Example 19: The computer system of any one of Examples 15-18, wherein a property of the surgical data transmitted to the database corresponds to the surgical context.
Example 20: The computer system of Example 19, wherein the property is selected from the group consisting of a bit size, a quantity, a resolution, a time bracket, and any combination thereof.
Example 21: The computer system of any one of Examples 15-20, wherein the identity of the database corresponds to a department of a medical facility.
Various aspects of the subject matter described herein are set out in the following numbered examples:
Example 1: A computer system configured to be communicably coupled to a surgical device. The computer system comprises a processor and a memory coupled to the processor. The memory stores instructions that, when executed by the processor, cause the computer system to: receive perioperative data from the surgical device; determine a surgical context based at least in part on the perioperative data; determine a procedural variable associated with the surgical context; compare the procedural variable to a baseline for the procedural variable, the baseline corresponding to the surgical context; and provide a notification according to whether the procedural variable deviates from the baseline for the procedural variable.
Example 2: The computer system of Example 1, wherein the procedural variable comprises a surgical instrument type being utilized during a surgical procedure.
Example 3: The computer system of Example 2, wherein the notification comprises a recommendation for an alternative surgical instrument type for the surgical procedure.
Example 4: The computer system of Example 3, wherein the alternative surgical instrument type is associated with improved procedural outcomes for the surgical procedure.
Example 5: The computer system of Example 1, wherein the procedural variable comprises a surgical procedure length.
Example 6: The computer system of Example 5, wherein the notification comprises a recommendation for an alternative surgical device setup.
Example 7: The computer system of Example 1, wherein the procedural variable comprises a cost of surgical devices utilized during a surgical procedure.
Example 8: The computer system of Example 7, wherein the notification comprises a recommendation for a less-expensive surgical device setup.
Example 9: A computer-implemented method for providing recommendations associated with a surgical procedure, the method comprising: receiving, by a computer system, perioperative data from a surgical device; determining, by the computer system, a surgical context based at least in part on the perioperative data; determining, by the computer system, a procedural variable associated with the surgical context; comparing, by the computer system, the procedural variable to a baseline for the procedural variable, the baseline corresponding to the surgical context; and providing, by the computer system, a notification according to whether the procedural variable deviates from the baseline for the procedural variable.
Example 10: The method of Example 9, wherein the procedural variable comprises a surgical instrument type being utilized during a surgical procedure.
Example 11: The method of Example 10, wherein the notification comprises a recommendation for an alternative surgical instrument type for the surgical procedure.
Example 12: The method of Example 11, wherein the alternative surgical instrument type is associated with improved procedural outcomes for the surgical procedure.
Example 13: The method of Example 9, wherein the procedural variable comprises a surgical procedure length.
Example 14: The method of Example 13, wherein the notification comprises a recommendation for an alternative surgical device setup.
Example 15: The method of Example 9, wherein the procedural variable comprises a cost of surgical devices utilized during a surgical procedure.
Example 16: The method of Example 15, wherein the notification comprises a recommendation for a less-expensive surgical device setup.
Example 17: A computer system configured to be communicably coupled to a surgical device and a video camera. The computer system comprises a processor and a memory coupled to the processor. The memory stores instructions that, when executed by the processor, cause the computer system to: record a surgical procedure via the video camera; receive perioperative data from the surgical device; determine a surgical context based at least in part on the perioperative data; determine a procedural variable associated with the surgical context; compare the procedural variable to a baseline for the procedural variable, the baseline corresponding to the surgical context; and replay a recording of the surgical procedure, the recording including a notification according to whether the procedural variable deviates from the baseline for the procedural variable.
Example 18: The computer system of Example 17, wherein the procedural variable comprises a surgical instrument type being utilized during a surgical procedure.
Example 19: The computer system of Example 18, wherein the notification comprises a recommendation for an alternative surgical instrument type for the surgical procedure.
Example 20: The computer system of Example 19, wherein the alternative surgical instrument type is associated with improved procedural outcomes for the surgical procedure.
Example 21: The computer system of Example 17, wherein the procedural variable comprises a surgical procedure length.
Example 22: The computer system of Example 21, wherein the notification comprises a recommendation for an alternative surgical device setup.
Example 23: The computer system of Example 17, wherein the procedural variable comprises a cost of surgical devices utilized during a surgical procedure.
Example 24: The computer system of Example 23, wherein the notification comprises a recommendation for a less-expensive surgical device setup.
Various additional aspects of the subject matter described herein are set out in the following numbered examples:
Example 1: A surgical system comprising a first device comprising a first control circuit and a second device configured to effect a surgical function. The second device comprises a second control circuit in signal communication with the first control circuit. The second control circuit is configured to selectively toggle the second device between a secondary operating mode, in which the second device is configured to control the first device, and a primary operating mode, in which the second device is configured to control the surgical function.
Example 2: The surgical system of Example 1, wherein the first device comprises a display, wherein the second device comprises an end effector positioned within a sterile field, and wherein the end effector is viewable on the display.
Example 3: The surgical system of Examples 1 or 2, wherein the secondary operating mode comprises a cursor mode, and wherein the primary operating mode comprises a tissue treatment mode.
Example 4: The surgical system of any one of Examples 1-3, wherein the second device comprises a handle comprising an input switch movable between a first position and a second position, and wherein the first position corresponds to the primary operating mode and the second position corresponds to the secondary operating mode.
Example 5: The surgical system of any one of Examples 1-4, wherein the second control circuit is configured to toggle between the primary operating mode and the secondary operating mode in response to an audible command by a clinician.
Example 6: The surgical system of Example 3, wherein the end effector is configured to drag and drop an icon across the display in the cursor mode.
Example 7: The surgical system of Examples 3 or 6, wherein the end effector is configured to select an anatomical feature on the display in the cursor mode.
Example 8: The surgical system of any one of Examples 1-7, wherein the second device comprises an ultrasonic instrument configured to apply ultrasonic vibrations to tissue, wherein the ultrasonic instrument comprises a first actuation button and a second actuation button, wherein, in the primary operating mode, the first actuation button is configured to actuate a first energy level and the second actuation button is configured to actuate a second energy level, and wherein, in the secondary operating mode, the first actuation button comprises a first cursor button and the second actuation button comprises a second cursor button.
Example 9: A surgical system comprising an imaging system comprising a camera and a display screen. The surgical system further comprises a surgical device configured to effect a surgical function. The surgical device comprises a control circuit comprising a processor and a memory communicatively coupled to the processor, the memory storing instructions executable by the processor to receive an input signal, in response to the input signal, switch between a first operational mode and a second operational mode, in the first operational mode, actuate the surgical function, and in the second operational mode, control the display screen.
Example 10: The surgical system of Example 9, wherein the surgical device is configured to control the display screen through a surgical barrier.
Example 11: The surgical system of Examples 9 or 10, wherein the display screen comprises a video monitor in an operating room, and wherein the surgical device comprises a laparoscopic device comprising an end effector positioned in a patient in the operating room.
Example 12: The surgical system of Example 11, wherein the surgical device comprises an end effector, and wherein the camera is configured to track the end effector in the patient.
Example 13: The surgical system of Examples 11 or 12, wherein, in the second operational mode, the end effector is configured to interact with one or more icons on the video monitor as a cursor.
Example 14: The surgical system of any one of Examples 11-13, wherein, in the second operational mode, the end effector is configured to interact as a cursor with a video feed on the video monitor.
Example 15: The surgical system of any one of Examples 9-14, wherein the surgical device comprises a handle comprising an input switch movable between a first position and a second position, and wherein the first position corresponds to the first operational mode and the second position corresponds to the second operational mode.
Example 16: The surgical system of any one of Examples 9-14, wherein the control circuit is configured to toggle between the first operational mode and the second operational mode in response to an audible command by a clinician.
Example 17: A non-transitory computer readable medium storing computer readable instructions which, when executed, causes a surgical system to receive an input signal, in response to the input signal, switch between a first operational mode and a second operational mode, in the first operational mode, actuate a surgical function, and in the second operational mode, interact with a display screen through a surgical barrier.
Example 18: The non-transitory computer readable medium of Example 17, wherein the surgical system is configured to interact with the display screen through the surgical barrier by clicking on an icon on the display screen.
Example 19: The non-transitory computer readable medium of Examples 17 or 18, wherein the surgical system is configured to interact with the display screen through the surgical barrier by dragging and dropping an icon on the display screen.
Example 20: The non-transitory computer readable medium of any one of Examples 17-19, wherein the surgical system is configured to interact with the display screen through the surgical barrier by selecting a portion of a video.
While several forms have been illustrated and described, it is not the intention of Applicant to restrict or limit the scope of the appended claims to such detail. Numerous modifications, variations, changes, substitutions, combinations, and equivalents to those forms may be implemented and will occur to those skilled in the art without departing from the scope of the present disclosure. Moreover, the structure of each element associated with the described forms can be alternatively described as a means for providing the function performed by the element. Also, where materials are disclosed for certain components, other materials may be used. It is therefore to be understood that the foregoing description and the appended claims are intended to cover all such modifications, combinations, and variations as falling within the scope of the disclosed forms. The appended claims are intended to cover all such modifications, variations, changes, substitutions, modifications, and equivalents.
The foregoing detailed description has set forth various forms of the devices and/or processes via the use of block diagrams, flowcharts, and/or examples. Insofar as such block diagrams, flowcharts, and/or examples contain one or more functions and/or operations, it will be understood by those within the art that each function and/or operation within such block diagrams, flowcharts, and/or examples can be implemented, individually and/or collectively, by a wide range of hardware, software, firmware, or virtually any combination thereof. Those skilled in the art will recognize that some aspects of the forms disclosed herein, in whole or in part, can be equivalently implemented in integrated circuits, as one or more computer programs running on one or more computers (e.g., as one or more programs running on one or more computer systems), as one or more programs running on one or more processors (e.g., as one or more programs running on one or more microprocessors), as firmware, or as virtually any combination thereof, and that designing the circuitry and/or writing the code for the software and or firmware would be well within the skill of one of skill in the art in light of this disclosure. In addition, those skilled in the art will appreciate that the mechanisms of the subject matter described herein are capable of being distributed as one or more program products in a variety of forms, and that an illustrative form of the subject matter described herein applies regardless of the particular type of signal bearing medium used to actually carry out the distribution.
Instructions used to program logic to perform various disclosed aspects can be stored within a memory in the system, such as dynamic random access memory (DRAM), cache, flash memory, or other storage. Furthermore, the instructions can be distributed via a network or by way of other computer readable media. Thus a machine-readable medium may include any mechanism for storing or transmitting information in a form readable by a machine (e.g., a computer), but is not limited to, floppy diskettes, optical disks, compact disc, read-only memory (CD-ROMs), and magneto-optical disks, read-only memory (ROMs), random access memory (RAM), erasable programmable read-only memory (EPROM), electrically erasable programmable read-only memory (EEPROM), magnetic or optical cards, flash memory, or a tangible, machine-readable storage used in the transmission of information over the Internet via electrical, optical, acoustical or other forms of propagated signals (e.g., carrier waves, infrared signals, digital signals, etc.). Accordingly, the non-transitory computer-readable medium includes any type of tangible machine-readable medium suitable for storing or transmitting electronic instructions or information in a form readable by a machine (e.g., a computer).
As used in any aspect herein, the term “control circuit” may refer to, for example, hardwired circuitry, programmable circuitry (e.g., a computer processor including one or more individual instruction processing cores, processing unit, processor, microcontroller, microcontroller unit, controller, digital signal processor (DSP), programmable logic device (PLD), programmable logic array (PLA), or field programmable gate array (FPGA)), state machine circuitry, firmware that stores instructions executed by programmable circuitry, and any combination thereof. The control circuit may, collectively or individually, be embodied as circuitry that forms part of a larger system, for example, an integrated circuit (IC), an application-specific integrated circuit (ASIC), a system on-chip (SoC), desktop computers, laptop computers, tablet computers, servers, smart phones, etc. Accordingly, as used herein “control circuit” includes, but is not limited to, electrical circuitry having at least one discrete electrical circuit, electrical circuitry having at least one integrated circuit, electrical circuitry having at least one application specific integrated circuit, electrical circuitry forming a general purpose computing device configured by a computer program (e.g., a general purpose computer configured by a computer program which at least partially carries out processes and/or devices described herein, or a microprocessor configured by a computer program which at least partially carries out processes and/or devices described herein), electrical circuitry forming a memory device (e.g., forms of random access memory), and/or electrical circuitry forming a communications device (e.g., a modem, communications switch, or optical-electrical equipment). Those having skill in the art will recognize that the subject matter described herein may be implemented in an analog or digital fashion or some combination thereof.
As used in any aspect herein, the term “logic” may refer to an app, software, firmware and/or circuitry configured to perform any of the aforementioned operations. Software may be embodied as a software package, code, instructions, instruction sets and/or data recorded on non-transitory computer readable storage medium. Firmware may be embodied as code, instructions or instruction sets and/or data that are hard-coded (e.g., nonvolatile) in memory devices.
As used in any aspect herein, the terms “component,” “system,” “module” and the like can refer to a computer-related entity, either hardware, a combination of hardware and software, software, or software in execution.
As used in any aspect herein, an “algorithm” refers to a self-consistent sequence of steps leading to a desired result, where a “step” refers to a manipulation of physical quantities and/or logic states which may, though need not necessarily, take the form of electrical or magnetic signals capable of being stored, transferred, combined, compared, and otherwise manipulated. It is common usage to refer to these signals as bits, values, elements, symbols, characters, terms, numbers, or the like. These and similar terms may be associated with the appropriate physical quantities and are merely convenient labels applied to these quantities and/or states.
A network may include a packet switched network. The communication devices may be capable of communicating with each other using a selected packet switched network communications protocol. One example communications protocol may include an Ethernet communications protocol which may be capable permitting communication using a Transmission Control Protocol/Internet Protocol (TCP/IP). The Ethernet protocol may comply or be compatible with the Ethernet standard published by the Institute of Electrical and Electronics Engineers (IEEE) titled “IEEE 802.3 Standard”, published in December, 2008 and/or later versions of this standard. Alternatively or additionally, the communication devices may be capable of communicating with each other using an X.25 communications protocol. The X.25 communications protocol may comply or be compatible with a standard promulgated by the International Telecommunication Union-Telecommunication Standardization Sector (ITU-T). Alternatively or additionally, the communication devices may be capable of communicating with each other using a frame relay communications protocol. The frame relay communications protocol may comply or be compatible with a standard promulgated by Consultative Committee for International Telegraph and Telephone (CCITT) and/or the American National Standards Institute (ANSI). Alternatively or additionally, the transceivers may be capable of communicating with each other using an Asynchronous Transfer Mode (ATM) communications protocol. The ATM communications protocol may comply or be compatible with an ATM standard published by the ATM Forum titled “ATM-MPLS Network Interworking 2.0” published August 2001, and/or later versions of this standard. Of course, different and/or after-developed connection-oriented network communication protocols are equally contemplated herein.
Unless specifically stated otherwise as apparent from the foregoing disclosure, it is appreciated that, throughout the foregoing disclosure, discussions using terms such as “processing,” “computing,” “calculating,” “determining,” “displaying,” or the like, refer to the action and processes of a computer system, or similar electronic computing device, that manipulates and transforms data represented as physical (electronic) quantities within the computer system's registers and memories into other data similarly represented as physical quantities within the computer system memories or registers or other such information storage, transmission or display devices.
One or more components may be referred to herein as “configured to,” “configurable to,” “operable/operative to,” “adapted/adaptable,” “able to,” “conformable/conformed to,” etc. Those skilled in the art will recognize that “configured to” can generally encompass active-state components and/or inactive-state components and/or standby-state components, unless context requires otherwise.
The terms “proximal” and “distal” are used herein with reference to a clinician manipulating the handle portion of the surgical instrument. The term “proximal” refers to the portion closest to the clinician and the term “distal” refers to the portion located away from the clinician. It will be further appreciated that, for convenience and clarity, spatial terms such as “vertical”, “horizontal”, “up”, and “down” may be used herein with respect to the drawings. However, surgical instruments are used in many orientations and positions, and these terms are not intended to be limiting and/or absolute.
Those skilled in the art will recognize that, in general, terms used herein, and especially in the appended claims (e.g., bodies of the appended claims) are generally intended as “open” terms (e.g., the term “including” should be interpreted as “including but not limited to,” the term “having” should be interpreted as “having at least,” the term “includes” should be interpreted as “includes but is not limited to,” etc.). It will be further understood by those within the art that if a specific number of an introduced claim recitation is intended, such an intent will be explicitly recited in the claim, and in the absence of such recitation no such intent is present. For example, as an aid to understanding, the following appended claims may contain usage of the introductory phrases “at least one” and “one or more” to introduce claim recitations. However, the use of such phrases should not be construed to imply that the introduction of a claim recitation by the indefinite articles “a” or “an” limits any particular claim containing such introduced claim recitation to claims containing only one such recitation, even when the same claim includes the introductory phrases “one or more” or “at least one” and indefinite articles such as “a” or “an” (e.g., “a” and/or “an” should typically be interpreted to mean “at least one” or “one or more”); the same holds true for the use of definite articles used to introduce claim recitations.
In addition, even if a specific number of an introduced claim recitation is explicitly recited, those skilled in the art will recognize that such recitation should typically be interpreted to mean at least the recited number (e.g., the bare recitation of “two recitations,” without other modifiers, typically means at least two recitations, or two or more recitations). Furthermore, in those instances where a convention analogous to “at least one of A, B, and C, etc.” is used, in general such a construction is intended in the sense one having skill in the art would understand the convention (e.g., “a system having at least one of A, B, and C” would include but not be limited to systems that have A alone, B alone, C alone, A and B together, A and C together, B and C together, and/or A, B, and C together, etc.). In those instances where a convention analogous to “at least one of A, B, or C, etc.” is used, in general such a construction is intended in the sense one having skill in the art would understand the convention (e.g., “a system having at least one of A, B, or C” would include but not be limited to systems that have A alone, B alone, C alone, A and B together, A and C together, B and C together, and/or A, B, and C together, etc.). It will be further understood by those within the art that typically a disjunctive word and/or phrase presenting two or more alternative terms, whether in the description, claims, or drawings, should be understood to contemplate the possibilities of including one of the terms, either of the terms, or both terms unless context dictates otherwise. For example, the phrase “A or B” will be typically understood to include the possibilities of “A” or “B” or “A and B.”
With respect to the appended claims, those skilled in the art will appreciate that recited operations therein may generally be performed in any order. Also, although various operational flow diagrams are presented in a sequence(s), it should be understood that the various operations may be performed in other orders than those which are illustrated, or may be performed concurrently. Examples of such alternate orderings may include overlapping, interleaved, interrupted, reordered, incremental, preparatory, supplemental, simultaneous, reverse, or other variant orderings, unless context dictates otherwise. Furthermore, terms like “responsive to,” “related to,” or other past-tense adjectives are generally not intended to exclude such variants, unless context dictates otherwise.
It is worthy to note that any reference to “one aspect,” “an aspect,” “an exemplification,” “one exemplification,” and the like means that a particular feature, structure, or characteristic described in connection with the aspect is included in at least one aspect. Thus, appearances of the phrases “in one aspect,” “in an aspect,” “in an exemplification,” and “in one exemplification” in various places throughout the specification are not necessarily all referring to the same aspect. Furthermore, the particular features, structures or characteristics may be combined in any suitable manner in one or more aspects.
Any patent application, patent, non-patent publication, or other disclosure material referred to in this specification and/or listed in any Application Data Sheet is incorporated by reference herein, to the extent that the incorporated materials is not inconsistent herewith. As such, and to the extent necessary, the disclosure as explicitly set forth herein supersedes any conflicting material incorporated herein by reference. Any material, or portion thereof, that is said to be incorporated by reference herein, but which conflicts with existing definitions, statements, or other disclosure material set forth herein will only be incorporated to the extent that no conflict arises between that incorporated material and the existing disclosure material.
In summary, numerous benefits have been described which result from employing the concepts described herein. The foregoing description of the one or more forms has been presented for purposes of illustration and description. It is not intended to be exhaustive or limiting to the precise form disclosed. Modifications or variations are possible in light of the above teachings. The one or more forms were chosen and described in order to illustrate principles and practical application to thereby enable one of ordinary skill in the art to utilize the various forms and with various modifications as are suited to the particular use contemplated. It is intended that the claims submitted herewith define the overall scope.
This application claims priority to U.S. patent application Ser. No. 16/209,490, titled METHOD FOR FACILITY DATA COLLECTION AND INTERPRETATION, filed Dec. 4, 2018, now U.S. Patent Application Publication No. 2019/0206564, the disclosure of which is herein incorporated by reference in its entirety. U.S. patent application Ser. No. 16/209,490 claims priority under 35 U.S.C. § 119(e) to U.S. Provisional Patent Application No. 62/773,778, tided METHOD FOR ADAPTIVE CONTROL SCHEMES FOR SURGICAL NETWORK CONTROL AND INTERACTION, filed Nov. 30, 2018, to U.S. Provisional Patent Application No. 62/773,728, tided METHOD FOR SITUATIONAL AWARENESS FOR SURGICAL NETWORK OR SURGICAL NETWORK CONNECTED DEVICE CAPABLE OF ADJUSTING FUNCTION BASED ON A SENSED SITUATION OR USAGE, filed Nov. 30, 2018, to U.S. Provisional Patent Application No. 62/773,741, titled METHOD FOR FACILITY DATA COLLECTION AND INTERPRETATION, filed Nov. 30, 2018, and to U.S. Provisional Patent Application No. 62/773,742, tided METHOD FOR CIRCULAR STAPLER CONTROL ALGORITHM ADJUSTMENT BASED ON SITUATIONAL AWARENESS, filed Nov. 30, 2018, the disclosure of each of which is herein incorporated by reference in its entirety. U.S. patent application Ser. No. 16/209,490 claims priority under 35 U.S.C. § 119(e) to U.S. Provisional Patent Application No. 62/750,529, tided METHOD FOR OPERATING A POWERED ARTICULATING MULTI-CLIP APPLIER, filed Oct. 25, 2018, to U.S. Provisional Patent Application No. 62/750,539, titled SURGICAL CLIP APPLIER, filed Oct. 25, 2018, and to U.S. Provisional Patent Application No. 62/750,555, tided SURGICAL CLIP APPLIER, filed Oct. 25, 2018, the disclosure of each of which is herein incorporated by reference in its entirety. U.S. patent application Ser. No. 16/209,490 also claims priority under 35 U.S.C. § 119(e) to U.S. Provisional Patent Application No. 62/729,183, tided CONTROL FOR A SURGICAL NETWORK OR SURGICAL NETWORK CONNECTED DEVICE THAT ADJUSTS ITS FUNCTION BASED ON A SENSED SITUATION OR USAGE, filed Sep. 10, 2018, to U.S. Provisional Patent Application No. 62/729,177, titled AUTOMATED DATA SCALING, ALIGNMENT, AND ORGANIZING BASED ON PREDEFINED PARAMETERS WITHIN A SURGICAL NETWORK BEFORE TRANSMISSION, filed Sep. 10, 2018, to U.S. Provisional Patent Application No. 62/729,176, titled INDIRECT COMMAND AND CONTROL OF A FIRST OPERATING ROOM SYSTEM THROUGH THE USE OF A SECOND OPERATING ROOM SYSTEM WITHIN A STERILE FIELD WHERE THE SECOND OPERATING ROOM SYSTEM HAS PRIMARY AND SECONDARY OPERATING MODES, filed Sep. 10, 2018, to U.S. Provisional Patent Application No. 62/729,185, titled POWERED STAPLING DEVICE THAT IS CAPABLE OF ADJUSTING FORCE, ADVANCEMENT SPEED, AND OVERALL STROKE OF CUTTING MEMBER OF THE DEVICE BASED ON SENSED PARAMETER OF FIRING OR CLAMPING, filed Sep. 10, 2018, to U.S. Provisional Patent Application No. 62/729,184, titled POWERED SURGICAL TOOL WITH A PREDEFINED ADJUSTABLE CONTROL ALGORITHM FOR CONTROLLING AT LEAST ONE END EFFECTOR PARAMETER AND A MEANS FOR LIMITING THE ADJUSTMENT, filed Sep. 10, 2018, to U.S. Provisional Patent Application No. 62/729,182, tided SENSING THE PATIENT POSITION AND CONTACT UTILIZING THE MONO-POLAR RETURN PAD ELECTRODE TO PROVIDE SITUATIONAL AWARENESS TO THE HUB, filed Sep. 10, 2018, to U.S. Provisional Patent Application No. 62/729,191, titled SURGICAL NETWORK RECOMMENDATIONS FROM REAL TIME ANALYSIS OF PROCEDURE VARIABLES AGAINST A BASELINE HIGHLIGHTING DIFFERENCES FROM THE OPTIMAL SOLUTION, filed Sep. 10, 2018, to U.S. Provisional Patent Application No. 62/729,195, titled ULTRASONIC ENERGY DEVICE WHICH VARIES PRESSURE APPLIED BY CLAMP ARM TO PROVIDE THRESHOLD CONTROL PRESSURE AT A CUT PROGRESSION LOCATION, filed Sep. 10, 2018, and to U.S. Provisional Patent Application No. 62/729,186, titled WIRELESS PAIRING OF A SURGICAL DEVICE WITH ANOTHER DEVICE WITHIN A STERILE SURGICAL FIELD BASED ON THE USAGE AND SITUATIONAL AWARENESS OF DEVICES, filed Sep. 10, 2018, the disclosure of each of which is herein incorporated by reference in its entirety. U.S. patent application Ser. No. 16/209,490 also claims priority under 35 U.S.C. § 119(e) to U.S. Provisional Patent Application No. 62/721,995, titled CONTROLLING AN ULTRASONIC SURGICAL INSTRUMENT ACCORDING TO TISSUE LOCATION, filed Aug. 23, 2018, to U.S. Provisional Patent Application No. 62/721,998, titled SITUATIONAL AWARENESS OF ELECTROSURGICAL SYSTEMS, filed Aug. 23, 2018, to U.S. Provisional Patent Application No. 62/721,999, titled INTERRUPTION OF ENERGY DUE TO INADVERTENT CAPACITIVE COUPLING, filed Aug. 23, 2018, to U.S. Provisional Patent Application No. 62/721,994, titled BIPOLAR COMBINATION DEVICE THAT AUTOMATICALLY ADJUSTS PRESSURE BASED ON ENERGY MODALITY, filed Aug. 23, 2018, and to U.S. Provisional Patent Application No. 62/721,996, titled RADIO FREQUENCY ENERGY DEVICE FOR DELIVERING COMBINED ELECTRICAL SIGNALS, filed Aug. 23, 2018, the disclosure of each of which is herein incorporated by reference in its entirety. U.S. patent application Ser. No. 16/209,490 also claims priority under 35 U.S.C. § 119(e) to U.S. Provisional Patent Application No. 62/692,747, titled SMART ACTIVATION OF AN ENERGY DEVICE BY ANOTHER DEVICE, filed on Jun. 30, 2018, to U.S. Provisional Patent Application No. 62/692,748, titled SMART ENERGY ARCHITECTURE, filed on Jun. 30, 2018, and to U.S. Provisional Patent Application No. 62/692,768, titled SMART ENERGY DEVICES, filed on Jun. 30, 2018, the disclosure of each of which is herein incorporated by reference in its entirety. U.S. patent application Ser. No. 16/209,490 also claims priority under 35 U.S.C. § 119(e) to U.S. Provisional Patent Application No. 62/691,228, titled METHOD OF USING REINFORCED FLEX CIRCUITS WITH MULTIPLE SENSORS WITH ELECTROSURGICAL DEVICES, filed Jun. 28, 2018, to U.S. Provisional Patent Application No. 62/691,227, titled CONTROLLING A SURGICAL INSTRUMENT ACCORDING TO SENSED CLOSURE PARAMETERS, filed Jun. 28, 2018, to U.S. Provisional Patent Application No. 62/691,230, titled SURGICAL INSTRUMENT HAVING A FLEXIBLE ELECTRODE, filed Jun. 28, 2018, to U.S. Provisional Patent Application No. 62/691,219, titled SURGICAL EVACUATION SENSING AND MOTOR CONTROL, filed Jun. 28, 2018, to U.S. Provisional Patent Application No. 62/691,257, titled COMMUNICATION OF SMOKE EVACUATION SYSTEM PARAMETERS TO HUB OR CLOUD IN SMOKE EVACUATION MODULE FOR INTERACTIVE SURGICAL PLATFORM, filed Jun. 28, 2018, to U.S. Provisional Patent Application No. 62/691,262, titled SURGICAL EVACUATION SYSTEM WITH A COMMUNICATION CIRCUIT FOR COMMUNICATION BETWEEN A FILTER AND A SMOKE EVACUATION DEVICE, filed Jun. 28, 2018, and to U.S. Provisional Patent Application No. 62/691,251, titled DUAL IN-SERIES LARGE AND SMALL DROPLET FILTERS, filed Jun. 28, 2018, the disclosure of each of which is herein incorporated by reference in its entirety. U.S. patent application Ser. No. 16/209,490 claims priority under 35 U.S.C. § 119(e) to U.S. Provisional Patent Application No. 62/665,129, titled SURGICAL SUTURING SYSTEMS, filed May 1, 2018, to U.S. Provisional Patent Application No. 62/665,139, titled SURGICAL INSTRUMENTS COMPRISING CONTROL SYSTEMS, filed May 1, 2018, to U.S. Provisional Patent Application No. 62/665,177, titled SURGICAL INSTRUMENTS COMPRISING HANDLE ARRANGEMENTS, filed May 1, 2018, to U.S. Provisional Patent Application No. 62/665,128, titled MODULAR SURGICAL INSTRUMENTS, filed May 1, 2018, to U.S. Provisional Patent Application No. 62/665,192, titled SURGICAL DISSECTORS, filed May 1, 2018, and to U.S. Provisional Patent Application No. 62/665,134, titled SURGICAL CLIP APPLIER, filed May 1, 2018, the disclosure of each of which is herein incorporated by reference in its entirety. U.S. patent application Ser. No. 16/209,490 also claims priority under 35 U.S.C. § 119(e) to U.S. Provisional Patent Application No. 62/659,900, titled METHOD OF HUB COMMUNICATION, filed on Apr. 19, 2018, the disclosure of which is herein incorporated by reference in its entirety. U.S. patent application Ser. No. 16/209,490 also claims priority under 35 U.S.C. § 119(e) to U.S. Provisional Patent Application No. 62/650,898, filed on Mar. 30, 2018, titled CAPACITIVE COUPLED RETURN PATH PAD WITH SEPARABLE ARRAY ELEMENTS, to U.S. Provisional Patent Application No. 62/650,887, titled SURGICAL SYSTEMS WITH OPTIMIZED SENSING CAPABILITIES, filed Mar. 30, 2018, to U.S. Provisional Patent Application No. 62/650,882, titled SMOKE EVACUATION MODULE FOR INTERACTIVE SURGICAL PLATFORM, filed Mar. 30, 2018, and to U.S. Provisional Patent Application No. 62/650,877, titled SURGICAL SMOKE EVACUATION SENSING AND CONTROLS, filed Mar. 30, 2018, the disclosure of each of which is herein incorporated by reference in its entirety. U.S. patent application Ser. No. 16/209,490 also claims the benefit of priority under 35 U.S.C. § 119(e) to U.S. Provisional Patent Application No. 62/649,302, titled INTERACTIVE SURGICAL SYSTEMS WITH ENCRYPTED COMMUNICATION CAPABILITIES, filed Mar. 28, 2018, to U.S. Provisional Patent Application No. 62/649,294, titled DATA STRIPPING METHOD TO INTERROGATE PATIENT RECORDS AND CREATE ANONYMIZED RECORD, filed Mar. 28, 2018, to U.S. Provisional Patent Application No. 62/649,300, titled SURGICAL HUB SITUATIONAL AWARENESS, filed Mar. 28, 2018, to U.S. Provisional Patent Application No. 62/649,309, titled SURGICAL HUB SPATIAL AWARENESS TO DETERMINE DEVICES IN OPERATING THEATER, filed Mar. 28, 2018, to U.S. Provisional Patent Application No. 62/649,310, titled COMPUTER IMPLEMENTED INTERACTIVE SURGICAL SYSTEMS, filed Mar. 28, 2018, to U.S. Provisional Patent Application No. 62/649,291, titled USE OF LASER LIGHT AND RED-GREEN-BLUE COLORATION TO DETERMINE PROPERTIES OF BACK SCATTERED LIGHT, filed Mar. 28, 2018, to U.S. Provisional Patent Application No. 62/649,296, titled ADAPTIVE CONTROL PROGRAM UPDATES FOR SURGICAL DEVICES, filed Mar. 28, 2018, to U.S. Provisional Patent Application No. 62/649,333, titled CLOUD-BASED MEDICAL ANALYTICS FOR CUSTOMIZATION AND RECOMMENDATIONS TO A USER, filed Mar. 28, 2018, to U.S. Provisional Patent Application No. 62/649,327, titled CLOUD-BASED MEDICAL ANALYTICS FOR SECURITY AND AUTHENTICATION TRENDS AND REACTIVE MEASURES, filed Mar. 28, 2018, to U.S. Provisional Patent Application No. 62/649,315, titled DATA HANDLING AND PRIORITIZATION IN A CLOUD ANALYTICS NETWORK, filed Mar. 28, 2018, to U.S. Provisional Patent Application No. 62/649,313, titled CLOUD INTERFACE FOR COUPLED SURGICAL DEVICES, filed Mar. 28, 2018, to U.S. Provisional Patent Application No. 62/649,320, titled DRIVE ARRANGEMENTS FOR ROBOT-ASSISTED SURGICAL PLATFORMS, filed Mar. 28, 2018, to U.S. Provisional Patent Application No. 62/649,307, titled AUTOMATIC TOOL ADJUSTMENTS FOR ROBOT-ASSISTED SURGICAL PLATFORMS, filed Mar. 28, 2018, and to U.S. Provisional Patent Application No. 62/649,323, titled SENSING ARRANGEMENTS FOR ROBOT-ASSISTED SURGICAL PLATFORMS, filed Mar. 28, 2018, the disclosure of each of which is herein incorporated by reference in its entirety. U.S. patent application Ser. No. 16/209,490 also claims the benefit of priority under 35 U.S.C. § 119(e) to U.S. Provisional Patent Application No. 62/611,341, titled INTERACTIVE SURGICAL PLATFORM, filed Dec. 28, 2017, to U.S. Provisional Patent Application No. 62/611,340, titled CLOUD-BASED MEDICAL ANALYTICS, filed Dec. 28, 2017, and to U.S. Provisional Patent Application No. 62/611,339, titled ROBOT ASSISTED SURGICAL PLATFORM, filed Dec. 28, 2017, the disclosure of each of which is herein incorporated by reference in its entirety.
Number | Date | Country | |
---|---|---|---|
62773778 | Nov 2018 | US | |
62773728 | Nov 2018 | US | |
62773741 | Nov 2018 | US | |
62773742 | Nov 2018 | US | |
62750529 | Oct 2018 | US | |
62750539 | Oct 2018 | US | |
62750555 | Oct 2018 | US | |
62729183 | Sep 2018 | US | |
62729177 | Sep 2018 | US | |
62729176 | Sep 2018 | US | |
62729185 | Sep 2018 | US | |
62729184 | Sep 2018 | US | |
62729182 | Sep 2018 | US | |
62729191 | Sep 2018 | US | |
62729195 | Sep 2018 | US | |
62729186 | Sep 2018 | US | |
62721995 | Aug 2018 | US | |
62721998 | Aug 2018 | US | |
62721999 | Aug 2018 | US | |
62721994 | Aug 2018 | US | |
62721996 | Aug 2018 | US | |
62692747 | Jun 2018 | US | |
62692748 | Jun 2018 | US | |
62692768 | Jun 2018 | US | |
62691228 | Jun 2018 | US | |
62691227 | Jun 2018 | US | |
62691230 | Jun 2018 | US | |
62691219 | Jun 2018 | US | |
62691257 | Jun 2018 | US | |
62691262 | Jun 2018 | US | |
62691251 | Jun 2018 | US | |
62665129 | May 2018 | US | |
62665139 | May 2018 | US | |
62665177 | May 2018 | US | |
62665128 | May 2018 | US | |
62665192 | May 2018 | US | |
62665134 | May 2018 | US | |
62659900 | Apr 2018 | US | |
62650898 | Mar 2018 | US | |
62650887 | Mar 2018 | US | |
62650882 | Mar 2018 | US | |
62650877 | Mar 2018 | US | |
62649302 | Mar 2018 | US | |
62649294 | Mar 2018 | US | |
62649300 | Mar 2018 | US | |
62649309 | Mar 2018 | US | |
62649310 | Mar 2018 | US | |
62649291 | Mar 2018 | US | |
62649296 | Mar 2018 | US | |
62649333 | Mar 2018 | US | |
62649327 | Mar 2018 | US | |
62649315 | Mar 2018 | US | |
62649313 | Mar 2018 | US | |
62649320 | Mar 2018 | US | |
62649307 | Mar 2018 | US | |
62649323 | Mar 2018 | US | |
62611341 | Dec 2017 | US | |
62611340 | Dec 2017 | US | |
62611339 | Dec 2017 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 16209490 | Dec 2018 | US |
Child | 17215933 | US |