BROADCAST AND PEER-TO-PEER COMMUNICATION FOR SURGICAL SYSTEMS

Information

  • Patent Application
  • 20250175439
  • Publication Number
    20250175439
  • Date Filed
    November 26, 2024
    6 months ago
  • Date Published
    May 29, 2025
    11 days ago
Abstract
Surgical systems and related computer-implemented methods are provided, including during performance of a surgical procedure on a patient, receiving, at a first surgical system, a first dataflow from a second surgical system, the first dataflow including first data regarding a measured patient parameter that the first surgical system is configured to use in performing a function during the performance of the surgical procedure. The computer-implemented methods including determining that a trigger event occurred during the performance of the surgical procedure such that a sum of a first bandwidth of the first dataflow and a second bandwidth of a second dataflow exceeds an available bandwidth, and in response to determining that the trigger event occurred, and during the performance of the surgical procedure, adjusting at least one of the first and second dataflows such that the sum of the first and second bandwidths does not exceed the available bandwidth.
Description
FIELD

The present disclosure relates generally to smart surgical devices, systems, and methods.


BACKGROUND

Surgical operations and environments have benefited from advances in technology. These advances include upgraded equipment, therapeutics, techniques, and more, which has resulted in more favorable outcomes for both patients and healthcare personnel. Further benefits can be realized through the continued advancement of technology and the continued integration of such advancements into the operations and environments.


Computers are more and ubiquitous in everyday life, and as the power of computers and computing systems increases, larger quantities of data can be processed in ways that render meaningful results and information for end users. This type of big data processing has immense benefits to surgical operations and environments as well, as more information can be distilled to meaningful assistance to a user, such as a surgeon, for use and reliance during surgical operations. Ultimately, this additional information for the user can result in even more favorable outcomes for both patients and healthcare personnel.


SUMMARY

In general, smart surgical devices, systems, and methods are provided.


In one embodiment, a surgical data management system is provided that include a plurality of surgical systems including a first surgical system, a second surgical system, and at least one additional surgical system. The plurality of surgical systems are configured to all be in use during performance of a surgical procedure on a patient. The first surgical system is configured to, during the performance of the surgical procedure, change between an individualized listening state, in which the first surgical system is a destination of dataflows from a first selected subset of the plurality of surgical systems, and a globalized listening state, in which the first surgical system is a destination of dataflows from all of the plurality of surgical systems. The second surgical system is configured to, during the performance of the surgical procedure, change between an individualized broadcast state, in which the second surgical system is a source of dataflows to a second selected subset of the plurality of surgical systems, and a globalized broadcast state, in which the second surgical system is a source of dataflows to all of the plurality of surgical systems. The dataflows to the first surgical system and the dataflows from the second surgical system include data collected in relation to and in real time with the performance of the surgical procedure and including at least one of patient data, surgical procedure data, and surgical instrument data.


The surgical data management system can have any number of variations. For example, the first surgical system can be configured to change from the individualized listening state to the globalized listening state in response to the first surgical system detecting occurrence of an anomalous event during the performance of the surgical procedure. Further, the first surgical system can be configured to change from the globalized listening state to the individualized listening state in response to resolution of the anomalous event.


For another example, the first surgical system can be configured to transmit information to all of the plurality of surgical systems indicative of dataflows to be broadcast from the first surgical system, and the second surgical system can be configured to determine, based on the information received from the first surgical system, whether to include the first surgical system in the second selected subset of the plurality of surgical systems.


For yet another example, the second surgical system can be configured to change from the individualized broadcast state to the globalized broadcast state in response to the second surgical system having an emergency broadcast to transmit to all of the plurality of surgical systems, and, in the globalized broadcast state, the second surgical system can be configured to transmit the emergency broadcast to all of the plurality of surgical systems. Further, the second surgical system can be configured to change from the globalized broadcast state to the individualized broadcast state after the transmission of the emergency broadcast; receipt of the emergency broadcast at the plurality of surgical systems can be configured to cause each of the plurality of surgical systems that includes a display to show an alert on the display; the emergency broadcast can be configured to inform the plurality of surgical systems of a step reached in the surgical procedure being performed, and the emergency broadcast can be configured to inform at least one of the plurality of surgical systems of a setting needed at the at least one of the plurality of surgical systems during the performance of the step of the surgical procedure; and/or the first surgical system can be configured to determine, based on the emergency broadcast, whether to include the second surgical system in the first selected subset of the plurality of surgical systems.


For still another example, in the individualized broadcast state, the second surgical system can use discrete addressing, and, in the globalized broadcast state, the second surgical system can use multiple distribution addressing.


For another example, the second surgical system can be configured to transmit information to all of the plurality of surgical systems indicative of dataflows to be broadcast from the second surgical system, and the first surgical system can be configured to determine, based on the information received from the second surgical system, whether to include the second surgical system in the first selected subset of the plurality of surgical systems.


For yet another example, the plurality of surgical systems can include a first display configured to display surgical information during the performance of the surgical procedure on the patient, the first display can be included in the second selected subset of the plurality of surgical systems, and, in response to a second display being added to the plurality of surgical systems configured to all be in use during the performance of the surgical procedure on the patient, the second surgical system can be configured to add the second display to the second selected subset of the plurality of surgical systems.


For another example, the second surgical system can be configured to broadcast to all of the plurality of surgical systems a notice indicative of an upcoming change of the second surgical system from the individualized broadcast state to the globalized broadcast state.


For yet another example, the surgical data management system can also include a surgical hub configured to be communicatively coupled with all of the plurality of surgical systems during the performance of the surgical procedure on the patient; and, in response to the surgical hub being communicatively coupled with a one of the plurality of surgical systems, the surgical hub can be configured to at least one of: add the one of the plurality of surgical systems to the first selected subset of the plurality of surgical systems, and add the one of the plurality of surgical systems to the second selected subset of the plurality of surgical systems. Further, the surgical hub can be configured to add the one of the plurality of surgical systems to the first selected subset of the plurality of surgical systems in response to detection that the one of the plurality of surgical systems is pre-configured for use with the first surgical system, and the surgical hub can be configured to add the one of the plurality of surgical systems to the second selected subset of the plurality of surgical systems in response to detection that the one of the plurality of surgical systems is pre-configured for use with the second surgical system.


For another example, the surgical data management system can also include a surgical hub configured to be communicatively coupled with the plurality of surgical systems during the performance of the surgical procedure, configured to cause the change for the first surgical system, and configured to cause the change for the second surgical system.


For yet another example, the first surgical system can be configured to cause the change for the first surgical system, and the second surgical system can be configured to cause the change for the second surgical system.


For still another example, each of the plurality of surgical systems can be one of a hospital network, a database, a surgical instrument, or a surgical cart, e.g., both being surgical instruments, one being a surgical instrument and the other being a database, etc.


In another embodiment, a surgical data management system is provided that includes a processor and a memory storing instructions that, when executed by the processor, cause the processor to perform operations including: change, during performance of a surgical procedure on a patient, a first surgical system between an individualized listening state, in which the first surgical system is a destination of dataflows from a first selected subset of a plurality of surgical systems, and a globalized listening state, in which the first surgical system is a destination of dataflows from all of the plurality of surgical systems. The plurality of surgical systems includes the first surgical system, a second surgical system, and at least one additional surgical system, and the plurality of surgical systems are configured to all be in use during the performance of the surgical procedure. The operations also include changing, during the performance of the surgical procedure, a second surgical system between an individualized broadcast state, in which the second surgical system is a source of dataflows to a second selected subset of the plurality of surgical systems, and a globalized broadcast state, in which the second surgical system is a source of dataflows to all of the plurality of surgical systems. The dataflows to the first surgical system and the dataflows from the second surgical system include data collected in relation to and in real time with the performance of the surgical procedure and include at least one of patient data, surgical procedure data, and surgical instrument data.


The surgical data management system can have any number of variations. For example, a surgical hub configured to be communicatively coupled with the plurality of surgical systems during the performance of the surgical procedure can include the processor such that the surgical hub is configured to cause the change for the first surgical system and to cause the change for the second surgical system. Further, in response to the surgical hub being communicatively coupled with a one of the plurality of surgical systems, the surgical hub can be configured to at least one of: add the one of the plurality of surgical systems to the first selected subset of the plurality of surgical systems, and add the one of the plurality of surgical systems to the second selected subset of the plurality of surgical systems. Further, the surgical hub can be configured to add the one of the plurality of surgical systems to the first selected subset of the plurality of surgical systems in response to detection that the one of the plurality of surgical systems is pre-configured for use with the first surgical system, and the surgical hub can be configured to add the one of the plurality of surgical systems to the second selected subset of the plurality of surgical systems in response to detection that the one of the plurality of surgical systems is pre-configured for use with the second surgical system.


For another example, the processor can include a first processor of the first surgical system and a second processor of the second surgical system such that the first surgical system is configured to cause the change for the first surgical system and the second surgical system is configured to cause the change for the second surgical system.


For example, the first surgical system can be configured to change from the individualized listening state to the globalized listening state in response to the first surgical system detecting occurrence of an anomalous event during the performance of the surgical procedure. Further, the first surgical system can be configured to change from the globalized listening state to the individualized listening state in response to resolution of the anomalous event.


For another example, the first surgical system can be configured to transmit information to all of the plurality of surgical systems indicative of dataflows to be broadcast from the first surgical system, and the second surgical system can be configured to determine, based on the information received from the first surgical system, whether to include the first surgical system in the second selected subset of the plurality of surgical systems.


For yet another example, the second surgical system can be configured to change from the individualized broadcast state to the globalized broadcast state in response to the second surgical system having an emergency broadcast to transmit to all of the plurality of surgical systems, and, in the globalized broadcast state, the second surgical system can be configured to transmit the emergency broadcast to all of the plurality of surgical systems. Further, the second surgical system can be configured to change from the globalized broadcast state to the individualized broadcast state after the transmission of the emergency broadcast; receipt of the emergency broadcast at the plurality of surgical systems can be configured to cause each of the plurality of surgical systems that includes a display to show an alert on the display; the emergency broadcast can be configured to inform the plurality of surgical systems of a step reached in the surgical procedure being performed, and the emergency broadcast can be configured to inform at least one of the plurality of surgical systems of a setting needed at the at least one of the plurality of surgical systems during the performance of the step of the surgical procedure; and/or the first surgical system can be configured to determine, based on the emergency broadcast, whether to include the second surgical system in the first selected subset of the plurality of surgical systems.


For still another example, in the individualized broadcast state, the second surgical system can use discrete addressing, and, in the globalized broadcast state, the second surgical system can use multiple distribution addressing.


For another example, the second surgical system can be configured to transmit information to all of the plurality of surgical systems indicative of dataflows to be broadcast from the second surgical system, and the first surgical system can be configured to determine, based on the information received from the second surgical system, whether to include the second surgical system in the first selected subset of the plurality of surgical systems.


For yet another example, the plurality of surgical systems can include a first display configured to display surgical information during the performance of the surgical procedure on the patient, the first display can be included in the second selected subset of the plurality of surgical systems, and, in response to a second display being added to the plurality of surgical systems configured to all be in use during the performance of the surgical procedure on the patient, the second surgical system can be configured to add the second display to the second selected subset of the plurality of surgical systems.


For another example, the second surgical system can be configured to broadcast to all of the plurality of surgical systems a notice indicative of an upcoming change of the second surgical system from the individualized broadcast state to the globalized broadcast state.


For yet another example, the first surgical system can be configured to cause the change for the first surgical system, and the second surgical system can be configured to cause the change for the second surgical system.


For still another example, each of the plurality of surgical systems can be one of a hospital network, a database, a surgical instrument, or a surgical cart, e.g., both being surgical instruments, one being a surgical instrument and the other being a database, etc.


In another embodiment, a computer-implemented method is provided that includes changing, during performance of a surgical procedure on a patient, a first surgical system between an individualized listening state, in which the first surgical system is a destination of dataflows from a first selected subset of a plurality of surgical systems, and a globalized listening state, in which the first surgical system is a destination of dataflows from all of the plurality of surgical systems. The plurality of surgical systems include the first surgical system, a second surgical system, and at least one additional surgical system. The plurality of surgical systems are all in use during the performance of the surgical procedure. The method also includes changing, during the performance of the surgical procedure, a second surgical system between an individualized broadcast state, in which the second surgical system is a source of dataflows to a second selected subset of the plurality of surgical systems, and a globalized broadcast state, in which the second surgical system is a source of dataflows to all of the plurality of surgical systems. The dataflows to the first surgical system and the dataflows from the second surgical system include data collected in relation to and in real time with the performance of the surgical procedure and including at least one of patient data, surgical procedure data, and surgical instrument data.


The method can vary in any number of ways. For example, a surgical hub communicatively coupled with the plurality of surgical systems during the performance of the surgical procedure can include the processor such that the surgical hub causes the change for the first surgical system and causes the change for the second surgical system. Further, in response to the surgical hub being communicatively coupled with a one of the plurality of surgical systems, the surgical hub can be configured to at least one of: add the one of the plurality of surgical systems to the first selected subset of the plurality of surgical systems, and add the one of the plurality of surgical systems to the second selected subset of the plurality of surgical systems. Further, the surgical hub can be configured to add the one of the plurality of surgical systems to the first selected subset of the plurality of surgical systems in response to detection that the one of the plurality of surgical systems is pre-configured for use with the first surgical system, and the surgical hub can be configured to add the one of the plurality of surgical systems to the second selected subset of the plurality of surgical systems in response to detection that the one of the plurality of surgical systems is pre-configured for use with the second surgical system.


For another example, the first surgical system can cause the change for the first surgical system, and the second surgical system can cause the change for the second surgical system.


For example, the first surgical system can be configured to change from the individualized listening state to the globalized listening state in response to the first surgical system detecting occurrence of an anomalous event during the performance of the surgical procedure. Further, the first surgical system can be configured to change from the globalized listening state to the individualized listening state in response to resolution of the anomalous event.


For another example, the first surgical system can be configured to transmit information to all of the plurality of surgical systems indicative of dataflows to be broadcast from the first surgical system, and the second surgical system can be configured to determine, based on the information received from the first surgical system, whether to include the first surgical system in the second selected subset of the plurality of surgical systems.


For yet another example, the second surgical system can be configured to change from the individualized broadcast state to the globalized broadcast state in response to the second surgical system having an emergency broadcast to transmit to all of the plurality of surgical systems, and, in the globalized broadcast state, the second surgical system can be configured to transmit the emergency broadcast to all of the plurality of surgical systems. Further, the second surgical system can be configured to change from the globalized broadcast state to the individualized broadcast state after the transmission of the emergency broadcast; receipt of the emergency broadcast at the plurality of surgical systems can be configured to cause each of the plurality of surgical systems that includes a display to show an alert on the display; the emergency broadcast can be configured to inform the plurality of surgical systems of a step reached in the surgical procedure being performed, and the emergency broadcast can be configured to inform at least one of the plurality of surgical systems of a setting needed at the at least one of the plurality of surgical systems during the performance of the step of the surgical procedure; and/or the first surgical system can be configured to determine, based on the emergency broadcast, whether to include the second surgical system in the first selected subset of the plurality of surgical systems.


For still another example, in the individualized broadcast state, the second surgical system can use discrete addressing, and, in the globalized broadcast state, the second surgical system can use multiple distribution addressing.


For another example, the second surgical system can be configured to transmit information to all of the plurality of surgical systems indicative of dataflows to be broadcast from the second surgical system, and the first surgical system can be configured to determine, based on the information received from the second surgical system, whether to include the second surgical system in the first selected subset of the plurality of surgical systems.


For yet another example, the plurality of surgical systems can include a first display configured to display surgical information during the performance of the surgical procedure on the patient, the first display can be included in the second selected subset of the plurality of surgical systems, and, in response to a second display being added to the plurality of surgical systems configured to all be in use during the performance of the surgical procedure on the patient, the second surgical system can be configured to add the second display to the second selected subset of the plurality of surgical systems.


For another example, the second surgical system can be configured to broadcast to all of the plurality of surgical systems a notice indicative of an upcoming change of the second surgical system from the individualized broadcast state to the globalized broadcast state.


For yet another example, the first surgical system can be configured to cause the change for the first surgical system, and the second surgical system can be configured to cause the change for the second surgical system.


For still another example, each of the plurality of surgical systems can be one of a hospital network, a database, a surgical instrument, or a surgical cart, e.g., both being surgical instruments, one being a surgical instrument and the other being a database, etc.





BRIEF DESCRIPTION OF DRAWINGS

The present invention is described by way of reference to the accompanying figures which are as follows:



FIG. 1 is a schematic view of one embodiment of a computer-implemented surgical system;



FIG. 2 is a perspective view of one embodiment of a surgical system in one embodiment of a surgical operating room;



FIG. 3 is a schematic view of one embodiment of a surgical hub paired with various systems;



FIG. 4 is a schematic view of one embodiment of a situationally aware surgical system;



FIG. 5 is a perspective view of one embodiment of a surgical instrument and one embodiment of a surgical system that includes the surgical instrument;



FIG. 6A is a schematic view of a data pipeline architecture;



FIG. 6B is an expanded schematic view of the data pipeline architecture of FIG. 6A;



FIG. 7A is a schematic view of one embodiment of a data pipe configuration;



FIG. 7B is a schematic view of another embodiment of a data pipe configuration;



FIG. 7C is a schematic view of one embodiment of transformations for the data pipe configuration of FIG. 7A;



FIG. 7D is a schematic view of one embodiment of transformations for the data pipe configuration of FIG. 7B;



FIG. 7E is a schematic view of another embodiment of transformations for the data pipe configuration of FIG. 7B;



FIG. 8A is a schematic view of one embodiment of single star networks;



FIG. 8B is a schematic view of one embodiment of merged star networks with broadcasting;



FIG. 9A is a schematic view of one embodiment of allocation of partial messages to multiple streams within a broadcast;



FIG. 9B is a schematic view of another embodiment of allocation of partial messages to multiple streams within a broadcast; and



FIG. 10 is a flowchart of one embodiment of a method of managing data pipelines with broadcast versus peer-to-peer communication.





It is noted that the drawings are not necessarily to scale. The drawings are intended to depict only typical aspects of the subject matter disclosed herein, and therefore should not be considered as limiting the scope of the disclosure.


DETAILED DESCRIPTION

Certain exemplary embodiments will now be described to provide an overall understanding of the principles of the structure, function, manufacture, and use of the devices, systems, and methods disclosed herein. One or more examples of these embodiments are illustrated in the accompanying drawings. A person skilled in the art will understand that the devices, systems, and methods specifically described herein and illustrated in the accompanying drawings are non-limiting exemplary embodiments and that the scope of the present invention is defined solely by the claims. The features illustrated or described in connection with one exemplary embodiment may be combined with the features of other embodiments. Such modifications and variations are intended to be included within the scope of the present invention.


Further, in the present disclosure, like-named components of the embodiments generally have similar features, and thus within a particular embodiment each feature of each like-named component is not necessarily fully elaborated upon. Additionally, to the extent that linear or circular dimensions are used in the description of the disclosed systems, devices, and methods, such dimensions are not intended to limit the types of shapes that can be used in conjunction with such systems, devices, and methods. A person skilled in the art will recognize that an equivalent to such linear and circular dimensions can easily be determined for any geometric shape. A person skilled in the art will appreciate that a dimension may not be a precise value but nevertheless be considered to be at about that value due to any number of factors such as manufacturing tolerances and sensitivity of measurement equipment. Sizes and shapes of the systems and devices, and the components thereof, can depend at least on the size and shape of components with which the systems and devices will be used.


In general, a health data management system may include an interactive smart system that includes data origination facets, movement, architecture and management, and transformation and lifecycle to determine mechanisms by which smart systems talk to each other. The health data management system may include a data stack that defines handling of data from beginning to end. A data stack may include data sources, data pipelines, data transformation/modeling systems, and data storage systems that define end-to-end handling of data. In one embodiment, the health data management system may include a plurality of smart medical systems that are configured to perform one or more medical operations. The health data management system may utilize the data stack to control and manage data flow to the different smart device systems. In one embodiment, the health data management system may control and manage the data flow for managing a patient or performing a medical procedure, for example, providing surgical assistance during performance of a surgical procedure by one or more smart medical systems (also referred to herein as “surgical systems”).


Surgical Systems


FIG. 1 shows one embodiment of a computer-implemented surgical system 100. The surgical system 100 may include one or more surgical systems (e.g., surgical sub-systems) 102, 103, 104. As in this illustrated embodiment, the surgical system 100 may include first, second, and third surgical systems 102, 103, 104, but may instead include another number, e.g., one, two, four, etc.


The first surgical system 102 is discussed herein as a general representative of the surgical systems 102, 103, 104. For example, the surgical system 102 may include a computer-implemented interactive surgical system. For example, the surgical system 102 may include a surgical hub 106 and/or a computing device 116 in communication with a cloud computing system 108, for example, as described in FIG. 2.


The cloud computing system 108 may include at least one remote cloud server 109 and at least one remote cloud storage unit 110. Embodiments of surgical systems 102, 103, 104 may include one or more wearable sensing systems 111, one or more environmental sensing systems 115, one or more robotic systems (also referred to herein as “robotic surgical systems”) 113, one or more intelligent instruments 114 (e.g., smart surgical instruments), one or more human interface systems 112, etc. A “human interface system” is also referred herein as a “human interface device.” The wearable sensing system(s) 111 may include one or more HCPs (“health care professional” or “health care personnel”) sensing systems and/or one or more patient sensing systems. The environmental sensing system(s) 115 may include one or more devices used for measuring one or more environmental attributes, for example, as further described in FIG. 2. The robotic system(s) 113 may include a plurality of devices used for performing a surgical procedure, for example, as further described in FIG. 2.


Examples of robotic surgical systems include the Ottava™ robotic-assisted surgery system (Johnson & Johnson of New Brunswick, NJ), da Vinci® surgical systems (Intuitive Surgical, Inc. of Sunnyvale, CA), the Hugo™ robotic-assisted surgery system (Medtronic PLC of Minneapolis, MN), the Versius® surgical robotic system (CMR Surgical Ltd of Cambridge, UK), and the Monarch® platform (Auris Health, Inc. of Redwood City, CA). Embodiments of various robotic surgical systems and using robotic surgical systems are further described in, for example, U.S. Pat. App. Pub. No. 2018/0177556 entitled “Flexible Instrument Insertion Using An Adaptive Force Threshold” filed Dec. 28, 2016, U.S. Pat. App. Pub. No. 2020/0000530 entitled “Systems And Techniques For Providing Multiple Perspectives During Medical Procedures” filed Apr. 16, 2019, U.S. Pat. App. Pub. No. 2020/0170720 entitled “Image-Based Branch Detection And Mapping For Navigation” filed Feb. 7, 2020, U.S. Pat. App. Pub. No. 2020/0188043 entitled “Surgical Robotics System” filed Dec. 9, 2019, U.S. Pat. App. Pub. No. 2020/0085516 entitled “Systems And Methods For Concomitant Medical Procedures” filed Sep. 3, 2019, U.S. Pat. No. 8,831,782 entitled “Patient-Side Surgeon Interface For A Teleoperated Surgical Instrument” filed Jul. 15, 2013, and Intl. Pat. Pub. No. WO 2014151621 entitled “Hyperdexterous Surgical System” filed Mar. 13, 2014, which are hereby incorporated by reference in their entireties.


The surgical system 102 may be in communication with the one or more remote servers 109 that may be part of the cloud computing system 108. In an example embodiment, the surgical system 102 may be in communication with the one or more remote servers 109 via an internet service provider's cable/FIOS networking node. In an example embodiment, a patient sensing system may be in direct communication with the one or more remote servers 109. The surgical system 102 (and/or various sub-systems, smart surgical instruments, robots, sensing systems, and other computerized devices described herein) may collect data in real-time and transfer the data to the cloud computing system 108 for data processing and manipulation, e.g., by the one or more remote servers 109. It will be appreciated that cloud computing may rely on sharing computing resources rather than having local servers or personal devices to handle software applications.


The surgical system 102 and/or a component therein may communicate with the one or more remote servers 109 via a cellular transmission/reception point (TRP) or a base station using one or more of the following cellular protocols: GSM/GPRS/EDGE (2G), UMTS/HSPA (3G), long term evolution (LTE) or 4G, LTE-Advanced (LTE-A), new radio (NR) or 5G, and/or other wired or wireless communication protocols. Various embodiments of cloud-based analytics that may be performed by the cloud computing system 108 are described further in, for example, U.S. Pat. App Pub. No. 2019/0206569 entitled “Method Of Cloud Based Data Analytics For Use With The Hub” published Jul. 4, 2019, which is hereby incorporated by reference in its entirety.


The surgical hub 106 may have cooperative interactions with one or more means of displaying an image, e.g., a display configured to display an image from a laparoscopic scope, etc., and information from one or more other smart devices and/or one or more sensing systems. The surgical hub 106 may interact with the one or more sensing systems, the one or more smart devices, and the one or more means of displaying an image. The surgical hub 106 may be configured to gather measurement data from the sensing system(s) and send notifications or control messages to the sensing system(s). The surgical hub 106 may send and/or receive information including notification information to and/or from the one or more human interface systems 112. The one or more human interface systems 112 may include one or more human interface devices (HIDs). The surgical hub 106 may send and/or receive notification information or control information to audio, display and/or control information to various devices that are in communication with the surgical hub 106.


In an exemplary embodiment, the one or more sensing systems may include the one or more wearable sensing system 111 (which may include one or more HCP sensing systems and/or one or more patient sensing systems) and/or the environmental sensing system(s) 115 shown in FIG. 1. The sensing system(s) may measure data relating to various biomarkers.


In an exemplary embodiment, the sensing system(s) may measure the biomarkers using one or more sensors, for example, photosensors (e.g., photodiodes, photoresistors), mechanical sensors (e.g., motion sensors), acoustic sensors, electrical sensors, electrochemical sensors, thermoelectric sensors, infrared sensors, etc. The sensor(s) may measure the biomarkers using one of more of the following sensing technologies: photoplethysmography, electrocardiography, electroencephalography, colorimetry, impedimentary, potentiometry, amperometry, etc.


The biomarkers measured by the sensing system(s) may include, but are not limited to, sleep, core body temperature, maximal oxygen consumption, physical activity, alcohol consumption, respiration rate, oxygen saturation, blood pressure, blood sugar, heart rate variability, blood potential of hydrogen, hydration state, heart rate, skin conductance, peripheral temperature, tissue perfusion pressure, coughing and sneezing, gastrointestinal motility, gastrointestinal tract imaging, respiratory tract bacteria, edema, mental aspects, sweat, circulating tumor cells, autonomic tone, circadian rhythm, and/or menstrual cycle.


The biomarkers may relate to physiologic systems, which may include, but are not limited to, behavior and psychology, cardiovascular system, renal system, skin system, nervous system, gastrointestinal system, respiratory system, endocrine system, immune system, tumor, musculoskeletal system, and/or reproductive system. Information from the biomarkers may be determined and/or used by the computer-implemented surgical system 100, for example. The information from the biomarkers may be determined and/or used by the computer-implemented patient and the computer-implemented surgical system 100 to improve said systems and/or to improve patient outcomes, for example.


The sensing system(s) may send data to the surgical hub 106. The sensing system(s) may use one or more of the following radiofrequency (RF) protocols for communicating with the surgical hub 106: Bluetooth, Bluetooth Low-Energy (BLE), Bluetooth Smart, Zigbee, Z-wave, IPv6 Low-power wireless Personal Area Network (6LoWPAN), Wi-Fi, etc.


Various embodiments of sensing systems, biomarkers, and physiological systems are described further in, for example, U.S. Pat. App Pub. No. 2022/0233119 entitled “Method Of Adjusting A Surgical Parameter Based On Biomarker Measurements” filed published Jul. 28, 2022, which is hereby incorporated by reference in its entirety.


The sensing systems described herein may be employed to assess physiological conditions of a surgeon operating on a patient or a patient being prepared for a surgical procedure or a patient recovering after a surgical procedure. The cloud-based computing system 108 may be used to monitor biomarkers associated with a HCP (a surgeon, a nurse, etc.) or a patient in real-time and to generate surgical plans based at least on measurement data gathered prior to a surgical procedure, provide control signals to one or more surgical instruments during a surgical procedure, and notify a patient of a complication during a post-surgical period.


The cloud-based computing system 108 may be used to analyze surgical data. Surgical data may be obtained via the intelligent instrument(s) 114, the wearable sensing system(s) 111, the environmental sensing system(s) 115, the robotic system(s) 113, and/or the like in the surgical system 102. Surgical data may include tissue states to assess leaks or perfusion of sealed tissue after a tissue sealing and cutting procedure pathology data, including images of samples of body tissue, anatomical structures of the body using a variety of sensors integrated with imaging devices and techniques such as overlaying images captured by multiple imaging devices, image data, and/or the like. The surgical data may be analyzed to improve surgical procedure outcomes by determining if further treatment, such as the application of endoscopic intervention, emerging technologies, a targeted radiation, targeted intervention, and precise robotics to tissue-specific sites and conditions. Such data analysis may employ outcome analytics processing and using standardized approaches may provide beneficial feedback to either confirm surgical treatments and the behavior of the surgeon or suggest modifications to surgical treatments and the behavior of the surgeon.



FIG. 2 shows one embodiment of the surgical system 102 in one embodiment of a surgical operating room 135. As illustrated in FIG. 2, a patient is being operated on by one or more HCPs. The HCP(s) are being monitored by one or more HCP sensing systems 120 worn by the HCP(s). The HCP(s) and the environment surrounding the HCP(s) may also be monitored by one or more environmental sensing systems including, for example, one or more cameras 121, one or more microphones 122, and other sensors that may be deployed in the operating room. The one or more HCP sensing systems 120 and the environmental sensing systems may be in communication with the surgical hub 106, which in turn may be in communication with the one or more cloud servers 109 of the cloud computing system 108, as shown in FIG. 1. The one or more environmental sensing systems may be used for measuring one or more environmental attributes, for example, HCP position in the surgical theater, HCP movements, ambient noise in the surgical theater, temperature/humidity in the surgical theater, etc.


As illustrated in FIG. 2, a primary display 123 and one or more audio output devices (e.g., speakers 119) are positioned in a sterile field of the surgical system 102 to be visible to an operator at an operating table 124. In addition, a visualization/notification tower 126 is positioned outside the sterile field. The visualization/notification tower 126 may include a first non-sterile human interactive device (HID) 127 and a second non-sterile HID 129, which may be displays and may face away from each other. The display 123 and the HIDs 127, 129 may include a touch screen allowing a human to interface directly with the HID 127, 129. A human interface system, guided by the surgical hub 106, may be configured to utilize the display 123 and the HIDs 127, 129 to coordinate information flow to operators inside and outside the sterile field. In an exemplary embodiment, the surgical hub 106 may cause an HID (e.g., the primary display 123) to display a notification and/or information about the patient and/or a surgical procedure step. In an exemplary embodiment, the surgical hub 106 may prompt for and/or receive input from personnel in the sterile field or in the non-sterile area. In an exemplary embodiment, the surgical hub 106 may cause one or more non-sterile HIDs 127, 129 to display a snapshot of a surgical site, as recorded by an imaging device 130, while maintaining a live feed of the surgical site on one or more sterile HIDs, e.g., the primary HID 123. The snapshot on the non-sterile HID(s) 127, 129 can permit a non-sterile operator to perform a diagnostic step relevant to the surgical procedure, for example.


The surgical hub 106 may be configured to route a diagnostic input or feedback entered by a non-sterile operator at the visualization tower 126 to the primary display 123 within the sterile field, where it can be viewed by a sterile operator at the operating table. In an exemplary embodiment, the input can be in the form of a modification to the snapshot displayed on the non-sterile HID(s) 127, 129, which can be routed to the one or more sterile HIDs, e.g., the primary display 123, by the surgical hub 106.


Various embodiments of surgical hubs are further described in, for example, U.S. Pat. App. Pub. No. 2024/0221937 entitled “Method For Advanced Algorithm Support” published Jul. 4, 2024, U.S. Pat. App. Pub. No. 2024/0112768 entitled “Method For Health Data And Consent Management” published Apr. 4, 2024, U.S. Pat. App. Pub. No. 2024/0220763 entitled “Data Volume Determination For Surgical Machine Learning Applications” published Jul. 2, 2024, U.S. Pat. App. Pub. No. 2019/0206564 entitled “Method For Facility Data Collection And Interpretation” published Jul. 4, 2019, U.S. Pat. App. Pub. No. 2023/0026634 entitled “Surgical Data System And Classification” published Jan. 26, 2023, U.S. Pat. App. Pub. No. 2019/0201115 entitled “Aggregation And Reporting Of Surgical Hub Data” published Jul. 4, 2019, U.S. Pat. App. Pub. No. 2023/0372030 entitled “Automatic Compilation, Annotation, And Dissemination Of Surgical Data To Systems To Anticipate Related Automated Operations” published Nov. 23, 2023, U.S. Pat. App. Pub. No. 2022/0104896 entitled “Interactive Information Overlay On Multiple Surgical Displays” published Apr. 7, 2022, U.S. Pat. No. 11,304,699 entitled “Method For Adaptive Control Schemes For Surgical Network Control And Interaction” issued Apr. 19, 2022, U.S. Pat. No. 10,849,697 entitled “Cloud Interface For Coupled Surgical Devices” issued Dec. 1, 2020, U.S. Pat. App. Pub. No. 2022/0239577 entitled “Ad Hoc Synchronization Of Data From Multiple Link Coordinated Sensing Systems” published Jul. 28, 2022, U.S. Pat. App. Pub. No. 2023/0025061 entitled “Surgical Data System And Management” published Jan. 26, 2023, U.S. Pat. App. Pub. No. 2023/0023083 entitled “Method Of Surgical System Power Management, Communication, Processing, Storage And Display” published Jan. 26, 2023, U.S. Pat. App. Pub. No. 2019/0206556 entitled “Real-Time Analysis Of Comprehensive Cost Of All Instrumentation Used In Surgery Utilizing Data Fluidity To Track Instruments Through Stocking And In-House Processes” published Jul. 4, 2019, U.S. Pat. App. Pub. No. 2023/0096268 entitled “Methods for Controlling Cooperative Surgical Instruments” filed Oct. 5, 2021, U.S. Pat. App. Pub. No. 2019/0200844 entitled “Method Of Hub Communication, Processing, Storage And Display” filed Dec. 4, 2018, U.S. Pat. App. Pub. No. 2019/0200981 entitled “Method Of Compressing Tissue Within A Stapling Device And Simultaneously Displaying The Location Of The Tissue Within The Jaws” filed Dec. 4, 2018, U.S. Pat. App. Pub. No. 2019/0201046 entitled “Method For Controlling Smart Energy Devices” filed Dec. 4, 2018, U.S. Pat. App. Pub. No. 2019/0201114 entitled “Adaptive Control Program Updates For Surgical Hubs” filed Mar. 29, 2018, U.S. Pat. App. Pub. No. 2019/0201140 entitled “Surgical Hub Situational Awareness” filed Mar. 29, 2018, U.S. Pat. App. Pub. No. 2019/0206004 entitled “Interactive Surgical Systems With Condition Handling Of Devices And Data Capabilities” filed Mar. 29, 2018, U.S. Pat. App. Pub. No. 2019/0206555 entitled “Cloud-based Medical Analytics For Customization And Recommendations To A User” filed Mar. 29, 2018, U.S. Pat. No. 11,678,881 entitled “Spatial Awareness Of Surgical Hubs In Operating Rooms” filed Mar. 29, 2018, and U.S. Pat. App. Pub. No. 2019/0207857 entitled “Surgical Network Determination Of Prioritization Of Communication, Interaction, Or Processing Based On System Or Device Needs” filed Nov. 6, 2018, which are hereby incorporated by reference in their entireties.


As in the illustrated embodiment of FIG. 2, one or more surgical instruments 131 may be being used in the surgical procedure as part of the surgical system 102. The surgical hub 106 may be configured to coordinate information flow to at least one display showing the surgical instrument(s) 131. A diagnostic input or feedback entered by a non-sterile operator at the visualization tower 126 can be routed by the surgical hub 106 to the at least one display, e.g., the primary display 123, within the sterile field, where it can be viewed by the operator of the surgical instrument(s) 131.


Various embodiments of coordinating information flow and display and various embodiments of surgical instruments are described further in, for example, U.S. Pat. No. 11,937,769 entitled “Method Of Hub Communication, Processing, Storage And Display” issued Mar. 26, 2024, which is hereby incorporated by reference in its entirety.


Examples of surgical instruments include a surgical dissector, a surgical stapler, a surgical grasper, a surgical scope (e.g., an endoscope, a laparoscope, etc.), a surgical energy device (e.g., a monopolar probe, a bi-polar probe, an ablation probe, an ultrasound device, an ultrasonic end effector, etc.), a surgical clip applier, etc.


Various embodiments of surgical instruments are described further in, for example, U.S. Pat. No. 11,723,642 entitled “Cooperative Access Hybrid Procedures” issued Aug. 14, 2023, U.S. Pat. App. Pub. No. 2013/0256377 entitled “Layer Comprising Deployable Attachment Members” filed Feb. 8, 2013, U.S. Pat. No. 8,393,514 entitled “Selectively Orientable Implantable Fastener Cartridge” filed Sep. 30, 2010, U.S. Pat. No. 8,317,070 entitled “Surgical Stapling Devices That Produce Formed Staples Having Different Lengths” filed Feb. 28, 2007, U.S. Pat. No. 7,143,925 entitled “Surgical Instrument Incorporating EAP Blocking Lockout Mechanism” filed Jun. 21, 2005, U.S. Pat. App. Pub. No. 2015/0134077 entitled “Sealing Materials for Use in Surgical Stapling” filed Nov. 8, 2013, U.S. Pat. App. Pub. No. 2015/0134076 entitled “Hybrid Adjunct Materials for Use in Surgical Stapling” filed Nov. 8, 2013, U.S. Pat. App. Pub. No. 2015/0133996 entitled “Positively Charged Implantable Materials and Method of Forming the Same” filed Nov. 8, 2013, U.S. Pat. App. Pub. No. 2015/0129634 entitled “Tissue Ingrowth Materials and Method of Using the Same” filed Nov. 8, 2013, U.S. Pat. App. Pub. No. 2015/0133995 entitled “Hybrid Adjunct Materials for Use in Surgical Stapling” filed Nov. 8, 2013, U.S. Pat. No. 9,913,642 entitled “Surgical Instrument Comprising a Sensor System” filed Mar. 26, 2014, U.S. Pat. No. 10,172,611 entitled “Adjunct Materials and Methods of Using Same in Surgical Methods for Tissue Sealing” filed Jun. 10, 2014, U.S. Pat. No. 8,989,903 entitled “Methods And Systems For Indicating A Clamping Prediction” filed Jan. 13, 2012, U.S. Pat. No. 9,072,535 entitled “Surgical Stapling Instruments With Rotatable Staple Deployment Arrangements” filed May 27, 2011, U.S. Pat. No. 9,072,536 entitled “Differential Locking Arrangements For Rotary Powered Surgical Instruments” filed Jun. 28, 2012, U.S. Pat. No. 10,531,929 entitled “Control Of Robotic Arm Motion Based On Sensed Load On Cutting Tool” filed Aug. 16, 2016, U.S. Pat. No. 10,709,516 entitled “Curved Cannula Surgical System Control” filed Apr. 2, 2018, U.S. Pat. No. 11,076,926 entitled “Manual Release For Medical Device Drive System” filed Mar. 21, 2018, U.S. Pat. No. 9,839,487 entitled “Method For Engaging Surgical Instrument With Teleoperated Actuator” filed Mar. 17, 2015, U.S. Pat. No. 10,543,051 entitled “Method For Engaging Surgical Instrument With Teleoperated Actuator” issued Jan. 28, 2020, U.S. Pat. No. 9,804,618 entitled “Systems And Methods For Controlling A Segmented Circuit” filed Mar. 25, 2014, U.S. Pat. No. 11,607,239 entitled “Systems And Methods For Controlling A Surgical Stapling And Cutting Instrument” filed Apr. 15, 2016, U.S. Pat. No. 10,052,044 entitled “Time Dependent Evaluation Of Sensor Data To Determine Stability, Creep, And Viscoelastic Elements Of Measures” filed Mar. 6, 2015, U.S. Pat. No. 9,439,649 entitled “Surgical Instrument Having Force Feedback Capabilities” filed Dec. 12, 2012, U.S. Pat. No. 10,751,117 entitled “Electrosurgical Instrument With Fluid Diverter” filed Sep. 23, 2016, U.S. Pat. No. 11,160,602 entitled “Control Of Surgical Field Irrigation” filed Aug. 29, 2017, U.S. Pat. No. 9,877,783 entitled “Energy Delivery Systems And Uses Thereof” filed Dec. 30, 2016, U.S. Pat. No. 11,266,458 entitled “Cryosurgical System With Pressure Regulation” filed Apr. 19, 2019, U.S. Pat. No. 10,314,649 entitled “Flexible Expandable Electrode And Method Of Intraluminal Delivery Of Pulsed Power” filed Aug. 2, 2012, U.S. Pat. App. Pub. No. 2023/0116781 entitled “Surgical Devices, Systems, And Methods Using Multi-Source Imaging” filed Oct. 5, 2021, U.S. Pat. App. Pub. No. 2023/0102358 entitled “Surgical Devices, Systems, And Methods Using Fiducial Identification And Tracking” filed Oct. 5, 2021, U.S. Pat. No. 10,413,373 entitled “Robotic Visualization And Collision Avoidance” filed Aug. 16, 2016, U.S. Pat. App. Pub. No. 2023/0077141 entitled “Robotically Controlled Uterine Manipulator” filed Sep. 21, 2021, and U.S. Pat. App. Pub. No. 2022/0273309 entitled “Stapler Reload Detection And Identification” filed May 16, 2022, which are hereby incorporated by reference herein in their entireties.


As shown in FIG. 2, the surgical system 102 can be used to perform a surgical procedure on the patient who is lying down on the operating table 124 in the surgical operating room 135. A robotic system 134 may be used in the surgical procedure as a part of the surgical system 102. The robotic system 134 may include a surgeon's console 136, a patient side cart 132 (surgical robot), and a surgical robotic hub 133. The patient side cart 132 can manipulate at least one removably coupled surgical instrument 137 through a minimally invasive incision in the body of the patient while the surgeon views the surgical site via the surgeon's console 136. An image of the surgical site can be obtained by the imaging device 130, which can be manipulated by the patient side cart 132 to orient the imaging device 130. The surgical robotic hub 133 can be used to process the images of the surgical site for subsequent display to the surgeon via the surgeon's console 136.


Various embodiments of robotic systems and various embodiments of surgical instruments are described further in, for example, U.S. Pat. No. 11,559,307 entitled “Method Of Robotic Hub Communication, Detection, And Control” issued Jan. 24, 2023, which is hereby incorporated by reference in its entirety.


The imaging device 130 may include at least one image sensor and one or more optical components. Suitable image sensors may include, but are not limited to, Charge-Coupled Device (CCD) sensors and Complementary Metal-Oxide Semiconductor (CMOS) sensors.


The optical components of the imaging device 130 may include one or more illumination sources and/or one or more lenses. The one or more illumination sources may be directed to illuminate portions of the surgical field. The one or more image sensors may receive light reflected or refracted from the surgical field, including light reflected or refracted from tissue and/or surgical instruments. The illumination source(s) may be configured to radiate electromagnetic energy in the visible spectrum as well as the invisible spectrum. The visible spectrum, sometimes referred to as the “optical spectrum” or the “luminous spectrum,” is the portion of the electromagnetic spectrum that is visible to (e.g., can be detected by) the human eye and may be referred to as “visible light” or simply “light.” A typical human eye will respond to wavelengths in air that range from about 380 nm to about 750 nm.


The invisible spectrum (e.g., the non-luminous spectrum) is the portion of the electromagnetic spectrum that lies below and above the visible spectrum (i.e., wavelengths below about 380 nm and above about 750 nm). The invisible spectrum is not detectable by the human eye. Wavelengths greater than about 750 nm are longer than the red visible spectrum, and they become invisible infrared (IR), microwave, and radio electromagnetic radiation. Wavelengths less than about 380 nm are shorter than the violet spectrum, and they become invisible ultraviolet, x-ray, and gamma ray electromagnetic radiation.


In various aspects, the imaging device 130 is configured for use in a minimally invasive procedure. Examples of imaging devices include an arthroscope, angioscope, bronchoscope, choledochoscope, colonoscope, cytoscope, duodenoscope, enteroscope, esophagogastro-duodenoscope (gastroscope), endoscope, laryngoscope, nasopharyngo-neproscope, sigmoidoscope, thoracoscope, and ureteroscope.


The imaging device 130 may employ multi-spectrum monitoring to discriminate topography and underlying structures. A multi-spectral image is one that captures image data within specific wavelength ranges across the electromagnetic spectrum. The wavelengths may be separated by filters or by the use of instruments that are sensitive to particular wavelengths, including light from frequencies beyond the visible light range, e.g., infrared (IR) and ultraviolet (UV). Spectral imaging can allow extraction of additional information that the human eye fails to capture with its receptors for red, green, and blue. Multi-spectrum monitoring can be a useful tool in relocating a surgical field after a surgical task is completed to perform one or more of the previously described tests on the treated tissue. It is axiomatic that strict sterilization of the operating room and surgical equipment is required during any surgery. The strict hygiene and sterilization conditions required in a “surgical theater,” e.g., an operating or treatment room, necessitate the highest possible sterility of all medical devices and equipment. Part of that sterilization process is the need to sterilize anything that comes in contact with the patient or penetrates the sterile field, including the imaging device 130 and its attachments and components. It will be appreciated that the sterile field may be considered a specified area, such as within a tray or on a sterile towel, that is considered free of microorganisms, or the sterile field may be considered an area, immediately around a patient, who has been prepared for a surgical procedure. The sterile field may include the scrubbed team members, who are properly attired, and all furniture and fixtures in the area.


Various embodiments of multi-spectral imaging are described further in, for example, U.S. Pat. No. 11,937,769 entitled “Method Of Hub Communication, Processing, Storage And Display” filed Dec. 4, 2018, which is hereby incorporated by reference in its entirety.


The wearable sensing system(s) 111 illustrated in FIG. 1 may include the one or more HCP sensing systems 120 as shown in FIG. 2. The one or more HCP sensing systems 120 may include sensing system(s) to monitor and detect a set of physical states and/or a set of physiological states of an HCP. An HCP may be a surgeon or one or more healthcare personnel assisting the surgeon or other healthcare service providers in general. The HCP sensing system 120 may send the measurement data associated with a set of biomarkers and data associated with a physical state of the surgeon to the surgical hub 106 for further processing. In an exemplary embodiment, an HCP sensing system 120 may measure a set of biomarkers to monitor the heart rate of an HCP. In an exemplary embodiment, an HCP sensing system 120 worn on a surgeon's wrist (e.g., a watch or a wristband) may use an accelerometer to detect hand motion and/or shakes and determine the magnitude and frequency of tremors.


The environmental sensing system(s) 115 shown in FIG. 1 may send environmental information to the surgical hub 106. In an exemplary embodiment, the environmental sensing system(s) 115 may include a camera 121 for detecting hand/body position of an HCP. The environmental sensing system(s) 115 may include one or more microphones 122 for measuring ambient noise in the surgical theater. Other environmental sensing system(s) 115 may include one or more devices, for example, a thermometer to measure temperature, a hygrometer to measure humidity of the surroundings in the surgical theater, etc. The surgeon biomarkers may include one or more of the following: stress, heart rate, etc. The environmental measurements from the surgical theater may include ambient noise level associated with the surgeon or the patient, surgeon and/or staff movements, surgeon and/or staff attention level, etc. The surgical hub 106, alone or in communication with the cloud computing system 108, may use the surgeon biomarker measurement data and/or environmental sensing information to modify control algorithms of hand-held instruments or the averaging delay of a robotic interface, for example, to minimize tremors.


The surgical hub 106 may use the surgeon biomarker measurement data associated with an HCP to adaptively control the one or more surgical instruments 131. For example, the surgical hub 106 may send a control program to one of the one or more surgical instruments 131 to control the surgical instrument's actuators to limit or compensate for fatigue and use of fine motor skills. The surgical hub 106 may send the control program based on situational awareness and/or the context on importance or criticality of a task. The control program may instruct the instrument to alter operation to provide more control when control is needed.



FIG. 3 shows an embodiment of the surgical system 102 including the surgical hub 106. The surgical hub 106 may be paired with, via a modular control, the one or more wearable sensing systems 111, the one or more environmental sensing systems 115, the one or more human interface systems 112, the one or more robotic systems 113, and the one or more intelligent instruments 114. As in this illustrated embodiment, the surgical hub 106 may include a display 148, an imaging module 149, a generator module 150 (e.g., an energy generator), a communication module 156, a processor module 157, a storage array 158, and an operating-room mapping module 159. In certain aspects, as illustrated in FIG. 3, the surgical hub 106 further includes a smoke evacuation module 154 and/or a suction/irrigation module 155. The various modules and systems may be connected to the modular control either directly via a router or via the communication module 156. The operating theater devices may be coupled to cloud computing resources and data storage, e.g., to the cloud computing system 108, via the modular control. The human interface system(s) 112 may include a display sub-system and a notification sub-system.


The modular control may be coupled to a non-contact sensor module. The non-contact sensor module may measure the dimensions of the operating theater and generate a map of the surgical theater using ultrasonic, laser-type, and/or the like non-contact measurement devices. Other distance sensors can be employed to determine the bounds of an operating room. The sensor module may be configured to determine the size of the operating theater and to adjust Bluetooth-pairing distance limits. A laser-based non-contact sensor module may scan the operating theater by transmitting laser light pulses, receiving laser light pulses that bounce off the perimeter walls of the operating theater, and comparing the phase of the transmitted pulse to the received pulse to determine the size of the operating theater and to adjust Bluetooth pairing distance limits, for example.


An ultrasound-based non-contact sensor module may scan the operating theater by transmitting a burst of ultrasound and receiving the echo when it bounces off the perimeter walls of an operating theater as described further in, for example, U.S. Pat. No. 11,857,152 entitled “Surgical Hub Spatial Awareness To Determine Devices In Operating Theater” issued Jan. 2, 2024, U.S. Pat. No. 11,278,281 entitled “Interactive Surgical Platform” issued Mar. 22, 2022, and U.S. Prov. Pat. App. No. 62/611,341 entitled “Interactive Surgical Platform” filed Dec. 28, 2017, which are hereby incorporated by reference herein in their entireties.


During a surgical procedure, energy application to tissue, for sealing and/or cutting, may be associated with smoke evacuation, suction of excess fluid, and/or irrigation of the tissue. Fluid, power, and/or data lines from different sources may be entangled during the surgical procedure. Valuable time can be lost addressing this issue during a surgical procedure. Detangling the lines may necessitate disconnecting the lines from their respective modules, which may require resetting the modules. A hub modular enclosure 160 of the surgical hub 106 may offer a unified environment for managing the power, data, and fluid lines, which reduces the frequency of entanglement between such lines.


Energy may be applied to tissue at a surgical site. The surgical hub 106 may include the hub modular enclosure 160 and a combo generator module slidably receivable in a docking station of the hub modular enclosure 160. The docking station may include data and power contacts. The combo generator module may include two or more of: an ultrasonic energy generator component, a bipolar radiofrequency (RF) energy generator component, or a monopolar RF energy generator component that are housed in a single unit. The combo generator module may include a smoke evacuation component, at least one energy delivery cable for connecting the combo generator module to a surgical instrument, at least one smoke evacuation component configured to evacuate smoke, fluid, and/or particulates generated by the application of therapeutic energy to the tissue, and a fluid line extending from the remote surgical site to the smoke evacuation component. The fluid line may be a first fluid line, and a second fluid line may extend from the remote surgical site to a suction and irrigation module 155 slidably received in the hub modular enclosure 160. The hub modular enclosure 160 may include a fluid interface.


The combo generator module may generate multiple energy types for application to the tissue. One energy type may be more beneficial for cutting the tissue, while another different energy type may be more beneficial for sealing the tissue. For example, a bipolar generator can be used to seal the tissue while an ultrasonic generator can be used to cut the sealed tissue. In an exemplary embodiment, the hub modular enclosure 160 may be configured to accommodate different generators and facilitate an interactive communication therebetween. The hub modular enclosure 160 may enable the quick removal and/or replacement of various modules.


The hub modular enclosure 160 may include a first energy-generator module, configured to generate a first energy for application to the tissue, and a first docking station comprising a first docking port that includes first data and power contacts, wherein the first energy-generator module is slidably movable into an electrical engagement with the power and data contacts and wherein the first energy-generator module is slidably movable out of the electrical engagement with the first power and data contacts. The modular surgical enclosure may include a second energy-generator module configured to generate a second energy, different than the first energy, for application to the tissue, and a second docking station comprising a second docking port that includes second data and power contacts, wherein the second energy generator module is slidably movable into an electrical engagement with the power and data contacts, and wherein the second energy-generator module is slidably movable out of the electrical engagement with the second power and data contacts. In addition, the modular surgical enclosure also includes a communication bus between the first docking port and the second docking port, configured to facilitate communication between the first energy-generator module and the second energy-generator module.


As shown in FIG. 3, the hub modular enclosure 160 may allow the modular integration of the generator module 150, the smoke evacuation module 154, and the suction/irrigation module 155. The hub modular enclosure 160 may facilitate interactive communication between the operating-room mapping, smoke evacuation, and suction/irrigation modules 159, 154, 155. The generator module 150 can be with integrated monopolar, bipolar, and ultrasonic components supported in a single housing unit slidably insertable into the hub modular enclosure 160. The generator module 150 may connect to a monopolar device 151, a bipolar device 152, and an ultrasonic device 153. The generator module 150 may include a series of monopolar, bipolar, and/or ultrasonic generator modules that interact through the hub modular enclosure 160. The hub modular enclosure 160 may facilitate the insertion of multiple generators and interactive communication between the generators docked into the hub modular enclosure 160 so that the generators would act as a single generator.


A surgical data network having a set of communication hubs may connect the sensing system(s), the modular devices located in one or more operating theaters of a healthcare facility, a patient recovery room, or a room in a healthcare facility specially equipped for surgical operations, to the cloud computing system 108.



FIG. 4 illustrates one embodiment of a situationally aware surgical system 200. Data sources 202 of the situationally aware surgical system 200 may include, for example, modular devices 204, databases 206 (e.g., an electronic medical records (EMR) database, such as of a hospital or other medical facility, containing patient records, etc.), patient monitoring devices 208 (e.g., a blood pressure (BP) monitor, an electrocardiography (EKG) monitor, one or more wearable sensing systems 111, etc.), HCP monitoring devices 210 (e.g., one or more wearable sensing systems 111, etc.), and/or environment monitoring devices 212 (e.g., one or more environmental sensing systems 115, etc.).


The modular devices 204 may include sensors configured to detect parameters associated with a patient, HCPs and environment and/or the modular device 204 itself. The modular devices 204 may include the one or more intelligent instrument(s) 114.


The data sources 202 may be in communication (e.g., wirelessly or wired) with a surgical hub 214, such as the surgical hub 106. The surgical hub 214 may derive contextual information pertaining to a surgical procedure from data based upon, for example, the particular combination(s) of data received from the data sources 202 or the particular order in which the data is received from the data sources 202. The contextual information inferred from the received data can include, for example, the type of surgical procedure being performed, the particular step of the surgical procedure that a surgeon (and/or other HCP) is performing, the type of tissue being operated on, or a body cavity that is the subject of the surgical procedure. This ability by some aspects of the surgical hub 214 to derive or infer information related to the surgical procedure from received data can be referred to as “situational awareness.” For example, the surgical hub 214 can incorporate a situational awareness system, which may be the hardware and/or programming associated with the surgical hub 214 that derives contextual information pertaining to the surgical procedure from the received data and/or a surgical plan information received from the edge computing system 216 or an enterprise cloud server 218, such as the cloud computing system 108. The contextual information derived from the data sources 202 may include, for example, what step of the surgical procedure is being performed, whether and how a particular modular device 204 is being used, and the patient's condition.


The surgical hub 214 may be connected to the databases 206 of the data sources 202 to retrieve therefrom data regarding the surgical procedure that is being performed or is to be performed. The data that may be received by the situational awareness system of the surgical hub 214 from the databases 206 may include, for example, start (or setup) time or operational information regarding the procedure (e.g., a segmentectomy in the upper right portion of the thoracic cavity). The surgical hub 214 may derive contextual information regarding the surgical procedure from this data alone or from the combination of this data and other data from the data sources 202.


The surgical hub 214 may be connected to (e.g., paired with) the patient monitoring devices 208 of the data sources 202. Examples of the patient monitoring devices 208 that can be paired with the surgical hub 214 may include a pulse oximeter (SpO2 monitor), a blood pressure (BP) monitor, and an electrocardiogram (EKG) monitor. Perioperative data that is received by the situational awareness system of the surgical hub 214 from the patient monitoring devices 208 may include, for example, the patient's oxygen saturation, blood pressure, heart rate, and/or other physiological parameters. The contextual information that may be derived by the surgical hub 214 from the perioperative data transmitted by the patient monitoring devices 208 may include, for example, whether the patient is located in the operating theater or under anesthesia. The surgical hub 214 may derive these inferences from data from the patient monitoring devices 208 alone or in combination with data from other data from the data sources 202, such as a ventilator and/or other data source.


The surgical hub 214 may be connected to (e.g., paired with) the modular devices 204. Examples of the modular devices 204 that are paired with the surgical hub 214 may include a smoke evacuator, a medical imaging device such as the imaging device 130 shown in FIG. 2, an insufflator, a combined energy generator (for powering an ultrasonic surgical instrument and/or an RF electrosurgical instrument), and a ventilator.


The perioperative data received by the surgical hub 214 may be from a medical imaging device and/or other device(s). The perioperative data received by the surgical hub 214 from the medical imaging device may include, for example, whether the medical imaging device is activated and image data. The contextual information that is derived by the surgical hub 214 from the perioperative data sent by the medical imaging device may include, for example, whether the procedure is a video-assisted thoracic surgery (VATS) procedure (based on whether the medical imaging device is activated or paired to the surgical hub 214 at the beginning or during the course of the procedure). The image data (e.g., still image and/or video image) from the medical imaging device (or the data stream representing the video for a digital medical imaging device) may be processed by a pattern recognition system or a machine learning system to recognize features (e.g., organs or tissue types) in the field of view (FOV) of the medical imaging device, for example. The contextual information that is derived by the surgical hub 214 from the recognized features may include, for example, what type of surgical procedure (or step thereof) is being performed, what organ is being operated on, or what body cavity is being operated in.


The situational awareness system of the surgical hub 214 may derive the contextual information from the data received from the data sources 202 in a variety of different ways. For example, the situational awareness system can include a pattern recognition system, or a machine learning system (e.g., an artificial neural network), that has been trained on training data to correlate various inputs (e.g., data from the databases 206, the patient monitoring devices 208, the modular devices 204, the HCP monitoring devices 210, and/or the environment monitoring devices 212) to corresponding contextual information regarding a surgical procedure. For example, a machine learning system may accurately derive contextual information regarding a surgical procedure from the provided inputs. In examples, the situational awareness system can include a lookup table storing pre-characterized contextual information regarding a surgical procedure in association with one or more inputs (or ranges of inputs) corresponding to the contextual information. In response to a query with one or more inputs, the lookup table can return the corresponding contextual information for the situational awareness system for controlling one or more of the modular devices 204. In examples, the contextual information received by the situational awareness system of the surgical hub 214 can be associated with a particular control adjustment or set of control adjustments for one or more of the modular devices 204. In examples, the situational awareness system can include a machine learning system, lookup table, or other such system, which may generate or retrieve one or more control adjustments for one or more of the modular devices 204 when provided the contextual information as input.


For example, based on data from the data sources 202, the surgical hub 214 may determine what type of tissue was being operated on. The surgical hub 214 can infer whether a surgical procedure being performed is a thoracic or an abdominal procedure, allowing the surgical hub 214 to determine whether the tissue clamped by an end effector of a surgical stapling and cutting instrument is lung tissue (for a thoracic procedure) or stomach tissue (for an abdominal procedure) tissue. The surgical hub 214 may determine whether the surgical site is under pressure (by determining that the surgical procedure is utilizing insufflation) and determine the procedure type, for a consistent amount of smoke evacuation for both thoracic and abdominal procedures. Based on data from the data sources 202, the surgical hub 214 may determine what step of the surgical procedure is being performed or will subsequently be performed.


The surgical hub 214 may determine what type of surgical procedure is being performed and customize an energy level according to an expected tissue profile for the surgical procedure. The situationally aware surgical hub 214 may adjust the energy level for an ultrasonic surgical instrument or RF electrosurgical instrument throughout the course of a surgical procedure, rather than just on a procedure-by-procedure basis.


In examples, data can be drawn from one or more data sources 202 to improve the conclusions that the surgical hub 214 draws from another one of the data sources 202. The surgical hub 214 may augment data that it receives from the modular devices 204 with contextual information that it has built up regarding the surgical procedure from the other data sources 202.


The situational awareness system of the surgical hub 214 can consider the physiological measurement data to provide additional context in analyzing the visualization data. The additional context can be useful when the visualization data may be inconclusive or incomplete on its own.


The surgical hub 214 may determine whether a surgeon (and/or other HCP(s)) was making an error or otherwise deviating from the expected course of action during the course of a surgical procedure. For example, the surgical hub 214 may determine a type of surgical procedure being performed, retrieve a corresponding list of steps or order of equipment usage (e.g., from a memory of the surgical hub 214 or other computer system), and compare the steps being performed or the equipment being used during the course of the surgical procedure to the expected steps or equipment for the type of surgical procedure that the surgical hub 214 determined is being performed. The surgical hub 214 can provide an alert indicating that an unexpected action is being performed or an unexpected device is being utilized at the particular step in the surgical procedure.


The surgical instruments (and other modular devices 204) may be adjusted for the particular context of each surgical procedure (such as adjusting to different tissue types) and validating actions during a surgical procedure. Next steps, data, and display adjustments may be provided to surgical instruments (and other modular devices 204) in the surgical theater according to the specific context of the surgical procedure.


Embodiments of situational awareness systems and using situational awareness systems during performance of a surgical procedure are described further in, for example, U.S. patent application Ser. No. 16/729,772 entitled “Analyzing Surgical Trends By A Surgical System” filed Dec. 30, 2019, U.S. patent application Ser. No. 16/729,747 entitled “Dynamic Surgical Visualization Systems” filed Dec. 30, 2019, U.S. patent application Ser. No. 16/729,744 entitled “Visualization Systems Using Structured Light” filed Dec. 30, 2019, U.S. patent application Ser. No. 16/729,778 entitled “System And Method For Determining, Adjusting, And Managing Resection Margin About A Subject Tissue” filed Dec. 30, 2019, U.S. patent application Ser. No. 16/729,729 entitled “Surgical Systems For Proposing And Corroborating Organ Portion Removals” filed Dec. 30, 2019, U.S. patent application Ser. No. 16/729,778 entitled “Surgical System For Overlaying Surgical Instrument Data Onto A Virtual Three Dimensional Construct Of An Organ” filed Dec. 30, 2019, U.S. patent application Ser. No. 16/729,751 entitled “Surgical Systems For Generating Three Dimensional Constructs Of Anatomical Organs And Coupling Identified Anatomical Structures Thereto” filed Dec. 30, 2019, U.S. patent application Ser. No. 16/729,740 entitled “Surgical Systems Correlating Visualization Data And Powered Surgical Instrument Data” filed Dec. 30, 2019, U.S. patent application Ser. No. 16/729,737 entitled “Adaptive Surgical System Control According To Surgical Smoke Cloud Characteristics” filed Dec. 30, 2019, U.S. patent application Ser. No. 16/729,796 entitled “Adaptive Surgical System Control According To Surgical Smoke Particulate Characteristics” filed Dec. 30, 2019, U.S. patent application Ser. No. 16/729,803 entitled “Adaptive Visualization By A Surgical System” filed Dec. 30, 2019, and U.S. patent application Ser. No. 16/729,807 entitled “Method Of Using Imaging Devices In Surgery” filed Dec. 30, 2019, which are hereby incorporated by reference in their entireties.



FIG. 5 illustrates one embodiment of a surgical system 300 that may include a surgical instrument 302, such as the surgical instrument 114 of FIG. 1 or the surgical instrument 131 of FIG. 2. The surgical instrument 302 can be in communication with a console 304 and/or a portable device 306 through a local area network (LAN) 308 and/or a cloud network 310, such as the cloud computing system 108 of FIG. 1, via a wired and/or wireless connection. The console 304 and the portable device 306 may be any suitable computing device.


The surgical instrument 302 may include a handle 312, an adapter 314, and a loading unit 316. The adapter 314 releasably couples to the handle 312 and the loading unit 316 releasably couples to the adapter 314 such that the adapter 314 transmits a force from one or more drive shafts to the loading unit 316. The adapter 314 or the loading unit 316 may include a force gauge (not explicitly shown in FIG. 5) disposed therein to measure a force exerted on the loading unit 316. In some embodiments, the adapter 314 is non-releasably attached to the handle 312. In some embodiments, the adapter 314 and the loading unit 316 are integral and may be releasably attachable to the handle 312 or non-releasably attached to the handle 312.


The loading unit 316 may include an end effector 318 having a first jaw 320 and a second jaw 322. The loading unit 316 may be an in-situ loaded or multi-firing loading unit (MFLU) that allows a clinician to fire a plurality of fasteners (e.g., staples, clips, etc.) multiple times without requiring the loading unit 316 to be removed from a surgical site to reload the loading unit 316. The first and second jaws 320, 322 may be configured to clamp tissue therebetween, fire fasteners through the clamped tissue, and sever the clamped tissue. The first jaw 320 may be configured to fire at least one fastener a plurality of times or may be configured to include a replaceable multi-fire fastener cartridge including a plurality of fasteners that may be fired more than one time prior to being replaced. The second jaw 322 may include an anvil that deforms or otherwise secures the fasteners, as the fasteners are ejected from the multi-fire fastener cartridge.


The surgical instrument 302 may include a motor, such as at the handle 312, that is coupled to the one or more drive shafts to affect rotation of the one or more drive shafts. The surgical instrument 302 may include a control interface, such as at the handle 312, to selectively activate the motor. The control interface may include buttons, switches, levers, sliders, touchscreens, and/or any other suitable input mechanisms or user interfaces, which can be engaged by a clinician to activate the motor.


The control interface of the surgical instrument 302 may be in communication with a controller 324 (e.g., a microprocessor or other controller) of the surgical instrument 302, shown in the embodiment of FIG. 5 disposed in the handle 312, to selectively activate the motor to affect rotation of the one or more drive shafts. The controller 324 may be configured to receive input from the control interface, adapter data from the adapter 314, and loading unit data from the loading unit 316. The controller 324 may analyze the input from the control interface and the data received from the adapter 314 and/or the date received from the loading unit 316 to selectively activate the motor. The surgical instrument 302 may also include a display, such as at the handle 312, that is viewable by a clinician during use of the surgical instrument 302. The display may be configured to display portions of the adapter data and/or loading unit data before, during, or after firing of the surgical instrument 302.


The adapter 314 may include an adapter identification device 326 disposed therein, and the loading unit 316 may include a loading unit identification device 328 disposed therein. The adapter identification device 326 may be in communication with the controller 324, and the loading unit identification device 328 may be in communication with the controller 324. It will be appreciated that the loading unit identification device 328 may be in communication with the adapter identification device 326, which relays or passes communication from the loading unit identification device 328 to the controller 324. In embodiments in which the adapter 314 and the loading unit 316 are integral, one of the adapter identification device 326 and the loading unit identification device 328 may be omitted.


The adapter 314 may also include one or more sensors 330 disposed thereabout to detect various conditions of the adapter 314 or of the environment (e.g., if the adapter 314 is connected to a loading unit, if the adapter 314 is connected to a handle, if the one or more drive shafts are rotating, a torque of the one or more drive shafts, a strain of the one or more drive shafts, a temperature within the adapter 314, a number of firings of the adapter 314, a peak force of the adapter 314 during firing, a total amount of force applied to the adapter 314, a peak retraction force of the adapter 314, a number of pauses of the adapter 314 during firing, etc.). The one or more sensors 330 may provide an input to the adapter identification device 326 (or to the loading unit identification device 328 if the adapter identification device 326 is omitted) in the form of data signals. The data signals of the one or more sensors 330 may be stored within or be used to update the adapter data stored within the adapter identification device 326 (or the loading unit identification device 328 if the adapter identification device 326 is omitted). The data signals of the one or more sensors 330 may be analog or digital. The one or more sensors 330 may include, for example, a force gauge to measure a force exerted on the loading unit 316 during firing.


The handle 312 and the adapter 314 can be configured to interconnect the adapter identification device 326 and the loading unit identification device 328 with the controller 324 via an electrical interface. The electrical interface may be a direct electrical interface (e.g., include electrical contacts that engage one another to transmit energy and signals therebetween). Additionally, or alternatively, the electrical interface may be a non-contact electrical interface to wirelessly transmit energy and signals therebetween (e.g., inductively transfer). It is also contemplated that the adapter identification device 326 and the controller 324 may be in wireless communication with one another via a wireless connection separate from the electrical interface.


The handle 312 may include a transceiver 332 that is configured to transmit instrument data from the controller 324 to one or more other components of the surgical system 300 (e.g., the LAN 308, the cloud 310, the console 304, and/or the portable device 306). The controller 324 may also transmit instrument data and/or measurement data associated with the one or more sensors 330 to a surgical hub, such as the surgical hub 106 of FIGS. 1-3 or the surgical hub 214 of FIG. 4. The transceiver 332 may receive data (e.g., cartridge data, loading unit data, adapter data, and/or other notifications) from the surgical hub. The transceiver 332 may receive data (e.g., cartridge data, loading unit data, or adapter data) from the other components of the surgical system 300. For example, the controller 324 may transmit surgical instrument data including a serial number of an attached adapter (e.g., the adapter 314) attached to the handle 312, a serial number of a loading unit (e.g., the loading unit 316) attached to the adapter 314, and a serial number of a multi-fire fastener cartridge loaded into the loading unit 316, e.g., into one of the jaws 320, 322 at the end effector 318, to the console 304. Thereafter, the console 304 may transmit data (e.g., cartridge data, loading unit data, and/or adapter data) associated with the attached cartridge, the loading unit 316, and the adapter, 314 respectively, back to the controller 324. The controller 324 can display messages on the local instrument display or transmit the message, via transceiver 332, to the console 304 or the portable device 306 to display the message on a display 334 or device screen of the portable device 306, respectively.


Various exemplary embodiments of aspects of smart surgical systems, for example how smart surgical systems choose to interact with each other, are described further in, for example, U.S. patent application Ser. No. 18/810,323 entitled “Method For Multi-System Interaction” filed Aug. 20, 2024, U.S. patent application Ser. No. 18/810,036 entitled “Adaptive Interaction Between Smart Healthcare Systems” filed Aug. 20, 2024, U.S. patent application Ser. No. 18/810,082 entitled “Control Redirection And Image Porting Between Surgical Systems” filed Aug. 20, 2024, U.S. patent application Ser. No. 18/809,890 entitled “Synchronized Motion Of Independent Surgical Devices” filed Aug. 20, 2024, U.S. patent application Ser. No. 18/810,133 entitled “Synchronization Of The Operational Envelopes Of Independent Surgical Devices” filed Aug. 20, 2024, U.S. patent application Ser. No. 18/810,170 entitled “Synchronized Motion Of Independent Surgical Devices To Maintain Relational Field Of Views” filed Aug. 20, 2024, U.S. patent application Ser. No. 18/810,208 entitled “Alignment And Distortion Compensation Of Reference Planes Used By Surgical Devices” filed Aug. 20, 2024, U.S. patent application Ser. No. 18/810,230 entitled “Shared Set Of Object Registrations For Surgical Devices Using Independent Reference Planes” filed Aug. 20, 2024, U.S. patent application Ser. No. 18/810,266 entitled “Coordinated Control Of Therapeutic Treatment Effects” filed Aug. 20, 2024, U.S. patent application Ser. No. 18/810,283 entitled “Functional Restriction Of A System Based On Information From Another Independent System” filed Aug. 20, 2024, U.S. patent application Ser. No. 18/809,960 entitled “Inter-Connectivity Of Data Flows Between Independent Smart Systems” filed Aug. 20, 2024, U.S. patent application Ser. No. 18/810,041 entitled “Inter-Connectivity Of Data Flows Between Independent Smart Systems” filed Aug. 20, 2024, U.S. patent application Ser. No. 18/810,119 entitled “Processing And Display Of Tissue Tension” filed Aug. 20, 2024, U.S. patent application Ser. No. 18/810,175 entitled “Situational Control Of Smart Surgical Devices” filed Aug. 20, 2024, U.S. patent application Ser. No. 18/810,222 entitled “Method For Activation Mode Determination Of An Energy Device” filed Aug. 20, 2024, U.S. patent application Ser. No. 18/810,274 entitled “Visual Data-Based Activation Mode Determination Of An Energy Device” filed Aug. 20, 2024, U.S. patent application Ser. No. 18/810,346 entitled “Electrical Data-Based Activation Mode Determination Of An Energy Device” filed Aug. 20, 2024, U.S. patent application Ser. No. 18/810,355 entitled “Mechanical Data-Based Activation Mode Determination Of An Energy Device” filed Aug. 20, 2024, U.S. patent application Ser. No. 18/810,361 entitled “Multi-Sourced Data-Based Activation Mode Determination Of An Energy Device” filed Aug. 20, 2024, U.S. patent application Ser. No. 18/810,407 entitled “Conflict Resolution For Activation Mode Determination Of An Energy Device” filed Aug. 20, 2024, and U.S. patent application Ser. No. 18/810,419 entitled “Controlling Patient Monitoring Devices” filed Aug. 20, 2024, which are each hereby incorporated by reference in their entireties.


Operating Intelligent Surgical Instruments

An intelligent surgical instrument, such as the surgical instrument 114 of FIG. 1, the surgical instrument 131 of FIG. 2, or the surgical instrument 302 of FIG. 5, can have an algorithm stored thereon, e.g., in a memory thereof, configured to be executable on board the intelligent surgical instrument, e.g., by a processor thereof, to control operation of the intelligent surgical instrument. In some embodiments, instead of or in addition to being stored on the intelligent surgical instrument, the algorithm can be stored on a surgical hub, e.g., in a memory thereof, that is configured to communicate with the intelligent surgical instrument.


The algorithm may be stored in the form of one or more sets of pluralities of data points defining and/or representing instructions, notifications, signals, etc. to control functions of the intelligent surgical instrument. In some embodiments, data gathered by the intelligent surgical instrument can be used by the intelligent surgical instrument, e.g., by a processor of the intelligent surgical instrument, to change at least one variable parameter of the algorithm. As discussed above, a surgical hub can be in communication with an intelligent surgical instrument, so data gathered by the intelligent surgical instrument can be communicated to the surgical hub and/or data gathered by another device in communication with the surgical hub can be communicated to the surgical hub, and data can be communicated from the surgical hub to the intelligent surgical instrument. Thus, instead of or in addition to the intelligent surgical instrument being configured to change a stored variable parameter, the surgical hub can be configured to communicate the changed at least one variable, alone or as part of the algorithm, to the intelligent surgical instrument and/or the surgical hub can communicate an instruction to the intelligent surgical instrument to change the at least one variable as determined by the surgical hub.


The at least one variable parameter may be among the algorithm's data points, e.g., may be included in instructions for operating the intelligent surgical instrument, and are thus each able to be changed by changing one or more of the stored pluralities of data points of the algorithm. After the at least one variable parameter has been changed, subsequent execution of the algorithm can be according to the changed algorithm. As such, operation of the intelligent surgical instrument over time can be managed for a patient to increase the beneficial results use of the intelligent surgical instrument by taking into consideration actual situations of the patient and actual conditions and/or results of the surgical procedure in which the intelligent surgical instrument is being used. Changing the at least one variable parameter is automated to improve patient outcomes. Thus, the intelligent surgical instrument can be configured to provide personalized medicine based on the patient and the patient's surrounding conditions to provide a smart system. In a surgical setting in which the intelligent surgical instrument is being used during performance of a surgical procedure, automated changing of the at least one variable parameter may allow for the intelligent surgical instrument to be controlled based on data gathered during the performance of the surgical procedure, which may help ensure that the intelligent surgical instrument is used efficiently and correctly and/or may help reduce chances of patient harm by harming a critical anatomical structure.


The at least one variable parameter can be any of a variety of different operational parameters. Examples of variable parameters include motor speed, motor torque, energy level, energy application duration, tissue compression rate, jaw closure rate, cutting element speed, load threshold, etc.


Various embodiments of operating surgical instruments are described further in, for example, U.S. Pat. App. Pub. No. 2023/0096268 entitled “Methods for Controlling Cooperative Surgical Instruments” filed Oct. 5, 2021, U.S. Pat. App. Pub. No. 2023/0097906 entitled “Surgical Methods Using Multi-Source Imaging” published Mar. 30, 2023, U.S. Pat. App. Pub. No. 2023/0095002 entitled “Surgical Methods Using Fiducial Identification And Tracking” published Mar. 30, 2023, U.S. Pat. App. Pub. No. 2023/0101750 entitled “Surgical Methods For Control Of One Visualization With Another” published Mar. 30, 2023, U.S. Pat. App. Pub. No. 2023/0100698 entitled “Methods For Controlling Cooperative Surgical Instruments” published Mar. 30, 2023, U.S. Pat. App. Pub. No. 2023/0103005 entitled “Methods for Controlling Cooperative Surgical Instruments” published Mar. 30, 2023, and U.S. Pat. App. Pub. No. 2023/0098538 entitled “Cooperative Access Hybrid Procedures” published Mar. 30, 2023, which are hereby incorporated by reference in their entireties.


Data Pipelines

As discussed herein, data may be transmitted from one point to another point, such as during a performance of a surgical procedure on a patient. The data may be transmitted from a source system to a destination system using a data pipeline.


As shown in FIG. 6A, a data pipeline 400 may move data from a source 402 to a destination 404 both physical and virtual (transient). In some data pipelines, the destination 404 may be called a “sink” or a “target.” Any time data is processed between point A and point B (or between multiple points such as points B, C, and D), there is a data pipeline 400 between those points. In general, the data pipeline 400 can include a set of tools and processes, which may be referred to as “steps” or “processing steps” used to automate the movement and transformation of data between the source 402 and the destination 404.


In some embodiments, the source 402 and the destination 404 are two different elements, such as a first element of a surgical system and a second element of a surgical system. The data from the source 402 may or may not be modified by the data pipeline 400 before being received at the destination 404. For example, the source 402 may be one of the wearable sensing system(s) 111, the environmental sensing system(s) 115, the robotic system(s) 113, the intelligent instrument(s) 114, or the human interface system(s) 112 of the surgical system 102 of FIGS. 1 and 3 and the destination 404 may be the surgical hub 106 of the surgical system 102 of FIGS. 1 and 3. For another example, the source 402 may be one of the wearable sensing system(s) 111, the environmental sensing system(s) 115, the intelligent instrument(s) 114, or the human interface system(s) 112 of one of the surgical systems 102, 103, 104 of FIG. 1 and the destination 404 may be the surgical hub 106 of another one of the surgical systems 102, 103, 104 of FIG. 1. For yet another example, the source 402 may be one of the wearable sensing system(s) 111, the environmental sensing system(s) 115, the robotic system(s) 113, the intelligent instrument(s) 114, or the human interface system(s) 112 of the surgical system 102 of FIGS. 1 and 3 and the destination 404 may be another one of the wearable sensing system(s) 111, the environmental sensing system(s) 115, the robotic system(s) 113, the intelligent instrument(s) 114, or the human interface system(s) 112 of the surgical system 102 of FIGS. 1 and 3. For still another example, the source 402 may be one of the surgical systems 102, 103, 104 of FIG. 1 and the destination 404 may be another one of the surgical systems 102, 103, 104 of FIG. 1.


In some embodiments, the source 402 and the destination 404 are the same element. The data pipeline 400 may thus be purely about modifying the data set between the source 402 and the destination 404.


As shown in FIG. 6B, the data pipeline 400 may include one or more data connectors 406 that extract data from the source 402 and load the extracted data into the destination 404. A plural “N” number of data connectors 406 are shown in FIG. 6B. In some embodiments, such as embodiments in which extract, transform, and load (ETL) processing of data is performed, as opposed to extract, load, and transform (ELT) processing of data, data may be transformed within the data pipeline 400 before the data is received by the destination 404. In other embodiments, such as embodiments in which ELT processing of data is performed, as opposed to ETL processing of data, the one or more data connectors 406 may simply load raw data to the destination 404. In some instances, light transformations may be applied to the data, such as normalizing and cleaning data or orchestrating transformations into models for analysts, before the destination 404 receives the data.


The data pipeline 400 can include physical elements like one or more wires or can include digital elements like one or more packets, network traffic, or internal processor paths/connections. Flexible data pipelines are portions of the overall system where redundant paths can be utilized and therefore data, e.g., a data stream, can be sent down one path, parsed between multiple parallel paths (to increase capacity) and these multiple paths can be flexibly adjusted by the system as necessary to accommodate changes in the volume and details of the data streams as necessary.


The data pipeline 400 can have a small code base that serves a very specific purpose. These types of applications are called microservices.


The data pipeline 400 can be a big data pipeline. There are five characteristics of big data: volume, variety, velocity, veracity, and value. Big data pipelines are data pipelines built to accommodate more than one of the five characteristics of big data. The velocity of big data makes it appealing to build streaming data pipelines for big data. Then data can be captured and processed in real time so some action can then occur. The volume of big data requires that data pipelines must be scalable, as the volume can be variable over time. In practice, there are likely to be many big data events that occur simultaneously or very close together, so the big data pipeline must be able to scale to process significant volumes of data concurrently. The variety of big data requires that big data pipelines be able to recognize and process data in many different formats-structured, unstructured, and semi-structured.


In general, an architecture design of a data pipeline, e.g., the data pipeline 400 of FIGS. 6A and 6B, can include interconnectivity between a first smart device and a second smart device, e.g., the source 402 and the destination 404 of FIGS. 6A and 6B. Data generated in one source system (e.g., the first smart device or the second smart device) may feed multiple data pipelines, which may have multiple other data pipelines dependent on their outputs.


The interconnectivity between the first smart device and the second smart device may be on a common/shared network, e.g., LAN, Wi-fi, powerline networking, MoCA networking, cellular (e.g., 4G, 5G, etc.), low power wide area network (LPWAN), Zigbee, Z-wave, etc.


The interconnectivity between the first smart device and the second smart device may be on a structured network. Traditionally, structured peer-to-peer (P2P) networks implement a distributed hash table (DHT). In order to route traffic efficiently through the network, nodes in a structured overlay must maintain lists of neighbors that satisfy specific criteria. This makes them less robust in networks with a high rate of churn (e.g., with large numbers of nodes frequently joining and leaving the network). DHT-based solutions may have a high cost of advertising/discovering resources and may have static and dynamic load imbalance.


The interconnectivity between the first smart device and the second smart device may be via cooperative networking. Cooperative networking utilizes a system that is a hybrid of a P2P network and a server-client network architecture, offloading serving to peers who have recently established direct interchanges of content.


The interconnectivity between the first smart device and the second smart device may be exclusive. For example, the interconnectivity may be exclusive via Bluetooth. For another example, the interconnectivity may be exclusive via network isolation, such as by using path isolation, a virtual private network (VPN), or a secure access service edge (SASE). The path isolation may include a software-defined LAN (SD-WAN). SD-WANs rely on a software and a centralized control function that can steer traffic across a WAN in a smarter way by handling traffic based on priority, security, and quality of service requirements. The VPN may involve creation of an independent secure network using common/shared open networks. Another network (a carrier network) is used to carry data, which is encrypted. The carrier network will see packets of the data, which it routes. To users of the VPN, it will look like the systems are directly connected to each other.


For example, with interconnectivity between the first smart device and the second smart device being exclusive in a surgical context, an operating room (OR) may have a surgical hub and an established network from a first vendor. In order to secure against hacking or data leakage, the network may be an encrypted common network which the first vendor supplies keys from. A surgical stapler in the OR may be from a second vendor that is different from the first vendor and that does not have the keys from the first vendor. The surgical stapler may want to link to other device(s) it relies on for functionality but does not want data leakage. An advanced energy generator from the second vendor with an accompanying smoke evacuator may also be in the OR and may form their own private network, such as by piggybacking on the first vendor network to create a second encrypted VPN routing through the first vendor network as a primary network or by forming an independent wireless network for bi-direction communication between the advanced energy generator and the smoke evacuator. The surgical stapler may want to communicate with the advanced energy generator, e.g., so the surgical stapler may retrieve updated software from the advanced energy generator, receive tissue properties information from the advanced energy generator, log data for exportation, and receive energy from the advanced energy generator and apply the energy to tissue, but not want to communicate with the smoke evacuator, e.g., because the surgical stapler performs no smoke evacuation. The surgical stapler and a communication backplane of the advanced energy generator may therefore form an isolated network with only the surgical stapler (first smart device) and the advanced energy generator (second smart device) able to communicate via the isolated network and with the surgical hub able to manage the data pipeline between the surgical stapler and the advanced energy generator.


In general, one or more steps may be performed along a data pipeline, e.g., the data pipeline 400 of FIGS. 6A and 6B. The steps in the data pipeline may include data transformation, data augmentation, data enrichment, data filtering, data grouping, data aggregating, and running algorithms against the data.


The data aggregation may include segmentation of data into buckets (e.g., decomposition of a procedure into sub-steps), data fusion and interfacing, and mixing real-time data streams with archived data streams. Various embodiments of data aggregation are described further in, for example, U.S. Pat. App. Pub. No. 2022/0104896 entitled “Interactive Information Overlay On Multiple Surgical Displays” published Apr. 7, 2022, U.S. Pat. App. Pub. No. 2019/0206564 entitled “Method For Facility Data Collection And Interpretation” published Jul. 4, 2019, and U.S. Pat. App. Pub. No. 2024/0221937 entitled “Method For Advanced Algorithm Support” published Jul. 4, 2024, which are hereby incorporated by reference in their entireties.


In one embodiment, mixing real-time data streams with archived data streams may include, in a surgical context, pre-operative data/imaging evaluation. The evaluation may include displaying of static preoperative scan(s), overlaying of video with aligned 3D model, and registering a virtual view to a camera view.


In one embodiment, the display of static pre-operative scan may include alignment based on surgeon (or other HCP) position, for example, where the surgeon (or other HCP) is standing.


In one embodiment, registering the virtual view to the camera view may include identifying organs in a video and triangulating with camera location and/or getting a camera location in reference to a coordinate system. For example, during performance of a surgical procedure, a camera location may be acquired with respect to a trocar by 3D tracking of the trocar, camera insertion in the trocar (e.g., insertion depth and/or insertion angle, and/or determination of what trocar is being used for the camera. An example of insertion depth is a marking on a shaft of the trocar, such as on a graphical scale or a color gradient. Examples of insertion angle in a trocar are 3D trocar orientation and 3D angle of attack.


In one embodiment, the pre-operative data/imaging evaluation may include using a machine learning (ML) algorithm to reviewing preoperative scans of a patient to identify any abnormalities. A cloud-based source may be used for augmented reality (AR) using cloud-based data for surgical procedure planning.


In one embodiment, an ML algorithm may be used in an initial planning stage, e.g., initial planning for a surgical procedure to be performed on a patient. Preoperative scans may be used to facilitate surgical path planning. If the initial scans detect anomalies or diseased tissues, as analyzed by the ML algorithm, the anomalies or diseased tissues may be relayed to the surgeon for the upcoming surgical procedure and a new surgical task order may be suggested based on how previous surgeons handled these problems. The relayed information to the surgeon may also include a recommended inventory list to have on hand based on this initial improved surgical task order.


For example, during preoperative scans for a sleeve gastrectomy, a small hernia may be discovered. This hernia may be highlighted during the surgical planning step and the surgeon may be asked if the surgeon wants to include a hernial repair in the initial sleeve gastrectomy plan. Based on the surgeon's answer, the hernial repair will be added into the surgical task order for an affirmative surgeon answer and the overall inventory for this case will be updated to include relevant items for the hernial repair added into the surgical task order. During performance of the sleeve gastrectomy, a ML algorithm may be used to detect diseased tissue or surgical anomalies. If a diseased tissue is discovered, the diseased tissue may be highlighted on a screen, e.g., on a HID, and a cutting path/angle may be recommended to avoid the tissue or make the tissue state more manageable. These recommendations may be based on how surgeons previously, successfully, handled these situations. If a surgical anomaly is discovered, the system may either automatically update the task order or require the surgeon to give a verbal command (or other command) to update the task order and highlight the required additional inventory on the circulators screen. For foreign bodies (such as bougies) that may be discovered, the foreign body may be highlighted on the screen and a cutting path may be included to provide an ample margin around the foreign body, assuming the foreign body is anticipated. If the foreign body is not anticipated, the foreign body may be highlighted to draw the surgeon's (and/or other HCP's) attention to it.


In one embodiment, the pre-operative data/imaging evaluation may include a cloud comparison of scans periodically taken through time for anatomic changes of that time to indicate possible operative complications. A cloud-based source may be used for augmented reality (AR) using preoperative scans to enhance return surgeries.


Looking at a difference between current and previous surgical scans may help inform the surgeon and/or other HCP and improve patient outcomes. This information can be used in various ways, for example for disease detection, informing surgical task planning, and/or informing previous surgical success and healing.


With respect to disease detection, current and historical scans can be used to determine if various disease states or abnormalities have evolved between surgeries. One case where this could be particularly useful is cancer detection. If a scan initially picks up an abnormal growth for a patient but the patient's HCP decides that it is benign but decides to flag it for caution, a follow-up scan nay confirm that the abnormality is benign or not. The scan may also automatically highlight areas of concern (tissue growth) that were not flagged by the HCP initially but could be areas of concern.


With respect to informing surgical task planning, information about previous surgeries (e.g., potential areas of scare tissue, previously seen difficult tissue, etc.) can help facilitate surgical step and path planning. This information can also be used during the surgery to display areas of scarring, changes of tissue from previous surgeries that might need to be examined, foreign bodies, and/or new adhesions.


With respect to informing previous surgical success and healing, data from various scans over time can be used to determine how successful patients were recovering or had recovered from previous surgeries. This information may be used by surgeons (and/or other HCPs) to help plan future procedures, assess previous work, and/or facilitate quicker patient recovery.


Data development may be performed as a step in a data pipeline and may include one or more of data modeling, database layout and configuration, implementation, data mapping, and correction.


Various data may be communicated using a data pipeline, e.g., the data pipeline 400 of FIGS. 6A and 6B, such as data from a local data source, data from a remote data source, and synthetically generated data.


Data from a local data source may include data collected by, used by, or resulting from the operation of aspects of the local data source, e.g., data gathered using a sensor (e.g., temperature data gathered using a temperature sensor, force data gathered using a force sensor, pressure measured using a pressure sensor, etc.), still and/or video image data (e.g., data gathered by a camera, etc.), operational parameters of a surgical instrument (e.g., energy level, energy type, motor current, cutting element speed, etc.), surgical instrument identification (e.g., instrument type, instrument serial number, etc.), etc.


Data from a local data source may have metadata, which may reflect aspects of a data stream, a device configuration, and/or system behavior that define information about the data. For example, metadata may include an auxiliary data location that is shared by two interconnected systems, e.g., first and second robotic systems, etc., to create a single “brain” instead of two distinct ones. Each of the interconnected systems may creates a copy of its memory system and introduce it to the combined system or “collective. Both of the interconnected systems may now use this new area for data exchange, for uni-directional communication, and to directly command control systems. The new combined system may become primary and the individual robotic system's memory areas may become secondary memory areas until the systems are “unpaired,” e.g., are no longer interconnected.


Data from a local data source may include a data stream that is monitored by at least one other system. In this way, data collected by, used by or resulting from the operation of aspects of one system may be sourced to another system (the monitoring system).


In one embodiment, application programming interfaces (APIs) may be used to communicate data from a local source.


In one embodiment, data may be communicated from a local source in response to occurrence of a trigger event. In one embodiment, the trigger event is a digital trigger event. For example, in a surgical context, the trigger event may be a surgical instrument changing orientation after being in a predetermined static position, such as when the surgical instrument “wakes up.” For another example, in a surgical context, the trigger event may be a system's power or signal interruption, e.g., communicating data after the interruption has been resolved. For yet another example, in a surgical context, the trigger event may be a change in system status, capacity, or connectivity. For still another example, in a surgical context, the trigger event may be quality, calibrations, or conversion factors of a surgical instrument.


Data from a remote data source may include data collected by, used by, or resulting from the operation of aspects of the remote data source. One example of a remote data source includes a database, such as a relational database or an NoSQL database. Examples of data in a relational database relevant in a surgical context can include inventory, sterile services process status, billing code, patient records (PHI), and previous procedure data.


One example of data from a remote data source, in a surgical context, includes a procedure plan, e.g., a plan for a surgical procedure to be performed on a patient. The procedure plan data can include, for example, instrument selection, port placement, adjuncts needed for devices, OR timing and local imaging needs, procedural steps, staff number and skill composition, and patient positioning.


Another example of data from a remote data source, in a surgical context, includes pre-operative imaging, such as a CT full body scan, external ultrasound, MRI, etc.


Another example of data from a remote data source includes software parameter updates, such as software parameter updates streaming from a cloud computing system. The software parameter updates can include, for example, original equipment manufacturer (OEM) updates to a device's operational aspects, e.g., updated basic input/output system (BIOS) controls, calibrations, updates on capabilities (e.g., recalls, limits/expansion of use, indications, contra-indications, etc.), etc.


Another example of data from a remote data source includes gold standard of care or outcomes improvement data, such as gold standard of care or outcomes improvement data from a cloud computing system. Gold standard of care or outcomes improvement data can include, for example, improved techniques of device use and/or device combinations.


In one embodiment, Apache® Hadoop®, which is an open source software framework, may be used for distributed processing of data across computer systems.


Examples of types of synthetically generated data may include synthetic text, media (e.g., video, image, sound, etc.), tabular data, and calculated continuous stream of data. The calculated continuous stream of data may be randomly generated (bracketed by extremes thresholds) or may be based off another stream or real continuous data stream that is modified to fit the stream limits of the expected synthetic stream. Reasons for using synthetically generated data can include for training data streams, because of missing data from an expected system that would otherwise draw a device error but is not relevant to the operation of the device or other dependent device, for data streams designed to verify the operational of the transforms or mathematic algorithms, for data streams intended to either verify security or prevent fraud/inauthenticity, for consecutive timing data for redaction of real-time from the relational data of the systems, for creations of trending data for replacement of legal compliance regulated data streams (e.g., either they produce datasets from partially synthetic data, where they replace only a selection of the dataset with synthetic data), and/or for a sudden but anticipatable/explainable change in a data source's feed which is being used as a closed loop control for a destination.


For example of use of synthetically generated data in a surgical context in which a surgical procedure is being performed on a patient, a PO2 sensor (data source) on the patient's finger may be being used as a means for controlling a closed loop feed or O2 through a ventilator (data destination). The ventilator also has an internal closed loop on CO2 outlet concentration, but since O2 blood saturation is the desired fixed relationship to O2 supplementation level, the ventilator is using the PO2 sensor from the patient monitoring system. There may be an abrupt change in the O2 level as measured by the PO2 sensor. The ventilator has two choices: ether switch to the trailing indicator or CO2 which has not had an abrupt change or examine other data sources to try to explain the O2 shift. When compared to the patient's core body temperature measure it may be discovered that the patient's temperature has dropped across a 1.5° C. below normal threshold that usually induces vasoconstriction limiting blood flow to the body's extremities. The PO2 measure by its metadata is known by the ventilator to be a finger monitor and therefore on an extremity. Further comparison over time may show the O2 measure fairly constant before the shift and then fairly constant after the shift as well, reinforcing the idea that this is the vasoconstriction that induced the shift. The ventilator may then create a synthetic data stream based on the shift data pattern and behavior that compensates for the vasoconstriction shift so the ventilator can continue on the primary linked feeds but using a modified synthetic or “calculated” data stream based of a real stream. For example, current body temperature control systems, such as a Bair Hugger™ device, are open-loop user settable heat gradient controlled systems but are affected by local temperature and environment.


Broadcast Versus Peer-to-Peer Communication

In various aspects, the present disclosure provides methods, devices, and systems for managing data pipelines with broadcast versus peer-to-peer (P2P) communication. In a surgical context, managing data pipelines with broadcast versus P2P communication may allow for a data pipeline, such as the data pipeline 400 of FIGS. 6A and 6B, to be managed during a performance of a surgical procedure on a patient so sending and receiving systems, e.g., source and destination systems, both can differentiate between broadcast and specific address exchange of data.


Managing data pipelines with broadcast versus P2P communication may include network management and interaction. In one embodiment, the network management and interaction may include A recipient of data via a network and any other system with the capacity of listening to the network may have a conditional capability of individualized or globalized data discovery. A sender of data via the network may also have a capacity of sending data to a single location on the network or to broadcasting the data to all locations on the network. The differentiation between these two capabilities of speaking or listening may be dependent on situational implications to one system or the patient. The listener (listening agent) may shift between isolated listening on the network to broadcast listening on the network based on a utilization resulting in unanticipated events or data to identify why the shift has occurred. The talking system (talker) may shift to a broadcast header implying all systems attached to the network should listen if the talking system determines a condition that is a high risk of affecting the other systems or the welfare of the patient allowing all systems on the network to display to their user (which may be the same user or different users) a warning and to take appropriate local action to minimize the situation the broadcast is issuing on their control loop systems.


For example, if a local system cannot be affected perhaps the local system does not care to listen. If a source system is about to create a potentially global effecting event, e.g., monopolar activation by an energy device for one example in a surgical context, the source system may broadcast it is about to create a potentially global effecting event. If the local system detects an unusual event it can listen to all data from all sources trying to resolve the cause.


In one embodiment, managing data pipelines with broadcast versus P2P communication may include managing how distributed processing and data pipe paths interact. The managing may include making real-time changes in hardware to redistribute processing with changes in data pipelines. For example, real-time FPGA reconfiguring may be performed to change both data pipeline architecture and processing location.


In one embodiment of making real-time changes in hardware, a configurable hardware solution (such as an FPGA) may allow data pipelines to be re-routed. These data pipelines may be routed permanently based on a specific configuration or re-routed dynamically based on the data that is being utilized at a point in time. This can allow a processor to then be allocated to the required functionality as is required.


For example, in routing data pipelines permanently based on a specific configuration, the same configurable hardware solution may be manufactured and built into every tower or rack being assembled. Based on the units connected to the system, the hardware (e.g., FPGA) detects that system and then routes its data pipelines most appropriately to optimize that specific configuration. Alternatively, the same hardware solution may be loaded with firmware for that specific configuration as well at the point of manufacture.



FIG. 7A illustrates one embodiment of data pipelines routed permanently based on a specific configuration. In this illustrated embodiment, which is in a surgical context, an FPGA 900 has managed data pipelines by routing a first data pipeline 902 between a visualization system 904 and a processor of a first surgical hub processor 906, a second data pipeline 908 between a hub operating room (OR) orchestration system 910 and a second surgical hub processor 912, and a third data pipeline 914 between an energy system 916 and a third surgical hub processor 916.


For example, in re-routing data pipelines dynamically, processing and corresponding data pipelines may be re-allocated and re-routed based on the needs at the point in time. In this case, during use of a high processing intensive system, such as an advanced visualization system, a large number of mathematical calculations must all be performed in parallel. As a result, the system may temporarily suspend other processes and reallocate the processes to that one specific system (e.g., the advanced visualization system). This allows the system to more efficiently use resources for these specific tasks and then resume them later on.



FIG. 7B illustrates one embodiment of data pipelines re-routed dynamically. In this illustrated embodiment, which is in a surgical context, an FPGA 1000 may manage data pipelines by dynamically re-routing a data pipeline 1002 between a visualization system 1004 and first, second, and third surgical hub processors 1006, 1008, 1010.


In one embodiment of making real-time changes in hardware, an element controls how the data pipelines and processing interact. For example, the element may be an algorithm of the hardware, e.g., an FPGA. The algorithm, when executed by the hardware, may determine a data type coming into the hardware, the data pipeline required, and the processing firmware in order to consume that data appropriately. Being able to understand if a processor is already available to do that, and if not, update an available processor to accomplish it.


For another example, the element may be a central system of the hardware may be a central system including one unit within the entire system that controls how data pipelines and management are performed.


For another example, the element may be distributed control in which any system can interrupt the system and allow for control to be used for its own purposes. The interrupt is potentially dependent on priority or criticality.


In one embodiment of making real-time changes in hardware, dynamic implementation of transformations may be made within real time allocations of the data pipelines and processing, which may allow for efficient allocation of hardware resources. Certain transformations may not make sense to perform within a processing unit. As a result, transformations may make more sense to perform within the hardware distribution layer.


The dynamic implementation of transformations may include localized transformations in which each processor may have transformations allocated specifically to it. This may serve as an extension of parallel processing capabilities, but for mathematical processes that are more efficiently done in hardware rather than done in software by the processor itself.



FIGS. 7C and 7D illustrate embodiments of localized transformations for the systems of FIGS. 7A and 7B, respectively. As shown in FIG. 7C, transformations occur locally at the FPGA 900 along each of the first, second, and third data pipelines 902, 908,914. As shown in FIG. 7D, transformations along the data pipeline 1002 occur locally at the FPGA 1000 for each of the three processors 1004, 1006, 1008.


The dynamic implementation of transformations may include single stage to multiple processors in which the re-configurable hardware layer, e.g., an FPGA, transforms data that is intended for multiple processors in one single location rather than the transformation taking place in multiple different processing locations.



FIG. 7E illustrates one embodiment of dynamic implementation of transformations for the system of FIG. 7B. As shown in FIG. 7E, transformations along the data pipeline 1002 occur in one single location at the FPGA 1000.


In making real-time changes in hardware, analysis performed on all available datasets on a single computer may not be fast enough or may be impossible due to operational infrastructure requirements (such as storage space or memory). By distributing the processing over multiple computers, the hardware requirements per computer can be decreased and total processing time can also be lowered as each computer operates in parallel. While taking a mainframe approach, e.g., a single high-performance computer, may be possible, it is often not as cost-effective as a distributed approach. Such distributed processing and analysis are commonly done through distributed processing pipelines.


For example, due to changes in the environment, external services may become unavailable or a new data source may provide data in a different format. The need may thus arise for a transformed data stream to be switched to an un-transformed data stream, and the main system (e.g., system performing the managing and including an FPGA) may have insufficient resources to handle the data and transform it down stream. The switch may be done in real-time during operation in a procedure.


For another example, due to changes in business model and goals, different statistics may be calculated based on the same data as a result of business industrial demands.


For another example, due to upgrading of processing models or parameters, there may be a need to fix errors in the data pipeline code to provide new functionality, to introduce more accurate algorithms, or to tweak and tune algorithm parameters for better results.


A first approach that may be used to update distributed processing pipelines requires stopping a running data pipeline and then starting a new updated version. It is not always appropriate or possible, such as in the case of permanent monitoring and controlling systems that need to be operational 24/7 or for batch processing pipelines that are in the middle of a long-term computation. In these cases, the resulting downtime or loss of progress may not be afforded, or it could simply not be desirable.


A second approach that may be used to update distributed processing pipelines includes executing a new updated version in parallel with the old version and taking over processing when the new data pipeline is ready. If the processing resources required for a data pipeline are significant, running a new data pipeline in parallel is not always an option because of the limited infrastructure available or excessive extra costs required for it.


Flexibility may be added to distributed pipelines by updating variables of a data pipeline as discussed further, for example, in Albers et al., “Adaptive On-the-Fly Changes in Distributed Processing Pipelines,” Front Big Data, 2021 Nov. 26, 4:666174.


In one embodiment, managing data pipelines with broadcast versus P2P communication may include managing when would a single peer-to-peer need to change to multiple broadcast to two peers simultaneously. In other words, managing active switching between P2P and broadcast network topologies in which sending and receiving systems choose between broadcast and localized communication.


Broadcasting requires more local resources of all the receiving systems. While this is the most efficient for a single smart system to identify what is available to utilize, it is very inefficient to transmit entire streams this way unless all systems, or at least multiple systems, can utilize the streams. With limited bandwidth and resources, a new system may initially broadcast its capabilities or even its data stream but then be told by a system that can use the data, or even a surgical hub or other system managing the data pipeline, to switch from broadcast to a peer-to-peer or directed pact feed method to limit the other systems on the network from having to interact or deal with the excessive data presentation.


In one embodiment, changing to multiple broadcast to two peers simultaneously may include dynamic integration of a plurality of smart systems. An existing smart system may have formed a singular connection between two points, as that was all that was previously required. However, as the smart system expands, or is integrated with a second smart system, the efficiency and required latency of the smart systems requires the same information to be simultaneously sent to multiple places. As a result, assuming that broadcasting is saying the same thing to multiple sources, the system may switch from a peer-to-peer relationship to a broadcast network relationship.


For example, in a surgical context, during performance of a surgical procedure, a first monitor, e.g., a HID, may be in a room (e.g., an OR) for one HCP's visualization of a surgical site by displaying images from a visualization and thereafter a second monitor, e.g., a HID, may be brought into the room to allow improve visualization of the surgical site for another HCP. Broadcasting of the visualization system may thus change from P2P broadcasting to the first monitor to multiple broadcast to the first and second monitors.


One example of a trigger for dynamic integration of a plurality of smart systems may be recovery to re-integrate a lost smart system. For example, a peer may have dropped from an existing P2P connection. Once the P2P connection has been lost, the system attempts to preserve functionality by shifting to a broadcast network.


Another example of a trigger for dynamic integration of a plurality of smart systems may be cooperating between existing smart systems. For example, in a surgical context, a surgeon and/or other HCP may want to pair a device a single time. When the surgeon and/or other HCP pairs a smart peripheral to a smart central system, a P2P network is established. However, the smart central system has been pre-configured to work with another smart system and thus informs the peripheral to switch to broadcast to send data to the other established systems.


For example, FIG. 8A illustrates one embodiment of first and second single star networks 1100, 1102. The first network 1100 includes a first central system 1104 paired with each of first and second peripheral devices 1106, 1108. The second network 1102 includes a second central system 1110 paired with each of third and fourth peripheral devices 1112, 1114. FIG. 8B illustrates the first and second networks 1100, 1102 reconfigured with broadcasting in which the first central system 1104 is now also in communication with the third and fourth peripheral devices 1112, 1114 and the second central system 1110 is now also in communication with the first and second peripheral devices 1106, 1108.


In one embodiment, changing to multiple broadcast to two peers simultaneously may include predefined switching between message relationships. A global broadcast emergency message may be sent, with subsequent peer to peer based on message type.


For example, in a surgical context, a surgical hub may have change to broadcasting to every smart device in the system during performance of a surgical procedure now that the surgical procedure has reached step #15 of the surgical procedure. In addition, because step #15 of the surgical procedure has been reached, the surgical hub may tell an energy generator (one of the smart devices) its electrical current limits, may broadcast to an endocutter (another one of the smart devices) that 45 mm is what is required, and it may inform a pulse oximeter (another one of the smart devices) that its alarm settings need to be configured based on expected body temperature and sedation levels.


In one embodiment, changing to multiple broadcast to two peers simultaneously may include switching of data dependent upon data types. For example, is a smart system determines information is protected, the smart system, e.g., an FPGA of the smart system, may allocate the information to a more protected internal data pipeline to allow it to get to its destination.


In one embodiment, changing to multiple broadcast to two peers simultaneously may include allocation of partial messages to multiple streams within a broadcast in which the same data is being sent to multiple locations. FIGS. 9A and 9B illustrate embodiments of allocation of partial messages to multiple streams within a broadcast.


In one embodiment, changing to multiple broadcast to two peers simultaneously may include simultaneous relationships of broadcasting and peer-to-peer in which there may be simultaneous transfer and/or broadcast of data to different devices for security. Multiple bands may be used, such as sending a key over Bluetooth and with data transferred over Wi-fi, or all keys may be sent P2P, where data is then broadcast to everyone and key is required to decrypt the data.


In one embodiment, changing to multiple broadcast to two peers simultaneously may include either a source of a data stream or a destination of the data stream controlling between transmitting to all systems or listening to all systems. For example, the control may be data source based decisions of when to shift from private/direct/local communications to broadcast communications. In this mode, a source of a data feed may detect some emergency or irregularity or potential upcoming perturbance and notify all stations of the potential for interfering with their use of the data, the viability of the data, or an emergency aspect of the data. For another example, recipient based conditional listening may be used to find situational context. A receiving system may switch from listening to data directed to a smart system to listening to all traffic available to the system. If the smart system is experience issues with its control loop or the data it is receiving appears suspect it could switch to start listening to all traffic or more traffic being exchanged from the same source and other systems or between all systems in order to add context to its data feed in an effort to identify the cause of its error or resolve the instability.



FIG. 10 illustrates one embodiment of a method 1200 of managing data pipelines with broadcast versus P2P communication. The method 1200 may include changing 1202, during performance of a surgical procedure on a patient, a first surgical system between an individualized listening state, in which the first surgical system is a destination of dataflows from a first selected subset of a plurality of surgical systems, and a globalized listening state, in which the first surgical system is a destination of dataflows from all of the plurality of surgical systems. The plurality of surgical systems may include the first surgical system, a second surgical system, and at least one additional surgical system. The plurality of surgical systems may be configured to all be in use during the performance of the surgical procedure.


The first surgical system may change 1202 from the individualized listening state to the globalized listening state in response to the first surgical system detecting occurrence of an anomalous event during the performance of the surgical procedure. Further, the first surgical system may change from the globalized listening state to the individualized listening state in response to resolution of the anomalous event.


The method 1200 may also include changing 1204, during the performance of the surgical procedure, a second surgical system between an individualized broadcast state, in which the second surgical system is a source of dataflows to a second selected subset of the plurality of surgical systems, and a globalized broadcast state, in which the second surgical system is a source of dataflows to all of the plurality of surgical systems. The dataflows to the first surgical system and the dataflows from the second surgical system may include data collected in relation to and in real time with the performance of the surgical procedure and may include at least one of patient data, surgical procedure data, and surgical instrument data.


The second surgical system may change 1204 from the individualized broadcast state to the globalized broadcast state in response to the second surgical system having an emergency broadcast to transmit to all of the plurality of surgical systems, and, in the globalized broadcast state, the second surgical system may transmit the emergency broadcast to all of the plurality of surgical systems. Further, the second surgical system may change from the globalized broadcast state to the individualized broadcast state after the transmission of the emergency broadcast, receipt of the emergency broadcast at the plurality of surgical systems may cause each of the plurality of surgical systems that includes a display to show an alert on the display, the emergency broadcast may inform the plurality of surgical systems of a step reached in the surgical procedure being performed and may inform at least one of the plurality of surgical systems of a setting needed at the at least one of the plurality of surgical systems during the performance of the step of the surgical procedure, and/or the first surgical system may determine, based on the emergency broadcast, whether to include the second surgical system in the first selected subset of the plurality of surgical systems.


Computer Systems

A computer system may be suitable for use in implementing computerized components described herein. In broad overview of an exemplary embodiment, the computer system may include a processor configured to perform actions in accordance with instructions, and memory devices configured to store instructions and data. The processor may be in communication, via a bus, with the memory (and/or incorporates the memory) and with at least one network interface controller with a network interface for connecting to external devices, e.g., a computer system (such as a mobile phone, a tablet, a laptop, a server, etc.). The processor may also be configured to be in communication, via the bus, with any other processor(s) of the computer system and with any I/O devices at an I/O interfaces. Generally, a processor will execute instructions received from the memory. In some embodiments, the computer system can be configured within a cloud computing environment, a virtual or containerized computing environment, and/or a web-based microservices environment.


In more detail, the processor can be any logic circuitry that processes instructions, e.g., instructions fetched from the memory. In many embodiments, the processor may be an embedded processor, a microprocessor unit (MPU), microcontroller unit (MCU), field-programmable gate array (FPGA or FGPA), or special purpose processor. The computer system can be based on any processor, e.g., suitable digital signal processor (DSP), or set of processors, capable of operating as described herein. In some embodiments, the processor can be a single core or multi-core processor. In some embodiments, the processor can be composed of multiple processors.


The memory can be any device suitable for storing computer readable data. The memory can be a device with fixed storage or a device for reading removable storage media. Examples include all forms of non-volatile memory, media and memory devices, semiconductor memory devices (e.g., EPROM, EEPROM, SDRAM, flash memory devices, and all types of solid state memory), magnetic disks, and magneto optical disks. A computer system can have any number of memory devices.


The memory also can include a cache memory, which is generally a form of high-speed computer memory placed in close proximity to the processor for fast read/write times. In some embodiments, the cache memory is part of, or on the same chip as, the processor.


The network interface controller may be configured to manage data exchanges via the network interface. The network interface controller may handle the physical, media access control, and data link layers of the Open Systems Interconnect (OSI) model for network communication. In some embodiments, some of the network interface controller's tasks may be handled by the processor. In some embodiments, the network interface controller may be part of the processor. In some embodiments, a computer system may have multiple network interface controllers. In some implementations, the network interface may be a connection point for a physical network link, e.g., an RJ 45 connector. In some embodiments, the network interface controller may support wireless network connections and an interface port may be a wireless Bluetooth transceiver. Generally, a computer system can be configured to exchange data with other network devices via physical or wireless links to a network interface. In some embodiments, the network interface controller may implement a network protocol such as LTE, TCP/IP Ethernet, IEEE 802.11, IEEE 802.16, Bluetooth, or the like.


In some uses, the I/O interface may support an input device and/or an output device. In some uses, the input device and the output device may be integrated into the same hardware, e.g., as in a touch screen. In some uses, such as in a server context, there may be no I/O interface or the I/O interface may not be used. In some uses, additional other components may be in communication with the computer system, e.g., external devices connected via a universal serial bus (USB). In some embodiments, an I/O device may be incorporated into the computer system, e.g., a touch screen on a tablet device.


In some implementations, a computer device may include an additional device such as a co-processor, e.g., a math co-processor configured to assist the processor with high precision or complex calculations.


CONCLUSION

Certain illustrative implementations have been described to provide an overall understanding of the principles of the structure, function, manufacture, and use of the systems, devices, and methods disclosed herein. One or more examples of these implementations have been illustrated in the accompanying drawings. Those skilled in the art will understand that the systems, devices, and methods specifically described herein and illustrated in the accompanying drawings are non-limiting illustrative implementations and that the scope of the present invention is defined solely by the claims. The features illustrated or described in connection with one illustrative implementation may be combined with the features of other implementations. Such modifications and variations are intended to be included within the scope of the present invention. Further, in the present disclosure, like-named components of the implementations generally have similar features, and thus within a particular implementation each feature of each like-named component is not necessarily fully elaborated upon.


Approximating language, as used herein throughout the specification and claims, may be applied to modify any quantitative representation that can permissibly vary without resulting in a change in the basic function to which it is related. Accordingly, a value modified by a term or terms, such as “about,” “approximately,” and “substantially,” are not to be limited to the precise value specified. In at least some instances, the approximating language may correspond to the precision of an instrument for measuring the value. Here and throughout the specification and claims, range limitations may be combined and/or interchanged, such ranges are identified and include all the sub-ranges contained therein unless context or language indicates otherwise.


One skilled in the art will appreciate further features and advantages of the invention based on the above-described implementations. Accordingly, the present application is not to be limited by what has been particularly shown and described, except as indicated by the appended claims. All publications and references cited herein are expressly incorporated herein by reference in their entirety for all purposes.

Claims
  • 1. A surgical data management system, comprising: a plurality of surgical systems including a first surgical system, a second surgical system, and at least one additional surgical system; wherein:the plurality of surgical systems are configured to all be in use during performance of a surgical procedure on a patient;the first surgical system is configured to, during the performance of the surgical procedure, change between an individualized listening state, in which the first surgical system is a destination of dataflows from a first selected subset of the plurality of surgical systems, and a globalized listening state, in which the first surgical system is a destination of dataflows from all of the plurality of surgical systems;the second surgical system is configured to, during the performance of the surgical procedure, change between an individualized broadcast state, in which the second surgical system is a source of dataflows to a second selected subset of the plurality of surgical systems, and a globalized broadcast state, in which the second surgical system is a source of dataflows to all of the plurality of surgical systems; andthe dataflows to the first surgical system and the dataflows from the second surgical system include data collected in relation to and in real time with the performance of the surgical procedure and including at least one of patient data, surgical procedure data, and surgical instrument data.
  • 2. The surgical data management system of claim 1, wherein the first surgical system is configured to change from the individualized listening state to the globalized listening state in response to the first surgical system detecting occurrence of an anomalous event during the performance of the surgical procedure.
  • 3. The surgical data management system of claim 2, wherein the first surgical system is configured to change from the globalized listening state to the individualized listening state in response to resolution of the anomalous event.
  • 4. The surgical data management system of claim 1, wherein the first surgical system is configured to transmit information to all of the plurality of surgical systems indicative of dataflows to be broadcast from the first surgical system; and the second surgical system is configured to determine, based on the information received from the first surgical system, whether to include the first surgical system in the second selected subset of the plurality of surgical systems.
  • 5. The surgical data management system of claim 1, wherein the second surgical system is configured to change from the individualized broadcast state to the globalized broadcast state in response to the second surgical system having an emergency broadcast to transmit to all of the plurality of surgical systems; and in the globalized broadcast state, the second surgical system is configured to transmit the emergency broadcast to all of the plurality of surgical systems.
  • 6. The surgical data management system of claim 5, wherein the second surgical system is configured to change from the globalized broadcast state to the individualized broadcast state after the transmission of the emergency broadcast.
  • 7. The surgical data management system of claim 5, wherein receipt of the emergency broadcast at the plurality of surgical systems is configured to cause each of the plurality of surgical systems that includes a display to show an alert on the display.
  • 8. The surgical data management system of claim 5, wherein the emergency broadcast is configured to inform the plurality of surgical systems of a step reached in the surgical procedure being performed; and the emergency broadcast is configured to inform at least one of the plurality of surgical systems of a setting needed at the at least one of the plurality of surgical systems during the performance of the step of the surgical procedure.
  • 9. The surgical data management system of claim 5, wherein the first surgical system is configured to determine, based on the emergency broadcast, whether to include the second surgical system in the first selected subset of the plurality of surgical systems.
  • 10. The surgical data management system of claim 1, wherein, in the individualized broadcast state, the second surgical system uses discrete addressing; and in the globalized broadcast state, the second surgical system uses multiple distribution addressing.
  • 11. The surgical data management system of claim 1, wherein the second surgical system is configured to transmit information to all of the plurality of surgical systems indicative of dataflows to be broadcast from the second surgical system; and the first surgical system is configured to determine, based on the information received from the second surgical system, whether to include the second surgical system in the first selected subset of the plurality of surgical systems.
  • 12. The surgical data management system of claim 1, wherein the plurality of surgical systems includes a first display configured to display surgical information during the performance of the surgical procedure on the patient; the first display is included in the second selected subset of the plurality of surgical systems; andin response to a second display being added to the plurality of surgical systems configured to all be in use during the performance of the surgical procedure on the patient, the second surgical system is configured to add the second display to the second selected subset of the plurality of surgical systems.
  • 13. The surgical data management system of claim 1, wherein the second surgical system is configured to broadcast to all of the plurality of surgical systems a notice indicative of an upcoming change of the second surgical system from the individualized broadcast state to the globalized broadcast state.
  • 14. The surgical data management system of claim 1, further comprising a surgical hub configured to be communicatively coupled with all of the plurality of surgical systems during the performance of the surgical procedure on the patient; and in response to the surgical hub being communicatively coupled with a one of the plurality of surgical systems, the surgical hub is configured to at least one of: add the one of the plurality of surgical systems to the first selected subset of the plurality of surgical systems, andadd the one of the plurality of surgical systems to the second selected subset of the plurality of surgical systems.
  • 15. The surgical data management system of claim 14, wherein the surgical hub is configured to add the one of the plurality of surgical systems to the first selected subset of the plurality of surgical systems in response to detection that the one of the plurality of surgical systems is pre-configured for use with the first surgical system, and the surgical hub is configured to add the one of the plurality of surgical systems to the second selected subset of the plurality of surgical systems in response to detection that the one of the plurality of surgical systems is pre-configured for use with the second surgical system.
  • 16. The surgical data management system of claim 1, further comprising a surgical hub configured to be communicatively coupled with the plurality of surgical systems during the performance of the surgical procedure, configured to cause the change for the first surgical system, and configured to cause the change for the second surgical system.
  • 17. The surgical data management system of claim 1, wherein the first surgical system is configured to cause the change for the first surgical system, and the second surgical system is configured to cause the change for the second surgical system.
  • 18. The surgical data management system of claim 1, wherein each of the plurality of surgical systems is one of a hospital network, a database, a surgical instrument, or a surgical cart.
  • 19. A surgical data management system, comprising: a processor; anda memory storing instructions that, when executed by the processor, cause the processor to perform operations comprising: change, during performance of a surgical procedure on a patient, a first surgical system between an individualized listening state, in which the first surgical system is a destination of dataflows from a first selected subset of a plurality of surgical systems, and a globalized listening state, in which the first surgical system is a destination of dataflows from all of the plurality of surgical systems, wherein the plurality of surgical systems includes the first surgical system, a second surgical system, and at least one additional surgical system, and the plurality of surgical systems are configured to all be in use during the performance of the surgical procedure, andchange, during the performance of the surgical procedure, a second surgical system between an individualized broadcast state, in which the second surgical system is a source of dataflows to a second selected subset of the plurality of surgical systems, and a globalized broadcast state, in which the second surgical system is a source of dataflows to all of the plurality of surgical systems;wherein the dataflows to the first surgical system and the dataflows from the second surgical system include data collected in relation to and in real time with the performance of the surgical procedure and include at least one of patient data, surgical procedure data, and surgical instrument data.
  • 20. A computer-implemented method, comprising: changing, during performance of a surgical procedure on a patient, a first surgical system between an individualized listening state, in which the first surgical system is a destination of dataflows from a first selected subset of a plurality of surgical systems, and a globalized listening state, in which the first surgical system is a destination of dataflows from all of the plurality of surgical systems, wherein the plurality of surgical systems includes the first surgical system, a second surgical system, and at least one additional surgical system, and the plurality of surgical systems are all in use during the performance of the surgical procedure; andchanging, during the performance of the surgical procedure, a second surgical system between an individualized broadcast state, in which the second surgical system is a source of dataflows to a second selected subset of the plurality of surgical systems, and a globalized broadcast state, in which the second surgical system is a source of dataflows to all of the plurality of surgical systems;wherein the dataflows to the first surgical system and the dataflows from the second surgical system include data collected in relation to and in real time with the performance of the surgical procedure and including at least one of patient data, surgical procedure data, and surgical instrument data.
CROSS REFERENCE TO RELATED APPLICATIONS

The present application claims priority to U.S. Provisional Patent Application No. 63/603,031 entitled “Smart Surgical Systems” filed Nov. 27, 2023, which is hereby incorporated by reference in its entirety. The subject matter of the present application is related to the following patent applications filed on Nov. 26, 2024, which are hereby incorporated by reference in their entireties: U.S. application Ser. No. 18/960,006 entitled “Methods For Smart Surgical Systems,” U.S. application Ser. No. 18/960,032 entitled “Data Flow Management Between Surgical Systems,” U.S. application Ser. No. 18/960,047 entitled “Mapping Data Pipelines For Surgical Systems,” U.S. application Ser. No. 18/960,070 entitled “Data Lifecycle Management For Surgical Systems,” U.S. application Ser. No. 18/960,081 entitled “Data Transformation For Surgical Systems,” U.S. application Ser. No. 18/960,094 entitled “Geofencing For Surgical Systems,” U.S. application Ser. No. 18/960,107 entitled “Information Discrimination For Surgical Instruments,” and U.S. application Ser. No. 18/960,117 entitled “Adaptation Of Data Pipelines For Surgical Systems.”

Provisional Applications (1)
Number Date Country
63603031 Nov 2023 US