DATA LIFECYCLE MANAGEMENT FOR SURGICAL SYSTEMS

Information

  • Patent Application
  • 20250174315
  • Publication Number
    20250174315
  • Date Filed
    November 26, 2024
    a year ago
  • Date Published
    May 29, 2025
    7 months ago
  • CPC
    • G16H10/00
  • International Classifications
    • G16H10/00
Abstract
Surgical systems and related computer-implemented methods are provided, including during performance of a surgical procedure on a patient, receiving, at a first surgical system, a first dataflow from a second surgical system, the first dataflow including first data regarding a measured patient parameter that the first surgical system is configured to use in performing a function during the performance of the surgical procedure. The computer-implemented methods including determining that a trigger event occurred during the performance of the surgical procedure such that a sum of a first bandwidth of the first dataflow and a second bandwidth of a second dataflow exceeds an available bandwidth, and in response to determining that the trigger event occurred, and during the performance of the surgical procedure, adjusting at least one of the first and second dataflows such that the sum of the first and second bandwidths does not exceed the available bandwidth.
Description
FIELD

The present disclosure relates generally to smart surgical devices, systems, and methods.


BACKGROUND

Surgical operations and environments have benefited from advances in technology. These advances include upgraded equipment, therapeutics, techniques, and more, which has resulted in more favorable outcomes for both patients and healthcare personnel. Further benefits can be realized through the continued advancement of technology and the continued integration of such advancements into the operations and environments.


Computers are more and ubiquitous in everyday life, and as the power of computers and computing systems increases, larger quantities of data can be processed in ways that render meaningful results and information for end users. This type of big data processing has immense benefits to surgical operations and environments as well, as more information can be distilled to meaningful assistance to a user, such as a surgeon, for use and reliance during surgical operations. Ultimately, this additional information for the user can result in even more favorable outcomes for both patients and healthcare personnel.


SUMMARY

In general, smart surgical devices, systems, and methods are provided.


In one embodiment, a computer-implemented method is provided that includes receiving, at a destination surgical system from a source surgical system, and during a performance of a surgical procedure on a patient, a first data stream including first data collected during the performance of the surgical procedure. The first data stream has associated first metadata. The first metadata includes an indication of a retention period for the first data stream. The first data includes information regarding at least two of patient data, surgical procedure data, and surgical instrument data. The method also includes storing the received first data stream at the destination surgical system. The method also includes, after the storing of the received first data stream, modifying the retention period for the first data stream based on: an aspect of the first data, a usefulness of the first data, a score of the first data, or a situational change of the source surgical system.


The method can have any number of variations. For example, the retention period can be modified based on the aspect of the first data, and the aspect of the first data can include one of a current value of the first data, a predicted future value of the data, a taxonomy of the first data, an age of the first data, and a change in a relationship of the patient with the patient's health care provider.


For another example, the retention period can be modified based on the aspect of the first data, and the aspect of the first data can include a current value of the first data and a predicted future value of the data.


For yet another example, the retention period can be modified based on the aspect of the first data, the aspect of the first data can include presence of patient-identifying data in the first data, the first data can be pruned of the patient-identifying data after expiration of the modified retention period, and a remainder of the first data can remain stored until after expiration of the retention period.


For another example, the retention period can be modified based on the aspect of the first data, and the aspect of the first data can include a taxonomy of the first data.


For yet another example, the retention period can be modified based on the usefulness of the first data, and the usefulness of the first data can include one of: a frequency of use of the first data by the destination surgical system, and relevance of the first data to patient outcome following the performance of the surgical procedure.


For still another example, the retention period can be modified based on the situational change of the source surgical system, and the situational change can include one of: a recall of the source surgical system, a health hazard evaluation of the source surgical system, a failure rate of the source surgical system, and a failure severity of the source surgical system.


For yet another example, the retention period can be modified based on a score of the first data, and the method can also include determining the score of the first data based on at least one of a quality of the first data, a protection level of the first data, a type of the first data, and a taxonomy of the first data.


For still another example, the method can also include creating the first data stream at the source surgical system, and the first metadata can include the indication of the retention period for the first data stream is generated when the first data stream is created.


For another example, the first data stream can remain stored at the destination surgical system until after expiration of the modified retention period.


For still another example, the method can also include, after the modifying of the retention period, modifying the retention period again based on: an aspect of the first data, a usefulness of the first data, a score of the first data, or a situational change of the destination surgical system.


For another example, the destination surgical system can includes a surgical hub, and the source surgical system can be one of a hospital network, a database, a surgical instrument, or a surgical cart. Further, the method can also include receiving, at the surgical hub from a second source surgical system, and during the performance of the surgical procedure on the patient, a second data stream including second data collected during the performance of the surgical procedure; the second data stream can have associated second metadata, the second metadata can include an indication of a retention period for the second data stream, the second data can include information regarding at least two of patient data, surgical procedure data, and surgical instrument data; the method can also include storing the received second data stream at the surgical hub; and the method can also include, after the storing of the received second data stream, modifying the retention period for the second data stream based on: an aspect of the second data, a usefulness of the second data, a score of the second data, or a situational change of the second source surgical system.


For another example, each of the source and destination surgical systems can be one of a hospital network, a database, a surgical instrument, or a surgical cart, e.g., both being surgical instruments, one being a surgical instrument and the other being a database, etc.


In another embodiment, a surgical data management system is provided that includes a processor and a memory storing instructions that, when executed by the processor, cause the processor to perform operations including: receiving, at a destination surgical system from a source surgical system, and during a performance of a surgical procedure on a patient, a first data stream including first data collected during the performance of the surgical procedure. The first data stream has associated first metadata. The first metadata includes an indication of a retention period for the first data stream. The first data includes information regarding at least two of patient data, surgical procedure data, and surgical instrument data. The operations also include storing of the received first data stream at the destination surgical system, and, after the storing of the received first data stream, modifying of the retention period for the first data stream based on: an aspect of the first data, a usefulness of the first data, a score of the first data, or a situational change of the source surgical system.


The surgical data management system can vary in any number of ways. For example, the destination surgical system can include a surgical hub, and the source surgical system can be one of a hospital network, a database, a surgical instrument, or a surgical cart. Further, the operations can also include receiving, at the surgical hub from a second source surgical system, and during the performance of the surgical procedure on the patient, a second data stream including second data collected during the performance of the surgical procedure; the second data stream can have associated second metadata, the second metadata can include an indication of a retention period for the second data stream, the second data can include information regarding at least two of patient data, surgical procedure data, and surgical instrument data; the operations can also include storing the received second data stream at the surgical hub; and the operations can also include, after the storing of the received second data stream, modifying the retention period for the second data stream based on: an aspect of the second data, a usefulness of the second data, a score of the second data, or a situational change of the second source surgical system.


For another example, the retention period can be modified based on the aspect of the first data, and the aspect of the first data can include one of a current value of the first data, a predicted future value of the data, a taxonomy of the first data, an age of the first data, and a change in a relationship of the patient with the patient's health care provider.


For another example, the retention period can be modified based on the aspect of the first data, and the aspect of the first data can include a current value of the first data and a predicted future value of the data.


For yet another example, the retention period can be modified based on the aspect of the first data, the aspect of the first data can include presence of patient-identifying data in the first data, the first data can be pruned of the patient-identifying data after expiration of the modified retention period, and a remainder of the first data can remain stored until after expiration of the retention period.


For another example, the retention period can be modified based on the aspect of the first data, and the aspect of the first data can include a taxonomy of the first data.


For yet another example, the retention period can be modified based on the usefulness of the first data, and the usefulness of the first data can include one of: a frequency of use of the first data by the destination surgical system, and relevance of the first data to patient outcome following the performance of the surgical procedure.


For still another example, the retention period can be modified based on the situational change of the source surgical system, and the situational change can include one of: a recall of the source surgical system, a health hazard evaluation of the source surgical system, a failure rate of the source surgical system, and a failure severity of the source surgical system.


For yet another example, the retention period can be modified based on a score of the first data, and the operations can also include determining the score of the first data based on at least one of a quality of the first data, a protection level of the first data, a type of the first data, and a taxonomy of the first data.


For still another example, the operations can also include creating the first data stream at the source surgical system, and the first metadata can include the indication of the retention period for the first data stream is generated when the first data stream is created.


For another example, the first data stream can remain stored at the destination surgical system until after expiration of the modified retention period.


For still another example, the operations can also include, after the modifying of the retention period, modifying the retention period again based on: an aspect of the first data, a usefulness of the first data, a score of the first data, or a situational change of the destination surgical system.


For another example, each of the source and destination surgical systems can be one of a hospital network, a database, a surgical instrument, or a surgical cart, e.g., both being surgical instruments, one being a surgical instrument and the other being a database, etc.


In another embodiment, a computer-implemented method is provided that includes receiving, at a destination surgical system from a source surgical system, and during a performance of a surgical procedure on a patient, a first data stream including first data collected during the performance of the surgical procedure. The first data stream has associated first metadata. The first metadata includes an indication of a retention period for the first data stream. The first data includes information regarding at least two of patient data, surgical procedure data, and surgical instrument data. The method also includes storing of the received first data stream at the destination surgical system. The method also includes, after the storing of the received first data stream, modifying of the retention period for the first data stream based on: an aspect of the first data, a usefulness of the first data, a score of the first data, or a situational change of the source surgical system.


The method can vary in any number of ways. For example, the destination surgical system can include a surgical hub, and the source surgical system can be one of a hospital network, a database, a surgical instrument, or a surgical cart. Further, the method can also include receiving, at the surgical hub from a second source surgical system, and during the performance of the surgical procedure on the patient, a second data stream including second data collected during the performance of the surgical procedure; the second data stream can have associated second metadata, the second metadata can include an indication of a retention period for the second data stream, the second data can include information regarding at least two of patient data, surgical procedure data, and surgical instrument data; the method can also include storing the received second data stream at the surgical hub; and the method can also include, after the storing of the received second data stream, modifying the retention period for the second data stream based on: an aspect of the second data, a usefulness of the second data, a score of the second data, or a situational change of the second source surgical system.


For another example, the retention period can be modified based on the aspect of the first data, and the aspect of the first data can include one of a current value of the first data, a predicted future value of the data, a taxonomy of the first data, an age of the first data, and a change in a relationship of the patient with the patient's health care provider.


For yet another example, the retention period can be modified based on the aspect of the first data, and the aspect of the first data can include a current value of the first data and a predicted future value of the data.


For still another example, the retention period can be modified based on the aspect of the first data, the aspect of the first data can include presence of patient-identifying data in the first data, the first data can be pruned of the patient-identifying data after expiration of the modified retention period, and a remainder of the first data can remain stored until after expiration of the retention period.


For another example, the retention period can be modified based on the aspect of the first data, and the aspect of the first data can include a taxonomy of the first data.


For yet another example, the retention period can be modified based on the usefulness of the first data, and the usefulness of the first data can include one of: a frequency of use of the first data by the destination surgical system, and relevance of the first data to patient outcome following the performance of the surgical procedure.


For still another example, the retention period can be modified based on the situational change of the source surgical system, and the situational change can include one of: a recall of the source surgical system, a health hazard evaluation of the source surgical system, a failure rate of the source surgical system, and a failure severity of the source surgical system.


For yet another example, the retention period can be modified based on a score of the first data, and the method can also include determining the score of the first data based on at least one of a quality of the first data, a protection level of the first data, a type of the first data, and a taxonomy of the first data.


For still another example, the method can also include creating the first data stream at the source surgical system, and the first metadata can include the indication of the retention period for the first data stream is generated when the first data stream is created.


For another example, the first data stream can remain stored at the destination surgical system until after expiration of the modified retention period.


For still another example, the method can also include, after the modifying of the retention period, modifying the retention period again based on: an aspect of the first data, a usefulness of the first data, a score of the first data, or a situational change of the destination surgical system.


For another example, each of the source and destination surgical systems can be one of a hospital network, a database, a surgical instrument, or a surgical cart, e.g., both being surgical instruments, one being a surgical instrument and the other being a database, etc.





BRIEF DESCRIPTION OF DRAWINGS

The present invention is described by way of reference to the accompanying figures which are as follows:



FIG. 1 is a schematic view of one embodiment of a computer-implemented surgical system;



FIG. 2 is a perspective view of one embodiment of a surgical system in one embodiment of a surgical operating room;



FIG. 3 is a schematic view of one embodiment of a surgical hub paired with various systems;



FIG. 4 is a schematic view of one embodiment of a situationally aware surgical system;



FIG. 5 is a perspective view of one embodiment of a surgical instrument and one embodiment of a surgical system that includes the surgical instrument;



FIG. 6A is a schematic view of a data pipeline architecture;



FIG. 6B is an expanded schematic view of the data pipeline architecture of FIG. 6A;



FIG. 7 is a graph depicting one embodiment of tags for particular data by year; and



FIG. 8 is a flowchart of one embodiment of a method of data lifecycle management.





It is noted that the drawings are not necessarily to scale. The drawings are intended to depict only typical aspects of the subject matter disclosed herein, and therefore should not be considered as limiting the scope of the disclosure.


DETAILED DESCRIPTION

Certain exemplary embodiments will now be described to provide an overall understanding of the principles of the structure, function, manufacture, and use of the devices, systems, and methods disclosed herein. One or more examples of these embodiments are illustrated in the accompanying drawings. A person skilled in the art will understand that the devices, systems, and methods specifically described herein and illustrated in the accompanying drawings are non-limiting exemplary embodiments and that the scope of the present invention is defined solely by the claims. The features illustrated or described in connection with one exemplary embodiment may be combined with the features of other embodiments. Such modifications and variations are intended to be included within the scope of the present invention.


Further, in the present disclosure, like-named components of the embodiments generally have similar features, and thus within a particular embodiment each feature of each like-named component is not necessarily fully elaborated upon. Additionally, to the extent that linear or circular dimensions are used in the description of the disclosed systems, devices, and methods, such dimensions are not intended to limit the types of shapes that can be used in conjunction with such systems, devices, and methods. A person skilled in the art will recognize that an equivalent to such linear and circular dimensions can easily be determined for any geometric shape. A person skilled in the art will appreciate that a dimension may not be a precise value but nevertheless be considered to be at about that value due to any number of factors such as manufacturing tolerances and sensitivity of measurement equipment. Sizes and shapes of the systems and devices, and the components thereof, can depend at least on the size and shape of components with which the systems and devices will be used.


In general, a health data management system may include an interactive smart system that includes data origination facets, movement, architecture and management, and transformation and lifecycle to determine mechanisms by which smart systems talk to each other. The health data management system may include a data stack that defines handling of data from beginning to end. A data stack may include data sources, data pipelines, data transformation/modeling systems, and data storage systems that define end-to-end handling of data. In one embodiment, the health data management system may include a plurality of smart medical systems that are configured to perform one or more medical operations. The health data management system may utilize the data stack to control and manage data flow to the different smart device systems. In one embodiment, the health data management system may control and manage the data flow for managing a patient or performing a medical procedure, for example, providing surgical assistance during performance of a surgical procedure by one or more smart medical systems (also referred to herein as “surgical systems”).


Surgical Systems


FIG. 1 shows one embodiment of a computer-implemented surgical system 100. The surgical system 100 may include one or more surgical systems (e.g., surgical sub-systems) 102, 103, 104. As in this illustrated embodiment, the surgical system 100 may include first, second, and third surgical systems 102, 103, 104, but may instead include another number, e.g., one, two, four, etc.


The first surgical system 102 is discussed herein as a general representative of the surgical systems 102, 103, 104. For example, the surgical system 102 may include a computer-implemented interactive surgical system. For example, the surgical system 102 may include a surgical hub 106 and/or a computing device 116 in communication with a cloud computing system 108, for example, as described in FIG. 2.


The cloud computing system 108 may include at least one remote cloud server 109 and at least one remote cloud storage unit 110. Embodiments of surgical systems 102, 103, 104 may include one or more wearable sensing systems 111, one or more environmental sensing systems 115, one or more robotic systems (also referred to herein as “robotic surgical systems”) 113, one or more intelligent instruments 114 (e.g., smart surgical instruments), one or more human interface systems 112, etc. A “human interface system” is also referred herein as a “human interface device.” The wearable sensing system(s) 111 may include one or more HCPs (“health care professional” or “health care personnel”) sensing systems and/or one or more patient sensing systems. The environmental sensing system(s) 115 may include one or more devices used for measuring one or more environmental attributes, for example, as further described in FIG. 2. The robotic system(s) 113 may include a plurality of devices used for performing a surgical procedure, for example, as further described in FIG. 2.


Examples of robotic surgical systems include the Ottava™ robotic-assisted surgery system (Johnson & Johnson of New Brunswick, NJ), da Vinci® surgical systems (Intuitive Surgical, Inc. of Sunnyvale, CA), the Hugo™ robotic-assisted surgery system (Medtronic PLC of Minneapolis, MN), the Versius® surgical robotic system (CMR Surgical Ltd of Cambridge, UK), and the Monarch® platform (Auris Health, Inc. of Redwood City, CA). Embodiments of various robotic surgical systems and using robotic surgical systems are further described in, for example, U.S. Pat. App. Pub. No. 2018/0177556 entitled “Flexible Instrument Insertion Using An Adaptive Force Threshold” filed Dec. 28, 2016, U.S. Pat. App. Pub. No. 2020/0000530 entitled “Systems And Techniques For Providing Multiple Perspectives During Medical Procedures” filed Apr. 16, 2019, U.S. Pat. App. Pub. No. 2020/0170720 entitled “Image-Based Branch Detection And Mapping For Navigation” filed Feb. 7, 2020, U.S. Pat. App. Pub. No. 2020/0188043 entitled “Surgical Robotics System” filed Dec. 9, 2019, U.S. Pat. App. Pub. No. 2020/0085516 entitled “Systems And Methods For Concomitant Medical Procedures” filed Sep. 3, 2019, U.S. Pat. No. 8,831,782 entitled “Patient-Side Surgeon Interface For A Teleoperated Surgical Instrument” filed Jul. 15, 2013, and Intl. Pat. Pub. No. WO 2014151621 entitled “Hyperdexterous Surgical System” filed Mar. 13, 2014, which are hereby incorporated by reference in their entireties.


The surgical system 102 may be in communication with the one or more remote servers 109 that may be part of the cloud computing system 108. In an example embodiment, the surgical system 102 may be in communication with the one or more remote servers 109 via an internet service provider's cable/FIOS networking node. In an example embodiment, a patient sensing system may be in direct communication with the one or more remote servers 109. The surgical system 102 (and/or various sub-systems, smart surgical instruments, robots, sensing systems, and other computerized devices described herein) may collect data in real-time and transfer the data to the cloud computing system 108 for data processing and manipulation, e.g., by the one or more remote servers 109. It will be appreciated that cloud computing may rely on sharing computing resources rather than having local servers or personal devices to handle software applications.


The surgical system 102 and/or a component therein may communicate with the one or more remote servers 109 via a cellular transmission/reception point (TRP) or a base station using one or more of the following cellular protocols: GSM/GPRS/EDGE (2G), UMTS/HSPA (3G), long term evolution (LTE) or 4G, LTE-Advanced (LTE-A), new radio (NR) or 5G, and/or other wired or wireless communication protocols. Various embodiments of cloud-based analytics that may be performed by the cloud computing system 108 are described further in, for example, U.S. Pat. App Pub. No. 2019/0206569 entitled “Method Of Cloud Based Data Analytics For Use With The Hub” published Jul. 4, 2019, which is hereby incorporated by reference in its entirety.


The surgical hub 106 may have cooperative interactions with one or more means of displaying an image, e.g., a display configured to display an image from a laparoscopic scope, etc., and information from one or more other smart devices and/or one or more sensing systems. The surgical hub 106 may interact with the one or more sensing systems, the one or more smart devices, and the one or more means of displaying an image. The surgical hub 106 may be configured to gather measurement data from the sensing system(s) and send notifications or control messages to the sensing system(s). The surgical hub 106 may send and/or receive information including notification information to and/or from the one or more human interface systems 112. The one or more human interface systems 112 may include one or more human interface devices (HIDs). The surgical hub 106 may send and/or receive notification information or control information to audio, display and/or control information to various devices that are in communication with the surgical hub 106.


In an exemplary embodiment, the one or more sensing systems may include the one or more wearable sensing system 111 (which may include one or more HCP sensing systems and/or one or more patient sensing systems) and/or the environmental sensing system(s) 115 shown in FIG. 1. The sensing system(s) may measure data relating to various biomarkers.


In an exemplary embodiment, the sensing system(s) may measure the biomarkers using one or more sensors, for example, photosensors (e.g., photodiodes, photoresistors), mechanical sensors (e.g., motion sensors), acoustic sensors, electrical sensors, electrochemical sensors, thermoelectric sensors, infrared sensors, etc. The sensor(s) may measure the biomarkers using one of more of the following sensing technologies: photoplethysmography, electrocardiogramalectroencephalography, colorimetry, impedimentary, potentiometry, amperometry, etc.


The biomarkers measured by the sensing system(s) may include, but are not limited to, sleep, core body temperature, maximal oxygen consumption, physical activity, alcohol consumption, respiration rate, oxygen saturation, blood pressure, blood sugar, heart rate variability, blood potential of hydrogen, hydration state, heart rate, skin conductance, peripheral temperature, tissue perfusion pressure, coughing and sneezing, gastrointestinal motility, gastrointestinal tract imaging, respiratory tract bacteria, edema, mental aspects, sweat, circulating tumor cells, autonomic tone, circadian rhythm, and/or menstrual cycle.


The biomarkers may relate to physiologic systems, which may include, but are not limited to, behavior and psychology, cardiovascular system, renal system, skin system, nervous system, gastrointestinal system, respiratory system, endocrine system, immune system, tumor, musculoskeletal system, and/or reproductive system. Information from the biomarkers may be determined and/or used by the computer-implemented surgical system 100, for example. The information from the biomarkers may be determined and/or used by the computer-implemented patient and the computer-implemented surgical system 100 to improve said systems and/or to improve patient outcomes, for example.


The sensing system(s) may send data to the surgical hub 106. The sensing system(s) may use one or more of the following radiofrequency (RF) protocols for communicating with the surgical hub 106: Bluetooth, Bluetooth Low-Energy (BLE), Bluetooth Smart, Zigbee, Z-wave, IPv6 Low-power wireless Personal Area Network (6LoWPAN), Wi-Fi, etc.


Various embodiments of sensing systems, biomarkers, and physiological systems are described further in, for example, U.S. Pat. App Pub. No. 2022/0233119 entitled “Method Of Adjusting A Surgical Parameter Based On Biomarker Measurements” filed published Jul. 28, 2022, which is hereby incorporated by reference in its entirety.


The sensing systems described herein may be employed to assess physiological conditions of a surgeon operating on a patient or a patient being prepared for a surgical procedure or a patient recovering after a surgical procedure. The cloud-based computing system 108 may be used to monitor biomarkers associated with a HCP (a surgeon, a nurse, etc.) or a patient in real-time and to generate surgical plans based at least on measurement data gathered prior to a surgical procedure, provide control signals to one or more surgical instruments during a surgical procedure, and notify a patient of a complication during a post-surgical period.


The cloud-based computing system 108 may be used to analyze surgical data. Surgical data may be obtained via the intelligent instrument(s) 114, the wearable sensing system(s) 111, the environmental sensing system(s) 115, the robotic system(s) 113, and/or the like in the surgical system 102. Surgical data may include tissue states to assess leaks or perfusion of sealed tissue after a tissue sealing and cutting procedure pathology data, including images of samples of body tissue, anatomical structures of the body using a variety of sensors integrated with imaging devices and techniques such as overlaying images captured by multiple imaging devices, image data, and/or the like. The surgical data may be analyzed to improve surgical procedure outcomes by determining if further treatment, such as the application of endoscopic intervention, emerging technologies, a targeted radiation, targeted intervention, and precise robotics to tissue-specific sites and conditions. Such data analysis may employ outcome analytics processing and using standardized approaches may provide beneficial feedback to either confirm surgical treatments and the behavior of the surgeon or suggest modifications to surgical treatments and the behavior of the surgeon.



FIG. 2 shows one embodiment of the surgical system 102 in one embodiment of a surgical operating room 135. As illustrated in FIG. 2, a patient is being operated on by one or more HCPs. The HCP(s) are being monitored by one or more HCP sensing systems 120 worn by the HCP(s). The HCP(s) and the environment surrounding the HCP(s) may also be monitored by one or more environmental sensing systems including, for example, one or more cameras 121, one or more microphones 122, and other sensors that may be deployed in the operating room. The one or more HCP sensing systems 120 and the environmental sensing systems may be in communication with the surgical hub 106, which in turn may be in communication with the one or more cloud servers 109 of the cloud computing system 108, as shown in FIG. 1. The one or more environmental sensing systems may be used for measuring one or more environmental attributes, for example, HCP position in the surgical theater, HCP movements, ambient noise in the surgical theater, temperature/humidity in the surgical theater, etc.


As illustrated in FIG. 2, a primary display 123 and one or more audio output devices (e.g., speakers 119) are positioned in a sterile field of the surgical system 102 to be visible to an operator at an operating table 124. In addition, a visualization/notification tower 126 is positioned outside the sterile field. The visualization/notification tower 126 may include a first non-sterile human interactive device (HID) 127 and a second non-sterile HID 129, which may be displays and may face away from each other. The display 123 and the HIDs 127, 129 may include a touch screen allowing a human to interface directly with the HID 127, 129. A human interface system, guided by the surgical hub 106, may be configured to utilize the display 123 and the HIDs 127, 129 to coordinate information flow to operators inside and outside the sterile field. In an exemplary embodiment, the surgical hub 106 may cause an HID (e.g., the primary display 123) to display a notification and/or information about the patient and/or a surgical procedure step. In an exemplary embodiment, the surgical hub 106 may prompt for and/or receive input from personnel in the sterile field or in the non-sterile area. In an exemplary embodiment, the surgical hub 106 may cause one or more non-sterile HIDs 127, 129 to display a snapshot of a surgical site, as recorded by an imaging device 130, while maintaining a live feed of the surgical site on one or more sterile HIDs, e.g., the primary HID 123. The snapshot on the non-sterile HID(s) 127, 129 can permit a non-sterile operator to perform a diagnostic step relevant to the surgical procedure, for example.


The surgical hub 106 may be configured to route a diagnostic input or feedback entered by a non-sterile operator at the visualization tower 126 to the primary display 123 within the sterile field, where it can be viewed by a sterile operator at the operating table. In an exemplary embodiment, the input can be in the form of a modification to the snapshot displayed on the non-sterile HID(s) 127, 129, which can be routed to the one or more sterile HIDs, e.g., the primary display 123, by the surgical hub 106.


Various embodiments of surgical hubs are further described in, for example, U.S. Pat. App. Pub. No. 2024/0221937 entitled “Method For Advanced Algorithm Support” published Jul. 4, 2024, U.S. Pat. App. Pub. No. 2024/0112768 entitled “Method For Health Data And Consent Management” published Apr. 4, 2024, U.S. Pat. App. Pub. No. 2024/0220763 entitled “Data Volume Determination For Surgical Machine Learning Applications” published Jul. 2, 2024, U.S. Pat. App. Pub. No. 2019/0206564 entitled “Method For Facility Data Collection And Interpretation” published Jul. 4, 2019, U.S. Pat. App. Pub. No. 2023/0026634 entitled “Surgical Data System And Classification” published Jan. 26, 2023, U.S. Pat. App. Pub. No. 2019/0201115 entitled “Aggregation And Reporting Of Surgical Hub Data” published Jul. 4, 2019, U.S. Pat. App. Pub. No. 2023/0372030 entitled “Automatic Compilation, Annotation, And Dissemination Of Surgical Data To Systems To Anticipate Related Automated Operations” published Nov. 23, 2023, U.S. Pat. App. Pub. No. 2022/0104896 entitled “Interactive Information Overlay On Multiple Surgical Displays” published Apr. 7, 2022, U.S. Pat. No. 11,304,699 entitled “Method For Adaptive Control Schemes For Surgical Network Control And Interaction” issued Apr. 19, 2022, U.S. Pat. No. 10,849,697 entitled “Cloud Interface For Coupled Surgical Devices” issued Dec. 1, 2020, U.S. Pat. App. Pub. No. 2022/0239577 entitled “Ad Hoc Synchronization Of Data From Multiple Link Coordinated Sensing Systems” published Jul. 28, 2022, U.S. Pat. App. Pub. No. 2023/0025061 entitled “Surgical Data System And Management” published Jan. 26, 2023, U.S. Pat. App. Pub. No. 2023/0023083 entitled “Method Of Surgical System Power Management, Communication, Processing, Storage And Display” published Jan. 26, 2023, U.S. Pat. App. Pub. No. 2019/0206556 entitled “Real-Time Analysis Of Comprehensive Cost Of All Instrumentation Used In Surgery Utilizing Data Fluidity To Track Instruments Through Stocking And In-House Processes” published Jul. 4, 2019, U.S. Pat. App. Pub. No. 2023/0096268 entitled “Methods for Controlling Cooperative Surgical Instruments” filed Oct. 5, 2021, U.S. Pat. App. Pub. No. 2019/0200844 entitled “Method Of Hub Communication, Processing, Storage And Display” filed Dec. 4, 2018, U.S. Pat. App. Pub. No. 2019/0200981 entitled “Method Of Compressing Tissue Within A Stapling Device And Simultaneously Displaying The Location Of The Tissue Within The Jaws” filed Dec. 4, 2018, U.S. Pat. App. Pub. No. 2019/0201046 entitled “Method For Controlling Smart Energy Devices” filed Dec. 4, 2018, U.S. Pat. App. Pub. No. 2019/0201114 entitled “Adaptive Control Program Updates For Surgical Hubs” filed Mar. 29, 2018, U.S. Pat. App. Pub. No. 2019/0201140 entitled “Surgical Hub Situational Awareness” filed Mar. 29, 2018, U.S. Pat. App. Pub. No. 2019/0206004 entitled “Interactive Surgical Systems With Condition Handling Of Devices And Data Capabilities” filed Mar. 29, 2018, U.S. Pat. App. Pub. No. 2019/0206555 entitled “Cloud-based Medical Analytics For Customization And Recommendations To A User” filed Mar. 29, 2018, U.S. Pat. No. 11,678,881 entitled “Spatial Awareness Of Surgical Hubs In Operating Rooms” filed Mar. 29, 2018, and U.S. Pat. App. Pub. No. 2019/0207857 entitled “Surgical Network Determination Of Prioritization Of Communication, Interaction, Or Processing Based On System Or Device Needs” filed Nov. 6, 2018, which are hereby incorporated by reference in their entireties.


As in the illustrated embodiment of FIG. 2, one or more surgical instruments 131 may be being used in the surgical procedure as part of the surgical system 102. The surgical hub 106 may be configured to coordinate information flow to at least one display showing the surgical instrument(s) 131. A diagnostic input or feedback entered by a non-sterile operator at the visualization tower 126 can be routed by the surgical hub 106 to the at least one display, e.g., the primary display 123, within the sterile field, where it can be viewed by the operator of the surgical instrument(s) 131.


Various embodiments of coordinating information flow and display and various embodiments of surgical instruments are described further in, for example, U.S. Pat. No. 11,937,769 entitled “Method Of Hub Communication, Processing, Storage And Display” issued Mar. 26, 2024, which is hereby incorporated by reference in its entirety.


Examples of surgical instruments include a surgical dissector, a surgical stapler, a surgical grasper, a surgical scope (e.g., an endoscope, a laparoscope, etc.), a surgical energy device (e.g., a mono-polar probe, a bi-polar probe, an ablation probe, an ultrasound device, an ultrasonic end effector, etc.), a surgical clip applier, etc.


Various embodiments of surgical instruments are described further in, for example, U.S. Pat. No. 11,723,642 entitled “Cooperative Access Hybrid Procedures” issued Aug. 14, 2023, U.S. Pat. App. Pub. No. 2013/0256377 entitled “Layer Comprising Deployable Attachment Members” filed Feb. 8, 2013, U.S. Pat. No. 8,393,514 entitled “Selectively Orientable Implantable Fastener Cartridge” filed Sep. 30, 2010, U.S. Pat. No. 8,317,070 entitled “Surgical Stapling Devices That Produce Formed Staples Having Different Lengths” filed Feb. 28, 2007, U.S. Pat. No. 7,143,925 entitled “Surgical Instrument Incorporating EAP Blocking Lockout Mechanism” filed Jun. 21, 2005, U.S. Pat. App. Pub. No. 2015/0134077 entitled “Sealing Materials for Use in Surgical Stapling” filed Nov. 8, 2013, U.S. Pat. App. Pub. No. 2015/0134076 entitled “Hybrid Adjunct Materials for Use in Surgical Stapling” filed Nov. 8, 2013, U.S. Pat. App. Pub. No. 2015/0133996 entitled “Positively Charged Implantable Materials and Method of Forming the Same” filed Nov. 8, 2013, U.S. Pat. App. Pub. No. 2015/0129634 entitled “Tissue Ingrowth Materials and Method of Using the Same” filed Nov. 8, 2013, U.S. Pat. App. Pub. No. 2015/0133995 entitled “Hybrid Adjunct Materials for Use in Surgical Stapling” filed Nov. 8, 2013, U.S. Pat. No. 9,913,642 entitled “Surgical Instrument Comprising a Sensor System” filed Mar. 26, 2014, U.S. Pat. No. 10,172,611 entitled “Adjunct Materials and Methods of Using Same in Surgical Methods for Tissue Sealing” filed Jun. 10, 2014, U.S. Pat. No. 8,989,903 entitled “Methods And Systems For Indicating A Clamping Prediction” filed Jan. 13, 2012, U.S. Pat. No. 9,072,535 entitled “Surgical Stapling Instruments With Rotatable Staple Deployment Arrangements” filed May 27, 2011, U.S. Pat. No. 9,072,536 entitled “Differential Locking Arrangements For Rotary Powered Surgical Instruments” filed Jun. 28, 2012, U.S. Pat. No. 10,531,929 entitled “Control Of Robotic Arm Motion Based On Sensed Load On Cutting Tool” filed Aug. 16, 2016, U.S. Pat. No. 10,709,516 entitled “Curved Cannula Surgical System Control” filed Apr. 2, 2018, U.S. Pat. No. 11,076,926 entitled “Manual Release For Medical Device Drive System” filed Mar. 21, 2018, U.S. Pat. No. 9,839,487 entitled “Method For Engaging Surgical Instrument With Teleoperated Actuator” filed Mar. 17, 2015, U.S. Pat. No. 10,543,051 entitled “Method For Engaging Surgical Instrument With Teleoperated Actuator” issued Jan. 28, 2020, U.S. Pat. No. 9,804,618 entitled “Systems And Methods For Controlling A Segmented Circuit” filed Mar. 25, 2014, U.S. Pat. No. 11,607,239 entitled “Systems And Methods For Controlling A Surgical Stapling And Cutting Instrument” filed Apr. 15, 2016, U.S. Pat. No. 10,052,044 entitled “Time Dependent Evaluation Of Sensor Data To Determine Stability, Creep, And Viscoelastic Elements Of Measures” filed Mar. 6, 2015, U.S. Pat. No. 9,439,649 entitled “Surgical Instrument Having Force Feedback Capabilities” filed Dec. 12, 2012, U.S. Pat. No. 10,751,117 entitled “Electrosurgical Instrument With Fluid Diverter” filed Sep. 23, 2016, U.S. Pat. No. 11,160,602 entitled “Control Of Surgical Field Irrigation” filed Aug. 29, 2017, U.S. Pat. No. 9,877,783 entitled “Energy Delivery Systems And Uses Thereof” filed Dec. 30, 2016, U.S. Pat. No. 11,266,458 entitled “Cryosurgical System With Pressure Regulation” filed Apr. 19, 2019, U.S. Pat. No. 10,314,649 entitled “Flexible Expandable Electrode And Method Of Intraluminal Delivery Of Pulsed Power” filed Aug. 2, 2012, U.S. Pat. App. Pub. No. 2023/0116781 entitled “Surgical Devices, Systems, And Methods Using Multi-Source Imaging” filed Oct. 5, 2021, U.S. Pat. App. Pub. No. 2023/0102358 entitled “Surgical Devices, Systems, And Methods Using Fiducial Identification And Tracking” filed Oct. 5, 2021, U.S. Pat. No. 10,413,373 entitled “Robotic Visualization And Collision Avoidance” filed Aug. 16, 2016, U.S. Pat. App. Pub. No. 2023/0077141 entitled “Robotically Controlled Uterine Manipulator” filed Sep. 21, 2021, and U.S. Pat. App. Pub. No. 2022/0273309 entitled “Stapler Reload Detection And Identification” filed May 16, 2022, which are hereby incorporated by reference herein in their entireties.


As shown in FIG. 2, the surgical system 102 can be used to perform a surgical procedure on the patient who is lying down on the operating table 124 in the surgical operating room 135. A robotic system 134 may be used in the surgical procedure as a part of the surgical system 102. The robotic system 134 may include a surgeon's console 136, a patient side cart 132 (surgical robot), and a surgical robotic hub 133. The patient side cart 132 can manipulate at least one removably coupled surgical instrument 137 through a minimally invasive incision in the body of the patient while the surgeon views the surgical site via the surgeon's console 136. An image of the surgical site can be obtained by the imaging device 130, which can be manipulated by the patient side cart 132 to orient the imaging device 130. The surgical robotic hub 133 can be used to process the images of the surgical site for subsequent display to the surgeon via the surgeon's console 136.


Various embodiments of robotic systems and various embodiments of surgical instruments are described further in, for example, U.S. Pat. No. 11,559,307 entitled “Method Of Robotic Hub Communication, Detection, And Control” issued Jan. 24, 2023, which is hereby incorporated by reference in its entirety.


The imaging device 130 may include at least one image sensor and one or more optical components. Suitable image sensors may include, but are not limited to, Charge-Coupled Device (CCD) sensors and Complementary Metal-Oxide Semiconductor (CMOS) sensors.


The optical components of the imaging device 130 may include one or more illumination sources and/or one or more lenses. The one or more illumination sources may be directed to illuminate portions of the surgical field. The one or more image sensors may receive light reflected or refracted from the surgical field, including light reflected or refracted from tissue and/or surgical instruments. The illumination source(s) may be configured to radiate electromagnetic energy in the visible spectrum as well as the invisible spectrum. The visible spectrum, sometimes referred to as the “optical spectrum” or the “luminous spectrum,” is the portion of the electromagnetic spectrum that is visible to (e.g., can be detected by) the human eye and may be referred to as “visible light” or simply “light.” A typical human eye will respond to wavelengths in air that range from about 380 nm to about 750 nm.


The invisible spectrum (e.g., the non-luminous spectrum) is the portion of the electromagnetic spectrum that lies below and above the visible spectrum (i.e., wavelengths below about 380 nm and above about 750 nm). The invisible spectrum is not detectable by the human eye. Wavelengths greater than about 750 nm are longer than the red visible spectrum, and they become invisible infrared (IR), microwave, and radio electromagnetic radiation. Wavelengths less than about 380 nm are shorter than the violet spectrum, and they become invisible ultraviolet, x-ray, and gamma ray electromagnetic radiation.


In various aspects, the imaging device 130 is configured for use in a minimally invasive procedure. Examples of imaging devices include an arthroscope, angioscope, bronchoscope, choledochoscope, colonoscope, cytoscope, duodenoscope, enteroscope, esophagogastro-duodenoscope (gastroscope), endoscope, laryngoscope, nasopharyngo-neproscope, sigmoidoscope, thoracoscope, and ureteroscope.


The imaging device 130 may employ multi-spectrum monitoring to discriminate topography and underlying structures. A multi-spectral image is one that captures image data within specific wavelength ranges across the electromagnetic spectrum. The wavelengths may be separated by filters or by the use of instruments that are sensitive to particular wavelengths, including light from frequencies beyond the visible light range, e.g., infrared (IR) and ultraviolet (UV). Spectral imaging can allow extraction of additional information that the human eye fails to capture with its receptors for red, green, and blue. Multi-spectrum monitoring can be a useful tool in relocating a surgical field after a surgical task is completed to perform one or more of the previously described tests on the treated tissue. It is axiomatic that strict sterilization of the operating room and surgical equipment is required during any surgery. The strict hygiene and sterilization conditions required in a “surgical theater,” e.g., an operating or treatment room, necessitate the highest possible sterility of all medical devices and equipment. Part of that sterilization process is the need to sterilize anything that comes in contact with the patient or penetrates the sterile field, including the imaging device 130 and its attachments and components. It will be appreciated that the sterile field may be considered a specified area, such as within a tray or on a sterile towel, that is considered free of microorganisms, or the sterile field may be considered an area, immediately around a patient, who has been prepared for a surgical procedure. The sterile field may include the scrubbed team members, who are properly attired, and all furniture and fixtures in the area.


Various embodiments of multi-spectral imaging are described further in, for example, U.S. Pat. No. 11,937,769 entitled “Method Of Hub Communication, Processing, Storage And Display” filed Dec. 4, 2018, which is hereby incorporated by reference in its entirety.


The wearable sensing system(s) 111 illustrated in FIG. 1 may include the one or more HCP sensing systems 120 as shown in FIG. 2. The one or more HCP sensing systems 120 may include sensing system(s) to monitor and detect a set of physical states and/or a set of physiological states of an HCP. An HCP may be a surgeon or one or more healthcare personnel assisting the surgeon or other healthcare service providers in general. The HCP sensing system 120 may send the measurement data associated with a set of biomarkers and data associated with a physical state of the surgeon to the surgical hub 106 for further processing. In an exemplary embodiment, an HCP sensing system 120 may measure a set of biomarkers to monitor the heart rate of an HCP. In an exemplary embodiment, an HCP sensing system 120 worn on a surgeon's wrist (e.g., a watch or a wristband) may use an accelerometer to detect hand motion and/or shakes and determine the magnitude and frequency of tremors.


The environmental sensing system(s) 115 shown in FIG. 1 may send environmental information to the surgical hub 106. In an exemplary embodiment, the environmental sensing system(s) 115 may include a camera 121 for detecting hand/body position of an HCP. The environmental sensing system(s) 115 may include one or more microphones 122 for measuring ambient noise in the surgical theater. Other environmental sensing system(s) 115 may include one or more devices, for example, a thermometer to measure temperature, a hygrometer to measure humidity of the surroundings in the surgical theater, etc. The surgeon biomarkers may include one or more of the following: stress, heart rate, etc. The environmental measurements from the surgical theater may include ambient noise level associated with the surgeon or the patient, surgeon and/or staff movements, surgeon and/or staff attention level, etc. The surgical hub 106, alone or in communication with the cloud computing system 108, may use the surgeon biomarker measurement data and/or environmental sensing information to modify control algorithms of hand-held instruments or the averaging delay of a robotic interface, for example, to minimize tremors.


The surgical hub 106 may use the surgeon biomarker measurement data associated with an HCP to adaptively control the one or more surgical instruments 131. For example, the surgical hub 106 may send a control program to one of the one or more surgical instruments 131 to control the surgical instrument's actuators to limit or compensate for fatigue and use of fine motor skills. The surgical hub 106 may send the control program based on situational awareness and/or the context on importance or criticality of a task. The control program may instruct the instrument to alter operation to provide more control when control is needed.



FIG. 3 shows an embodiment of the surgical system 102 including the surgical hub 106. The surgical hub 106 may be paired with, via a modular control, the one or more wearable sensing systems 111, the one or more environmental sensing systems 115, the one or more human interface systems 112, the one or more robotic systems 113, and the one or more intelligent instruments 114. As in this illustrated embodiment, the surgical hub 106 may include a display 148, an imaging module 149, a generator module 150 (e.g., an energy generator), a communication module 156, a processor module 157, a storage array 158, and an operating-room mapping module 159. In certain aspects, as illustrated in FIG. 3, the surgical hub 106 further includes a smoke evacuation module 154 and/or a suction/irrigation module 155. The various modules and systems may be connected to the modular control either directly via a router or via the communication module 156. The operating theater devices may be coupled to cloud computing resources and data storage, e.g., to the cloud computing system 108, via the modular control. The human interface system(s) 112 may include a display sub-system and a notification sub-system.


The modular control may be coupled to a non-contact sensor module. The non-contact sensor module may measure the dimensions of the operating theater and generate a map of the surgical theater using ultrasonic, laser-type, and/or the like non-contact measurement devices. Other distance sensors can be employed to determine the bounds of an operating room. The sensor module may be configured to determine the size of the operating theater and to adjust Bluetooth-pairing distance limits. A laser-based non-contact sensor module may scan the operating theater by transmitting laser light pulses, receiving laser light pulses that bounce off the perimeter walls of the operating theater, and comparing the phase of the transmitted pulse to the received pulse to determine the size of the operating theater and to adjust Bluetooth pairing distance limits, for example.


An ultrasound-based non-contact sensor module may scan the operating theater by transmitting a burst of ultrasound and receiving the echo when it bounces off the perimeter walls of an operating theater as described further in, for example, U.S. Pat. No. 11,857,152 entitled “Surgical Hub Spatial Awareness To Determine Devices In Operating Theater” issued Jan. 2, 2024, U.S. Pat. No. 11,278,281 entitled “Interactive Surgical Platform” issued Mar. 22, 2022, and U.S. Prov. Pat. App. No. 62/611,341 entitled “Interactive Surgical Platform” filed Dec. 28, 2017, which are hereby incorporated by reference herein in their entireties.


During a surgical procedure, energy application to tissue, for sealing and/or cutting, may be associated with smoke evacuation, suction of excess fluid, and/or irrigation of the tissue. Fluid, power, and/or data lines from different sources may be entangled during the surgical procedure. Valuable time can be lost addressing this issue during a surgical procedure. Detangling the lines may necessitate disconnecting the lines from their respective modules, which may require resetting the modules. A hub modular enclosure 160 of the surgical hub 106 may offer a unified environment for managing the power, data, and fluid lines, which reduces the frequency of entanglement between such lines.


Energy may be applied to tissue at a surgical site. The surgical hub 106 may include the hub modular enclosure 160 and a combo generator module slidably receivable in a docking station of the hub modular enclosure 160. The docking station may include data and power contacts. The combo generator module may include two or more of: an ultrasonic energy generator component, a bipolar radiofrequency (RF) energy generator component, or a monopolar RF energy generator component that are housed in a single unit. The combo generator module may include a smoke evacuation component, at least one energy delivery cable for connecting the combo generator module to a surgical instrument, at least one smoke evacuation component configured to evacuate smoke, fluid, and/or particulates generated by the application of therapeutic energy to the tissue, and a fluid line extending from the remote surgical site to the smoke evacuation component. The fluid line may be a first fluid line, and a second fluid line may extend from the remote surgical site to a suction and irrigation module 155 slidably received in the hub modular enclosure 160. The hub modular enclosure 160 may include a fluid interface.


The combo generator module may generate multiple energy types for application to the tissue. One energy type may be more beneficial for cutting the tissue, while another different energy type may be more beneficial for sealing the tissue. For example, a bipolar generator can be used to seal the tissue while an ultrasonic generator can be used to cut the sealed tissue. In an exemplary embodiment, the hub modular enclosure 160 may be configured to accommodate different generators and facilitate an interactive communication therebetween. The hub modular enclosure 160 may enable the quick removal and/or replacement of various modules.


The hub modular enclosure 160 may include a first energy-generator module, configured to generate a first energy for application to the tissue, and a first docking station comprising a first docking port that includes first data and power contacts, wherein the first energy-generator module is slidably movable into an electrical engagement with the power and data contacts and wherein the first energy-generator module is slidably movable out of the electrical engagement with the first power and data contacts. The modular surgical enclosure may include a second energy-generator module configured to generate a second energy, different than the first energy, for application to the tissue, and a second docking station comprising a second docking port that includes second data and power contacts, wherein the second energy generator module is slidably movable into an electrical engagement with the power and data contacts, and wherein the second energy-generator module is slidably movable out of the electrical engagement with the second power and data contacts. In addition, the modular surgical enclosure also includes a communication bus between the first docking port and the second docking port, configured to facilitate communication between the first energy-generator module and the second energy-generator module.


As shown in FIG. 3, the hub modular enclosure 160 may allow the modular integration of the generator module 150, the smoke evacuation module 154, and the suction/irrigation module 155. The hub modular enclosure 160 may facilitate interactive communication between the operating-room mapping, smoke evacuation, and suction/irrigation modules 159, 154, 155. The generator module 150 can be with integrated monopolar, bipolar, and ultrasonic components supported in a single housing unit slidably insertable into the hub modular enclosure 160. The generator module 150 may connect to a monopolar device 151, a bipolar device 152, and an ultrasonic device 153. The generator module 150 may include a series of monopolar, bipolar, and/or ultrasonic generator modules that interact through the hub modular enclosure 160. The hub modular enclosure 160 may facilitate the insertion of multiple generators and interactive communication between the generators docked into the hub modular enclosure 160 so that the generators would act as a single generator.


A surgical data network having a set of communication hubs may connect the sensing system(s), the modular devices located in one or more operating theaters of a healthcare facility, a patient recovery room, or a room in a healthcare facility specially equipped for surgical operations, to the cloud computing system 108.



FIG. 4 illustrates one embodiment of a situationally aware surgical system 200. Data sources 202 of the situationally aware surgical system 200 may include, for example, modular devices 204, databases 206 (e.g., an electronic medical records (EMR) database, such as of a hospital or other medical facility, containing patient records, etc.), patient monitoring devices 208 (e.g., a blood pressure (BP) monitor, an electrocardiogramonitor, one or more wearable sensing systems 111, etc.), HCP monitoring devices 210 (e.g., one or more wearable sensing systems 111, etc.), and/or environment monitoring devices 212 (e.g., one or more environmental sensing systems 115, etc.).


The modular devices 204 may include sensors configured to detect parameters associated with a patient, HCPs and environment and/or the modular device 204 itself. The modular devices 204 may include the one or more intelligent instrument(s) 114.


The data sources 202 may be in communication (e.g., wirelessly or wired) with a surgical hub 214, such as the surgical hub 106. The surgical hub 214 may derive contextual information pertaining to a surgical procedure from data based upon, for example, the particular combination(s) of data received from the data sources 202 or the particular order in which the data is received from the data sources 202. The contextual information inferred from the received data can include, for example, the type of surgical procedure being performed, the particular step of the surgical procedure that a surgeon (and/or other HCP) is performing, the type of tissue being operated on, or a body cavity that is the subject of the surgical procedure. This ability by some aspects of the surgical hub 214 to derive or infer information related to the surgical procedure from received data can be referred to as “situational awareness.” For example, the surgical hub 214 can incorporate a situational awareness system, which may be the hardware and/or programming associated with the surgical hub 214 that derives contextual information pertaining to the surgical procedure from the received data and/or a surgical plan information received from the edge computing system 216 or an enterprise cloud server 218, such as the cloud computing system 108. The contextual information derived from the data sources 202 may include, for example, what step of the surgical procedure is being performed, whether and how a particular modular device 204 is being used, and the patient's condition.


The surgical hub 214 may be connected to the databases 206 of the data sources 202 to retrieve therefrom data regarding the surgical procedure that is being performed or is to be performed. The data that may be received by the situational awareness system of the surgical hub 214 from the databases 206 may include, for example, start (or setup) time or operational information regarding the procedure (e.g., a segmentectomy in the upper right portion of the thoracic cavity). The surgical hub 214 may derive contextual information regarding the surgical procedure from this data alone or from the combination of this data and other data from the data sources 202.


The surgical hub 214 may be connected to (e.g., paired with) the patient monitoring devices 208 of the data sources 202. Examples of the patient monitoring devices 208 that can be paired with the surgical hub 214 may include a pulse oximeter (SpO2 monitor), a blood pressure (BP) monitor, and an electrocardiogram (EKG) monitor. Perioperative data that is received by the situational awareness system of the surgical hub 214 from the patient monitoring devices 208 may include, for example, the patient's oxygen saturation, blood pressure, heart rate, and/or other physiological parameters. The contextual information that may be derived by the surgical hub 214 from the perioperative data transmitted by the patient monitoring devices 208 may include, for example, whether the patient is located in the operating theater or under anesthesia. The surgical hub 214 may derive these inferences from data from the patient monitoring devices 208 alone or in combination with data from other data from the data sources 202, such as a ventilator and/or other data source.


The surgical hub 214 may be connected to (e.g., paired with) the modular devices 204. Examples of the modular devices 204 that are paired with the surgical hub 214 may include a smoke evacuator, a medical imaging device such as the imaging device 130 shown in FIG. 2, an insufflator, a combined energy generator (for powering an ultrasonic surgical instrument and/or an RF electrosurgical instrument), and a ventilator.


The perioperative data received by the surgical hub 214 may be from a medical imaging device and/or other device(s). The perioperative data received by the surgical hub 214 from the medical imaging device may include, for example, whether the medical imaging device is activated and image data. The contextual information that is derived by the surgical hub 214 from the perioperative data sent by the medical imaging device may include, for example, whether the procedure is a video-assisted thoracic surgery (VATS) procedure (based on whether the medical imaging device is activated or paired to the surgical hub 214 at the beginning or during the course of the procedure). The image data (e.g., still image and/or video image) from the medical imaging device (or the data stream representing the video for a digital medical imaging device) may be processed by a pattern recognition system or a machine learning system to recognize features (e.g., organs or tissue types) in the field of view (FOV) of the medical imaging device, for example. The contextual information that is derived by the surgical hub 214 from the recognized features may include, for example, what type of surgical procedure (or step thereof) is being performed, what organ is being operated on, or what body cavity is being operated in.


The situational awareness system of the surgical hub 214 may derive the contextual information from the data received from the data sources 202 in a variety of different ways. For example, the situational awareness system can include a pattern recognition system, or a machine learning system (e.g., an artificial neural network), that has been trained on training data to correlate various inputs (e.g., data from the databases 206, the patient monitoring devices 208, the modular devices 204, the HCP monitoring devices 210, and/or the environment monitoring devices 212) to corresponding contextual information regarding a surgical procedure. For example, a machine learning system may accurately derive contextual information regarding a surgical procedure from the provided inputs. In examples, the situational awareness system can include a lookup table storing pre-characterized contextual information regarding a surgical procedure in association with one or more inputs (or ranges of inputs) corresponding to the contextual information. In response to a query with one or more inputs, the lookup table can return the corresponding contextual information for the situational awareness system for controlling one or more of the modular devices 204. In examples, the contextual information received by the situational awareness system of the surgical hub 214 can be associated with a particular control adjustment or set of control adjustments for one or more of the modular devices 204. In examples, the situational awareness system can include a machine learning system, lookup table, or other such system, which may generate or retrieve one or more control adjustments for one or more of the modular devices 204 when provided the contextual information as input.


For example, based on data from the data sources 202, the surgical hub 214 may determine what type of tissue was being operated on. The surgical hub 214 can infer whether a surgical procedure being performed is a thoracic or an abdominal procedure, allowing the surgical hub 214 to determine whether the tissue clamped by an end effector of a surgical stapling and cutting instrument is lung tissue (for a thoracic procedure) or stomach tissue (for an abdominal procedure) tissue. The surgical hub 214 may determine whether the surgical site is under pressure (by determining that the surgical procedure is utilizing insufflation) and determine the procedure type, for a consistent amount of smoke evacuation for both thoracic and abdominal procedures. Based on data from the data sources 202, the surgical hub 214 may determine what step of the surgical procedure is being performed or will subsequently be performed.


The surgical hub 214 may determine what type of surgical procedure is being performed and customize an energy level according to an expected tissue profile for the surgical procedure. The situationally aware surgical hub 214 may adjust the energy level for an ultrasonic surgical instrument or RF electrosurgical instrument throughout the course of a surgical procedure, rather than just on a procedure-by-procedure basis.


In examples, data can be drawn from one or more data sources 202 to improve the conclusions that the surgical hub 214 draws from another one of the data sources 202. The surgical hub 214 may augment data that it receives from the modular devices 204 with contextual information that it has built up regarding the surgical procedure from the other data sources 202.


The situational awareness system of the surgical hub 214 can consider the physiological measurement data to provide additional context in analyzing the visualization data. The additional context can be useful when the visualization data may be inconclusive or incomplete on its own.


The surgical hub 214 may determine whether a surgeon (and/or other HCP(s)) was making an error or otherwise deviating from the expected course of action during the course of a surgical procedure. For example, the surgical hub 214 may determine a type of surgical procedure being performed, retrieve a corresponding list of steps or order of equipment usage (e.g., from a memory of the surgical hub 214 or other computer system), and compare the steps being performed or the equipment being used during the course of the surgical procedure to the expected steps or equipment for the type of surgical procedure that the surgical hub 214 determined is being performed. The surgical hub 214 can provide an alert indicating that an unexpected action is being performed or an unexpected device is being utilized at the particular step in the surgical procedure.


The surgical instruments (and other modular devices 204) may be adjusted for the particular context of each surgical procedure (such as adjusting to different tissue types) and validating actions during a surgical procedure. Next steps, data, and display adjustments may be provided to surgical instruments (and other modular devices 204) in the surgical theater according to the specific context of the surgical procedure.


Embodiments of situational awareness systems and using situational awareness systems during performance of a surgical procedure are described further in, for example, U.S. patent application Ser. No. 16/729,772 entitled “Analyzing Surgical Trends By A Surgical System” filed Dec. 30, 2019, U.S. patent application Ser. No. 16/729,747 entitled “Dynamic Surgical Visualization Systems” filed Dec. 30, 2019, U.S. patent application Ser. No. 16/729,744 entitled “Visualization Systems Using Structured Light” filed Dec. 30, 2019, U.S. patent application Ser. No. 16/729,778 entitled “System And Method For Determining, Adjusting, And Managing Resection Margin About A Subject Tissue” filed Dec. 30, 2019, U.S. patent application Ser. No. 16/729,729 entitled “Surgical Systems For Proposing And Corroborating Organ Portion Removals” filed Dec. 30, 2019, U.S. patent application Ser. No. 16/729,778 entitled “Surgical System For Overlaying Surgical Instrument Data Onto A Virtual Three Dimensional Construct Of An Organ” filed Dec. 30, 2019, U.S. patent application Ser. No. 16/729,751 entitled “Surgical Systems For Generating Three Dimensional Constructs Of Anatomical Organs And Coupling Identified Anatomical Structures Thereto” filed Dec. 30, 2019, U.S. patent application Ser. No. 16/729,740 entitled “Surgical Systems Correlating Visualization Data And Powered Surgical Instrument Data” filed Dec. 30, 2019, U.S. patent application Ser. No. 16/729,737 entitled “Adaptive Surgical System Control According To Surgical Smoke Cloud Characteristics” filed Dec. 30, 2019, U.S. patent application Ser. No. 16/729,796 entitled “Adaptive Surgical System Control According To Surgical Smoke Particulate Characteristics” filed Dec. 30, 2019, U.S. patent application Ser. No. 16/729,803 entitled “Adaptive Visualization By A Surgical System” filed Dec. 30, 2019, and U.S. patent application Ser. No. 16/729,807 entitled “Method Of Using Imaging Devices In Surgery” filed Dec. 30, 2019, which are hereby incorporated by reference in their entireties.



FIG. 5 illustrates one embodiment of a surgical system 300 that may include a surgical instrument 302, such as the surgical instrument 114 of FIG. 1 or the surgical instrument 131 of FIG. 2. The surgical instrument 302 can be in communication with a console 304 and/or a portable device 306 through a local area network (LAN) 308 and/or a cloud network 310, such as the cloud computing system 108 of FIG. 1, via a wired and/or wireless connection. The console 304 and the portable device 306 may be any suitable computing device.


The surgical instrument 302 may include a handle 312, an adapter 314, and a loading unit 316. The adapter 314 releasably couples to the handle 312 and the loading unit 316 releasably couples to the adapter 314 such that the adapter 314 transmits a force from one or more drive shafts to the loading unit 316. The adapter 314 or the loading unit 316 may include a force gauge (not explicitly shown in FIG. 5) disposed therein to measure a force exerted on the loading unit 316. In some embodiments, the adapter 314 is non-releasably attached to the handle 312. In some embodiments, the adapter 314 and the loading unit 316 are integral and may be releasably attachable to the handle 312 or non-releasably attached to the handle 312.


The loading unit 316 may include an end effector 318 having a first jaw 320 and a second jaw 322. The loading unit 316 may be an in-situ loaded or multi-firing loading unit (MFLU) that allows a clinician to fire a plurality of fasteners (e.g., staples, clips, etc.) multiple times without requiring the loading unit 316 to be removed from a surgical site to reload the loading unit 316. The first and second jaws 320, 322 may be configured to clamp tissue therebetween, fire fasteners through the clamped tissue, and sever the clamped tissue. The first jaw 320 may be configured to fire at least one fastener a plurality of times or may be configured to include a replaceable multi-fire fastener cartridge including a plurality of fasteners that may be fired more than one time prior to being replaced. The second jaw 322 may include an anvil that deforms or otherwise secures the fasteners, as the fasteners are ejected from the multi-fire fastener cartridge.


The surgical instrument 302 may include a motor, such as at the handle 312, that is coupled to the one or more drive shafts to affect rotation of the one or more drive shafts. The surgical instrument 302 may include a control interface, such as at the handle 312, to selectively activate the motor. The control interface may include buttons, switches, levers, sliders, touchscreens, and/or any other suitable input mechanisms or user interfaces, which can be engaged by a clinician to activate the motor.


The control interface of the surgical instrument 302 may be in communication with a controller 324 (e.g., a microprocessor or other controller) of the surgical instrument 302, shown in the embodiment of FIG. 5 disposed in the handle 312, to selectively activate the motor to affect rotation of the one or more drive shafts. The controller 324 may be configured to receive input from the control interface, adapter data from the adapter 314, and loading unit data from the loading unit 316. The controller 324 may analyze the input from the control interface and the data received from the adapter 314 and/or the date received from the loading unit 316 to selectively activate the motor. The surgical instrument 302 may also include a display, such as at the handle 312, that is viewable by a clinician during use of the surgical instrument 302. The display may be configured to display portions of the adapter data and/or loading unit data before, during, or after firing of the surgical instrument 302.


The adapter 314 may include an adapter identification device 326 disposed therein, and the loading unit 316 may include a loading unit identification device 328 disposed therein. The adapter identification device 326 may be in communication with the controller 324, and the loading unit identification device 328 may be in communication with the controller 324. It will be appreciated that the loading unit identification device 328 may be in communication with the adapter identification device 326, which relays or passes communication from the loading unit identification device 328 to the controller 324. In embodiments in which the adapter 314 and the loading unit 316 are integral, one of the adapter identification device 326 and the loading unit identification device 328 may be omitted.


The adapter 314 may also include one or more sensors 330 disposed thereabout to detect various conditions of the adapter 314 or of the environment (e.g., if the adapter 314 is connected to a loading unit, if the adapter 314 is connected to a handle, if the one or more drive shafts are rotating, a torque of the one or more drive shafts, a strain of the one or more drive shafts, a temperature within the adapter 314, a number of firings of the adapter 314, a peak force of the adapter 314 during firing, a total amount of force applied to the adapter 314, a peak retraction force of the adapter 314, a number of pauses of the adapter 314 during firing, etc.). The one or more sensors 330 may provide an input to the adapter identification device 326 (or to the loading unit identification device 328 if the adapter identification device 326 is omitted) in the form of data signals. The data signals of the one or more sensors 330 may be stored within or be used to update the adapter data stored within the adapter identification device 326 (or the loading unit identification device 328 if the adapter identification device 326 is omitted). The data signals of the one or more sensors 330 may be analog or digital. The one or more sensors 330 may include, for example, a force gauge to measure a force exerted on the loading unit 316 during firing.


The handle 312 and the adapter 314 can be configured to interconnect the adapter identification device 326 and the loading unit identification device 328 with the controller 324 via an electrical interface. The electrical interface may be a direct electrical interface (e.g., include electrical contacts that engage one another to transmit energy and signals therebetween). Additionally, or alternatively, the electrical interface may be a non-contact electrical interface to wirelessly transmit energy and signals therebetween (e.g., inductively transfer). It is also contemplated that the adapter identification device 326 and the controller 324 may be in wireless communication with one another via a wireless connection separate from the electrical interface.


The handle 312 may include a transceiver 332 that is configured to transmit instrument data from the controller 324 to one or more other components of the surgical system 300 (e.g., the LAN 308, the cloud 310, the console 304, and/or the portable device 306). The controller 324 may also transmit instrument data and/or measurement data associated with the one or more sensors 330 to a surgical hub, such as the surgical hub 106 of FIGS. 1-3 or the surgical hub 214 of FIG. 4. The transceiver 332 may receive data (e.g., cartridge data, loading unit data, adapter data, and/or other notifications) from the surgical hub. The transceiver 332 may receive data (e.g., cartridge data, loading unit data, or adapter data) from the other components of the surgical system 300. For example, the controller 324 may transmit surgical instrument data including a serial number of an attached adapter (e.g., the adapter 314) attached to the handle 312, a serial number of a loading unit (e.g., the loading unit 316) attached to the adapter 314, and a serial number of a multi-fire fastener cartridge loaded into the loading unit 316, e.g., into one of the jaws 320, 322 at the end effector 318, to the console 304. Thereafter, the console 304 may transmit data (e.g., cartridge data, loading unit data, and/or adapter data) associated with the attached cartridge, the loading unit 316, and the adapter, 314 respectively, back to the controller 324. The controller 324 can display messages on the local instrument display or transmit the message, via transceiver 332, to the console 304 or the portable device 306 to display the message on a display 334 or device screen of the portable device 306, respectively.


Various exemplary embodiments of aspects of smart surgical systems, for example how smart surgical systems choose to interact with each other, are described further in, for example, U.S. patent application Ser. No. 18/810,323 entitled “Method For Multi-System Interaction” filed Aug. 20, 2024, U.S. patent application Ser. No. 18/810,036 entitled “Adaptive Interaction Between Smart Healthcare Systems” filed Aug. 20, 2024, U.S. patent application Ser. No. 18/810,082 entitled “Control Redirection And Image Porting Between Surgical Systems” filed Aug. 20, 2024, U.S. patent application Ser. No. 18/809,890 entitled “Synchronized Motion Of Independent Surgical Devices” filed Aug. 20, 2024, U.S. patent application Ser. No. 18/810,133 entitled “Synchronization Of The Operational Envelopes Of Independent Surgical Devices” filed Aug. 20, 2024, U.S. patent application Ser. No. 18/810,170 entitled “Synchronized Motion Of Independent Surgical Devices To Maintain Relational Field Of Views” filed Aug. 20, 2024, U.S. patent application Ser. No. 18/810,208 entitled “Alignment And Distortion Compensation Of Reference Planes Used By Surgical Devices” filed Aug. 20, 2024, U.S. patent application Ser. No. 18/810,230 entitled “Shared Set Of Object Registrations For Surgical Devices Using Independent Reference Planes” filed Aug. 20, 2024, U.S. patent application Ser. No. 18/810,266 entitled “Coordinated Control Of Therapeutic Treatment Effects” filed Aug. 20, 2024, U.S. patent application Ser. No. 18/810,283 entitled “Functional Restriction Of A System Based On Information From Another Independent System” filed Aug. 20, 2024, U.S. patent application Ser. No. 18/809,960 entitled “Inter-Connectivity Of Data Flows Between Independent Smart Systems” filed Aug. 20, 2024, U.S. patent application Ser. No. 18/810,041 entitled “Inter-Connectivity Of Data Flows Between Independent Smart Systems” filed Aug. 20, 2024, U.S. patent application Ser. No. 18/810,119 entitled “Processing And Display Of Tissue Tension” filed Aug. 20, 2024, U.S. patent application Ser. No. 18/810,175 entitled “Situational Control Of Smart Surgical Devices” filed Aug. 20, 2024, U.S. patent application Ser. No. 18/810,222 entitled “Method For Activation Mode Determination Of An Energy Device” filed Aug. 20, 2024, U.S. patent application Ser. No. 18/810,274 entitled “Visual Data-Based Activation Mode Determination Of An Energy Device” filed Aug. 20, 2024, U.S. patent application Ser. No. 18/810,346 entitled “Electrical Data-Based Activation Mode Determination Of An Energy Device” filed Aug. 20, 2024, U.S. patent application Ser. No. 18/810,355 entitled “Mechanical Data-Based Activation Mode Determination Of An Energy Device” filed Aug. 20, 2024, U.S. patent application Ser. No. 18/810,361 entitled “Multi-Sourced Data-Based Activation Mode Determination Of An Energy Device” filed Aug. 20, 2024, U.S. patent application Ser. No. 18/810,407 entitled “Conflict Resolution For Activation Mode Determination Of An Energy Device” filed Aug. 20, 2024, and U.S. patent application Ser. No. 18/810,419 entitled “Controlling Patient Monitoring Devices” filed Aug. 20, 2024, which are each hereby incorporated by reference in their entireties.


Operating Intelligent Surgical Instruments

An intelligent surgical instrument, such as the surgical instrument 114 of FIG. 1, the surgical instrument 131 of FIG. 2, or the surgical instrument 302 of FIG. 5, can have an algorithm stored thereon, e.g., in a memory thereof, configured to be executable on board the intelligent surgical instrument, e.g., by a processor thereof, to control operation of the intelligent surgical instrument. In some embodiments, instead of or in addition to being stored on the intelligent surgical instrument, the algorithm can be stored on a surgical hub, e.g., in a memory thereof, that is configured to communicate with the intelligent surgical instrument.


The algorithm may be stored in the form of one or more sets of pluralities of data points defining and/or representing instructions, notifications, signals, etc. to control functions of the intelligent surgical instrument. In some embodiments, data gathered by the intelligent surgical instrument can be used by the intelligent surgical instrument, e.g., by a processor of the intelligent surgical instrument, to change at least one variable parameter of the algorithm. As discussed above, a surgical hub can be in communication with an intelligent surgical instrument, so data gathered by the intelligent surgical instrument can be communicated to the surgical hub and/or data gathered by another device in communication with the surgical hub can be communicated to the surgical hub, and data can be communicated from the surgical hub to the intelligent surgical instrument. Thus, instead of or in addition to the intelligent surgical instrument being configured to change a stored variable parameter, the surgical hub can be configured to communicate the changed at least one variable, alone or as part of the algorithm, to the intelligent surgical instrument and/or the surgical hub can communicate an instruction to the intelligent surgical instrument to change the at least one variable as determined by the surgical hub.


The at least one variable parameter may be among the algorithm's data points, e.g., may be included in instructions for operating the intelligent surgical instrument, and are thus each able to be changed by changing one or more of the stored pluralities of data points of the algorithm. After the at least one variable parameter has been changed, subsequent execution of the algorithm can be according to the changed algorithm. As such, operation of the intelligent surgical instrument over time can be managed for a patient to increase the beneficial results use of the intelligent surgical instrument by taking into consideration actual situations of the patient and actual conditions and/or results of the surgical procedure in which the intelligent surgical instrument is being used. Changing the at least one variable parameter is automated to improve patient outcomes. Thus, the intelligent surgical instrument can be configured to provide personalized medicine based on the patient and the patient's surrounding conditions to provide a smart system. In a surgical setting in which the intelligent surgical instrument is being used during performance of a surgical procedure, automated changing of the at least one variable parameter may allow for the intelligent surgical instrument to be controlled based on data gathered during the performance of the surgical procedure, which may help ensure that the intelligent surgical instrument is used efficiently and correctly and/or may help reduce chances of patient harm by harming a critical anatomical structure.


The at least one variable parameter can be any of a variety of different operational parameters. Examples of variable parameters include motor speed, motor torque, energy level, energy application duration, tissue compression rate, jaw closure rate, cutting element speed, load threshold, etc.


Various embodiments of operating surgical instruments are described further in, for example, U.S. Pat. App. Pub. No. 2023/0096268 entitled “Methods for Controlling Cooperative Surgical Instruments” filed Oct. 5, 2021, U.S. Pat. App. Pub. No. 2023/0097906 entitled “Surgical Methods Using Multi-Source Imaging” published Mar. 30, 2023, U.S. Pat. App. Pub. No. 2023/0095002 entitled “Surgical Methods Using Fiducial Identification And Tracking” published Mar. 30, 2023, U.S. Pat. App. Pub. No. 2023/0101750 entitled “Surgical Methods For Control Of One Visualization With Another” published Mar. 30, 2023, U.S. Pat. App. Pub. No. 2023/0100698 entitled “Methods For Controlling Cooperative Surgical Instruments” published Mar. 30, 2023, U.S. Pat. App. Pub. No. 2023/0103005 entitled “Methods for Controlling Cooperative Surgical Instruments” published Mar. 30, 2023, and U.S. Pat. App. Pub. No. 2023/0098538 entitled “Cooperative Access Hybrid Procedures” published Mar. 30, 2023, which are hereby incorporated by reference in their entireties.


Data Pipelines

As discussed herein, data may be transmitted from one point to another point, such as during a performance of a surgical procedure on a patient. The data may be transmitted from a source system to a destination system using a data pipeline.


As shown in FIG. 6A, a data pipeline 400 may move data from a source 402 to a destination 404 both physical and virtual (transient). In some data pipelines, the destination 404 may be called a “sink” or a “target.” Any time data is processed between point A and point B (or between multiple points such as points B, C, and D), there is a data pipeline 400 between those points. In general, the data pipeline 400 can include a set of tools and processes, which may be referred to as “steps” or “processing steps” used to automate the movement and transformation of data between the source 402 and the destination 404.


In some embodiments, the source 402 and the destination 404 are two different elements, such as a first element of a surgical system and a second element of a surgical system. The data from the source 402 may or may not be modified by the data pipeline 400 before being received at the destination 404. For example, the source 402 may be one of the wearable sensing system(s) 111, the environmental sensing system(s) 115, the robotic system(s) 113, the intelligent instrument(s) 114, or the human interface system(s) 112 of the surgical system 102 of FIGS. 1 and 3 and the destination 404 may be the surgical hub 106 of the surgical system 102 of FIGS. 1 and 3. For another example, the source 402 may be one of the wearable sensing system(s) 111, the environmental sensing system(s) 115, the intelligent instrument(s) 114, or the human interface system(s) 112 of one of the surgical systems 102, 103, 104 of FIG. 1 and the destination 404 may be the surgical hub 106 of another one of the surgical systems 102, 103, 104 of FIG. 1. For yet another example, the source 402 may be one of the wearable sensing system(s) 111, the environmental sensing system(s) 115, the robotic system(s) 113, the intelligent instrument(s) 114, or the human interface system(s) 112 of the surgical system 102 of FIGS. 1 and 3 and the destination 404 may be another one of the wearable sensing system(s) 111, the environmental sensing system(s) 115, the robotic system(s) 113, the intelligent instrument(s) 114, or the human interface system(s) 112 of the surgical system 102 of FIGS. 1 and 3. For still another example, the source 402 may be one of the surgical systems 102, 103, 104 of FIG. 1 and the destination 404 may be another one of the surgical systems 102, 103, 104 of FIG. 1.


In some embodiments, the source 402 and the destination 404 are the same element. The data pipeline 400 may thus be purely about modifying the data set between the source 402 and the destination 404.


As shown in FIG. 6B, the data pipeline 400 may include one or more data connectors 406 that extract data from the source 402 and load the extracted data into the destination 404. A plural “N” number of data connectors 406 are shown in FIG. 6B. In some embodiments, such as embodiments in which extract, transform, and load (ETL) processing of data is performed, as opposed to extract, load, and transform (ELT) processing of data, data may be transformed within the data pipeline 400 before the data is received by the destination 404. In other embodiments, such as embodiments in which ELT processing of data is performed, as opposed to ETL processing of data, the one or more data connectors 406 may simply load raw data to the destination 404. In some instances, light transformations may be applied to the data, such as normalizing and cleaning data or orchestrating transformations into models for analysts, before the destination 404 receives the data.


The data pipeline 400 can include physical elements like one or more wires or can include digital elements like one or more packets, network traffic, or internal processor paths/connections. Flexible data pipelines are portions of the overall system where redundant paths can be utilized and therefore data, e.g., a data stream, can be sent down one path, parsed between multiple parallel paths (to increase capacity) and these multiple paths can be flexibly adjusted by the system as necessary to accommodate changes in the volume and details of the data streams as necessary.


The data pipeline 400 can have a small code base that serves a very specific purpose. These types of applications are called microservices.


The data pipeline 400 can be a big data pipeline. There are five characteristics of big data: volume, variety, velocity, veracity, and value. Big data pipelines are data pipelines built to accommodate more than one of the five characteristics of big data. The velocity of big data makes it appealing to build streaming data pipelines for big data. Then data can be captured and processed in real time so some action can then occur. The volume of big data requires that data pipelines must be scalable, as the volume can be variable over time. In practice, there are likely to be many big data events that occur simultaneously or very close together, so the big data pipeline must be able to scale to process significant volumes of data concurrently. The variety of big data requires that big data pipelines be able to recognize and process data in many different formats-structured, unstructured, and semi-structured.


In general, an architecture design of a data pipeline, e.g., the data pipeline 400 of FIGS. 6A and 6B, can include interconnectivity between a first smart device and a second smart device, e.g., the source 402 and the destination 404 of FIGS. 6A and 6B. Data generated in one source system (e.g., the first smart device or the second smart device) may feed multiple data pipelines, which may have multiple other data pipelines dependent on their outputs.


The interconnectivity between the first smart device and the second smart device may be on a common/shared network, e.g., LAN, Wi-fi, powerline networking, MoCA networking, cellular (e.g., 4G, 5G, etc.), low power wide area network (LPWAN), Zigbee, Z-wave, etc.


The interconnectivity between the first smart device and the second smart device may be on a structured network. Traditionally, structured peer-to-peer (P2P) networks implement a distributed hash table (DHT). In order to route traffic efficiently through the network, nodes in a structured overlay must maintain lists of neighbors that satisfy specific criteria. This makes them less robust in networks with a high rate of churn (e.g., with large numbers of nodes frequently joining and leaving the network). DHT-based solutions may have a high cost of advertising/discovering resources and may have static and dynamic load imbalance.


The interconnectivity between the first smart device and the second smart device may be via cooperative networking. Cooperative networking utilizes a system that is a hybrid of a P2P network and a server-client network architecture, offloading serving to peers who have recently established direct interchanges of content.


The interconnectivity between the first smart device and the second smart device may be exclusive. For example, the interconnectivity may be exclusive via Bluetooth. For another example, the interconnectivity may be exclusive via network isolation, such as by using path isolation, a virtual private network (VPN), or a secure access service edge (SASE). The path isolation may include a software-defined LAN (SD-WAN). SD-WANs rely on a software and a centralized control function that can steer traffic across a WAN in a smarter way by handling traffic based on priority, security, and quality of service requirements. The VPN may involve creation of an independent secure network using common/shared open networks. Another network (a carrier network) is used to carry data, which is encrypted. The carrier network will see packets of the data, which it routes. To users of the VPN, it will look like the systems are directly connected to each other.


For example, with interconnectivity between the first smart device and the second smart device being exclusive in a surgical context, an operating room (OR) may have a surgical hub and an established network from a first vendor. In order to secure against hacking or data leakage, the network may be an encrypted common network which the first vendor supplies keys from. A surgical stapler in the OR may be from a second vendor that is different from the first vendor and that does not have the keys from the first vendor. The surgical stapler may want to link to other device(s) it relies on for functionality but does not want data leakage. An advanced energy generator from the second vendor with an accompanying smoke evacuator may also be in the OR and may form their own private network, such as by piggybacking on the first vendor network to create a second encrypted VPN routing through the first vendor network as a primary network or by forming an independent wireless network for bi-direction communication between the advanced energy generator and the smoke evacuator. The surgical stapler may want to communicate with the advanced energy generator, e.g., so the surgical stapler may retrieve updated software from the advanced energy generator, receive tissue properties information from the advanced energy generator, log data for exportation, and receive energy from the advanced energy generator and apply the energy to tissue, but not want to communicate with the smoke evacuator, e.g., because the surgical stapler performs no smoke evacuation. The surgical stapler and a communication backplane of the advanced energy generator may therefore form an isolated network with only the surgical stapler (first smart device) and the advanced energy generator (second smart device) able to communicate via the isolated network and with the surgical hub able to manage the data pipeline between the surgical stapler and the advanced energy generator.


In general, one or more steps may be performed along a data pipeline, e.g., the data pipeline 400 of FIGS. 6A and 6B. The steps in the data pipeline may include data transformation, data augmentation, data enrichment, data filtering, data grouping, data aggregating, and running algorithms against the data.


The data aggregation may include segmentation of data into buckets (e.g., decomposition of a procedure into sub-steps), data fusion and interfacing, and mixing real-time data streams with archived data streams. Various embodiments of data aggregation are described further in, for example, U.S. Pat. App. Pub. No. 2022/0104896 entitled “Interactive Information Overlay On Multiple Surgical Displays” published Apr. 7, 2022, U.S. Pat. App. Pub. No. 2019/0206564 entitled “Method For Facility Data Collection And Interpretation” published Jul. 4, 2019, and U.S. Pat. App. Pub. No. 2024/0221937 entitled “Method For Advanced Algorithm Support” published Jul. 4, 2024, which are hereby incorporated by reference in their entireties.


In one embodiment, mixing real-time data streams with archived data streams may include, in a surgical context, pre-operative data/imaging evaluation. The evaluation may include displaying of static preoperative scan(s), overlaying of video with aligned 3D model, and registering a virtual view to a camera view.


In one embodiment, the display of static pre-operative scan may include alignment based on surgeon (or other HCP) position, for example, where the surgeon (or other HCP) is standing.


In one embodiment, registering the virtual view to the camera view may include identifying organs in a video and triangulating with camera location and/or getting a camera location in reference to a coordinate system. For example, during performance of a surgical procedure, a camera location may be acquired with respect to a trocar by 3D tracking of the trocar, camera insertion in the trocar (e.g., insertion depth and/or insertion angle, and/or determination of what trocar is being used for the camera. An example of insertion depth is a marking on a shaft of the trocar, such as on a graphical scale or a color gradient. Examples of insertion angle in a trocar are 3D trocar orientation and 3D angle of attack.


In one embodiment, the pre-operative data/imaging evaluation may include using a machine learning (ML) algorithm to reviewing preoperative scans of a patient to identify any abnormalities. A cloud-based source may be used for augmented reality (AR) using cloud-based data for surgical procedure planning.


In one embodiment, an ML algorithm may be used in an initial planning stage, e.g., initial planning for a surgical procedure to be performed on a patient. Preoperative scans may be used to facilitate surgical path planning. If the initial scans detect anomalies or diseased tissues, as analyzed by the ML algorithm, the anomalies or diseased tissues may be relayed to the surgeon for the upcoming surgical procedure and a new surgical task order may be suggested based on how previous surgeons handled these problems. The relayed information to the surgeon may also include a recommended inventory list to have on hand based on this initial improved surgical task order.


For example, during preoperative scans for a sleeve gastrectomy, a small hernia may be discovered. This hernia may be highlighted during the surgical planning step and the surgeon may be asked if the surgeon wants to include a hernial repair in the initial sleeve gastrectomy plan. Based on the surgeon's answer, the hernial repair will be added into the surgical task order for an affirmative surgeon answer and the overall inventory for this case will be updated to include relevant items for the hernial repair added into the surgical task order. During performance of the sleeve gastrectomy, a ML algorithm may be used to detect diseased tissue or surgical anomalies. If a diseased tissue is discovered, the diseased tissue may be highlighted on a screen, e.g., on a HID, and a cutting path/angle may be recommended to avoid the tissue or make the tissue state more manageable. These recommendations may be based on how surgeons previously, successfully, handled these situations. If a surgical anomaly is discovered, the system may either automatically update the task order or require the surgeon to give a verbal command (or other command) to update the task order and highlight the required additional inventory on the circulators screen. For foreign bodies (such as bougies) that may be discovered, the foreign body may be highlighted on the screen and a cutting path may be included to provide an ample margin around the foreign body, assuming the foreign body is anticipated. If the foreign body is not anticipated, the foreign body may be highlighted to draw the surgeon's (and/or other HCP's) attention to it.


In one embodiment, the pre-operative data/imaging evaluation may include a cloud comparison of scans periodically taken through time for anatomic changes of that time to indicate possible operative complications. A cloud-based source may be used for augmented reality (AR) using preoperative scans to enhance return surgeries.


Looking at a difference between current and previous surgical scans may help inform the surgeon and/or other HCP and improve patient outcomes. This information can be used in various ways, for example for disease detection, informing surgical task planning, and/or informing previous surgical success and healing.


With respect to disease detection, current and historical scans can be used to determine if various disease states or abnormalities have evolved between surgeries. One case where this could be particularly useful is cancer detection. If a scan initially picks up an abnormal growth for a patient but the patient's HCP decides that it is benign but decides to flag it for caution, a follow-up scan nay confirm that the abnormality is benign or not. The scan may also automatically highlight areas of concern (tissue growth) that were not flagged by the HCP initially but could be areas of concern.


With respect to informing surgical task planning, information about previous surgeries (e.g., potential areas of scare tissue, previously seen difficult tissue, etc.) can help facilitate surgical step and path planning. This information can also be used during the surgery to display areas of scarring, changes of tissue from previous surgeries that might need to be examined, foreign bodies, and/or new adhesions.


With respect to informing previous surgical success and healing, data from various scans over time can be used to determine how successful patients were recovering or had recovered from previous surgeries. This information may be used by surgeons (and/or other HCPs) to help plan future procedures, assess previous work, and/or facilitate quicker patient recovery.


Data development may be performed as a step in a data pipeline and may include one or more of data modeling, database layout and configuration, implementation, data mapping, and correction.


Various data may be communicated using a data pipeline, e.g., the data pipeline 400 of FIGS. 6A and 6B, such as data from a local data source, data from a remote data source, and synthetically generated data.


Data from a local data source may include data collected by, used by, or resulting from the operation of aspects of the local data source, e.g., data gathered using a sensor (e.g., temperature data gathered using a temperature sensor, force data gathered using a force sensor, pressure measured using a pressure sensor, etc.), still and/or video image data (e.g., data gathered by a camera, etc.), operational parameters of a surgical instrument (e.g., energy level, energy type, motor current, cutting element speed, etc.), surgical instrument identification (e.g., instrument type, instrument serial number, etc.), etc.


Data from a local data source may have metadata, which may reflect aspects of a data stream, a device configuration, and/or system behavior that define information about the data. For example, metadata may include an auxiliary data location that is shared by two interconnected systems, e.g., first and second robotic systems, etc., to create a single “brain” instead of two distinct ones. Each of the interconnected systems may creates a copy of its memory system and introduce it to the combined system or “collective. Both of the interconnected systems may now use this new area for data exchange, for uni-directional communication, and to directly command control systems. The new combined system may become primary and the individual robotic system's memory areas may become secondary memory areas until the systems are “unpaired,” e.g., are no longer interconnected.


Data from a local data source may include a data stream that is monitored by at least one other system. In this way, data collected by, used by or resulting from the operation of aspects of one system may be sourced to another system (the monitoring system).


In one embodiment, application programming interfaces (APIs) may be used to communicate data from a local source.


In one embodiment, data may be communicated from a local source in response to occurrence of a trigger event. In one embodiment, the trigger event is a digital trigger event. For example, in a surgical context, the trigger event may be a surgical instrument changing orientation after being in a predetermined static position, such as when the surgical instrument “wakes up.” For another example, in a surgical context, the trigger event may be a system's power or signal interruption, e.g., communicating data after the interruption has been resolved. For yet another example, in a surgical context, the trigger event may be a change in system status, capacity, or connectivity. For still another example, in a surgical context, the trigger event may be quality, calibrations, or conversion factors of a surgical instrument.


Data from a remote data source may include data collected by, used by, or resulting from the operation of aspects of the remote data source. One example of a remote data source includes a database, such as a relational database or an NoSQL database. Examples of data in a relational database relevant in a surgical context can include inventory, sterile services process status, billing code, patient records (PHI), and previous procedure data.


One example of data from a remote data source, in a surgical context, includes a procedure plan, e.g., a plan for a surgical procedure to be performed on a patient. The procedure plan data can include, for example, instrument selection, port placement, adjuncts needed for devices, OR timing and local imaging needs, procedural steps, staff number and skill composition, and patient positioning.


Another example of data from a remote data source, in a surgical context, includes pre-operative imaging, such as a CT full body scan, external ultrasound, MRI, etc.


Another example of data from a remote data source includes software parameter updates, such as software parameter updates streaming from a cloud computing system. The software parameter updates can include, for example, original equipment manufacturer (OEM) updates to a device's operational aspects, e.g., updated basic input/output system (BIOS) controls, calibrations, updates on capabilities (e.g., recalls, limits/expansion of use, indications, contra-indications, etc.), etc.


Another example of data from a remote data source includes gold standard of care or outcomes improvement data, such as gold standard of care or outcomes improvement data from a cloud computing system. Gold standard of care or outcomes improvement data can include, for example, improved techniques of device use and/or device combinations.


In one embodiment, Apache® Hadoop®, which is an open source software framework, may be used for distributed processing of data across computer systems.


Examples of types of synthetically generated data may include synthetic text, media (e.g., video, image, sound, etc.), tabular data, and calculated continuous stream of data. The calculated continuous stream of data may be randomly generated (bracketed by extremes thresholds) or may be based off another stream or real continuous data stream that is modified to fit the stream limits of the expected synthetic stream. Reasons for using synthetically generated data can include for training data streams, because of missing data from an expected system that would otherwise draw a device error but is not relevant to the operation of the device or other dependent device, for data streams designed to verify the operational of the transforms or mathematic algorithms, for data streams intended to either verify security or prevent fraud/inauthenticity, for consecutive timing data for redaction of real-time from the relational data of the systems, for creations of trending data for replacement of legal compliance regulated data streams (e.g., either they produce datasets from partially synthetic data, where they replace only a selection of the dataset with synthetic data), and/or for a sudden but anticipatable/explainable change in a data source's feed which is being used as a closed loop control for a destination.


For example of use of synthetically generated data in a surgical context in which a surgical procedure is being performed on a patient, a PO2 sensor (data source) on the patient's finger may be being used as a means for controlling a closed loop feed or O2 through a ventilator (data destination). The ventilator also has an internal closed loop on CO2 outlet concentration, but since O2 blood saturation is the desired fixed relationship to O2 supplementation level, the ventilator is using the PO2 sensor from the patient monitoring system. There may be an abrupt change in the O2 level as measured by the PO2 sensor. The ventilator has two choices: ether switch to the trailing indicator or CO2 which has not had an abrupt change or examine other data sources to try to explain the O2 shift. When compared to the patient's core body temperature measure it may be discovered that the patient's temperature has dropped across a 1.5° C. below normal threshold that usually induces vasoconstriction limiting blood flow to the body's extremities. The PO2 measure by its metadata is known by the ventilator to be a finger monitor and therefore on an extremity. Further comparison over time may show the O2 measure fairly constant before the shift and then fairly constant after the shift as well, reinforcing the idea that this is the vasoconstriction that induced the shift. The ventilator may then create a synthetic data stream based on the shift data pattern and behavior that compensates for the vasoconstriction shift so the ventilator can continue on the primary linked feeds but using a modified synthetic or “calculated” data stream based of a real stream. For example, current body temperature control systems, such as a Bair Hugger™ device, are open-loop user settable heat gradient controlled systems but are affected by local temperature and environment.


Data Lifecycle Management

In various aspects, the present disclosure provides methods, devices, and systems for data lifecycle management. The data lifecycle management may allow for data to be organized based on value of the data to an end user, e.g., value to a destination (such as the destination 404 of FIGS. 6A and 6B) of the data or value to a human user viewing the data. The organizing may be based on aspects of the data without implication to the end user.


Data lifecycle management may include the lifecycle of a data stream being established and altered based on at least one of a type of the data, aspects of the data, and usefulness of the data. The alteration of the lifecycle timing or handling can be changed based on situational implications, and effects on lifecycle may change at least one of trigging date, trigging action, retention periods, and destination locations. Examples of the aspects of the data include details of the data itself relative to an end user, such as value, taxonomy, age, and patient relationship with one or more HCPs. The usefulness of the data may be based on the analyses it could be used to improve, which may include, in a surgical context, the helpfulness of the data for improving healing response. The situational implications that can change lifecycle may be at least one of failure severity, failure rate, expansion of claim interest, and, in a surgical context, efficiency of the system or doctors to effectively treat patients.


A score of the data may be used to alter the handling of the data. The score may be based on one or more of quality, protection level, and type or taxonomy of the data or a lifecycle impacts aspect.


In general, quality may help in resolving aspects of data to create a measurable usefulness of data. The quality of data may be a score allowing the recovering or transferring system to evaluate the aspects of the data. There may be a minimum threshold score that is required for one or more of the systems to accept or form a closed loop control on the data, but there does not have to be. The score may merely be stored and used as a means for conflict resolution or root cause analysis if an issue results.


In one embodiment, quality as an aspect of data that may alter handling of the data may include one or more of a quality score or trustiness score that can be based on risk of the controlled system and, in a surgical context, co-morbidity risk of the specified patient to the aspects of the measure, and risk or priority of a surgical procedure or surgical procedure steps. These scores may impact the limits of the closed loop system operable range based on score of the data.


Quality of data may be quantified on one or more of accuracy, reliability, integrity, and conformance. Accuracy may include the correctness of the data to the sensing patient or system measure and/or precision of the measure to discern between two closely related measures on the same scale.


Reliability may include error handling, repeatability, and/or reproducibility. The error handling may include ease of detecting data errors, frequency of errors to handle, and/or determination of the magnitude of the error effect such as determination of the impact of the error on the receiving system operation. The repeatability may be a measure of the same perturbance or situation producing the same signal or output every time. Reproducibility may include consistency of the data feed and/or stability of the measure. A signal may be consistent but still be a noisy signal. A signal cannot be stable and noisy. Stability of the measure may include minimization of drop outs, minimization of lost packets, and/or tightness of the signal (e.g., minimization of noise within the signal). A measure of the variation may be used as a means for scoring stability.


Integrity may include one or more of authenticity, verification, and uniqueness. The authenticity may include certification of the data source and the points are from where they are supposed to be from, which may help prevent man in the middle attacks. The authenticity may include the data originating from an authentic source, such as through recognition of non-authentic source devices (e.g., knock off devices) or recognition of synthetically generated data not originating from any authentic source. Uniqueness may include no duplicates and/or uniqueness scoring that may define the need for data cleaning or deduplication.


Conformance may reflect that data should be collected according to defined business rules and parameters and should conform to the right format and fall within the right range, that data is timely (e.g., is available when it is required), that is in expected form, frequency (e.g., a sample rate of the sensing system or a delivery rate of the data packets to the receiving smart system), and delivery ability (e.g., header/packet details), and/or that data is complete (e.g., meta-data of the data may provide context and breath of the measure of completeness).


In one embodiment, quality as an aspect of data that may alter handling of the data may include a specification. The specification may be standardization and/or may be that the receiving and transferring system has a predefined set or criteria that it is expecting from any data streams provided to it to create a closed loop control on. This specification of quality and data may be provided to the source system to improve conformance. If the data is “acquired” from another system the quality score is likely affected. The acquired of data may impact latency and/or accuracy.


In one embodiment, quality as an aspect of data that may alter handling of the data may include determining what to do with data of low quality. Data may require corroboration before being used for a control stream to a closed loop system; if the system is a high risk system, data of low quality may be disregarded causing the system to seek a secondary related feed or operate in an open loop rather than closed loop manner; and/or the data may be cleaned before use to remove variability or noise from the signal, which might result in an averaging of data or a lower data rate in order to get a higher data accuracy or integrity.


For example, an atrial fibrillation (AFib), atrial surface remodeling procedure being performed on a patient may include ablation along a continuous path of an ablation device's electrode with selective activation that is correlated with the patient's local heart interior image and EKG. The patient's blood pressure, local heart temperature, and pulse rate are also relevant parameters to the surgeon (and/or other HCP). Depending on where the temperature is taken, e.g., using a temperature probe, the accuracy of the data stream can be +/−2° C. If the sensing system is proactively fixated to the patient the repeatability and reproducibility may be improved over merely placing the temperature probe in the correct location. The variation of the location locally will also impact accuracy. Since all of these systems are an electronic probe that may be attached to a monitoring system (e.g., a CARESCAPE™ ONE GE system or other monitoring system) the data's meta data may provide the details of the monitoring system, the setting of which the system is monitoring the temperature probe and may include other information such as temperature probe calibration. The monitoring system may be activity cooperating with a surgical hub and a patient heating system so it may be conforming to the data packet addressing and frequency prescribed by the receiving or transferring system. The quality score may be affected by all of these aspects and, the higher quality of the data, the more trusting the receiving system is of the data and therefore the wider the closed loop operational window can be based on the trustworthiness of the data. If some aspect of the quality decreases during the surgical procedure or in subsequent later surgical procedures the receiving system may reduce the upper and lower range limits of the adjustments of patient thermal load to prevent low quality data from causing instabilities in the closed loop operation of the system. When the quality drops below a minimum level, the system may go to an open loop operation rather than unstably heat/cool the patient. Both the thermal load range and the rate of patient heating can adversely impact the physiology, anesthesia, and/or other blood gases systems and even result in cardiac arrest or death if unstably applied. If the patient or surgical procedure is low risk, the allowable quality window could be larger, but high co-morbidity patients, risky surgical procedures (AFib procedure vs cholecystectomy procedure), and/or the surgical procedure step risk may increase or decrease the tolerable quality score variation.


As mentioned above, an aspect of the data to control handling and usability of the data streams may include protection level of the data. In one embodiment, protection level as an aspect of data that may alter handling of the data includes a confidentiality level, which may reflect a privacy status (e.g., data masking, subset masking, and/or redaction) and/or a restricted status.


As mentioned above, an aspect of the data to control handling and usability of the data streams may include taxonomy (e.g., classification). In one embodiment, taxonomy as an aspect of data that may alter handling of the data includes data categorization, structured data/semi-structured data/unstructured data, open data/closed data (HIPAA), streaming data/event-driven data/time, and/or catalog of data. The catalog of data may include data curation, data mapping, data migration, and/or data integration. Data curation (e.g., management of data through its lifecycle) may minimize the manifesting of data swamps. Data mapping (e.g., matching fields from one database to another) may facilitate data migration, data integration, and other data management tasks. Data migration moves data from one system to another as a one-time event. Data integration is an ongoing process of regularly moving data from one system to another, with data stored and maintained at both the source and destination, and can be scheduled, such as quarterly or monthly, or can be triggered by an event. Like data migration, data maps for integrations may match source fields with destination fields.


A lifecycle of data may be defined based on the data's source and adapted based on situational implications. In general, a data stream lifecycle controls where, when, and how the data is handled and may be adapted based on different triggering events and may result in differing lifecycle reactions.


A lifecycle of data may include collecting the data, analyzing/using the data, publishing the data (e.g., sharing the data), preserving the data (e.g., archiving the data and/or reformatting the data), reusing the data (e.g., citation of the data, data mining of the data, and/or and applications to real-time transformations), retention of data (e.g., expiring data), pruning of data, and purging of data.


Various embodiments of data retention are discussed further in, for example, U.S. Pat. App. Pub. No. 2023/0026634 entitled “Surgical Data System And Classification” published Jan. 26, 2023, which is hereby incorporated by reference in its entirety.


Reusing the data may include baselining of data to adjust system performance, which may include real-time adjustments to baseline or trending of the baseline. For example, in a surgical contact, a patient may be brought into a surgical procedure. Prior to any sedation or other pharmaceuticals being applied, the patient may have their baseline pulse and other metrics measured. This baseline may then be incorporated into data streams, such as pulse measurements, etc., in real time to understand the patient's pulse and other metrics as a percentage of their baseline, in addition to their absolute metrics. After sedation is applied, the system may apply a second baseline to better understand the patient's biomarkers as they exist while sedated, and for tracking changes shortly after sedation has been applied.


Reusing the data, in a surgical context, may include data mining of configurations and data during performance of a surgical procedure to influence system performance, which may allow co-orchestration of the OR symphony based upon data leanings from the procedures and/or changes to an energy delivery algorithm based on data streams. With respect to changes to an energy delivery algorithm, all energy deliveries do not create equal amounts of mechanical or electrical stress on the device that is delivering the energy. Longer activations with higher current flow may result in more degradation of the device than numerous small activations of the device. As a result, the device may accommodate this by updating coefficients within its algorithm for energy delivery based on real-time acquisition of numerous data streams, including data from an energy generator, such as activation length, impedance, etc., as well as other relevant data streams within the operating room. The device may additionally inherit new coefficients for its algorithm from smart connections to other systems in the OR. Instead of Pro-actively fixing the algorithm for changes in a new pad design, or variations in manufacturing, the algorithm may be able to correct for the differences itself. For example, the system can detect the changes within five uses and then continue to correct over time, including, for example, if it over-corrects.


Reusing the data may include, in a surgical context, using activations of an energy device where the performance of the device causes alarms to be triggered. Based on numerous triggers of a similar nature, the system may learn to minimize the alarms.


Pruning of the data may include pruning of patient identifying data. The pruning may be performed at a predefined timing. A remainder of the data, e.g., the non-pruned data, may be maintained until (and if) a full purging of the data is initiated. The pruning of the data may allow for maintaining statistically relevant data but not personal health information (PHI) data. The pruning may include compression of data and/or dimensional reduction of the data in phases (e.g., raw, row reduced, and deprecated).


For example, PHI typically allows identification of a specific small group or one patient out of the overall set. It is unlikely that merely one physiological parameter, e.g., high blood pressure, etc., would be a identifying aspect, but co-morbidities (e.g., blood sugar level, medications, etc.) or surgery date, procedure type, surgeon, hospital, etc. could be used in combination to make a non-identifying data stream a PHI. Specific metadata on devices in use, such as serial numbers, versions of software, or combinations of interacting devices, could also drive non-identifying data to PHI. To purge PHI data, linkages may be broken, or all or part of the PHI may be replaced with averages from within the procedure or between multiple patients. This would convert the PHI to population data which does not require a purging aspect and retention period.


The purging of the data may include removal of all of PHI data or all of the patient/surgical data entirely from the databases it is attached to, accessed by, stored on, or integrated into. The prune may merely be removal of that data stream or data catalog, or the removal of the data may be coupled with an exchange of synthetic data or adaptation of the algorithms that have used the data to develop or learn so that the operative algorithm still operates but the underlining data is no longer supporting it. In the latter case, the pruning may be the removal of the data while replacing it with an average or adjustment factor to prevent shifting of the algorithms developed on it, and/or the learned control algorithms may be compiled or condensed into an static non-adapting control system that no longer requires the underlying data or may be adapted to periodically remove older data and only adapt from the current condensed control loop to new adapted loops based on newer or forward looking data as the older data is removed.


Data lifecycle management may include determination of lifecycle triggers for a specific data type. The determination may be based on one or more of a retention period age (e.g., based on a jurisdiction's requirements for retaining protected information such as U.S. HIPAA and state law requirements), overall age of the data, magnitude and frequency of device failure, providence of the data, presence of unanticipated systems or data streams, predefined outcomes evidence for changing indications or clinical studies, value variability or discrepancy of duration-of-use for cataloging to improve procedure flow, and/or curation (e.g., an indexing of the data stream based on the usefulness of the data to the end user).


With respect to a retention period age, this trigger may initiate a prune operation rather than merely a purge operation to maintain portions of the non-PHI information within the database for continued ML training or expansion of the population summary of the findings. To accomplish this, each data stream would have to contain meta data differentiating between PHI and mere retracted population data, or it would require an instruction set and database organizational scheme that enables it to remove certain taxonomies of data wholesale while retaining others.


With respect to a retention period age based on U.S. HIPAA legal requirements, the HIPAA Breach Notification Rule (45 CFR §§ 164.400-414) further places a burden on entities that have PHI to notify patients of any breach of security during the retention period which would require the entities to maintain some mechanism to track where the patients move to and prove they have been informed of the breach.


With respect to overall age of the data, the older the data is the less relevant it is to new computational use. For example, in a surgical context, while it is likely true portions of how surgical procedures are accomplished and the outcomes have a long or potentially infinite durability, it is also true that medicine in general is constantly updating the gold standard of care for each disease and treatment option. This will place a limit on the durability of a data set to enable a system to learn or identify trends within the data that are relevant to present day issues, complications, and outcomes. This would likely result in a series of data age related purging triggers depending on the type of data that is being recorded, e.g., if the data is anatomic in nature the retention period might be considerably longer (e.g., 20-30 years) than data relating to the outcomes of a particular treatment option (e.g., chemo drug, infection control method, sterilization practice, etc.) which could be as short a 5 years, but more likely 10-15 years.


For another example, a last known sale of device may make data become less relevant as the data ages. By way of example, if device X was last sold in 1999, with a 3 year sterilization lifespan, then 2002 would have been the last date of use of device X. Records related to device X are thus of no value at 10 years beyond 2002, and are discarded in 2012.


For another example, changes in market offering may make data less relevant as the data ages. By way of example, if a device is removed from the market and has been re-introduced, all data related to the earlier iterations of the device have no further value as it has been superseded by other data.


With respect to magnitude and frequency of device failure, such a metric may be reflected by voluntary recalls (e.g., recall of a product by a manufacturer or distributor to protect public health and wellbeing), a health hazard evaluation (e.g., as Class I, Class II, or Class III), and/or severity of the data. For example, depending on the frequency of a device failure or recall or the magnitude of Class I failures, the retention of the data collected relating to that manner of device could be affected, e.g., the manufacturer, health treatment facility, or regulatory body might increase the timing relating to the portion of data around a specific product or use of the product, which would allow for a larger sample size to determine trends or track longer term outcomes that are related to the data collected by the device.


With respect to providence of the data, the providence may be indicated by a source of the data, a reliability of the data, data being replaced with data of greater providence, tagging of the data itself with metadata or other metrics from that data, and/or tagging of the data source associated with the reliability or other metrics from the data (e.g., if you are constantly erasing the unreliable data, you don't know how often you are receiving unreliable data).


With respect to value variability or discrepancy of duration-of-use for cataloging to improve procedure flow, value may be derived by what the data can be used for, e.g., improved surgical procedure time or flow which become lower cost or money.


For example, in a surgical procedure in which a robotic surgical system is being used by a doctor to control an ablation probe, the positioning of the robotic surgical system for ideal ablation probe orientation may take some doctors 30 minutes and other more than 70 minutes. This variance in procedure step length may indicate differing levels of precision (and thus indicate better outcomes) or merely lack of skill (and thus indicate longer procedure time). The data may be pooled with assorted outcomes and related steps to improve both outcome and improved techniques.


Value of data may be indicated by one or more of usefulness of the data (e.g., relevance of the data, frequency of use of the data, and/or frequency of contribution of the data), cost/benefit analysis of data retention (e.g., data being assessed, risk balancing of data retention with acknowledgement that loss of useful data is itself a risk, and/or prioritization of the data), absolute current value (e.g., whether the data is currently useful in any way), and predicted future value of the data (e.g., whether the data is likely to be useful in the future).


With respect to the absolute current value of the data, data may be constantly collected for numerous purposes. In some instances, this data may be collected for legal purposes, but in other instances data may be collected for potential research applications. While data may sometimes be collected under potential future use scenarios, data may sometimes be determined to have no current value.


Data may lack current value because of incomplete data collection. Data may be collected to potentially support an application. However, certain data may be determined to be missing from the data set, and there is no way to supplement it. As a result, the data may be determined to not currently have any value.


Data may lack current value because of an incorrect scenario/hypothesis. Data may be collected under a hypothesis about how it may be useful. That hypothesis may later be shown to be false, and as a result, the data no longer has any value.


Data may lack current value because of incorrect data. Data may be collected but a problem later be identified in how the data was collected. As a result, the data may be inaccurate and no longer has any value.


A combination of no current value and no predicted future value may be the likeliest trigger for a data discard.


With respect to the predicted future value of the data, not all data will have expiration determined by local laws or regulations. Some data may be retained by a company internally only, and as such may be used as long as it is useful by the company. As storing and maintaining data comes with a cost, one question becomes how to determine how “useful” is data going forward. One metric for such a determination may be frequency of reference, e.g., how often data is utilized or referenced by other applications or areas. As data is referenced by projects, systems, or other data streams, the original data may be constantly “tagged” to indicate that it has been used. The more frequently that data is tagged, the more useful it may be considered to be. This rate of tagging of data may be monitored to understand when the value for data is decreasing going into the future. FIG. 7 illustrates one embodiment of a graph of tags for particular data by year. As shown in the graph, a trend of the particular data being tagged less begins in 2032.


With respect to curation, the curation may be based on taxonomy and/or reusability capacity. Taxonomy generally refers to inherent importance of the data itself being of differing value. For example, in a surgical context, in a gallbladder procedure, the needed adaptation of pre-op data to make it applicable to the post-op data (aligning registrations) and the confirmation of a local ultrasound to subjective full body CT in assessing the presences of stones would be more valuable for later use in ML than pulse rate, blood pressure, or common biomarkers of the patient. The mores valuable data may be pruned earlier of patient identifying data to that its purging date could be longer allowing for a large database of curated data for assisted interpretation of full body CT by ML.


For another example, in a surgical context, an atrial fibrillation (AFib), atrial surface remodeling procedure being performed on a patient may include ablation along a continuous path of an ablation device's electrode with selective activation that is correlated with the patient's local heart interior image and EKG. The patient's blood pressure, local heart temperature, and pulse rate are also relevant parameters to the surgeon (and/or other HCP). However, the EKG may be more important than blood pressure, local heart temperature, and pulse rate because the EKG is allowing for predictive future location of the moving heart wall in between beats which is when the ablation probe can be fired. By contrast, in the case of a heart attack and not AFib, local heart temperature (which related to ischemia) and blood flow are of a higher value than the EKG.


Reusability capacity generally refers to durability for prolonged use defining a data's value. Data may have relevancy to two entirely separate groups of people, learnings, or decisions. For example, in a surgical context, the ability for a local imager (e.g., via a flexible endoscopy ultrasound) to produce and adjust a pre-op CT to an operational state may be valuable for gallbladder procedure planning and surgeon for navigation. While the reuse of the same comparison data may also be used to improve the determination of stones via the CT scan alone by using ML to establish characteristics important to the radiologist in differentiating a stone composition compared to a naturally occurring anatomic element.


Data lifecycle management may include adjusting the dates that would trigger the next step in the lifecycle for a data stream. Triggers for the next step may include one or more of changes in clinical care (e.g., the standard of care or other methods causes data to immediately lose value), technological shift (e.g., internally kept data that is no longer valuable due to program abandonment or technological shifts that render it irrelevant going forward), and/or living documents (e.g., data that will never reach a certain lifecycle state, such as human factors reports that are revised over time but are living documents and which never expire).


For example, in a surgical context, the adjustment may be applied to related data stream with interrelated data feeds either being identified to a surgical hub or being self-determined based on review of the data. Interrelated data may be flagged with similar lifecycle changes based on the incident that caused the adaptation.



FIG. 8 illustrates one embodiment of a method 1300 of data lifecycle management. The method 1300 may include receiving 1302, at a destination surgical system from a source surgical system, and during a performance of a surgical procedure on a patient, a first data stream including first data collected during the performance of the surgical procedure. The first data stream may have associated first metadata. The first metadata may include an indication of a retention period for the first data stream, and the first data may include information regarding at least two of patient data, surgical procedure data, and surgical instrument data.


The method 1300 may also include storing 1304 the received first data stream at the destination surgical system. The method 1300 may also include, after the storing 1304 of the received first data stream, modifying 1306 the retention period for the first data stream based on: an aspect of the first data, a usefulness of the first data, a score of the first data, or a situational change of the source surgical system.


The method 1300 may also include after the modifying of the retention period, modifying the retention period again based on: an aspect of the first data, a usefulness of the first data, a score of the first data, or a situational change of the destination surgical system.


The method 1300 may also include creating 1310 the first data stream at the source surgical system.


Computer Systems

A computer system may be suitable for use in implementing computerized components described herein. In broad overview of an exemplary embodiment, the computer system may include a processor configured to perform actions in accordance with instructions, and memory devices configured to store instructions and data. The processor may be in communication, via a bus, with the memory (and/or incorporates the memory) and with at least one network interface controller with a network interface for connecting to external devices, e.g., a computer system (such as a mobile phone, a tablet, a laptop, a server, etc.). The processor may also be configured to be in communication, via the bus, with any other processor(s) of the computer system and with any I/O devices at an I/O interfaces. Generally, a processor will execute instructions received from the memory. In some embodiments, the computer system can be configured within a cloud computing environment, a virtual or containerized computing environment, and/or a web-based microservices environment.


In more detail, the processor can be any logic circuitry that processes instructions, e.g., instructions fetched from the memory. In many embodiments, the processor may be an embedded processor, a microprocessor unit (MPU), microcontroller unit (MCU), field-programmable gate array (FPGA or FGPA), or special purpose processor. The computer system can be based on any processor, e.g., suitable digital signal processor (DSP), or set of processors, capable of operating as described herein. In some embodiments, the processor can be a single core or multi-core processor. In some embodiments, the processor can be composed of multiple processors.


The memory can be any device suitable for storing computer readable data. The memory can be a device with fixed storage or a device for reading removable storage media. Examples include all forms of non-volatile memory, media and memory devices, semiconductor memory devices (e.g., EPROM, EEPROM, SDRAM, flash memory devices, and all types of solid state memory), magnetic disks, and magneto optical disks. A computer system can have any number of memory devices.


The memory also can include a cache memory, which is generally a form of high-speed computer memory placed in close proximity to the processor for fast read/write times. In some embodiments, the cache memory is part of, or on the same chip as, the processor.


The network interface controller may be configured to manage data exchanges via the network interface. The network interface controller may handle the physical, media access control, and data link layers of the Open Systems Interconnect (OSI) model for network communication. In some embodiments, some of the network interface controller's tasks may be handled by the processor. In some embodiments, the network interface controller may be part of the processor. In some embodiments, a computer system may have multiple network interface controllers. In some implementations, the network interface may be a connection point for a physical network link, e.g., an RJ 45 connector. In some embodiments, the network interface controller may support wireless network connections and an interface port may be a wireless Bluetooth transceiver. Generally, a computer system can be configured to exchange data with other network devices via physical or wireless links to a network interface. In some embodiments, the network interface controller may implement a network protocol such as LTE, TCP/IP Ethernet, IEEE 802.11, IEEE 802.16, Bluetooth, or the like.


In some uses, the I/O interface may support an input device and/or an output device. In some uses, the input device and the output device may be integrated into the same hardware, e.g., as in a touch screen. In some uses, such as in a server context, there may be no I/O interface or the I/O interface may not be used. In some uses, additional other components may be in communication with the computer system, e.g., external devices connected via a universal serial bus (USB). In some embodiments, an I/O device may be incorporated into the computer system, e.g., a touch screen on a tablet device.


In some implementations, a computer device may include an additional device such as a co-processor, e.g., a math co-processor configured to assist the processor with high precision or complex calculations.


CONCLUSION

Certain illustrative implementations have been described to provide an overall understanding of the principles of the structure, function, manufacture, and use of the systems, devices, and methods disclosed herein. One or more examples of these implementations have been illustrated in the accompanying drawings. Those skilled in the art will understand that the systems, devices, and methods specifically described herein and illustrated in the accompanying drawings are non-limiting illustrative implementations and that the scope of the present invention is defined solely by the claims. The features illustrated or described in connection with one illustrative implementation may be combined with the features of other implementations. Such modifications and variations are intended to be included within the scope of the present invention. Further, in the present disclosure, like-named components of the implementations generally have similar features, and thus within a particular implementation each feature of each like-named component is not necessarily fully elaborated upon.


Approximating language, as used herein throughout the specification and claims, may be applied to modify any quantitative representation that can permissibly vary without resulting in a change in the basic function to which it is related. Accordingly, a value modified by a term or terms, such as “about,” “approximately,” and “substantially,” are not to be limited to the precise value specified. In at least some instances, the approximating language may correspond to the precision of an instrument for measuring the value. Here and throughout the specification and claims, range limitations may be combined and/or interchanged, such ranges are identified and include all the sub-ranges contained therein unless context or language indicates otherwise.


One skilled in the art will appreciate further features and advantages of the invention based on the above-described implementations. Accordingly, the present application is not to be limited by what has been particularly shown and described, except as indicated by the appended claims. All publications and references cited herein are expressly incorporated herein by reference in their entirety for all purposes.

Claims
  • 1. A computer-implemented method, comprising: receiving, at a destination surgical system from a source surgical system, and during a performance of a surgical procedure on a patient, a first data stream including first data collected during the performance of the surgical procedure, wherein the first data stream has associated first metadata, the first metadata includes an indication of a retention period for the first data stream, and the first data includes information regarding at least two of patient data, surgical procedure data, and surgical instrument data;storing the received first data stream at the destination surgical system; andafter the storing of the received first data stream, modifying the retention period for the first data stream based on: an aspect of the first data,a usefulness of the first data,a score of the first data, ora situational change of the source surgical system.
  • 2. The method of claim 1, wherein the retention period is modified based on the aspect of the first data; and the aspect of the first data includes one of a current value of the first data, a predicted future value of the data, a taxonomy of the first data, an age of the first data, and a change in a relationship of the patient with the patient's health care provider.
  • 3. The method of claim 1, wherein the retention period is modified based on the aspect of the first data; and the aspect of the first data includes a current value of the first data and a predicted future value of the data.
  • 4. The method of claim 1, wherein the retention period is modified based on the aspect of the first data; the aspect of the first data includes presence of patient-identifying data in the first data; andthe first data is pruned of the patient-identifying data after expiration of the modified retention period, and a remainder of the first data remains stored until after expiration of the retention period.
  • 5. The method of claim 1, wherein the retention period is modified based on the aspect of the first data; and the aspect of the first data includes a taxonomy of the first data.
  • 6. The method of claim 1, wherein the retention period is modified based on the usefulness of the first data; and the usefulness of the first data includes one of: a frequency of use of the first data by the destination surgical system, andrelevance of the first data to patient outcome following the performance of the surgical procedure.
  • 7. The method of claim 1, wherein the retention period is modified based on the situational change of the source surgical system; and the situational change includes one of: a recall of the source surgical system,a health hazard evaluation of the source surgical system,a failure rate of the source surgical system, anda failure severity of the source surgical system.
  • 8. The method of claim 1, wherein the retention period is modified based on the score of the first data; and the method further comprises determining the score of the first data based on at least one of a quality of the first data, a protection level of the first data, a type of the first data, and a taxonomy of the first data.
  • 9. The method of claim 1, further comprising creating the first data stream at the source surgical system, wherein the first metadata including the indication of the retention period for the first data stream is generated when the first data stream is created.
  • 10. The method of claim 1, wherein the first data stream remains stored at the destination surgical system until after expiration of the modified retention period.
  • 11. The method of claim 1, further comprising, after the modifying of the retention period, modifying the retention period again based on: an aspect of the first data,a usefulness of the first data,a score of the first data, ora situational change of the destination surgical system.
  • 12. The method of claim 1, wherein the destination surgical system includes a surgical hub; and the source surgical system is one of a hospital network, a database, a surgical instrument, or a surgical cart.
  • 13. The method of claim 12, further comprising receiving, at the surgical hub from a second source surgical system, and during the performance of the surgical procedure on the patient, a second data stream including second data collected during the performance of the surgical procedure, wherein the second data stream has associated second metadata, the second metadata includes an indication of a retention period for the second data stream, and the second data includes information regarding at least two of patient data, surgical procedure data, and surgical instrument data; storing the received second data stream at the surgical hub; andafter the storing of the received second data stream, modifying the retention period for the second data stream based on: an aspect of the second data,a usefulness of the second data,a score of the second data, ora situational change of the second source surgical system.
  • 14. The method of claim 1, wherein each of the source and destination surgical systems is one of a hospital network, a database, a surgical instrument, or a surgical cart.
  • 15. A surgical data management system, comprising: a processor; anda memory storing instructions that, when executed by the processor, cause the processor to perform operations comprising: receiving, at a destination surgical system from a source surgical system, and during a performance of a surgical procedure on a patient, a first data stream including first data collected during the performance of the surgical procedure, wherein the first data stream has associated first metadata, the first metadata includes an indication of a retention period for the first data stream, and the first data includes information regarding at least two of patient data, surgical procedure data, and surgical instrument data,storing of the received first data stream at the destination surgical system, andafter the storing of the received first data stream, modifying of the retention period for the first data stream based on: an aspect of the first data,a usefulness of the first data,a score of the first data, ora situational change of the source surgical system.
  • 16. The surgical data management system of claim 15, wherein the destination surgical system includes a surgical hub; and the source surgical system is one of a hospital network, a database, a surgical instrument, or a surgical cart.
  • 17. The surgical data management system of claim 15, wherein each of the source and destination surgical systems is one of a hospital network, a database, a surgical instrument, or a surgical cart.
  • 18. A computer-implemented method, comprising: receiving, at a destination surgical system from a source surgical system, and during a performance of a surgical procedure on a patient, a first data stream including first data collected during the performance of the surgical procedure, wherein the first data stream has associated first metadata, the first metadata includes an indication of a retention period for the first data stream, and the first data includes information regarding at least two of patient data, surgical procedure data, and surgical instrument data;storing of the received first data stream at the destination surgical system; andafter the storing of the received first data stream, modifying of the retention period for the first data stream based on: an aspect of the first data,a usefulness of the first data,a score of the first data, ora situational change of the source surgical system.
  • 19. The method of claim 18, wherein the destination surgical system includes a surgical hub; and the source surgical system is one of a hospital network, a database, a surgical instrument, or a surgical cart.
  • 20. The method of claim 18, wherein each of the source and destination surgical systems is one of a hospital network, a database, a surgical instrument, or a surgical cart.
CROSS REFERENCE TO RELATED APPLICATIONS

The present application claims priority to U.S. Provisional Patent Application No. 63/603,031 entitled “Smart Surgical Systems” filed Nov. 27, 2023, which is hereby incorporated by reference in its entirety. The subject matter of the present application is related to the following patent applications filed on Nov. 26, 2024, which are hereby incorporated by reference in their entireties: U.S. application Ser. No. 18/960,006 entitled “Methods For Smart Surgical Systems,” U.S. application Ser. No. 18/960,032 entitled “Data Flow Management Between Surgical Systems,” U.S. application Ser. No. 18/960,047 entitled “Mapping Data Pipelines For Surgical Systems,” U.S. application Ser. No. 18/960,059 entitled “Broadcast And Peer-To-Peer Communication For Surgical Systems,” U.S. application Ser. No. 18/960,081 entitled “Data Transformation For Surgical Systems,” U.S. application Ser. No. 18/960,094 entitled “Geofencing For Surgical Systems,” U.S. application Ser. No. 18/960,107 entitled “Information Discrimination For Surgical Instruments,” and U.S. application Ser. No. 18/960,117 entitled “Adaptation Of Data Pipelines For Surgical Systems.”

Provisional Applications (1)
Number Date Country
63603031 Nov 2023 US