INFORMATION DISCRIMINATION FOR SURGICAL INSTRUMENTS

Abstract
Surgical systems and related computer-implemented methods are provided, including during performance of a surgical procedure on a patient, receiving, at a first surgical system, a first dataflow from a second surgical system, the first dataflow including first data regarding a measured patient parameter that the first surgical system is configured to use in performing a function during the performance of the surgical procedure. The computer-implemented methods including determining that a trigger event occurred during the performance of the surgical procedure such that a sum of a first bandwidth of the first dataflow and a second bandwidth of a second dataflow exceeds an available bandwidth, and in response to determining that the trigger event occurred, and during the performance of the surgical procedure, adjusting at least one of the first and second dataflows such that the sum of the first and second bandwidths does not exceed the available bandwidth.
Description
FIELD

The present disclosure relates generally to smart surgical devices, systems, and methods.


BACKGROUND

Surgical operations and environments have benefited from advances in technology. These advances include upgraded equipment, therapeutics, techniques, and more, which has resulted in more favorable outcomes for both patients and healthcare personnel. Further benefits can be realized through the continued advancement of technology and the continued integration of such advancements into the operations and environments.


Computers are more and ubiquitous in everyday life, and as the power of computers and computing systems increases, larger quantities of data can be processed in ways that render meaningful results and information for end users. This type of big data processing has immense benefits to surgical operations and environments as well, as more information can be distilled to meaningful assistance to a user, such as a surgeon, for use and reliance during surgical operations. Ultimately, this additional information for the user can result in even more favorable outcomes for both patients and healthcare personnel.


SUMMARY

In general, smart surgical devices, systems, and methods are provided.


In one embodiment, a surgical data discrimination system is provided that includes a surgical instrument configured to, during performance of a surgical procedure on a patient, receive at least two data streams available to the surgical instrument from an external source. Each of the at least two data streams includes data collected in real time with the performance of the surgical procedure and regarding a same measured parameter. A controller of the surgical instrument is configured to control performance of a function of the surgical instrument in an open loop configuration or in a closed loop configuration. In the open loop configuration, the controller is configured to control the performance of the function without using the data from any of the at least two data streams. In the closed loop configuration, the controller is configured to control the performance of the function using the data of a one of the at least two data streams selected by the controller.


The surgical data discrimination can vary in any number of ways. For example, the external source can include at least two different surgical system sources being used in relation to the performance of the surgical procedure. Further, each of the at least two different surgical system sources can be one of a hospital network, a database, a surgical instrument, or a surgical cart, e.g., each being surgical instruments, one being a surgical instrument and the other(s) being a database, etc.


For another example, the parameter can regard at least one of the patient and the surgical procedure.


For yet another example, the selection by the controller of the one of the at least two data streams can be based on which of the at least two data streams improves operational behavior of the surgical instrument in either stability or outcome.


For still another example, the controller can be configured to base the selection by the controller of the one of the at least two data streams based on at least one of: a hierarchy of the at least two data streams, a trustworthiness of each of the at least two data streams, and a time-dependability of each of the at least two data streams. Further, the controller can be configured to base the selection by the controller of the one of the at least two data streams based on the hierarchy of the at least two data streams, the controller can be configured to base the selection by the controller of the one of the at least two data streams based on the trustworthiness of each of the at least two data streams, and/or the controller can be configured to base the selection by the controller of the one of the at least two data streams based on the time-dependability of each of the at least two data streams.


For another example, the controller can be configured to identify data streams available to the surgical instrument to identify the at least two data streams. Further, the controller can be configured to identify the available data streams based on the external source being communicatively coupled by a user with the surgical instrument; the external source can include at least two different surgical systems, and the controller can be configured to identify each of the available data streams based on either the surgical system broadcasting data stream identification to the surgical instrument or the controller causing the surgical instrument to interrogate the surgical system for available data stream information, and/or the controller can be configured to identify the available data streams based on an introduction of a surgical system to a same space as the surgical instrument. Further, the space can be a physical operating room space, or the space can be an interconnected network.


For yet another example, the function of the surgical instrument can be one of stapling tissue of the patient, applying energy to the tissue of the patient, applying a clip to the tissue of the patient, imaging the tissue of the patient, cutting the tissue of the patient, evacuating smoke from the patient, suctioning fluid from the patient, delivery irrigation fluid to the patient, and grasping the tissue of the patient.


For still another example, the surgical instrument can be one of a surgical stapler, a surgical energy device, an imaging device, a surgical clip applier, a smoke evacuator, a surgical dissector, and a surgical grasper, and the external source can be one of a surgical stapler, a surgical energy device, an imaging device, a surgical clip applier, a smoke evacuator, a surgical dissector, and a surgical grasper.


In another embodiment, a surgical data discrimination system is provided that includes a processor and a memory storing instructions that, when executed by the processor, cause the processor to perform operations including: controlling performance of a function of a surgical instrument in an open loop configuration or in a closed loop configuration. In the open loop configuration, the performance of the function is controlled without using data from any of at least two data streams available to the surgical instrument from an external source. Each of the at least two data streams includes data collected in real time with the performance of the surgical procedure and regarding a same measured parameter. In the closed loop configuration, the performance of the function is controlled using the data of a selected one of the at least two data streams.


The surgical data discrimination system can have any number of variations. For example, a surgical instrument can include the processor and the memory. Further, the surgical instrument can be one of a surgical stapler, a surgical energy device, an imaging device, a surgical clip applier, a smoke evacuator, a surgical dissector, and a surgical grasper; and/or the external source can be one of a surgical stapler, a surgical energy device, an imaging device, a surgical clip applier, a smoke evacuator, a surgical dissector, and a surgical grasper.


For another example, a surgical hub can include the processor and the memory, and the surgical hub can be configured to be communicatively coupled with each of the surgical instrument and the external source.


For yet another example, a cloud-based server can include the processor and the memory, and the cloud-based server can be configured to be communicatively coupled with each of the surgical instrument and the external source.


For still another example, the external source can be one of a surgical stapler, a surgical energy device, an imaging device, a surgical clip applier, a smoke evacuator, a surgical dissector, and a surgical grasper


For example, the external source can include at least two different surgical system sources being used in relation to the performance of the surgical procedure. Further, each of the at least two different surgical system sources can be one of a hospital network, a database, a surgical instrument, or a surgical cart, e.g., each being surgical instruments, one being a surgical instrument and the other(s) being a database, etc.


For another example, the parameter can regard at least one of the patient and the surgical procedure.


For yet another example, the selection of the one of the at least two data streams can be based on which of the at least two data streams improves operational behavior of the surgical instrument in either stability or outcome.


For still another example, the selection of the one of the at least two data streams can be based on at least one of: a hierarchy of the at least two data streams, a trustworthiness of each of the at least two data streams, and a time-dependability of each of the at least two data streams. Further, the selection of the one of the at least two data streams can be based on the hierarchy of the at least two data streams, the selection of the one of the at least two data streams can be based on the trustworthiness of each of the at least two data streams, and/or the selection of the one of the at least two data streams can be based on the time-dependability of each of the at least two data streams.


For another example, the operations can also include identifying data streams available to the surgical instrument to identify the at least two data streams. Further, the identification of the available data streams can be based on the external source being communicatively coupled by a user with the surgical instrument; the external source can include at least two different surgical systems, and each of the available data streams can be identified based on either the surgical system broadcasting data stream identification to the surgical instrument or causing the surgical instrument to interrogate the surgical system for available data stream information, and/or the identification the available data streams can be based on an introduction of a surgical system to a same space as the surgical instrument. Further, the space can be a physical operating room space, or the space can be an interconnected network.


For yet another example, the function of the surgical instrument can be one of stapling tissue of the patient, applying energy to the tissue of the patient, applying a clip to the tissue of the patient, imaging the tissue of the patient, cutting the tissue of the patient, evacuating smoke from the patient, suctioning fluid from the patient, delivery irrigation fluid to the patient, and grasping the tissue of the patient.


In another embodiment, a computer-implemented method is provided that includes controlling performance of a function of a surgical instrument in an open loop configuration or in a closed loop configuration. In the open loop configuration, the performance of the function is controlled without using data from any of at least two data streams available to the surgical instrument from an external source. Each of the at least two data streams includes data collected in real time with the performance of the surgical procedure and regarding a same measured parameter. In the closed loop configuration, the performance of the function is controlled using the data of a selected one of the at least two data streams.


The computer-implemented method can vary in any number of ways. For example, the surgical instrument can perform the controlling. Further, the surgical instrument can be one of a surgical stapler, a surgical energy device, an imaging device, a surgical clip applier, a smoke evacuator, a surgical dissector, and a surgical grasper; and/or the external source can be one of a surgical stapler, a surgical energy device, an imaging device, a surgical clip applier, a smoke evacuator, a surgical dissector, and a surgical grasper.


For another example, a surgical hub can perform the controlling, and the surgical hub can be configured to be communicatively coupled with each of the surgical instrument and the external source.


For yet another example, a cloud-based server can perform the controlling, and the cloud-based server can be configured to be communicatively coupled with each of the surgical instrument and the external source.


For still another example, the external source can be one of a surgical stapler, a surgical energy device, an imaging device, a surgical clip applier, a smoke evacuator, a surgical dissector, and a surgical grasper


For example, the external source can include at least two different surgical system sources being used in relation to the performance of the surgical procedure. Further, each of the at least two different surgical system sources can be one of a hospital network, a database, a surgical instrument, or a surgical cart, e.g., each being surgical instruments, one being a surgical instrument and the other(s) being a database, etc.


For another example, the parameter can regard at least one of the patient and the surgical procedure.


For yet another example, the selection of the one of the at least two data streams can be based on which of the at least two data streams improves operational behavior of the surgical instrument in either stability or outcome.


For still another example, the selection of the one of the at least two data streams can be based on at least one of: a hierarchy of the at least two data streams, a trustworthiness of each of the at least two data streams, and a time-dependability of each of the at least two data streams. Further, the selection of the one of the at least two data streams can be based on the hierarchy of the at least two data streams, the selection of the one of the at least two data streams can be based on the trustworthiness of each of the at least two data streams, and/or the selection of the one of the at least two data streams can be based on the time-dependability of each of the at least two data streams.


For another example, the method can also include identifying data streams available to the surgical instrument to identify the at least two data streams. Further, the identification of the available data streams can be based on the external source being communicatively coupled by a user with the surgical instrument; the external source can include at least two different surgical systems, and the identification of each of the available data streams can be based on either the surgical system broadcasting data stream identification to the surgical instrument or causing the surgical instrument to interrogate the surgical system for available data stream information, and/or the identification the available data streams can be based on an introduction of a surgical system to a same space as the surgical instrument. Further, the space can be a physical operating room space, or the space can be an interconnected network.


For yet another example, the function of the surgical instrument can be one of stapling tissue of the patient, applying energy to the tissue of the patient, applying a clip to the tissue of the patient, imaging the tissue of the patient, cutting the tissue of the patient, evacuating smoke from the patient, suctioning fluid from the patient, delivery irrigation fluid to the patient, and grasping the tissue of the patient.





BRIEF DESCRIPTION OF DRAWINGS

The present invention is described by way of reference to the accompanying figures which are as follows:



FIG. 1 is a schematic view of one embodiment of a computer-implemented surgical system;



FIG. 2 is a perspective view of one embodiment of a surgical system in one embodiment of a surgical operating room;



FIG. 3 is a schematic view of one embodiment of a surgical hub paired with various systems;



FIG. 4 is a schematic view of one embodiment of a situationally aware surgical system;



FIG. 5 is a perspective view of one embodiment of a surgical instrument and one embodiment of a surgical system that includes the surgical instrument;



FIG. 6A is a schematic view of a data pipeline architecture;



FIG. 6B is an expanded schematic view of the data pipeline architecture of FIG. 6A; and



FIG. 7 is a flowchart of one embodiment of a method of information discrimination.





It is noted that the drawings are not necessarily to scale. The drawings are intended to depict only typical aspects of the subject matter disclosed herein, and therefore should not be considered as limiting the scope of the disclosure.


DETAILED DESCRIPTION

Certain exemplary embodiments will now be described to provide an overall understanding of the principles of the structure, function, manufacture, and use of the devices, systems, and methods disclosed herein. One or more examples of these embodiments are illustrated in the accompanying drawings. A person skilled in the art will understand that the devices, systems, and methods specifically described herein and illustrated in the accompanying drawings are non-limiting exemplary embodiments and that the scope of the present invention is defined solely by the claims. The features illustrated or described in connection with one exemplary embodiment may be combined with the features of other embodiments. Such modifications and variations are intended to be included within the scope of the present invention.


Further, in the present disclosure, like-named components of the embodiments generally have similar features, and thus within a particular embodiment each feature of each like-named component is not necessarily fully elaborated upon. Additionally, to the extent that linear or circular dimensions are used in the description of the disclosed systems, devices, and methods, such dimensions are not intended to limit the types of shapes that can be used in conjunction with such systems, devices, and methods. A person skilled in the art will recognize that an equivalent to such linear and circular dimensions can easily be determined for any geometric shape. A person skilled in the art will appreciate that a dimension may not be a precise value but nevertheless be considered to be at about that value due to any number of factors such as manufacturing tolerances and sensitivity of measurement equipment. Sizes and shapes of the systems and devices, and the components thereof, can depend at least on the size and shape of components with which the systems and devices will be used.


In general, a health data management system may include an interactive smart system that includes data origination facets, movement, architecture and management, and transformation and lifecycle to determine mechanisms by which smart systems talk to each other. The health data management system may include a data stack that defines handling of data from beginning to end. A data stack may include data sources, data pipelines, data transformation/modeling systems, and data storage systems that define end-to-end handling of data. In one embodiment, the health data management system may include a plurality of smart medical systems that are configured to perform one or more medical operations. The health data management system may utilize the data stack to control and manage data flow to the different smart device systems. In one embodiment, the health data management system may control and manage the data flow for managing a patient or performing a medical procedure, for example, providing surgical assistance during performance of a surgical procedure by one or more smart medical systems (also referred to herein as “surgical systems”).


Surgical Systems


FIG. 1 shows one embodiment of a computer-implemented surgical system 100. The surgical system 100 may include one or more surgical systems (e.g., surgical sub-systems) 102, 103, 104. As in this illustrated embodiment, the surgical system 100 may include first, second, and third surgical systems 102, 103, 104, but may instead include another number, e.g., one, two, four, etc.


The first surgical system 102 is discussed herein as a general representative of the surgical systems 102, 103, 104. For example, the surgical system 102 may include a computer-implemented interactive surgical system. For example, the surgical system 102 may include a surgical hub 106 and/or a computing device 116 in communication with a cloud computing system 108, for example, as described in FIG. 2.


The cloud computing system 108 may include at least one remote cloud server 109 and at least one remote cloud storage unit 110. Embodiments of surgical systems 102. 103, 104 may include one or more wearable sensing systems 111, one or more environmental sensing systems 115, one or more robotic systems (also referred to herein as “robotic surgical systems”) 113, one or more intelligent instruments 114 (e.g., smart surgical instruments), one or more human interface systems 112, etc. A “human interface system” is also referred herein as a “human interface device.” The wearable sensing system(s) 111 may include one or more HCPs (“health care professional” or “health care personnel”) sensing systems and/or one or more patient sensing systems. The environmental sensing system(s) 115 may include one or more devices used for measuring one or more environmental attributes, for example, as further described in FIG. 2. The robotic system(s) 113 may include a plurality of devices used for performing a surgical procedure, for example, as further described in FIG. 2.


Examples of robotic surgical systems include the Ottava™ robotic-assisted surgery system (Johnson & Johnson of New Brunswick, NJ), da Vinci® surgical systems (Intuitive Surgical, Inc. of Sunnyvale, CA), the Hugo™ robotic-assisted surgery system (Medtronic PLC of Minneapolis, MN), the Versius® surgical robotic system (CMR Surgical Ltd of Cambridge, UK), and the Monarch® platform (Auris Health, Inc. of Redwood City, CA). Embodiments of various robotic surgical systems and using robotic surgical systems are further described in, for example, U.S. Pat. App. Pub. No. 2018/0177556 entitled “Flexible Instrument Insertion Using An Adaptive Force Threshold” filed Dec. 28, 2016, U.S. Pat. App. Pub. No. 2020/0000530 entitled “Systems And Techniques For Providing Multiple Perspectives During Medical Procedures” filed Apr. 16, 2019, U.S. Pat. App. Pub. No. 2020/0170720 entitled “Image-Based Branch Detection And Mapping For Navigation” filed Feb. 7. 2020, U.S. Pat. App. Pub. No. 2020/0188043 entitled “Surgical Robotics System” filed Dec. 9. 2019, U.S. Pat. App. Pub. No. 2020/0085516 entitled “Systems And Methods For Concomitant Medical Procedures” filed Sep. 3. 2019, U.S. Pat. No. 8,831,782 entitled “Patient-Side Surgeon Interface For A Teleoperated Surgical Instrument” filed Jul. 15, 2013, and Intl. Pat. Pub. No. WO 2014151621 entitled “Hyperdexterous Surgical System” filed Mar. 13, 2014, which are hereby incorporated by reference in their entireties.


The surgical system 102 may be in communication with the one or more remote servers 109 that may be part of the cloud computing system 108. In an example embodiment, the surgical system 102 may be in communication with the one or more remote servers 109 via an internet service provider's cable/FIOS networking node. In an example embodiment, a patient sensing system may be in direct communication with the one or more remote servers 109. The surgical system 102 (and/or various sub-systems, smart surgical instruments, robots, sensing systems, and other computerized devices described herein) may collect data in real-time and transfer the data to the cloud computing system 108 for data processing and manipulation, e.g., by the one or more remote servers 109. It will be appreciated that cloud computing may rely on sharing computing resources rather than having local servers or personal devices to handle software applications.


The surgical system 102 and/or a component therein may communicate with the one or more remote servers 109 via a cellular transmission/reception point (TRP) or a base station using one or more of the following cellular protocols: GSM/GPRS/EDGE (2G), UMTS/HSPA (3G), long term evolution (LTE) or 4G, LTE-Advanced (LTE-A), new radio (NR) or 5G, and/or other wired or wireless communication protocols. Various embodiments of cloud-based analytics that may be performed by the cloud computing system 108 are described further in, for example, U.S. Pat. App Pub. No. 2019/0206569 entitled “Method Of Cloud Based Data Analytics For Use With The Hub” published Jul. 4, 2019, which is hereby incorporated by reference in its entirety.


The surgical hub 106 may have cooperative interactions with one or more means of displaying an image, e.g., a display configured to display an image from a laparoscopic scope, etc., and information from one or more other smart devices and/or one or more sensing systems. The surgical hub 106 may interact with the one or more sensing systems, the one or more smart devices, and the one or more means of displaying an image. The surgical hub 106 may be configured to gather measurement data from the sensing system(s) and send notifications or control messages to the sensing system(s). The surgical hub 106 may send and/or receive information including notification information to and/or from the one or more human interface systems 112. The one or more human interface systems 112 may include one or more human interface devices (HIDs). The surgical hub 106 may send and/or receive notification information or control information to audio, display and/or control information to various devices that are in communication with the surgical hub 106.


In an exemplary embodiment, the one or more sensing systems may include the one or more wearable sensing system 111 (which may include one or more HCP sensing systems and/or one or more patient sensing systems) and/or the environmental sensing system(s) 115 shown in FIG. 1. The sensing system(s) may measure data relating to various biomarkers.


In an exemplary embodiment, the sensing system(s) may measure the biomarkers using one or more sensors, for example, photosensors (e.g., photodiodes, photoresistors), mechanical sensors (e.g., motion sensors), acoustic sensors, electrical sensors, electrochemical sensors, thermoelectric sensors, infrared sensors, etc. The sensor(s) may measure the biomarkers using one of more of the following sensing technologies: photoplethysmography, electrocardiography, electroencephalography, colorimetry, impedimentary, potentiometry, amperometry, etc.


The biomarkers measured by the sensing system(s) may include, but are not limited to, sleep, core body temperature, maximal oxygen consumption, physical activity, alcohol consumption, respiration rate, oxygen saturation, blood pressure, blood sugar, heart rate variability, blood potential of hydrogen, hydration state, heart rate, skin conductance, peripheral temperature, tissue perfusion pressure, coughing and sneezing, gastrointestinal motility, gastrointestinal tract imaging, respiratory tract bacteria, edema, mental aspects, sweat, circulating tumor cells, autonomic tone, circadian rhythm, and/or menstrual cycle.


The biomarkers may relate to physiologic systems, which may include, but are not limited to, behavior and psychology, cardiovascular system, renal system, skin system, nervous system, gastrointestinal system, respiratory system, endocrine system, immune system, tumor, musculoskeletal system, and/or reproductive system. Information from the biomarkers may be determined and/or used by the computer-implemented surgical system 100, for example. The information from the biomarkers may be determined and/or used by the computer-implemented patient and the computer-implemented surgical system 100 to improve said systems and/or to improve patient outcomes, for example.


The sensing system(s) may send data to the surgical hub 106. The sensing system(s) may use one or more of the following radiofrequency (RF) protocols for communicating with the surgical hub 106: Bluetooth, Bluetooth Low-Energy (BLE), Bluetooth Smart. Zigbee, Z-wave, IPv6 Low-power wireless Personal Area Network (6LoWPAN), Wi-Fi, etc.


Various embodiments of sensing systems, biomarkers, and physiological systems are described further in, for example, U.S. Pat. App Pub. No. 2022/0233119 entitled “Method Of Adjusting A Surgical Parameter Based On Biomarker Measurements” filed published Jul. 28, 2022, which is hereby incorporated by reference in its entirety.


The sensing systems described herein may be employed to assess physiological conditions of a surgeon operating on a patient or a patient being prepared for a surgical procedure or a patient recovering after a surgical procedure. The cloud-based computing system 108 may be used to monitor biomarkers associated with a HCP (a surgeon, a nurse, etc.) or a patient in real-time and to generate surgical plans based at least on measurement data gathered prior to a surgical procedure, provide control signals to one or more surgical instruments during a surgical procedure, and notify a patient of a complication during a post-surgical period.


The cloud-based computing system 108 may be used to analyze surgical data. Surgical data may be obtained via the intelligent instrument(s) 114, the wearable sensing system(s) 111, the environmental sensing system(s) 115, the robotic system(s) 113, and/or the like in the surgical system 102. Surgical data may include tissue states to assess leaks or perfusion of sealed tissue after a tissue sealing and cutting procedure pathology data, including images of samples of body tissue, anatomical structures of the body using a variety of sensors integrated with imaging devices and techniques such as overlaying images captured by multiple imaging devices, image data, and/or the like. The surgical data may be analyzed to improve surgical procedure outcomes by determining if further treatment, such as the application of endoscopic intervention, emerging technologies, a targeted radiation, targeted intervention, and precise robotics to tissue-specific sites and conditions. Such data analysis may employ outcome analytics processing and using standardized approaches may provide beneficial feedback to either confirm surgical treatments and the behavior of the surgeon or suggest modifications to surgical treatments and the behavior of the surgeon.



FIG. 2 shows one embodiment of the surgical system 102 in one embodiment of a surgical operating room 135. As illustrated in FIG. 2, a patient is being operated on by one or more HCPs. The HCP(s) are being monitored by one or more HCP sensing systems 120 worn by the HCP(s). The HCP(s) and the environment surrounding the HCP(s) may also be monitored by one or more environmental sensing systems including, for example, one or more cameras 121, one or more microphones 122, and other sensors that may be deployed in the operating room. The one or more HCP sensing systems 120 and the environmental sensing systems may be in communication with the surgical hub 106, which in turn may be in communication with the one or more cloud servers 109 of the cloud computing system 108, as shown in FIG. 1. The one or more environmental sensing systems may be used for measuring one or more environmental attributes, for example, HCP position in the surgical theater, HCP movements, ambient noise in the surgical theater, temperature/humidity in the surgical theater, etc.


As illustrated in FIG. 2, a primary display 123 and one or more audio output devices (e.g., speakers 119) are positioned in a sterile field of the surgical system 102 to be visible to an operator at an operating table 124. In addition, a visualization/notification tower 126 is positioned outside the sterile field. The visualization/notification tower 126 may include a first non-sterile human interactive device (HID) 127 and a second non-sterile HID 129, which may be displays and may face away from each other. The display 123 and the HIDs 127, 129 may include a touch screen allowing a human to interface directly with the HID 127, 129. A human interface system, guided by the surgical hub 106, may be configured to utilize the display 123 and the HIDs 127, 129 to coordinate information flow to operators inside and outside the sterile field. In an exemplary embodiment, the surgical hub 106 may cause an HID (e.g., the primary display 123) to display a notification and/or information about the patient and/or a surgical procedure step. In an exemplary embodiment, the surgical hub 106 may prompt for and/or receive input from personnel in the sterile field or in the non-sterile area. In an exemplary embodiment, the surgical hub 106 may cause one or more non-sterile HIDs 127, 129 to display a snapshot of a surgical site, as recorded by an imaging device 130, while maintaining a live feed of the surgical site on one or more sterile HIDs. e.g., the primary HID 123. The snapshot on the non-sterile HID(s) 127, 129 can permit a non-sterile operator to perform a diagnostic step relevant to the surgical procedure, for example.


The surgical hub 106 may be configured to route a diagnostic input or feedback entered by a non-sterile operator at the visualization tower 126 to the primary display 123 within the sterile field, where it can be viewed by a sterile operator at the operating table. In an exemplary embodiment, the input can be in the form of a modification to the snapshot displayed on the non-sterile HID(s) 127, 129, which can be routed to the one or more sterile HIDs, e.g., the primary display 123, by the surgical hub 106.


Various embodiments of surgical hubs are further described in, for example, U.S. Pat. App. Pub. No. 2024/0221937 entitled “Method For Advanced Algorithm Support” published Jul. 4, 2024, U.S. Pat. App. Pub. No. 2024/0112768 entitled “Method For Health Data And Consent Management” published Apr. 4, 2024, U.S. Pat. App. Pub. No. 2024/0220763 entitled “Data Volume Determination For Surgical Machine Learning Applications” published Jul. 2. 2024, U.S. Pat. App. Pub. No. 2019/0206564 entitled “Method For Facility Data Collection And Interpretation” published Jul. 4. 2019, U.S. Pat. App. Pub. No. 2023/0026634 entitled “Surgical Data System And Classification” published Jan. 26, 2023, U.S. Pat. App. Pub. No. 2019/0201115 entitled “Aggregation And Reporting Of Surgical Hub Data” published Jul. 4. 2019, U.S. Pat. App. Pub. No. 2023/0372030 entitled “Automatic Compilation, Annotation, And Dissemination Of Surgical Data To Systems To Anticipate Related Automated Operations” published Nov. 23, 2023, U.S. Pat. App. Pub. No. 2022/0104896 entitled “Interactive Information Overlay On Multiple Surgical Displays” published Apr. 7. 2022, U.S. Pat. No. 11,304,699 entitled “Method For Adaptive Control Schemes For Surgical Network Control And Interaction” issued Apr. 19, 2022, U.S. Pat. No. 10,849,697 entitled “Cloud Interface For Coupled Surgical Devices” issued Dec. 1, 2020, U.S. Pat. App. Pub. No. 2022/0239577 entitled “Ad Hoc Synchronization Of Data From Multiple Link Coordinated Sensing Systems” published Jul. 28, 2022. U.S. Pat. App. Pub. No. 2023/0025061 entitled “Surgical Data System And Management” published Jan. 26, 2023, U.S. Pat. App. Pub. No. 2023/0023083 entitled “Method Of Surgical System Power Management, Communication, Processing, Storage And Display” published Jan. 26, 2023, U.S. Pat. App. Pub. No. 2019/0206556 entitled “Real-Time Analysis Of Comprehensive Cost Of All Instrumentation Used In Surgery Utilizing Data Fluidity To Track Instruments Through Stocking And In-House Processes” published Jul. 4. 2019, U.S. Pat. App. Pub. No. 2023/0096268 entitled “Methods for Controlling Cooperative Surgical Instruments” filed Oct. 5, 2021, U.S. Pat. App. Pub. No. 2019/0200844 entitled “Method Of Hub Communication, Processing, Storage And Display” filed Dec. 4, 2018, U.S. Pat. App. Pub. No. 2019/0200981 entitled “Method Of Compressing Tissue Within A Stapling Device And Simultaneously Displaying The Location Of The Tissue Within The Jaws” filed Dec. 4, 2018, U.S. Pat. App. Pub. No. 2019/0201046 entitled “Method For Controlling Smart Energy Devices” filed Dec. 4, 2018, U.S. Pat. App. Pub. No. 2019/0201114 entitled “Adaptive Control Program Updates For Surgical Hubs” filed Mar. 29, 2018, U.S. Pat. App. Pub. No. 2019/0201140 entitled “Surgical Hub Situational Awareness” filed Mar. 29, 2018, U.S. Pat. App. Pub. No. 2019/0206004 entitled “Interactive Surgical Systems With Condition Handling Of Devices And Data Capabilities” filed Mar. 29, 2018, U.S. Pat. App. Pub. No. 2019/0206555 entitled “Cloud-based Medical Analytics For Customization And Recommendations To A User” filed Mar. 29, 2018, U.S. Pat. No. 11,678,881 entitled “Spatial Awareness Of Surgical Hubs In Operating Rooms” filed Mar. 29, 2018, and U.S. Pat. App. Pub. No. 2019/0207857 entitled “Surgical Network Determination Of Prioritization Of Communication, Interaction, Or Processing Based On System Or Device Needs” filed Nov. 6, 2018, which are hereby incorporated by reference in their entireties.


As in the illustrated embodiment of FIG. 2, one or more surgical instruments 131 may be being used in the surgical procedure as part of the surgical system 102. The surgical hub 106 may be configured to coordinate information flow to at least one display showing the surgical instrument(s) 131. A diagnostic input or feedback entered by a non-sterile operator at the visualization tower 126 can be routed by the surgical hub 106 to the at least one display, e.g., the primary display 123, within the sterile field, where it can be viewed by the operator of the surgical instrument(s) 131.


Various embodiments of coordinating information flow and display and various embodiments of surgical instruments are described further in, for example, U.S. Pat. No. 11,937,769 entitled “Method Of Hub Communication, Processing, Storage And Display” issued Mar. 26, 2024, which is hereby incorporated by reference in its entirety.


Examples of surgical instruments include a surgical dissector, a surgical stapler, a surgical grasper, a surgical scope (e.g., an endoscope, a laparoscope, etc.), a surgical energy device (e.g., a mono-polar probe, a bi-polar probe, an ablation probe, an ultrasound device, an ultrasonic end effector, etc.), a surgical clip applier, etc.


Various embodiments of surgical instruments are described further in, for example, U.S. Pat. No. 11,723,642 entitled “Cooperative Access Hybrid Procedures” issued Aug. 14, 2023, U.S. Pat. App. Pub. No. 2013/0256377 entitled “Layer Comprising Deployable Attachment Members” filed Feb. 8, 2013, U.S. Pat. No. 8,393,514 entitled “Selectively Orientable Implantable Fastener Cartridge” filed Sep. 30, 2010, U.S. Pat. No. 8,317,070 entitled “Surgical Stapling Devices That Produce Formed Staples Having Different Lengths” filed Feb. 28, 2007, U.S. Pat. No. 7,143,925 entitled “Surgical Instrument Incorporating EAP Blocking Lockout Mechanism” filed Jun. 21, 2005, U.S. Pat. App. Pub. No. 2015/0134077 entitled “Sealing Materials for Use in Surgical Stapling” filed Nov. 8, 2013, U.S. Pat. App. Pub. No. 2015/0134076 entitled “Hybrid Adjunct Materials for Use in Surgical Stapling” filed Nov. 8, 2013, U.S. Pat. App. Pub. No. 2015/0133996 entitled “Positively Charged Implantable Materials and Method of Forming the Same” filed Nov. 8, 2013, U.S. Pat. App. Pub. No. 2015/0129634 entitled “Tissue Ingrowth Materials and Method of Using the Same” filed Nov. 8, 2013, U.S. Pat. App. Pub. No. 2015/0133995 entitled “Hybrid Adjunct Materials for Use in Surgical Stapling” filed Nov. 8, 2013, U.S. Pat. No. 9,913,642 entitled “Surgical Instrument Comprising a Sensor System” filed Mar. 26, 2014, U.S. Pat. No. 10,172,611 entitled “Adjunct Materials and Methods of Using Same in Surgical Methods for Tissue Sealing” filed Jun. 10, 2014, U.S. Pat. No. 8,989,903 entitled “Methods And Systems For Indicating A Clamping Prediction” filed Jan. 13, 2012, U.S. Pat. No. 9,072,535 entitled “Surgical Stapling Instruments With Rotatable Staple Deployment Arrangements” filed May 27, 2011, U.S. Pat. No. 9,072,536 entitled “Differential Locking Arrangements For Rotary Powered Surgical Instruments” filed Jun. 28, 2012, U.S. Pat. No. 10,531,929 entitled “Control Of Robotic Arm Motion Based On Sensed Load On Cutting Tool” filed Aug. 16, 2016, U.S. Pat. No. 10,709,516 entitled “Curved Cannula Surgical System Control” filed April 2. 2018, U.S. Pat. No. 11,076,926 entitled “Manual Release For Medical Device Drive System” filed Mar. 21, 2018, U.S. Pat. No. 9,839,487 entitled “Method For Engaging Surgical Instrument With Teleoperated Actuator” filed Mar. 17, 2015, U.S. Pat. No. 10,543,051 entitled “Method For Engaging Surgical Instrument With Teleoperated Actuator” issued Jan. 28, 2020, U.S. Pat. No. 9,804,618 entitled “Systems And Methods For Controlling A Segmented Circuit” filed Mar. 25, 2014, U.S. Pat. No. 11,607,239 entitled “Systems And Methods For Controlling A Surgical Stapling And Cutting Instrument” filed Apr. 15, 2016, U.S. Pat. No. 10,052,044 entitled “Time Dependent Evaluation Of Sensor Data To Determine Stability, Creep, And Viscoelastic Elements Of Measures” filed Mar. 6, 2015, U.S. Pat. No. 9,439,649 entitled “Surgical Instrument Having Force Feedback Capabilities” filed Dec. 12, 2012, U.S. Pat. No. 10,751,117 entitled “Electrosurgical Instrument With Fluid Diverter” filed Sep. 23, 2016, U.S. Pat. No. 11,160,602 entitled “Control Of Surgical Field Irrigation” filed Aug. 29, 2017, U.S. Pat. No. 9,877,783 entitled “Energy Delivery Systems And Uses Thereof” filed Dec. 30, 2016, U.S. Pat. No. 11,266,458 entitled “Cryosurgical System With Pressure Regulation” filed Apr. 19, 2019, U.S. Pat. No. 10,314,649 entitled “Flexible Expandable Electrode And Method Of Intraluminal Delivery Of Pulsed Power” filed Aug. 2, 2012, U.S. Pat. App. Pub. No. 2023/0116781 entitled “Surgical Devices, Systems, And Methods Using Multi-Source Imaging” filed Oct. 5, 2021, U.S. Pat. App. Pub. No. 2023/0102358 entitled “Surgical Devices, Systems, And Methods Using Fiducial Identification And Tracking” filed Oct. 5, 2021, U.S. Pat. No. 10,413,373 entitled “Robotic Visualization And Collision Avoidance” filed Aug. 16, 2016, U.S. Pat. App. Pub. No. 2023/0077141 entitled “Robotically Controlled Uterine Manipulator” filed Sep. 21. 2021, and U.S. Pat. App. Pub. No. 2022/0273309 entitled “Stapler Reload Detection And Identification” filed May 16, 2022, which are hereby incorporated by reference herein in their entireties.


As shown in FIG. 2, the surgical system 102 can be used to perform a surgical procedure on the patient who is lying down on the operating table 124 in the surgical operating room 135. A robotic system 134 may be used in the surgical procedure as a part of the surgical system 102. The robotic system 134 may include a surgeon's console 136, a patient side cart 132 (surgical robot), and a surgical robotic hub 133. The patient side cart 132 can manipulate at least one removably coupled surgical instrument 137 through a minimally invasive incision in the body of the patient while the surgeon views the surgical site via the surgeon's console 136. An image of the surgical site can be obtained by the imaging device 130, which can be manipulated by the patient side cart 132 to orient the imaging device 130. The surgical robotic hub 133 can be used to process the images of the surgical site for subsequent display to the surgeon via the surgeon's console 136.


Various embodiments of robotic systems and various embodiments of surgical instruments are described further in, for example, U.S. Pat. No. 11,559,307 entitled “Method Of Robotic Hub Communication, Detection, And Control” issued Jan. 24, 2023, which is hereby incorporated by reference in its entirety.


The imaging device 130 may include at least one image sensor and one or more optical components. Suitable image sensors may include, but are not limited to, Charge-Coupled Device (CCD) sensors and Complementary Metal-Oxide Semiconductor (CMOS) sensors.


The optical components of the imaging device 130 may include one or more illumination sources and/or one or more lenses. The one or more illumination sources may be directed to illuminate portions of the surgical field. The one or more image sensors may receive light reflected or refracted from the surgical field, including light reflected or refracted from tissue and/or surgical instruments. The illumination source(s) may be configured to radiate electromagnetic energy in the visible spectrum as well as the invisible spectrum. The visible spectrum, sometimes referred to as the “optical spectrum” or the “luminous spectrum,” is the portion of the electromagnetic spectrum that is visible to (e.g., can be detected by) the human eye and may be referred to as “visible light” or simply “light.” A typical human eye will respond to wavelengths in air that range from about 380 nm to about 750 nm.


The invisible spectrum (e.g., the non-luminous spectrum) is the portion of the electromagnetic spectrum that lies below and above the visible spectrum (i.e., wavelengths below about 380 nm and above about 750 nm). The invisible spectrum is not detectable by the human eye. Wavelengths greater than about 750 nm are longer than the red visible spectrum, and they become invisible infrared (IR), microwave, and radio electromagnetic radiation. Wavelengths less than about 380 nm are shorter than the violet spectrum, and they become invisible ultraviolet, x-ray, and gamma ray electromagnetic radiation.


In various aspects, the imaging device 130 is configured for use in a minimally invasive procedure. Examples of imaging devices include an arthroscope, angioscope, bronchoscope, choledochoscope, colonoscope, cytoscope, duodenoscope, enteroscope, esophagogastro-duodenoscope (gastroscope), endoscope, laryngoscope, nasopharyngo-neproscope, sigmoidoscope, thoracoscope, and ureteroscope.


The imaging device 130 may employ multi-spectrum monitoring to discriminate topography and underlying structures. A multi-spectral image is one that captures image data within specific wavelength ranges across the electromagnetic spectrum. The wavelengths may be separated by filters or by the use of instruments that are sensitive to particular wavelengths, including light from frequencies beyond the visible light range, e.g., infrared (IR) and ultraviolet (UV). Spectral imaging can allow extraction of additional information that the human eye fails to capture with its receptors for red, green, and blue. Multi-spectrum monitoring can be a useful tool in relocating a surgical field after a surgical task is completed to perform one or more of the previously described tests on the treated tissue. It is axiomatic that strict sterilization of the operating room and surgical equipment is required during any surgery. The strict hygiene and sterilization conditions required in a “surgical theater.” e.g., an operating or treatment room, necessitate the highest possible sterility of all medical devices and equipment. Part of that sterilization process is the need to sterilize anything that comes in contact with the patient or penetrates the sterile field, including the imaging device 130 and its attachments and components. It will be appreciated that the sterile field may be considered a specified area, such as within a tray or on a sterile towel, that is considered free of microorganisms, or the sterile field may be considered an area, immediately around a patient, who has been prepared for a surgical procedure. The sterile field may include the scrubbed team members, who are properly attired, and all furniture and fixtures in the area.


Various embodiments of multi-spectral imaging are described further in, for example, U.S. Pat. No. 11,937,769 entitled “Method Of Hub Communication, Processing, Storage And Display” filed Dec. 4, 2018, which is hereby incorporated by reference in its entirety.


The wearable sensing system(s) 111 illustrated in FIG. 1 may include the one or more HCP sensing systems 120 as shown in FIG. 2. The one or more HCP sensing systems 120 may include sensing system(s) to monitor and detect a set of physical states and/or a set of physiological states of an HCP. An HCP may be a surgeon or one or more healthcare personnel assisting the surgeon or other healthcare service providers in general. The HCP sensing system 120 may send the measurement data associated with a set of biomarkers and data associated with a physical state of the surgeon to the surgical hub 106 for further processing. In an exemplary embodiment, an HCP sensing system 120 may measure a set of biomarkers to monitor the heart rate of an HCP. In an exemplary embodiment, an HCP sensing system 120 worn on a surgeon's wrist (e.g., a watch or a wristband) may use an accelerometer to detect hand motion and/or shakes and determine the magnitude and frequency of tremors.


The environmental sensing system(s) 115 shown in FIG. 1 may send environmental information to the surgical hub 106. In an exemplary embodiment, the environmental sensing system(s) 115 may include a camera 121 for detecting hand/body position of an HCP. The environmental sensing system(s) 115 may include one or more microphones 122 for measuring ambient noise in the surgical theater. Other environmental sensing system(s) 115 may include one or more devices, for example, a thermometer to measure temperature, a hygrometer to measure humidity of the surroundings in the surgical theater, etc. The surgeon biomarkers may include one or more of the following: stress, heart rate, etc. The environmental measurements from the surgical theater may include ambient noise level associated with the surgeon or the patient, surgeon and/or staff movements, surgeon and/or staff attention level, etc. The surgical hub 106, alone or in communication with the cloud computing system 108, may use the surgeon biomarker measurement data and/or environmental sensing information to modify control algorithms of hand-held instruments or the averaging delay of a robotic interface, for example, to minimize tremors.


The surgical hub 106 may use the surgeon biomarker measurement data associated with an HCP to adaptively control the one or more surgical instruments 131. For example, the surgical hub 106 may send a control program to one of the one or more surgical instruments 131 to control the surgical instrument's actuators to limit or compensate for fatigue and use of fine motor skills. The surgical hub 106 may send the control program based on situational awareness and/or the context on importance or criticality of a task. The control program may instruct the instrument to alter operation to provide more control when control is needed.



FIG. 3 shows an embodiment of the surgical system 102 including the surgical hub 106. The surgical hub 106 may be paired with, via a modular control, the one or more wearable sensing systems 111, the one or more environmental sensing systems 115, the one or more human interface systems 112. the one or more robotic systems 113, and the one or more intelligent instruments 114. As in this illustrated embodiment, the surgical hub 106 may include a display 148, an imaging module 149, a generator module 150 (e.g., an energy generator), a communication module 156, a processor module 157, a storage array 158, and an operating-room mapping module 159. In certain aspects, as illustrated in FIG. 3, the surgical hub 106 further includes a smoke evacuation module 154 and/or a suction/irrigation module 155. The various modules and systems may be connected to the modular control either directly via a router or via the communication module 156. The operating theater devices may be coupled to cloud computing resources and data storage, e.g., to the cloud computing system 108, via the modular control. The human interface system(s) 112 may include a display sub-system and a notification sub-system.


The modular control may be coupled to a non-contact sensor module. The non-contact sensor module may measure the dimensions of the operating theater and generate a map of the surgical theater using ultrasonic, laser-type, and/or the like non-contact measurement devices. Other distance sensors can be employed to determine the bounds of an operating room. The sensor module may be configured to determine the size of the operating theater and to adjust Bluetooth-pairing distance limits. A laser-based non-contact sensor module may scan the operating theater by transmitting laser light pulses, receiving laser light pulses that bounce off the perimeter walls of the operating theater, and comparing the phase of the transmitted pulse to the received pulse to determine the size of the operating theater and to adjust Bluetooth pairing distance limits, for example.


An ultrasound-based non-contact sensor module may scan the operating theater by transmitting a burst of ultrasound and receiving the echo when it bounces off the perimeter walls of an operating theater as described further in, for example, U.S. Pat. No. 11,857,152 entitled “Surgical Hub Spatial Awareness To Determine Devices In Operating Theater” issued Jan. 2, 2024, U.S. Pat. No. 11.278,281 entitled “Interactive Surgical Platform” issued Mar. 22, 2022, and U.S. Prov. Pat. App. No. 62/611,341 entitled “Interactive Surgical Platform” filed Dec. 28, 2017, which are hereby incorporated by reference herein in their entireties.


During a surgical procedure, energy application to tissue, for sealing and/or cutting, may be associated with smoke evacuation, suction of excess fluid, and/or irrigation of the tissue. Fluid, power, and/or data lines from different sources may be entangled during the surgical procedure. Valuable time can be lost addressing this issue during a surgical procedure. Detangling the lines may necessitate disconnecting the lines from their respective modules, which may require resetting the modules. A hub modular enclosure 160 of the surgical hub 106 may offer a unified environment for managing the power, data, and fluid lines, which reduces the frequency of entanglement between such lines.


Energy may be applied to tissue at a surgical site. The surgical hub 106 may include the hub modular enclosure 160 and a combo generator module slidably receivable in a docking station of the hub modular enclosure 160. The docking station may include data and power contacts. The combo generator module may include two or more of: an ultrasonic energy generator component, a bipolar radiofrequency (RF) energy generator component, or a monopolar RF energy generator component that are housed in a single unit. The combo generator module may include a smoke evacuation component, at least one energy delivery cable for connecting the combo generator module to a surgical instrument, at least one smoke evacuation component configured to evacuate smoke, fluid, and/or particulates generated by the application of therapeutic energy to the tissue, and a fluid line extending from the remote surgical site to the smoke evacuation component. The fluid line may be a first fluid line, and a second fluid line may extend from the remote surgical site to a suction and irrigation module 155 slidably received in the hub modular enclosure 160. The hub modular enclosure 160 may include a fluid interface.


The combo generator module may generate multiple energy types for application to the tissue. One energy type may be more beneficial for cutting the tissue, while another different energy type may be more beneficial for sealing the tissue. For example, a bipolar generator can be used to seal the tissue while an ultrasonic generator can be used to cut the sealed tissue. In an exemplary embodiment, the hub modular enclosure 160 may be configured to accommodate different generators and facilitate an interactive communication therebetween. The hub modular enclosure 160 may enable the quick removal and/or replacement of various modules.


The hub modular enclosure 160 may include a first energy-generator module, configured to generate a first energy for application to the tissue, and a first docking station comprising a first docking port that includes first data and power contacts, wherein the first energy-generator module is slidably movable into an electrical engagement with the power and data contacts and wherein the first energy-generator module is slidably movable out of the electrical engagement with the first power and data contacts. The modular surgical enclosure may include a second energy-generator module configured to generate a second energy, different than the first energy, for application to the tissue, and a second docking station comprising a second docking port that includes second data and power contacts, wherein the second energy generator module is slidably movable into an electrical engagement with the power and data contacts, and wherein the second energy-generator module is slidably movable out of the electrical engagement with the second power and data contacts. In addition, the modular surgical enclosure also includes a communication bus between the first docking port and the second docking port, configured to facilitate communication between the first energy-generator module and the second energy-generator module.


As shown in FIG. 3, the hub modular enclosure 160 may allow the modular integration of the generator module 150, the smoke evacuation module 154, and the suction/irrigation module 155. The hub modular enclosure 160 may facilitate interactive communication between the operating-room mapping, smoke evacuation, and suction/irrigation modules 159, 154, 155. The generator module 150 can be with integrated monopolar, bipolar, and ultrasonic components supported in a single housing unit slidably insertable into the hub modular enclosure 160. The generator module 150 may connect to a monopolar device 151, a bipolar device 152, and an ultrasonic device 153. The generator module 150 may include a series of monopolar, bipolar, and/or ultrasonic generator modules that interact through the hub modular enclosure 160. The hub modular enclosure 160 may facilitate the insertion of multiple generators and interactive communication between the generators docked into the hub modular enclosure 160 so that the generators would act as a single generator.


A surgical data network having a set of communication hubs may connect the sensing system(s), the modular devices located in one or more operating theaters of a healthcare facility, a patient recovery room, or a room in a healthcare facility specially equipped for surgical operations, to the cloud computing system 108.



FIG. 4 illustrates one embodiment of a situationally aware surgical system 200. Data sources 202 of the situationally aware surgical system 200 may include, for example, modular devices 204, databases 206 (e.g., an electronic medical records (EMR) database, such as of a hospital or other medical facility, containing patient records, etc.), patient monitoring devices 208 (e.g., a blood pressure (BP) monitor, an electrocardiography (EKG) monitor, one or more wearable sensing systems 111, etc.), HCP monitoring devices 210 (e.g., one or more wearable sensing systems 111, etc.), and/or environment monitoring devices 212 (e.g., one or more environmental sensing systems 115, etc.).


The modular devices 204 may include sensors configured to detect parameters associated with a patient. HCPs and environment and/or the modular device 204 itself. The modular devices 204 may include the one or more intelligent instrument(s) 114.


The data sources 202 may be in communication (e.g., wirelessly or wired) with a surgical hub 214, such as the surgical hub 106. The surgical hub 214 may derive contextual information pertaining to a surgical procedure from data based upon, for example, the particular combination(s) of data received from the data sources 202 or the particular order in which the data is received from the data sources 202. The contextual information inferred from the received data can include, for example, the type of surgical procedure being performed, the particular step of the surgical procedure that a surgeon (and/or other HCP) is performing, the type of tissue being operated on, or a body cavity that is the subject of the surgical procedure. This ability by some aspects of the surgical hub 214 to derive or infer information related to the surgical procedure from received data can be referred to as “situational awareness.” For example, the surgical hub 214 can incorporate a situational awareness system, which may be the hardware and/or programming associated with the surgical hub 214 that derives contextual information pertaining to the surgical procedure from the received data and/or a surgical plan information received from the edge computing system 216 or an enterprise cloud server 218, such as the cloud computing system 108. The contextual information derived from the data sources 202 may include, for example, what step of the surgical procedure is being performed, whether and how a particular modular device 204 is being used, and the patient's condition.


The surgical hub 214 may be connected to the databases 206 of the data sources 202 to retrieve therefrom data regarding the surgical procedure that is being performed or is to be performed. The data that may be received by the situational awareness system of the surgical hub 214 from the databases 206 may include, for example, start (or setup) time or operational information regarding the procedure (e.g., a segmentectomy in the upper right portion of the thoracic cavity). The surgical hub 214 may derive contextual information regarding the surgical procedure from this data alone or from the combination of this data and other data from the data sources 202.


The surgical hub 214 may be connected to (e.g., paired with) the patient monitoring devices 208 of the data sources 202. Examples of the patient monitoring devices 208 that can be paired with the surgical hub 214 may include a pulse oximeter (SpO2 monitor), a blood pressure (BP) monitor, and an electrocardiogram (EKG) monitor. Perioperative data that is received by the situational awareness system of the surgical hub 214 from the patient monitoring devices 208 may include, for example, the patient's oxygen saturation, blood pressure, heart rate, and/or other physiological parameters. The contextual information that may be derived by the surgical hub 214 from the perioperative data transmitted by the patient monitoring devices 208 may include, for example, whether the patient is located in the operating theater or under anesthesia. The surgical hub 214 may derive these inferences from data from the patient monitoring devices 208 alone or in combination with data from other data from the data sources 202, such as a ventilator and/or other data source.


The surgical hub 214 may be connected to (e.g., paired with) the modular devices 204. Examples of the modular devices 204 that are paired with the surgical hub 214 may include a smoke evacuator, a medical imaging device such as the imaging device 130 shown in FIG. 2, an insufflator, a combined energy generator (for powering an ultrasonic surgical instrument and/or an RF electrosurgical instrument), and a ventilator.


The perioperative data received by the surgical hub 214 may be from a medical imaging device and/or other device(s). The perioperative data received by the surgical hub 214 from the medical imaging device may include, for example, whether the medical imaging device is activated and image data. The contextual information that is derived by the surgical hub 214 from the perioperative data sent by the medical imaging device may include, for example, whether the procedure is a video-assisted thoracic surgery (VATS) procedure (based on whether the medical imaging device is activated or paired to the surgical hub 214 at the beginning or during the course of the procedure). The image data (e.g., still image and/or video image) from the medical imaging device (or the data stream representing the video for a digital medical imaging device) may be processed by a pattern recognition system or a machine learning system to recognize features (e.g., organs or tissue types) in the field of view (FOV) of the medical imaging device, for example. The contextual information that is derived by the surgical hub 214 from the recognized features may include, for example, what type of surgical procedure (or step thereof) is being performed, what organ is being operated on, or what body cavity is being operated in.


The situational awareness system of the surgical hub 214 may derive the contextual information from the data received from the data sources 202 in a variety of different ways. For example, the situational awareness system can include a pattern recognition system, or a machine learning system (e.g., an artificial neural network), that has been trained on training data to correlate various inputs (e.g., data from the databases 206, the patient monitoring devices 208, the modular devices 204, the HCP monitoring devices 210, and/or the environment monitoring devices 212) to corresponding contextual information regarding a surgical procedure. For example, a machine learning system may accurately derive contextual information regarding a surgical procedure from the provided inputs. In examples, the situational awareness system can include a lookup table storing pre-characterized contextual information regarding a surgical procedure in association with one or more inputs (or ranges of inputs) corresponding to the contextual information. In response to a query with one or more inputs, the lookup table can return the corresponding contextual information for the situational awareness system for controlling one or more of the modular devices 204. In examples, the contextual information received by the situational awareness system of the surgical hub 214 can be associated with a particular control adjustment or set of control adjustments for one or more of the modular devices 204. In examples, the situational awareness system can include a machine learning system, lookup table, or other such system, which may generate or retrieve one or more control adjustments for one or more of the modular devices 204 when provided the contextual information as input.


For example, based on data from the data sources 202, the surgical hub 214 may determine what type of tissue was being operated on. The surgical hub 214 can infer whether a surgical procedure being performed is a thoracic or an abdominal procedure, allowing the surgical hub 214 to determine whether the tissue clamped by an end effector of a surgical stapling and cutting instrument is lung tissue (for a thoracic procedure) or stomach tissue (for an abdominal procedure) tissue. The surgical hub 214 may determine whether the surgical site is under pressure (by determining that the surgical procedure is utilizing insufflation) and determine the procedure type, for a consistent amount of smoke evacuation for both thoracic and abdominal procedures. Based on data from the data sources 202, the surgical hub 214 may determine what step of the surgical procedure is being performed or will subsequently be performed.


The surgical hub 214 may determine what type of surgical procedure is being performed and customize an energy level according to an expected tissue profile for the surgical procedure. The situationally aware surgical hub 214 may adjust the energy level for an ultrasonic surgical instrument or RF electrosurgical instrument throughout the course of a surgical procedure, rather than just on a procedure-by-procedure basis.


In examples, data can be drawn from one or more data sources 202 to improve the conclusions that the surgical hub 214 draws from another one of the data sources 202. The surgical hub 214 may augment data that it receives from the modular devices 204 with contextual information that it has built up regarding the surgical procedure from the other data sources 202.


The situational awareness system of the surgical hub 214 can consider the physiological measurement data to provide additional context in analyzing the visualization data. The additional context can be useful when the visualization data may be inconclusive or incomplete on its own.


The surgical hub 214 may determine whether a surgeon (and/or other HCP(s)) was making an error or otherwise deviating from the expected course of action during the course of a surgical procedure. For example, the surgical hub 214 may determine a type of surgical procedure being performed, retrieve a corresponding list of steps or order of equipment usage (e.g., from a memory of the surgical hub 214 or other computer system), and compare the steps being performed or the equipment being used during the course of the surgical procedure to the expected steps or equipment for the type of surgical procedure that the surgical hub 214 determined is being performed. The surgical hub 214 can provide an alert indicating that an unexpected action is being performed or an unexpected device is being utilized at the particular step in the surgical procedure.


The surgical instruments (and other modular devices 204) may be adjusted for the particular context of each surgical procedure (such as adjusting to different tissue types) and validating actions during a surgical procedure. Next steps, data, and display adjustments may be provided to surgical instruments (and other modular devices 204) in the surgical theater according to the specific context of the surgical procedure.


Embodiments of situational awareness systems and using situational awareness systems during performance of a surgical procedure are described further in, for example, U.S. patent application Ser. No. 16/729,772 entitled “Analyzing Surgical Trends By A Surgical System” filed Dec. 30, 2019, U.S. patent application Ser. No. 16/729,747 entitled “Dynamic Surgical Visualization Systems” filed Dec. 30, 2019, U.S. patent application Ser. No. 16/729,744 entitled “Visualization Systems Using Structured Light” filed Dec. 30, 2019, U.S. patent application Ser. No. 16/729,778 entitled “System And Method For Determining, Adjusting, And Managing Resection Margin About A Subject Tissue” filed Dec. 30, 2019, U.S. patent application Ser. No. 16/729,729 entitled “Surgical Systems For Proposing And Corroborating Organ Portion Removals” filed Dec. 30, 2019, U.S. patent application Ser. No. 16/729,778 entitled “Surgical System For Overlaying Surgical Instrument Data Onto A Virtual Three Dimensional Construct Of An Organ” filed Dec. 30, 2019, U.S. patent application Ser. No. 16/729,751 entitled “Surgical Systems For Generating Three Dimensional Constructs Of Anatomical Organs And Coupling Identified Anatomical Structures Thereto” filed Dec. 30, 2019, U.S. patent application Ser. No. 16/729,740 entitled “Surgical Systems Correlating Visualization Data And Powered Surgical Instrument Data” filed Dec. 30, 2019, U.S. patent application Ser. No. 16/729,737 entitled “Adaptive Surgical System Control According To Surgical Smoke Cloud Characteristics” filed Dec. 30, 2019, U.S. patent application Ser. No. 16/729,796 entitled “Adaptive Surgical System Control According To Surgical Smoke Particulate Characteristics” filed Dec. 30, 2019, U.S. patent application Ser. No. 16/729,803 entitled “Adaptive Visualization By A Surgical System” filed Dec. 30, 2019, and U.S. patent application Ser. No. 16/729,807 entitled “Method Of Using Imaging Devices In Surgery” filed Dec. 30. 2019, which are hereby incorporated by reference in their entireties.



FIG. 5 illustrates one embodiment of a surgical system 300 that may include a surgical instrument 302, such as the surgical instrument 114 of FIG. 1 or the surgical instrument 131 of FIG. 2. The surgical instrument 302 can be in communication with a console 304 and/or a portable device 306 through a local area network (LAN) 308 and/or a cloud network 310, such as the cloud computing system 108 of FIG. 1, via a wired and/or wireless connection. The console 304 and the portable device 306 may be any suitable computing device.


The surgical instrument 302 may include a handle 312, an adapter 314, and a loading unit 316. The adapter 314 releasably couples to the handle 312 and the loading unit 316 releasably couples to the adapter 314 such that the adapter 314 transmits a force from one or more drive shafts to the loading unit 316. The adapter 314 or the loading unit 316 may include a force gauge (not explicitly shown in FIG. 5) disposed therein to measure a force exerted on the loading unit 316. In some embodiments, the adapter 314 is non-releasably attached to the handle 312. In some embodiments, the adapter 314 and the loading unit 316 are integral and may be releasably attachable to the handle 312 or non-releasably attached to the handle 312.


The loading unit 316 may include an end effector 318 having a first jaw 320 and a second jaw 322. The loading unit 316 may be an in-situ loaded or multi-firing loading unit (MFLU) that allows a clinician to fire a plurality of fasteners (e.g., staples, clips, etc.) multiple times without requiring the loading unit 316 to be removed from a surgical site to reload the loading unit 316. The first and second jaws 320, 322 may be configured to clamp tissue therebetween, fire fasteners through the clamped tissue, and sever the clamped tissue. The first jaw 320 may be configured to fire at least one fastener a plurality of times or may be configured to include a replaceable multi-fire fastener cartridge including a plurality of fasteners that may be fired more than one time prior to being replaced. The second jaw 322 may include an anvil that deforms or otherwise secures the fasteners, as the fasteners are ejected from the multi-fire fastener cartridge.


The surgical instrument 302 may include a motor, such as at the handle 312, that is coupled to the one or more drive shafts to affect rotation of the one or more drive shafts. The surgical instrument 302 may include a control interface, such as at the handle 312, to selectively activate the motor. The control interface may include buttons, switches, levers, sliders, touchscreens, and/or any other suitable input mechanisms or user interfaces, which can be engaged by a clinician to activate the motor.


The control interface of the surgical instrument 302 may be in communication with a controller 324 (e.g., a microprocessor or other controller) of the surgical instrument 302, shown in the embodiment of FIG. 5 disposed in the handle 312, to selectively activate the motor to affect rotation of the one or more drive shafts. The controller 324 may be configured to receive input from the control interface, adapter data from the adapter 314, and loading unit data from the loading unit 316. The controller 324 may analyze the input from the control interface and the data received from the adapter 314 and/or the date received from the loading unit 316 to selectively activate the motor. The surgical instrument 302 may also include a display, such as at the handle 312, that is viewable by a clinician during use of the surgical instrument 302. The display may be configured to display portions of the adapter data and/or loading unit data before, during, or after firing of the surgical instrument 302.


The adapter 314 may include an adapter identification device 326 disposed therein, and the loading unit 316 may include a loading unit identification device 328 disposed therein. The adapter identification device 326 may be in communication with the controller 324, and the loading unit identification device 328 may be in communication with the controller 324. It will be appreciated that the loading unit identification device 328 may be in communication with the adapter identification device 326, which relays or passes communication from the loading unit identification device 328 to the controller 324. In embodiments in which the adapter 314 and the loading unit 316 are integral, one of the adapter identification device 326 and the loading unit identification device 328 may be omitted.


The adapter 314 may also include one or more sensors 330 disposed thereabout to detect various conditions of the adapter 314 or of the environment (e.g., if the adapter 314 is connected to a loading unit, if the adapter 314 is connected to a handle, if the one or more drive shafts are rotating, a torque of the one or more drive shafts, a strain of the one or more drive shafts, a temperature within the adapter 314, a number of firings of the adapter 314, a peak force of the adapter 314 during firing, a total amount of force applied to the adapter 314, a peak retraction force of the adapter 314, a number of pauses of the adapter 314 during firing, etc.). The one or more sensors 330 may provide an input to the adapter identification device 326 (or to the loading unit identification device 328 if the adapter identification device 326 is omitted) in the form of data signals. The data signals of the one or more sensors 330 may be stored within or be used to update the adapter data stored within the adapter identification device 326 (or the loading unit identification device 328 if the adapter identification device 326 is omitted). The data signals of the one or more sensors 330 may be analog or digital. The one or more sensors 330 may include, for example, a force gauge to measure a force exerted on the loading unit 316 during firing.


The handle 312 and the adapter 314 can be configured to interconnect the adapter identification device 326 and the loading unit identification device 328 with the controller 324 via an electrical interface. The electrical interface may be a direct electrical interface (e.g., include electrical contacts that engage one another to transmit energy and signals therebetween). Additionally, or alternatively, the electrical interface may be a non-contact electrical interface to wirelessly transmit energy and signals therebetween (e.g., inductively transfer). It is also contemplated that the adapter identification device 326 and the controller 324 may be in wireless communication with one another via a wireless connection separate from the electrical interface.


The handle 312 may include a transceiver 332 that is configured to transmit instrument data from the controller 324 to one or more other components of the surgical system 300 (e.g., the LAN 308, the cloud 310, the console 304, and/or the portable device 306). The controller 324 may also transmit instrument data and/or measurement data associated with the one or more sensors 330 to a surgical hub, such as the surgical hub 106 of FIGS. 1-3 or the surgical hub 214 of FIG. 4. The transceiver 332 may receive data (e.g., cartridge data, loading unit data, adapter data, and/or other notifications) from the surgical hub. The transceiver 332 may receive data (e.g., cartridge data, loading unit data, or adapter data) from the other components of the surgical system 300. For example, the controller 324 may transmit surgical instrument data including a serial number of an attached adapter (e.g., the adapter 314) attached to the handle 312, a serial number of a loading unit (e.g., the loading unit 316) attached to the adapter 314, and a serial number of a multi-fire fastener cartridge loaded into the loading unit 316, e.g., into one of the jaws 320, 322 at the end effector 318, to the console 304. Thereafter, the console 304 may transmit data (e.g., cartridge data, loading unit data, and/or adapter data) associated with the attached cartridge, the loading unit 316, and the adapter, 314 respectively, back to the controller 324. The controller 324 can display messages on the local instrument display or transmit the message, via transceiver 332, to the console 304 or the portable device 306 to display the message on a display 334 or device screen of the portable device 306, respectively.


Various exemplary embodiments of aspects of smart surgical systems, for example how smart surgical systems choose to interact with each other, are described further in, for example, U.S. patent application Ser. No. 18/810,323 entitled “Method For Multi-System Interaction” filed Aug. 20, 2024, U.S. patent application Ser. No. 18/810,036 entitled “Adaptive Interaction Between Smart Healthcare Systems” filed Aug. 20, 2024, U.S. patent application Ser. No. 18/810,082 entitled “Control Redirection And Image Porting Between Surgical Systems” filed Aug. 20, 2024, U.S. patent application Ser. No. 18/809,890 entitled “Synchronized Motion Of Independent Surgical Devices” filed Aug. 20, 2024, U.S. patent application Ser. No. 18/810,133 entitled “Synchronization Of The Operational Envelopes Of Independent Surgical Devices” filed Aug. 20, 2024, U.S. patent application Ser. No. 18/810,170 entitled “Synchronized Motion Of Independent Surgical Devices To Maintain Relational Field Of Views” filed Aug. 20, 2024, U.S. patent application Ser. No. 18/810,208 entitled “Alignment And Distortion Compensation Of Reference Planes Used By Surgical Devices” filed Aug. 20, 2024, U.S. patent application Ser. No. 18/810,230 entitled “Shared Set Of Object Registrations For Surgical Devices Using Independent Reference Planes” filed Aug. 20, 2024, U.S. patent application Ser. No. 18/810,266 entitled “Coordinated Control Of Therapeutic Treatment Effects” filed Aug. 20, 2024, U.S. patent application Ser. No. 18/810,283 entitled “Functional Restriction Of A System Based On Information From Another Independent System” filed Aug. 20, 2024, U.S. patent application Ser. No. 18/809,960 entitled “Inter-Connectivity Of Data Flows Between Independent Smart Systems” filed Aug. 20, 2024, U.S. patent application Ser. No. 18/810,041 entitled “Inter-Connectivity Of Data Flows Between Independent Smart Systems” filed Aug. 20, 2024, U.S. patent application Ser. No. 18/810,119 entitled “Processing And Display Of Tissue Tension” filed Aug. 20, 2024, U.S. patent application Ser. No. 18/810,175 entitled “Situational Control Of Smart Surgical Devices” filed Aug. 20, 2024, U.S. patent application Ser. No. 18/810,222 entitled “Method For Activation Mode Determination Of An Energy Device” filed Aug. 20, 2024, U.S. patent application Ser. No. 18/810,274 entitled “Visual Data-Based Activation Mode Determination Of An Energy Device” filed Aug. 20, 2024, U.S. patent application Ser. No. 18/810,346 entitled “Electrical Data-Based Activation Mode Determination Of An Energy Device” filed Aug. 20, 2024, U.S. patent application Ser. No. 18/810,355 entitled “Mechanical Data-Based Activation Mode Determination Of An Energy Device” filed Aug. 20, 2024, U.S. patent application Ser. No. 18/810,361 entitled “Multi-Sourced Data-Based Activation Mode Determination Of An Energy Device” filed Aug. 20, 2024, U.S. patent application Ser. No. 18/810,407 entitled “Conflict Resolution For Activation Mode Determination Of An Energy Device” filed Aug. 20, 2024, and U.S. patent application Ser. No. 18/810,419 entitled “Controlling Patient Monitoring Devices” filed Aug. 20, 2024, which are each hereby incorporated by reference in their entireties.


Operating Intelligent Surgical Instruments

An intelligent surgical instrument, such as the surgical instrument 114 of FIG. 1, the surgical instrument 131 of FIG. 2, or the surgical instrument 302 of FIG. 5, can have an algorithm stored thereon, e.g., in a memory thereof, configured to be executable on board the intelligent surgical instrument, e.g., by a processor thereof, to control operation of the intelligent surgical instrument. In some embodiments, instead of or in addition to being stored on the intelligent surgical instrument, the algorithm can be stored on a surgical hub, e.g., in a memory thereof, that is configured to communicate with the intelligent surgical instrument.


The algorithm may be stored in the form of one or more sets of pluralities of data points defining and/or representing instructions, notifications, signals, etc. to control functions of the intelligent surgical instrument. In some embodiments, data gathered by the intelligent surgical instrument can be used by the intelligent surgical instrument, e.g., by a processor of the intelligent surgical instrument, to change at least one variable parameter of the algorithm. As discussed above, a surgical hub can be in communication with an intelligent surgical instrument, so data gathered by the intelligent surgical instrument can be communicated to the surgical hub and/or data gathered by another device in communication with the surgical hub can be communicated to the surgical hub, and data can be communicated from the surgical hub to the intelligent surgical instrument. Thus, instead of or in addition to the intelligent surgical instrument being configured to change a stored variable parameter, the surgical hub can be configured to communicate the changed at least one variable, alone or as part of the algorithm, to the intelligent surgical instrument and/or the surgical hub can communicate an instruction to the intelligent surgical instrument to change the at least one variable as determined by the surgical hub.


The at least one variable parameter may be among the algorithm's data points, e.g., may be included in instructions for operating the intelligent surgical instrument, and are thus each able to be changed by changing one or more of the stored pluralities of data points of the algorithm. After the at least one variable parameter has been changed, subsequent execution of the algorithm can be according to the changed algorithm. As such, operation of the intelligent surgical instrument over time can be managed for a patient to increase the beneficial results use of the intelligent surgical instrument by taking into consideration actual situations of the patient and actual conditions and/or results of the surgical procedure in which the intelligent surgical instrument is being used. Changing the at least one variable parameter is automated to improve patient outcomes. Thus, the intelligent surgical instrument can be configured to provide personalized medicine based on the patient and the patient's surrounding conditions to provide a smart system. In a surgical setting in which the intelligent surgical instrument is being used during performance of a surgical procedure, automated changing of the at least one variable parameter may allow for the intelligent surgical instrument to be controlled based on data gathered during the performance of the surgical procedure, which may help ensure that the intelligent surgical instrument is used efficiently and correctly and/or may help reduce chances of patient harm by harming a critical anatomical structure.


The at least one variable parameter can be any of a variety of different operational parameters. Examples of variable parameters include motor speed, motor torque, energy level, energy application duration, tissue compression rate, jaw closure rate, cutting element speed, load threshold, etc.


Various embodiments of operating surgical instruments are described further in, for example, U.S. Pat. App. Pub. No. 2023/0096268 entitled “Methods for Controlling Cooperative Surgical Instruments” filed Oct. 5, 2021, U.S. Pat. App. Pub. No. 2023/0097906 entitled “Surgical Methods Using Multi-Source Imaging” published Mar. 30, 2023, U.S. Pat. App. Pub. No. 2023/0095002 entitled “Surgical Methods Using Fiducial Identification And Tracking” published Mar. 30, 2023, U.S. Pat. App. Pub. No. 2023/0101750 entitled “Surgical Methods For Control Of One Visualization With Another” published Mar. 30, 2023, U.S. Pat. App. Pub. No. 2023/0100698 entitled “Methods For Controlling Cooperative Surgical Instruments” published Mar. 30, 2023, U.S. Pat. App. Pub. No. 2023/0103005 entitled “Methods for Controlling Cooperative Surgical Instruments” published Mar. 30, 2023, and U.S. Pat. App. Pub. No. 2023/0098538 entitled “Cooperative Access Hybrid Procedures” published Mar. 30, 2023, which are hereby incorporated by reference in their entireties.


Data Pipelines

As discussed herein, data may be transmitted from one point to another point, such as during a performance of a surgical procedure on a patient. The data may be transmitted from a source system to a destination system using a data pipeline.


As shown in FIG. 6A, a data pipeline 400 may move data from a source 402 to a destination 404 both physical and virtual (transient). In some data pipelines, the destination 404 may be called a “sink” or a “target.” Any time data is processed between point A and point B (or between multiple points such as points B, C, and D), there is a data pipeline 400 between those points. In general, the data pipeline 400 can include a set of tools and processes, which may be referred to as “steps” or “processing steps” used to automate the movement and transformation of data between the source 402 and the destination 404.


In some embodiments, the source 402 and the destination 404 are two different elements, such as a first element of a surgical system and a second element of a surgical system. The data from the source 402 may or may not be modified by the data pipeline 400 before being received at the destination 404. For example, the source 402 may be one of the wearable sensing system(s) 111, the environmental sensing system(s) 115, the robotic system(s) 113, the intelligent instrument(s) 114, or the human interface system(s) 112 of the surgical system 102 of FIGS. 1 and 3 and the destination 404 may be the surgical hub 106 of the surgical system 102 of FIGS. 1 and 3. For another example, the source 402 may be one of the wearable sensing system(s) 111, the environmental sensing system(s) 115, the intelligent instrument(s) 114, or the human interface system(s) 112 of one of the surgical systems 102, 103, 104 of FIG. 1 and the destination 404 may be the surgical hub 106 of another one of the surgical systems 102, 103, 104 of FIG. 1. For yet another example, the source 402 may be one of the wearable sensing system(s) 111, the environmental sensing system(s) 115, the robotic system(s) 113, the intelligent instrument(s) 114, or the human interface system(s) 112 of the surgical system 102 of FIGS. 1 and 3 and the destination 404 may be another one of the wearable sensing system(s) 111, the environmental sensing system(s) 115, the robotic system(s) 113, the intelligent instrument(s) 114, or the human interface system(s) 112 of the surgical system 102 of FIGS. 1 and 3. For still another example, the source 402 may be one of the surgical systems 102, 103, 104 of FIG. 1 and the destination 404 may be another one of the surgical systems 102, 103, 104 of FIG. 1.


In some embodiments, the source 402 and the destination 404 are the same element. The data pipeline 400 may thus be purely about modifying the data set between the source 402 and the destination 404.


As shown in FIG. 6B, the data pipeline 400 may include one or more data connectors 406 that extract data from the source 402 and load the extracted data into the destination 404. A plural “N” number of data connectors 406 are shown in FIG. 6B. In some embodiments, such as embodiments in which extract, transform, and load (ETL) processing of data is performed, as opposed to extract, load, and transform (ELT) processing of data, data may be transformed within the data pipeline 400 before the data is received by the destination 404. In other embodiments, such as embodiments in which ELT processing of data is performed, as opposed to ETL processing of data, the one or more data connectors 406 may simply load raw data to the destination 404. In some instances, light transformations may be applied to the data, such as normalizing and cleaning data or orchestrating transformations into models for analysts, before the destination 404 receives the data.


The data pipeline 400 can include physical elements like one or more wires or can include digital elements like one or more packets, network traffic, or internal processor paths/connections. Flexible data pipelines are portions of the overall system where redundant paths can be utilized and therefore data, e.g., a data stream, can be sent down one path, parsed between multiple parallel paths (to increase capacity) and these multiple paths can be flexibly adjusted by the system as necessary to accommodate changes in the volume and details of the data streams as necessary.


The data pipeline 400 can have a small code base that serves a very specific purpose. These types of applications are called microservices.


The data pipeline 400 can be a big data pipeline. There are five characteristics of big data: volume, variety, velocity, veracity, and value. Big data pipelines are data pipelines built to accommodate more than one of the five characteristics of big data. The velocity of big data makes it appealing to build streaming data pipelines for big data. Then data can be captured and processed in real time so some action can then occur. The volume of big data requires that data pipelines must be scalable, as the volume can be variable over time. In practice, there are likely to be many big data events that occur simultaneously or very close together, so the big data pipeline must be able to scale to process significant volumes of data concurrently. The variety of big data requires that big data pipelines be able to recognize and process data in many different formats—structured, unstructured, and semi-structured.


In general, an architecture design of a data pipeline, e.g., the data pipeline 400 of FIGS. 6A and 6B, can include interconnectivity between a first smart device and a second smart device, e.g., the source 402 and the destination 404 of FIGS. 6A and 6B. Data generated in one source system (e.g., the first smart device or the second smart device) may feed multiple data pipelines, which may have multiple other data pipelines dependent on their outputs.


The interconnectivity between the first smart device and the second smart device may be on a common/shared network, e.g., LAN, Wi-fi, powerline networking, MoCA networking, cellular (e.g., 4G, 5G, etc.), low power wide area network (LPWAN), Zigbee, Z-wave, etc.


The interconnectivity between the first smart device and the second smart device may be on a structured network. Traditionally, structured peer-to-peer (P2P) networks implement a distributed hash table (DHT). In order to route traffic efficiently through the network, nodes in a structured overlay must maintain lists of neighbors that satisfy specific criteria. This makes them less robust in networks with a high rate of churn (e.g., with large numbers of nodes frequently joining and leaving the network). DHT-based solutions may have a high cost of advertising/discovering resources and may have static and dynamic load imbalance.


The interconnectivity between the first smart device and the second smart device may be via cooperative networking. Cooperative networking utilizes a system that is a hybrid of a P2P network and a server-client network architecture, offloading serving to peers who have recently established direct interchanges of content.


The interconnectivity between the first smart device and the second smart device may be exclusive. For example, the interconnectivity may be exclusive via Bluetooth. For another example, the interconnectivity may be exclusive via network isolation, such as by using path isolation, a virtual private network (VPN), or a secure access service edge (SASE). The path isolation may include a software-defined LAN (SD-WAN). SD-WANs rely on a software and a centralized control function that can steer traffic across a WAN in a smarter way by handling traffic based on priority, security, and quality of service requirements. The VPN may involve creation of an independent secure network using common/shared open networks. Another network (a carrier network) is used to carry data, which is encrypted. The carrier network will see packets of the data, which it routes. To users of the VPN, it will look like the systems are directly connected to each other.


For example, with interconnectivity between the first smart device and the second smart device being exclusive in a surgical context, an operating room (OR) may have a surgical hub and an established network from a first vendor. In order to secure against hacking or data leakage, the network may be an encrypted common network which the first vendor supplies keys from. A surgical stapler in the OR may be from a second vendor that is different from the first vendor and that does not have the keys from the first vendor. The surgical stapler may want to link to other device(s) it relies on for functionality but does not want data leakage. An advanced energy generator from the second vendor with an accompanying smoke evacuator may also be in the OR and may form their own private network, such as by piggybacking on the first vendor network to create a second encrypted VPN routing through the first vendor network as a primary network or by forming an independent wireless network for bi-direction communication between the advanced energy generator and the smoke evacuator. The surgical stapler may want to communicate with the advanced energy generator, e.g., so the surgical stapler may retrieve updated software from the advanced energy generator, receive tissue properties information from the advanced energy generator, log data for exportation, and receive energy from the advanced energy generator and apply the energy to tissue, but not want to communicate with the smoke evacuator, e.g., because the surgical stapler performs no smoke evacuation. The surgical stapler and a communication backplane of the advanced energy generator may therefore form an isolated network with only the surgical stapler (first smart device) and the advanced energy generator (second smart device) able to communicate via the isolated network and with the surgical hub able to manage the data pipeline between the surgical stapler and the advanced energy generator.


In general, one or more steps may be performed along a data pipeline, e.g., the data pipeline 400 of FIGS. 6A and 6B. The steps in the data pipeline may include data transformation, data augmentation, data enrichment, data filtering, data grouping, data aggregating, and running algorithms against the data.


The data aggregation may include segmentation of data into buckets (e.g., decomposition of a procedure into sub-steps), data fusion and interfacing, and mixing real-time data streams with archived data streams. Various embodiments of data aggregation are described further in, for example, U.S. Pat. App. Pub. No. 2022/0104896 entitled “Interactive Information Overlay On Multiple Surgical Displays” published Apr. 7. 2022, U.S. Pat. App. Pub. No. 2019/0206564 entitled “Method For Facility Data Collection And Interpretation” published Jul. 4, 2019, and U.S. Pat. App. Pub. No. 2024/0221937 entitled “Method For Advanced Algorithm Support” published Jul. 4, 2024, which are hereby incorporated by reference in their entireties.


In one embodiment, mixing real-time data streams with archived data streams may include, in a surgical context, pre-operative data/imaging evaluation. The evaluation may include displaying of static preoperative scan(s), overlaying of video with aligned 3D model, and registering a virtual view to a camera view.


In one embodiment, the display of static pre-operative scan may include alignment based on surgeon (or other HCP) position, for example, where the surgeon (or other HCP) is standing.


In one embodiment, registering the virtual view to the camera view may include identifying organs in a video and triangulating with camera location and/or getting a camera location in reference to a coordinate system. For example, during performance of a surgical procedure, a camera location may be acquired with respect to a trocar by 3D tracking of the trocar, camera insertion in the trocar (e.g., insertion depth and/or insertion angle, and/or determination of what trocar is being used for the camera. An example of insertion depth is a marking on a shaft of the trocar, such as on a graphical scale or a color gradient. Examples of insertion angle in a trocar are 3D trocar orientation and 3D angle of attack.


In one embodiment, the pre-operative data/imaging evaluation may include using a machine learning (ML) algorithm to reviewing preoperative scans of a patient to identify any abnormalities. A cloud-based source may be used for augmented reality (AR) using cloud-based data for surgical procedure planning.


In one embodiment, an ML algorithm may be used in an initial planning stage, e.g., initial planning for a surgical procedure to be performed on a patient. Preoperative scans may be used to facilitate surgical path planning. If the initial scans detect anomalies or diseased tissues, as analyzed by the ML algorithm, the anomalies or diseased tissues may be relayed to the surgeon for the upcoming surgical procedure and a new surgical task order may be suggested based on how previous surgeons handled these problems. The relayed information to the surgeon may also include a recommended inventory list to have on hand based on this initial improved surgical task order.


For example, during preoperative scans for a sleeve gastrectomy, a small hernia may be discovered. This hernia may be highlighted during the surgical planning step and the surgeon may be asked if the surgeon wants to include a hernial repair in the initial sleeve gastrectomy plan. Based on the surgeon's answer, the hernial repair will be added into the surgical task order for an affirmative surgeon answer and the overall inventory for this case will be updated to include relevant items for the hernial repair added into the surgical task order. During performance of the sleeve gastrectomy, a ML algorithm may be used to detect diseased tissue or surgical anomalies. If a diseased tissue is discovered, the diseased tissue may be highlighted on a screen, e.g., on a HID, and a cutting path/angle may be recommended to avoid the tissue or make the tissue state more manageable. These recommendations may be based on how surgeons previously, successfully, handled these situations. If a surgical anomaly is discovered, the system may either automatically update the task order or require the surgeon to give a verbal command (or other command) to update the task order and highlight the required additional inventory on the circulators screen. For foreign bodies (such as bougies) that may be discovered, the foreign body may be highlighted on the screen and a cutting path may be included to provide an ample margin around the foreign body, assuming the foreign body is anticipated. If the foreign body is not anticipated, the foreign body may be highlighted to draw the surgeon's (and/or other HCP's) attention to it.


In one embodiment, the pre-operative data/imaging evaluation may include a cloud comparison of scans periodically taken through time for anatomic changes of that time to indicate possible operative complications. A cloud-based source may be used for augmented reality (AR) using preoperative scans to enhance return surgeries.


Looking at a difference between current and previous surgical scans may help inform the surgeon and/or other HCP and improve patient outcomes. This information can be used in various ways, for example for disease detection, informing surgical task planning, and/or informing previous surgical success and healing.


With respect to disease detection, current and historical scans can be used to determine if various disease states or abnormalities have evolved between surgeries. One case where this could be particularly useful is cancer detection. If a scan initially picks up an abnormal growth for a patient but the patient's HCP decides that it is benign but decides to flag it for caution, a follow-up scan nay confirm that the abnormality is benign or not. The scan may also automatically highlight areas of concern (tissue growth) that were not flagged by the HCP initially but could be areas of concern.


With respect to informing surgical task planning, information about previous surgeries (e.g., potential areas of scare tissue, previously seen difficult tissue, etc.) can help facilitate surgical step and path planning. This information can also be used during the surgery to display areas of scarring, changes of tissue from previous surgeries that might need to be examined, foreign bodies, and/or new adhesions.


With respect to informing previous surgical success and healing, data from various scans over time can be used to determine how successful patients were recovering or had recovered from previous surgeries. This information may be used by surgeons (and/or other HCPs) to help plan future procedures, assess previous work, and/or facilitate quicker patient recovery.


Data development may be performed as a step in a data pipeline and may include one or more of data modeling, database layout and configuration, implementation, data mapping, and correction.


Various data may be communicated using a data pipeline, e.g., the data pipeline 400 of FIGS. 6A and 6B, such as data from a local data source, data from a remote data source, and synthetically generated data.


Data from a local data source may include data collected by, used by, or resulting from the operation of aspects of the local data source, e.g., data gathered using a sensor (e.g., temperature data gathered using a temperature sensor, force data gathered using a force sensor, pressure measured using a pressure sensor, etc.), still and/or video image data (e.g., data gathered by a camera, etc.), operational parameters of a surgical instrument (e.g., energy level, energy type, motor current, cutting element speed, etc.), surgical instrument identification (e.g., instrument type, instrument serial number, etc.), etc.


Data from a local data source may have metadata, which may reflect aspects of a data stream, a device configuration, and/or system behavior that define information about the data. For example, metadata may include an auxiliary data location that is shared by two interconnected systems, e.g., first and second robotic systems, etc., to create a single “brain” instead of two distinct ones. Each of the interconnected systems may creates a copy of its memory system and introduce it to the combined system or “collective. Both of the interconnected systems may now use this new area for data exchange, for uni-directional communication, and to directly command control systems. The new combined system may become primary and the individual robotic system's memory areas may become secondary memory areas until the systems are “unpaired,” e.g., are no longer interconnected.


Data from a local data source may include a data stream that is monitored by at least one other system. In this way, data collected by, used by or resulting from the operation of aspects of one system may be sourced to another system (the monitoring system).


In one embodiment, application programming interfaces (APIs) may be used to communicate data from a local source.


In one embodiment, data may be communicated from a local source in response to occurrence of a trigger event. In one embodiment, the trigger event is a digital trigger event. For example, in a surgical context, the trigger event may be a surgical instrument changing orientation after being in a predetermined static position, such as when the surgical instrument “wakes up.” For another example, in a surgical context, the trigger event may be a system's power or signal interruption, e.g., communicating data after the interruption has been resolved. For yet another example, in a surgical context, the trigger event may be a change in system status, capacity, or connectivity. For still another example, in a surgical context, the trigger event may be quality, calibrations, or conversion factors of a surgical instrument.


Data from a remote data source may include data collected by, used by, or resulting from the operation of aspects of the remote data source. One example of a remote data source includes a database, such as a relational database or an NoSQL database. Examples of data in a relational database relevant in a surgical context can include inventory, sterile services process status, billing code, patient records (PHI), and previous procedure data.


One example of data from a remote data source, in a surgical context, includes a procedure plan, e.g., a plan for a surgical procedure to be performed on a patient. The procedure plan data can include, for example, instrument selection, port placement, adjuncts needed for devices, OR timing and local imaging needs, procedural steps, staff number and skill composition, and patient positioning.


Another example of data from a remote data source, in a surgical context, includes pre-operative imaging, such as a CT full body scan, external ultrasound, MRI, etc.


Another example of data from a remote data source includes software parameter updates, such as software parameter updates streaming from a cloud computing system. The software parameter updates can include, for example, original equipment manufacturer (OEM) updates to a device's operational aspects, e.g., updated basic input/output system (BIOS) controls, calibrations, updates on capabilities (e.g., recalls, limits/expansion of use, indications, contra-indications, etc.), etc.


Another example of data from a remote data source includes gold standard of care or outcomes improvement data, such as gold standard of care or outcomes improvement data from a cloud computing system. Gold standard of care or outcomes improvement data can include, for example, improved techniques of device use and/or device combinations.


In one embodiment, Apache® Hadoop®, which is an open source software framework, may be used for distributed processing of data across computer systems.


Examples of types of synthetically generated data may include synthetic text, media (e.g., video, image, sound, etc.), tabular data, and calculated continuous stream of data. The calculated continuous stream of data may be randomly generated (bracketed by extremes thresholds) or may be based off another stream or real continuous data stream that is modified to fit the stream limits of the expected synthetic stream. Reasons for using synthetically generated data can include for training data streams, because of missing data from an expected system that would otherwise draw a device error but is not relevant to the operation of the device or other dependent device, for data streams designed to verify the operational of the transforms or mathematic algorithms, for data streams intended to either verify security or prevent fraud/inauthenticity, for consecutive timing data for redaction of real-time from the relational data of the systems, for creations of trending data for replacement of legal compliance regulated data streams (e.g., either they produce datasets from partially synthetic data, where they replace only a selection of the dataset with synthetic data), and/or for a sudden but anticipatable/explainable change in a data source's feed which is being used as a closed loop control for a destination.


For example of use of synthetically generated data in a surgical context in which a surgical procedure is being performed on a patient, a PO2 sensor (data source) on the patient's finger may be being used as a means for controlling a closed loop feed or O2 through a ventilator (data destination). The ventilator also has an internal closed loop on CO2 outlet concentration, but since O2 blood saturation is the desired fixed relationship to O2 supplementation level, the ventilator is using the PO2 sensor from the patient monitoring system. There may be an abrupt change in the O2 level as measured by the PO2 sensor. The ventilator has two choices: ether switch to the trailing indicator or CO2 which has not had an abrupt change or examine other data sources to try to explain the O2 shift. When compared to the patient's core body temperature measure it may be discovered that the patient's temperature has dropped across a 1.5° C. below normal threshold that usually induces vasoconstriction limiting blood flow to the body's extremities. The PO2 measure by its metadata is known by the ventilator to be a finger monitor and therefore on an extremity. Further comparison over time may show the O2 measure fairly constant before the shift and then fairly constant after the shift as well, reinforcing the idea that this is the vasoconstriction that induced the shift. The ventilator may then create a synthetic data stream based on the shift data pattern and behavior that compensates for the vasoconstriction shift so the ventilator can continue on the primary linked feeds but using a modified synthetic or “calculated” data stream based of a real stream. For example, current body temperature control systems, such as a Bair Hugger™ device, are open-loop user settable heat gradient controlled systems but are affected by local temperature and environment.


Information Discrimination

In various aspects, the present disclosure provides methods, devices, and systems for information discrimination. The information discrimination may help discriminate between sources of data that a system could use, e.g., discriminate between one or more sources (e.g., the source 402 of FIGS. 6A and 6B) for a destination (e.g., the destination 404 of FIGS. 6A and 6B). Discrimination between externally supplied data sources/streams can be important to provide a smart system with the best option of good, accurate, and timely streamed data to improve one (or more) of its controlled operations.


In some aspects, information discrimination may include discrimination between competing sources of data for a smart surgical instrument. A smart surgical instrument may operate at least a portion of its controlled portion in an open loop or limited closed loop control, may have access to more than one data stream from differing smart system origins which is each relevant to the limited closed loop control of its controlled portion, and may have an ability to discriminate between the multiple data streams to determine which of the multiple data streams the smart surgical instrument is going to monitor and use for full closed loop control of its system. Operational behavior of the smart surgical instrument may be improved through the use of one data stream over the other data stream(s) either in stability or outcomes.


In one embodiment, information discrimination may include a smart system identifying available data sources it currently has access to. In other words, the smart system may identify one or more data streams the smart system can possibly utilize.


In one embodiment, available data sources may include user connected devices. For example, the user connected devices may be devices that are manually connected by a user of the device or another person to a smart system, e.g., a control system such as a surgical hub, a cloud-based server, etc. The smart system may log the user connected devices into a list within the smart system or may identify specifically to each system that could need access to data generated by the user connected devices. For another example, the user connected devices may be devices that are identified in a surgical procedure plan. The smart system may identify measured parameters required for the surgical procedure (temperatures, pressures, etc.) and identify required equipment based on the identified parameters and/or a surgeon may identify data feeds they care about for the surgical procedure. As time elapses in the surgical procedure, some data stream may not be as useful.


In one embodiment, available data sources may include using a broadcast network for the identification of system data streams that could be used by other systems. Using the broadcast network may include self-identification of data that can be provided to other systems as needed and/or interrogation of systems that have been identified to be in operation or providing data through the network.


With respect to self-identification of data that can be provided to other systems as needed, in a surgical context, as systems come online into a room such as an OR (or into another geofenced area), the systems may announce to a smart system (e.g., central system) that they are available. For example, a wireless pulse oximeter in an OR may be plugged in and start receiving power. As the wireless pulse oximeter boots up, it, as a possible source of data to one or more systems, may broadcast wirelessly a publicly available message to all wirelessly enabled systems in the OR that it is available to be paired with. In this case, the wireless pulse oximeter could broadcast this data over advertising channels of its wireless network to allow the wirelessly enabled systems in the OR to identify that the wireless pulse oximeter exists, even if the wirelessly enabled systems in the OR cannot exchange meaningful data with the wireless pulse oximeter at this point in time.


With respect to interrogation of systems that have been identified to be in operation or providing data through the network, in a surgical context, a device may, for example, be monitoring a room, e.g., OR, (or another geofenced area) in which the device is located for networks, wireless signal traffic, and/or unanticipated data packets operating on connected networks, which can trigger as analysis of the fingerprint of the “data” moving that could be used to identify new sources of data to interface with or monitor. For another example, a device may request for systems that could provide data, such as by a wireless scanning of a room, e.g., OR, (or another geofenced area) in which the device is located and/or by the room (e.g., a central system in the room) scanning in-use products or product billed to this surgical procedure to determine viable data exchange options for the device. With respect to wireless scanning of a room (or another geofenced area), a smart system, as it turns on, or periodically, may wirelessly scan a room, e.g., an OR, (or another geofenced area) in which the smart system is located to identify other smart systems that are within the smart system's local area. For example, a wirelessly enabled surgical hub may be being setup for a surgical procedure. As part of its setup, the surgical hub may have a step where it identifies available systems it can connect with. During this step, the surgical hub may send out a message (e.g., a ping) to other systems in its vicinity to determine if they are there, and then waits for a response.


In one embodiment, identifying available data sources may include utilization of one or more cameras in a room, e.g., an OR, (or another geofenced area) to identify new capital equipment being introduced or activated, and/or monitoring of displays or monitors having one or more data feeds clearly being streamed to them that are not part of the known dataset may trigger analysis and assimilation of the new one or more data feeds.


In one embodiment, available data sources may be identified by an introduction, such as an unexpected introduction of a smart system into a pre-existing interconnected space or by an introduction of a new unexpected system to a pre-existing space. Type of data that may become available can include raw data or transformed data. Raw data may be more time consuming to transform but may be less accurate depending on the source. A smart system may have an ability to interrogate the new available data. The smart system may need to notify a sensor to switch its data type being sent. A data pipeline (e.g., the data pipeline 400 of FIGS. 6A and 6B) also needs to adjust for the new data to allow for new data transmission between source and destination.


In one embodiment, identifying available data sources by an introduction may include a transfer or inheritance of available data sources between smart systems. The transfer or inheritance may include. when a new smart device is introduced into a room, e.g., an OR, (or another geofenced area) and is connected to a smart device within a local network, that smart device may then transfer its list of associated data sources to the new device. For example, a first surgical hub in an OR may be connected to a plurality of other smart devices within the OR, such as a connected stapler or connected vision system. During performance of a surgical procedure in the OR, a problem may be found to be occurring with the first surgical hub and, thus, a second surgical hub may be brought into the OR and connected to the first surgical hub. With the first and second surgical hubs connected, the first surgical hub may transfers the information of the existing smart devices and data sources it is connected to to the second surgical hub. As a result, the second surgical hub “inherits” all the smart data sources from the first surgical hub.


In one embodiment, identifying available data sources by an introduction may include identifying devices in a room, e.g., an OR, (or another geofenced area). For example, devices in a room (or another geofenced area) may broadcast new/existing data sources, a scanner may collects all devices that are in a room and (or another geofenced area) compare scanned devices with a procedure plan, devices may be adaptively detected based on data transmitting sound or RF, etc., and/or a geofence may be used to identify appropriate data pipelines within a room (or another geofenced area) with close proximity to each other.


In one embodiment, information discrimination may include a smart system choosing which data sources it wants/needs to obtain for improved operation of its system. In an exemplary embodiment, the smart system may choose between two or more data feeds based on one or more parameters of the data feed parameters and/or one or more needs of the smart system.


A smarts system's operation of its owned aspects may improve with an acquisition of outside data sources. As one example of an improvement, the smart system's operation may improve by allowing for adaptation or open loop, reactive or proxy related internally measured control to closed loop operation on a more primary related parameter. For example, in a surgical context in which a surgical procedure is being performed on a patient, a smoke evacuator operating by either energy activation (secondary proxy for visibility impairment) or particle count that is measured on an inlet of the smoke evacuator (reactive-only adjusts after the inlet is detected and therefore already affecting the operation that it is supposed to overcome) may be replaced by a feed from a scope that is actually directly monitoring the obstruction of the visibility. Alternatively, a delay between energy activation and subsequent activations on a regular basis relative to non-energy uses can be indicative of a user (e.g., one or more HCPs) being unable to clearly visualize the surgical site which could be used as a more direct measure of the need for escalated smoke evacuation. These more direct measures of the presence of smoke may be used to control both activation, duration, and intensity of smoke evacuation over the more secondary measures traditionally used.


As another example of an improvement, the smart system's operation may improve by better correlating accuracy of operation related to the magnitude of the perturbance by the measure used being more directly tied to the issue the smart system is trying to minimize. For example, in a surgical context in which a surgical procedure is being performed on a patient, in the case of the patient's core temperature impacting transcutaneous O2 blood gases control of a ventilator, the patient's body core temperature measure may be used as a quasi trigger for when vasoconstriction impacts the O2 measure. Further, a better measure may be laser Doppler measurement of the blood flow in the extremity of the patient that is being monitored. This may be done, for example, with the patient monitoring system as a separate laser Doppler sensor on the patient's arm or may even be part of the O2 finger sensor that would provide blood perfusion in addition to the O2 measure to more directly monitor for vasoconstriction and therefore may be a more accurate measure than merely core body temperature.


In an exemplary embodiment, a smart system may choose between two or more data feeds based on one or more of hierarchies of data, trustworthiness, and time dependability.


In one embodiment, a hierarchy of data can be used for a smart system to choose between two or more data feeds. Examples of the hierarchy include hierarchy via proximity, hierarchy via source/type, and hierarchy via prior performance. The hierarchy may be a primary related measure to the controlled system versus secondary or “related” measure to the directly controlled system.


In one embodiment, trustworthiness can be used for a smart system to choose between two or more data feeds. Examples of trustworthiness include confidence bounds/levels from data sources and using an ability of the smart system to identify available data sources, e.g., data sources available within an OR, from non-collaborative devices.


With respect to confidence bounds/levels from data sources, confidence bounds/levels may include one or more of averaging or trends for this surgical procedure, identifying confidence to a user (e.g., show when confidence shifts too much away from nominal and/or adjust when to show/hide), and surgical procedure templates (e.g., setup configuration). Confidence bounds may be adjusted by, for example, one or more of changes in environment that may cause a reduction of confidence of data streams, an HCP or other person, facility (e.g., conservative confidence if at a new facility), tools (e.g., conservative if new tools), and sensor decay (e.g., deviation from nominal, which may be procedure nominal or sensing medium nominal). For example, in a surgical context in which a surgical procedure is being performed on a patient, a smart table may measure patient parameters with a default confidence level. There may be scenarios that may want to adjust that confidence.


For example, with respect to the smart system having an ability to identify available data sources from non-collaborative devices, examples of the ability include a camera seeing digital read-outs in a room, e.g., an OR, (or another geofenced area) with confidence increasing as in reading data and reduced as it is stealing, a camera reading x-ray or other imaging data, and pattern recognition of devices in a room, e.g., an OR, (or another geofenced area). Examples of the pattern recognition include asking to move a system to provide a visible view for a camera to steal data, asking to connect a wire to a smart system (e.g., show an IFU/setup manual for devices, etc.), suggesting new room (e.g., OR) setup/configuration, and a camera seeing an unused monitor/inactive monitor and suggesting options (other data streams) to display or connect to that available monitor.


In one embodiment, time-dependability can be used for a smart system choosing between two or more data feeds. Multiple data sources may be available but may not be direct sources for the data required by a smart system. A smart system may have a capability to detect appropriate data pipelines for data sources. Competitive systems or non-connectable data may be required to close the loop. Some data may need to be transformed or transmitted through multiple data pipelines before reaching the consumer of the data, e.g., the destination. This may take more time than is allowed for this data. The discernment of this timeliness of the data may dictate if the data is useful.


For example, in a surgical context in which a surgical procedure is being performed on a patient, a ventilator may have an open-loop control set title volume and O2 supplementation levels. A finger monitor may serve as a temperature gauge and may provide skin O2 level as affected by blood flow. CO2 off-gassing is not a direct measure for O2 in blood. A surgeon and/or other HCP may be shown how devices are closed-loop controls, such as by a smart system using scanned devices to identify a way through external sensors to calculate a measured O2 level based on a lookup table of transfer functions of data sources. The may smart system discern a pathway to collect required data to close the loop on the ventilator through this approach.


In an exemplary embodiment, a smart system may choose between two or more data feeds, and a choice's poor impacts may be limited, such as by one or more of retroactive analysis of data and its trending over time, reactions to a discrepancy in data, and having conflicting data between sources with a level of error between the sources impacting the response.


In one embodiment, retroactive analysis of data and its trending over time may include utilization of baseline data to help determine the impact of time trending because of how long various things take to happen and how a certain point was arrived at. The baseline may be established using patient specific baselines (e.g., historical patient information, such as information from prior surgeries) and/or surgery/operation specific baselines. The baseline may be used for direct comparison.


In one embodiment, reactions to a discrepancy in data may include failing safe (e.g., determining how may the system allow for it to default in the direction of safety when there is a disagreement in data sources), defaulting to an expert assumption or analysis, prompting surgical staff (e.g., a central system prompting surgical staff, such as via a HID, to make a decision on how to proceed), and/or doing nothing. The system may do nothing if it cannot handle or make an analysis and, as a result, take no further direct action other than to inform staff, e.g., HCPs, that it is receiving conflicting information. With respect to defaulting to an expert assumption or analysis, just as a human surgeon may receive information that is incomplete or conflicting, an expert's judgement is often relied upon to inform a decision, knowing that it may be time critical, and the absence of a decision may be detrimental itself. The system may default to a design that would meet the criteria to be consistent with an expert opinion for the given scenario.


In one embodiment, conflicting data between sources may have different level of error impacting the response, e.g., minor disagreements (lowest level), major disagreements (intermediate level), and critical disagreements (highest level). For example, in a surgical context in which a surgical procedure is being performed on a patient, two patient monitoring systems may report patient temperature, respectively, as 98.1° F. and 98.2° F. These two temperatures are a conflict in data, but the level of conflict is small enough that it may not warrant a response, e.g., may be considered a minor disagreement.


In a surgical context in which a surgical procedure is being performed on a patient, using O2 and tidal volume closed through the patient control of a ventilator is one example of a closed loop using other data streams and choices based on those for adaptation. For example, an atrial fibrillation (AFib) procedure traditionally requires a mild hypothermia to minimize damage due to ischemia during the procedure. A ventilator may identify two measures that it closed loop on through the patient, such as PO2 from a patient monitoring system through a finger sensor and CO2 off-gassing through an exhalation sensor on a ventilator. The O2 may be a primary measure and the CO2 may be a proxy secondary measure from a hierarchy point of view. Pressure during inhalation and exhalation may be monitored by the ventilator as a means for monitoring depth of breath. The choice at the beginning with no additional information may be to use closed loop in the primary for O2 control, with the tidal volume closed on the pressure which is interrelated as a primary for both O2 and tidal volume. At the start, the two data feeds (PO2 data feed and CO2 data feed) may be proportionally correlated. However, after a certain amount of time, e.g., 30 minutes into the procedure, local tissue cooling of the heart takes place for a mild hypothermia, and the correlation between the two data feeds may become inconsistent. Looking at other systems, the patient's core body temperature may have exceeded the traditional 1.5° C. hypothermic limit, implying vasoconstriction, which may be what is affecting the PO2 measure relative to the core body temperature. Given that if the O2 continues to be closed loop on the PO2 sensor, it is going to over-oxygenate the patient's blood because what is actually happening is a restriction of blood flow volume to the patient's extremities while they continue their normal metabolic consumption, making the O2 measure look artificially low. The PO2 sensor deviation is likely to be a stepwise shift in correlation to the CO2 outgas. The central system may use this physiologic relationship and the patient's core temperature together to accommodate the shift while still using the PO2 measure as the closed loop input, may shift the CO2 secondary measure, or may do some combination of the two.


Factors that can lead to the decision, in the above example and others, include consideration of one or more of time-based comparison of the correlation, a utilization of one or more other data feeds that may confirm or refute known possible causes, consistency of the signals, and risk injury to the patient. With respect to time-based comparison of the correlation, the comparison may include, for example, determining if the divergence is continually increasing, which implies drift of the signal or a physiologic adaptation. This is likely to require some manner of threshold over which the measure is no longer used. For another example, the comparison may include determining if the correlation is shifting and then remaining consistent, which may be an acute physiologic adjustment like vasoconstriction. In this case the measure may be perfectly useable but the stepwise shift must be accounted for before the measure can continue to be used as a closed loop control. For another example, the comparison may include determining if the correlation exponentially changing. This is likely to be some manner of cascade failure and may require notification to someone to handle.


With respect to utilization of one or more other data feeds that may confirm or refute known possible causes, the one or more data feeds may be indicators monitoring a room (or another geofenced area) for sensor lead integrity and/or monitoring temperature and comparing physiologic expansion.


With respect to consistency of the signals, inconsistency may be indicated by noise erratically affecting signals, and the signals being stable and consistent may indicate consistency.


With respect to risk injury to the patient, the presence due to risk of injury may indicate not to use a measure as the first line of adjustment or monitoring. In the above example, the tidal volume may be increased to a limit if the O2 or Co2 exchange needs to adjusted, but collateral damage can to the plura by over inflation so there is a threshold limit.


Choosing an original monitor feed and an adaptive feed can, in the above example and others, include a decision based on a proximity of the measured to the physically controlled smart device variable. The original choice may be adapted based on threshold deviations on correlation, or expected values may trigger a re-evaluation and change of choice. The choice may involve one or more considerations, such as balancing risk versus reward, primary versus secondary (leading versus trailing measures), and risk of incorrect choice or adjustment doing unexpected or collateral damage. A comparison may be made, e.g., by a central system, to other non-controlled measures and their consistency or deviation, which may result in no choice and a warning to the user or may reinforce the validity of a choice over another because of the non-controlled system either following the trend or not following the control data stream's trend.


In a surgical context in which a surgical procedure is being performed on a patient, robotic endoscopy re-calibration to electromagnetic (EM) sensor drift due to metal objects in the OR is another example of a closed loop using other data streams and choices based on those for adaptation. Such robotic endoscopy re-calibration typically uses cone beam CT as a primary measure for orientation position but may use local optical visualization of visible lung branch structures as another means to close the loop through the patient in re-calibrating.


In a surgical context in which a surgical procedure is being performed on a patient, generator control of an ultrasonic transducer is another example of a closed loop using other data streams and choices based on those for adaptation. Generator control of an ultrasonic transducer typically uses transducer impedance as a control for algorithm power level. Low impedance tissue and fluid immersion both have transducer impedance effect. A camera may be used to determine if the device's ultrasonic blade is immersed. However, particulate sensing of aerosols in place of smoke may be determined, e.g., by a central system, as a secondary measure if the device is immersed. The particles in the visual field may be occluding the primary feed of the scope providing visualization. The choice may be to use the secondary feed or a combination what the scope can see and confirmation or not of the presence of aerosol.



FIG. 7 illustrates one embodiment of a method 2200 of information discrimination. The method 2200 may include receiving 2202, at a surgical instrument and during performance of a surgical procedure on a patient, at least two data streams available to the surgical instrument from an external source. Each of the at least two data streams may include data collected in real time with the performance of the surgical procedure and regarding a same measured parameter.


The method 2200 may also include controlling 2204, using a controller of the surgical instrument, performance of a function of the surgical instrument in an open loop configuration or in a closed loop configuration. In the open loop configuration, the control of the performance of the function may be without using the data from any of the at least two data streams. In the closed loop configuration, the control of the performance of the function may be using the data of a one of the at least two data streams selected by the controller.


The selection by the controller of the one of the at least two data streams may be based on which of the at least two data streams improves operational behavior of the surgical instrument in either stability or outcome. The selection by the controller of the one of the at least two data streams may be based on at least one of: a hierarchy of the at least two data streams, a trustworthiness of each of the at least two data streams, and a time-dependability of each of the at least two data streams.


Computer Systems

A computer system may be suitable for use in implementing computerized components described herein. In broad overview of an exemplary embodiment, the computer system may include a processor configured to perform actions in accordance with instructions, and memory devices configured to store instructions and data. The processor may be in communication, via a bus, with the memory (and/or incorporates the memory) and with at least one network interface controller with a network interface for connecting to external devices, e.g., a computer system (such as a mobile phone, a tablet, a laptop, a server, etc.). The processor may also be configured to be in communication, via the bus, with any other processor(s) of the computer system and with any I/O devices at an I/O interfaces. Generally, a processor will execute instructions received from the memory. In some embodiments, the computer system can be configured within a cloud computing environment, a virtual or containerized computing environment, and/or a web-based microservices environment.


In more detail, the processor can be any logic circuitry that processes instructions, e.g., instructions fetched from the memory. In many embodiments, the processor may be an embedded processor, a microprocessor unit (MPU), microcontroller unit (MCU), field-programmable gate array (FPGA or FGPA), or special purpose processor. The computer system can be based on any processor, e.g., suitable digital signal processor (DSP), or set of processors, capable of operating as described herein. In some embodiments, the processor can be a single core or multi-core processor. In some embodiments, the processor can be composed of multiple processors.


The memory can be any device suitable for storing computer readable data. The memory can be a device with fixed storage or a device for reading removable storage media. Examples include all forms of non-volatile memory, media and memory devices, semiconductor memory devices (e.g., EPROM, EEPROM, SDRAM, flash memory devices, and all types of solid state memory), magnetic disks, and magneto optical disks. A computer system can have any number of memory devices.


The memory also can include a cache memory, which is generally a form of high-speed computer memory placed in close proximity to the processor for fast read/write times. In some embodiments, the cache memory is part of, or on the same chip as, the processor.


The network interface controller may be configured to manage data exchanges via the network interface. The network interface controller may handle the physical, media access control, and data link layers of the Open Systems Interconnect (OSI) model for network communication. In some embodiments, some of the network interface controller's tasks may be handled by the processor. In some embodiments, the network interface controller may be part of the processor. In some embodiments, a computer system may have multiple network interface controllers. In some implementations, the network interface may be a connection point for a physical network link, e.g., an RJ 45 connector. In some embodiments, the network interface controller may support wireless network connections and an interface port may be a wireless Bluetooth transceiver. Generally, a computer system can be configured to exchange data with other network devices via physical or wireless links to a network interface. In some embodiments, the network interface controller may implement a network protocol such as LTE, TCP/IP Ethernet, IEEE 802.11, IEEE 802.16, Bluetooth, or the like.


In some uses, the I/O interface may support an input device and/or an output device. In some uses, the input device and the output device may be integrated into the same hardware, e.g., as in a touch screen. In some uses, such as in a server context, there may be no I/O interface or the I/O interface may not be used. In some uses, additional other components may be in communication with the computer system, e.g., external devices connected via a universal serial bus (USB). In some embodiments, an I/O device may be incorporated into the computer system, e.g., a touch screen on a tablet device.


In some implementations, a computer device may include an additional device such as a co-processor, e.g., a math co-processor configured to assist the processor with high precision or complex calculations.


CONCLUSION

Certain illustrative implementations have been described to provide an overall understanding of the principles of the structure, function, manufacture, and use of the systems, devices, and methods disclosed herein. One or more examples of these implementations have been illustrated in the accompanying drawings. Those skilled in the art will understand that the systems, devices, and methods specifically described herein and illustrated in the accompanying drawings are non-limiting illustrative implementations and that the scope of the present invention is defined solely by the claims. The features illustrated or described in connection with one illustrative implementation may be combined with the features of other implementations. Such modifications and variations are intended to be included within the scope of the present invention. Further, in the present disclosure, like-named components of the implementations generally have similar features, and thus within a particular implementation each feature of each like-named component is not necessarily fully elaborated upon.


Approximating language, as used herein throughout the specification and claims, may be applied to modify any quantitative representation that can permissibly vary without resulting in a change in the basic function to which it is related. Accordingly, a value modified by a term or terms, such as “about,” “approximately,” and “substantially,” are not to be limited to the precise value specified. In at least some instances, the approximating language may correspond to the precision of an instrument for measuring the value. Here and throughout the specification and claims, range limitations may be combined and/or interchanged, such ranges are identified and include all the sub-ranges contained therein unless context or language indicates otherwise.


One skilled in the art will appreciate further features and advantages of the invention based on the above-described implementations. Accordingly, the present application is not to be limited by what has been particularly shown and described, except as indicated by the appended claims. All publications and references cited herein are expressly incorporated herein by reference in their entirety for all purposes.

Claims
  • 1. A surgical data discrimination system, comprising: a surgical instrument configured to, during performance of a surgical procedure on a patient, receive at least two data streams available to the surgical instrument from an external source, each of the at least two data streams including data collected in real time with the performance of the surgical procedure and regarding a same measured parameter;wherein a controller of the surgical instrument is configured to control performance of a function of the surgical instrument in an open loop configuration or in a closed loop configuration;in the open loop configuration, the controller is configured to control the performance of the function without using the data from any of the at least two data streams; andin the closed loop configuration, the controller is configured to control the performance of the function using the data of a one of the at least two data streams selected by the controller.
  • 2. The surgical data discrimination system of claim 1, wherein the external source includes at least two different surgical system sources being used in relation to the performance of the surgical procedure.
  • 3. The surgical data discrimination system of claim 2, wherein each of the at least two different surgical system sources is one of a hospital network, a database, a surgical instrument, or a surgical cart.
  • 4. The surgical data discrimination system of claim 1, wherein the parameter regards at least one of the patient and the surgical procedure.
  • 5. The surgical data discrimination system of claim 1, wherein the selection by the controller of the one of the at least two data streams is based on which of the at least two data streams improves operational behavior of the surgical instrument in either stability or outcome.
  • 6. The surgical data discrimination system of claim 1, wherein the controller is configured to base the selection by the controller of the one of the at least two data streams based on at least one of: a hierarchy of the at least two data streams,a trustworthiness of each of the at least two data streams, anda time-dependability of each of the at least two data streams.
  • 7. The surgical data discrimination system of claim 6, wherein the controller is configured to base the selection by the controller of the one of the at least two data streams based on the hierarchy of the at least two data streams.
  • 8. The surgical data discrimination system of claim 6, wherein the controller is configured to base the selection by the controller of the one of the at least two data streams based on the trustworthiness of each of the at least two data streams.
  • 9. The surgical data discrimination system of claim 6, wherein the controller is configured to base the selection by the controller of the one of the at least two data streams based on the time-dependability of each of the at least two data streams.
  • 10. The surgical data discrimination system of claim 1, wherein the controller is configured to identify data streams available to the surgical instrument to identify the at least two data streams.
  • 11. The surgical data discrimination system of claim 10, wherein the controller is configured to identify the available data streams based on the external source being communicatively coupled by a user with the surgical instrument.
  • 12. The surgical data discrimination system of claim 10, wherein the external source includes at least two different surgical systems; and the controller is configured to identify each of the available data streams based on either the surgical system broadcasting data stream identification to the surgical instrument or the controller causing the surgical instrument to interrogate the surgical system for available data stream information.
  • 13. The surgical data discrimination system of claim 10, wherein the controller is configured to identify the available data streams based on an introduction of a surgical system to a same space as the surgical instrument.
  • 14. The surgical data discrimination system of claim 13, wherein the space is a physical operating room space, or the space is an interconnected network.
  • 15. The surgical data discrimination system of claim 1, wherein the function of the surgical instrument is one of stapling tissue of the patient, applying energy to the tissue of the patient, applying a clip to the tissue of the patient, imaging the tissue of the patient, cutting the tissue of the patient, evacuating smoke from the patient, suctioning fluid from the patient, delivery irrigation fluid to the patient, and grasping the tissue of the patient.
  • 16. The surgical data discrimination system of claim 1, wherein the surgical instrument is one of a surgical stapler, a surgical energy device, an imaging device, a surgical clip applier, a smoke evacuator, a surgical dissector, and a surgical grasper; and the external source is one of a surgical stapler, a surgical energy device, an imaging device, a surgical clip applier, a smoke evacuator, a surgical dissector, and a surgical grasper.
  • 17. A surgical data discrimination system, comprising: a processor; anda memory storing instructions that, when executed by the processor, cause the processor to perform operations comprising: controlling performance of a function of a surgical instrument in an open loop configuration or in a closed loop configuration;wherein in the open loop configuration, the performance of the function is controlled without using data from any of at least two data streams available to the surgical instrument from an external source;each of the at least two data streams includes data collected in real time with the performance of the surgical procedure and regarding a same measured parameter; andin the closed loop configuration, the performance of the function is controlled using the data of a selected one of the at least two data streams.
  • 18. The surgical data discrimination system of claim 17, wherein a surgical instrument includes the processor and the memory; and the surgical instrument is one of a surgical stapler, a surgical energy device, an imaging device, a surgical clip applier, a smoke evacuator, a surgical dissector, and a surgical grasper.
  • 19. The surgical data discrimination system of claim 17, wherein a surgical hub includes the processor and the memory, the surgical hub being configured to be communicatively coupled with each of the surgical instrument and the external source.
  • 20. A computer-implemented method, comprising: controlling performance of a function of a surgical instrument in an open loop configuration or in a closed loop configuration;wherein in the open loop configuration, the performance of the function is controlled without using data from any of at least two data streams available to the surgical instrument from an external source;each of the at least two data streams includes data collected in real time with the performance of the surgical procedure and regarding a same measured parameter; andin the closed loop configuration, the performance of the function is controlled using the data of a selected one of the at least two data streams.
CROSS REFERENCE TO RELATED APPLICATIONS

The present application claims priority to U.S. Provisional Patent Application No. 63/603,031 entitled “Smart Surgical Systems” filed Nov. 27, 2023, which is hereby incorporated by reference in its entirety. The subject matter of the present application is related to the following patent applications filed on Nov. 26, 2024, which are hereby incorporated by reference in their entireties: U.S. application Ser. No. 18/960,006 entitled “Methods For Smart Surgical Systems,” U.S. application Ser. No. 18/960,032 entitled “Data Flow Management Between Surgical Systems,” U.S. application Ser. No. 18/960,047 entitled “Mapping Data Pipelines For Surgical Systems,” U.S. application Ser. No. 18/960,059 entitled “Broadcast And Peer-To-Peer Communication For Surgical Systems,” U.S. application Ser. No. 18/960,070 entitled “Data Lifecycle Management For Surgical Systems,” U.S. application Ser. No. 18/960,081 entitled “Data Transformation For Surgical Systems,” U.S. application Ser. No. 18/960,094 entitled “Geofencing For Surgical Systems,” and U.S. application Ser. No. 18/960,117 entitled “Adaptation Of Data Pipelines For Surgical Systems.”

Provisional Applications (1)
Number Date Country
63603031 Nov 2023 US