DATA FLOW MANAGEMENT BETWEEN SURGICAL SYSTEMS

Information

  • Patent Application
  • 20250174334
  • Publication Number
    20250174334
  • Date Filed
    November 26, 2024
    11 months ago
  • Date Published
    May 29, 2025
    5 months ago
Abstract
Surgical systems and related computer-implemented methods are provided, including during performance of a surgical procedure on a patient, receiving, at a first surgical system, a first dataflow from a second surgical system, the first dataflow including first data regarding a measured patient parameter that the first surgical system is configured to use in performing a function during the performance of the surgical procedure. The computer-implemented methods including determining that a trigger event occurred during the performance of the surgical procedure such that a sum of a first bandwidth of the first dataflow and a second bandwidth of a second dataflow exceeds an available bandwidth, and in response to determining that the trigger event occurred, and during the performance of the surgical procedure, adjusting at least one of the first and second dataflows such that the sum of the first and second bandwidths does not exceed the available bandwidth.
Description
FIELD

The present disclosure relates generally to smart surgical devices, systems, and methods.


BACKGROUND

Surgical operations and environments have benefited from advances in technology. These advances include upgraded equipment, therapeutics, techniques, and more, which has resulted in more favorable outcomes for both patients and healthcare personnel. Further benefits can be realized through the continued advancement of technology and the continued integration of such advancements into the operations and environments.


Computers are more and ubiquitous in everyday life, and as the power of computers and computing systems increases, larger quantities of data can be processed in ways that render meaningful results and information for end users. This type of big data processing has immense benefits to surgical operations and environments as well, as more information can be distilled to meaningful assistance to a user, such as a surgeon, for use and reliance during surgical operations. Ultimately, this additional information for the user can result in even more favorable outcomes for both patients and healthcare personnel.


SUMMARY

In general, smart surgical devices, systems, and methods are provided.


In one embodiment, a computer-implemented method is provided that includes receiving, at a first surgical system and in real time with performance of a surgical procedure on a patient, a first dataflow of patient data from a second surgical system, the first surgical system using the received the first dataflow of patient data to perform a first function in real time with the performance of the surgical procedure, the first surgical system using second data received by the first surgical system in a second dataflow in real time with the performance of the surgical procedure to perform a second function in real time with the performance of the surgical procedure, and adjusting, based on occurrence of a trigger event in real time with the performance of the surgical procedure that results in the first and second dataflows exceeding a total bandwidth limit, at least one of the first or second dataflows so that the first and second dataflows do not exceed the total bandwidth limit.


The method can vary in any number of ways. For example, the second dataflow can be one of: flow from the first surgical system to the second surgical system, and flow within the first surgical system.


For another example, the adjusting can include at least one of: changing a prioritization of the first and second dataflows, changing time-related access to the at least one of the first and second dataflows, and temporarily storing data of one of the first dataflow and the second dataflow; and the adjusting can be performed using a processor of the first surgical system, the adjusting can be performed using a processor of a surgical hub communicatively coupled with the first and second surgical systems, or the adjusting can be performed using a processor of a cloud-based remote server communicatively coupled with the first and second surgical systems.


For yet another example, the trigger event can be one of: congestion of traffic using the available bandwidth, performance of a predetermined step of the surgical procedure by a surgeon performing the surgical procedure, loss of data in the first dataflow, and a predetermined patient biomarker variation.


For still another example, the method can also include, after the adjusting, transmitting data on the adjusted at least one of the first and second dataflows.


For yet another example, the first type of surgical system and the second type of surgical system can each be one of a hospital network, a database, a surgical instrument, or a surgical cart, e.g., both being surgical instruments, one being a surgical instrument and the other being a database, etc.


In another embodiment, a surgical system is provided that can include the first and second surgical systems of the computer-implemented method.





BRIEF DESCRIPTION OF DRAWINGS

The present invention is described by way of reference to the accompanying figures which are as follows:



FIG. 1 is a schematic view of one embodiment of a computer-implemented surgical system;



FIG. 2 is a perspective view of one embodiment of a surgical system in one embodiment of a surgical operating room;



FIG. 3 is a schematic view of one embodiment of a surgical hub paired with various systems;



FIG. 4 is a schematic view of one embodiment of a situationally aware surgical system;



FIG. 5 is a perspective view of one embodiment of a surgical instrument and one embodiment of a surgical system that includes the surgical instrument;



FIG. 6A is a schematic view of a data pipeline architecture;



FIG. 6B is an expanded schematic view of the data pipeline architecture of FIG. 6A;



FIG. 7A is a schematic view of one embodiment of data being sorted and handled in a data pipes network; and



FIG. 7B is a flowchart of one embodiment of a method of managing a plurality of data feeds to prevent bandwidth overflow.





It is noted that the drawings are not necessarily to scale. The drawings are intended to depict only typical aspects of the subject matter disclosed herein, and therefore should not be considered as limiting the scope of the disclosure.


DETAILED DESCRIPTION

Certain exemplary embodiments will now be described to provide an overall understanding of the principles of the structure, function, manufacture, and use of the devices, systems, and methods disclosed herein. One or more examples of these embodiments are illustrated in the accompanying drawings. A person skilled in the art will understand that the devices, systems, and methods specifically described herein and illustrated in the accompanying drawings are non-limiting exemplary embodiments and that the scope of the present invention is defined solely by the claims. The features illustrated or described in connection with one exemplary embodiment may be combined with the features of other embodiments. Such modifications and variations are intended to be included within the scope of the present invention.


Further, in the present disclosure, like-named components of the embodiments generally have similar features, and thus within a particular embodiment each feature of each like-named component is not necessarily fully elaborated upon. Additionally, to the extent that linear or circular dimensions are used in the description of the disclosed systems, devices, and methods, such dimensions are not intended to limit the types of shapes that can be used in conjunction with such systems, devices, and methods. A person skilled in the art will recognize that an equivalent to such linear and circular dimensions can easily be determined for any geometric shape. A person skilled in the art will appreciate that a dimension may not be a precise value but nevertheless be considered to be at about that value due to any number of factors such as manufacturing tolerances and sensitivity of measurement equipment. Sizes and shapes of the systems and devices, and the components thereof, can depend at least on the size and shape of components with which the systems and devices will be used.


In general, a health data management system may include an interactive smart system that includes data origination facets, movement, architecture and management, and transformation and lifecycle to determine mechanisms by which smart systems talk to each other. The health data management system may include a data stack that defines handling of data from beginning to end. A data stack may include data sources, data pipelines, data transformation/modeling systems, and data storage systems that define end-to-end handling of data. In one embodiment, the health data management system may include a plurality of smart medical systems that are configured to perform one or more medical operations. The health data management system may utilize the data stack to control and manage data flow to the different smart device systems. In one embodiment, the health data management system may control and manage the data flow for managing a patient or performing a medical procedure, for example, providing surgical assistance during performance of a surgical procedure by one or more smart medical systems (also referred to herein as “surgical systems”).


Surgical Systems


FIG. 1 shows one embodiment of a computer-implemented surgical system 100. The surgical system 100 may include one or more surgical systems (e.g., surgical sub-systems) 102, 103, 104. As in this illustrated embodiment, the surgical system 100 may include first, second, and third surgical systems 102, 103, 104, but may instead include another number, e.g., one, two, four, etc.


The first surgical system 102 is discussed herein as a general representative of the surgical systems 102, 103, 104. For example, the surgical system 102 may include a computer-implemented interactive surgical system. For example, the surgical system 102 may include a surgical hub 106 and/or a computing device 116 in communication with a cloud computing system 108, for example, as described in FIG. 2.


The cloud computing system 108 may include at least one remote cloud server 109 and at least one remote cloud storage unit 110. Embodiments of surgical systems 102, 103, 104 may include one or more wearable sensing systems 111, one or more environmental sensing systems 115, one or more robotic systems (also referred to herein as “robotic surgical systems”) 113, one or more intelligent instruments 114 (e.g., smart surgical instruments), one or more human interface systems 112, etc. A “human interface system” is also referred herein as a “human interface device.” The wearable sensing system(s) 111 may include one or more HCPs (“health care professional” or “health care personnel”) sensing systems and/or one or more patient sensing systems. The environmental sensing system(s) 115 may include one or more devices used for measuring one or more environmental attributes, for example, as further described in FIG. 2. The robotic system(s) 113 may include a plurality of devices used for performing a surgical procedure, for example, as further described in FIG. 2.


Examples of robotic surgical systems include the Ottava™ robotic-assisted surgery system (Johnson & Johnson of New Brunswick, NJ), da Vinci® surgical systems (Intuitive Surgical, Inc. of Sunnyvale, CA), the Hugo™ robotic-assisted surgery system (Medtronic PLC of Minneapolis, MN), the Versius® surgical robotic system (CMR Surgical Ltd of Cambridge, UK), and the Monarch® platform (Auris Health, Inc. of Redwood City, CA). Embodiments of various robotic surgical systems and using robotic surgical systems are further described in, for example, U.S. Pat. App. Pub. No. 2018/0177556 entitled “Flexible Instrument Insertion Using An Adaptive Force Threshold” filed Dec. 28, 2016, U.S. Pat. App. Pub. No. 2020/0000530 entitled “Systems And Techniques For Providing Multiple Perspectives During Medical Procedures” filed Apr. 16, 2019, U.S. Pat. App. Pub. No. 2020/0170720 entitled “Image-Based Branch Detection And Mapping For Navigation” filed Feb. 7, 2020, U.S. Pat. App. Pub. No. 2020/0188043 entitled “Surgical Robotics System” filed Dec. 9, 2019, U.S. Pat. App. Pub. No. 2020/0085516 entitled “Systems And Methods For Concomitant Medical Procedures” filed Sep. 3, 2019, U.S. Pat. No. 8,831,782 entitled “Patient-Side Surgeon Interface For A Teleoperated Surgical Instrument” filed Jul. 15, 2013, and Intl. Pat. Pub. No. WO 2014151621 entitled “Hyperdexterous Surgical System” filed Mar. 13, 2014, which are hereby incorporated by reference in their entireties.


The surgical system 102 may be in communication with the one or more remote servers 109 that may be part of the cloud computing system 108. In an example embodiment, the surgical system 102 may be in communication with the one or more remote servers 109 via an internet service provider's cable/FIOS networking node. In an example embodiment, a patient sensing system may be in direct communication with the one or more remote servers 109. The surgical system 102 (and/or various sub-systems, smart surgical instruments, robots, sensing systems, and other computerized devices described herein) may collect data in real-time and transfer the data to the cloud computing system 108 for data processing and manipulation, e.g., by the one or more remote servers 109. It will be appreciated that cloud computing may rely on sharing computing resources rather than having local servers or personal devices to handle software applications.


The surgical system 102 and/or a component therein may communicate with the one or more remote servers 109 via a cellular transmission/reception point (TRP) or a base station using one or more of the following cellular protocols: GSM/GPRS/EDGE (2G), UMTS/HSPA (3G), long term evolution (LTE) or 4G, LTE-Advanced (LTE-A), new radio (NR) or 5G, and/or other wired or wireless communication protocols. Various embodiments of cloud-based analytics that may be performed by the cloud computing system 108 are described further in, for example, U.S. Pat. App Pub. No. 2019/0206569 entitled “Method Of Cloud Based Data Analytics For Use With The Hub” published Jul. 4, 2019, which is hereby incorporated by reference in its entirety.


The surgical hub 106 may have cooperative interactions with one or more means of displaying an image, e.g., a display configured to display an image from a laparoscopic scope, etc., and information from one or more other smart devices and/or one or more sensing systems. The surgical hub 106 may interact with the one or more sensing systems, the one or more smart devices, and the one or more means of displaying an image. The surgical hub 106 may be configured to gather measurement data from the sensing system(s) and send notifications or control messages to the sensing system(s). The surgical hub 106 may send and/or receive information including notification information to and/or from the one or more human interface systems 112. The one or more human interface systems 112 may include one or more human interface devices (HIDs). The surgical hub 106 may send and/or receive notification information or control information to audio, display and/or control information to various devices that are in communication with the surgical hub 106.


In an exemplary embodiment, the one or more sensing systems may include the one or more wearable sensing system 111 (which may include one or more HCP sensing systems and/or one or more patient sensing systems) and/or the environmental sensing system(s) 115 shown in FIG. 1. The sensing system(s) may measure data relating to various biomarkers.


In an exemplary embodiment, the sensing system(s) may measure the biomarkers using one or more sensors, for example, photosensors (e.g., photodiodes, photoresistors), mechanical sensors (e.g., motion sensors), acoustic sensors, electrical sensors, electrochemical sensors, thermoelectric sensors, infrared sensors, etc. The sensor(s) may measure the biomarkers using one of more of the following sensing technologies: photoplethysmography, electrocardiography, electroencephalography, colorimetry, impedimentary, potentiometry, amperometry, etc.


The biomarkers measured by the sensing system(s) may include, but are not limited to, sleep, core body temperature, maximal oxygen consumption, physical activity, alcohol consumption, respiration rate, oxygen saturation, blood pressure, blood sugar, heart rate variability, blood potential of hydrogen, hydration state, heart rate, skin conductance, peripheral temperature, tissue perfusion pressure, coughing and sneezing, gastrointestinal motility, gastrointestinal tract imaging, respiratory tract bacteria, edema, mental aspects, sweat, circulating tumor cells, autonomic tone, circadian rhythm, and/or menstrual cycle.


The biomarkers may relate to physiologic systems, which may include, but are not limited to, behavior and psychology, cardiovascular system, renal system, skin system, nervous system, gastrointestinal system, respiratory system, endocrine system, immune system, tumor, musculoskeletal system, and/or reproductive system. Information from the biomarkers may be determined and/or used by the computer-implemented surgical system 100, for example. The information from the biomarkers may be determined and/or used by the computer-implemented patient and the computer-implemented surgical system 100 to improve said systems and/or to improve patient outcomes, for example.


The sensing system(s) may send data to the surgical hub 106. The sensing system(s) may use one or more of the following radiofrequency (RF) protocols for communicating with the surgical hub 106: Bluetooth, Bluetooth Low-Energy (BLE), Bluetooth Smart, Zigbee, Z-wave, IPv6 Low-power wireless Personal Area Network (6LoWPAN), Wi-Fi, etc.


Various embodiments of sensing systems, biomarkers, and physiological systems are described further in, for example, U.S. Pat. App Pub. No. 2022/0233119 entitled “Method Of Adjusting A Surgical Parameter Based On Biomarker Measurements” filed published Jul. 28, 2022, which is hereby incorporated by reference in its entirety.


The sensing systems described herein may be employed to assess physiological conditions of a surgeon operating on a patient or a patient being prepared for a surgical procedure or a patient recovering after a surgical procedure. The cloud-based computing system 108 may be used to monitor biomarkers associated with a HCP (a surgeon, a nurse, etc.) or a patient in real-time and to generate surgical plans based at least on measurement data gathered prior to a surgical procedure, provide control signals to one or more surgical instruments during a surgical procedure, and notify a patient of a complication during a post-surgical period.


The cloud-based computing system 108 may be used to analyze surgical data. Surgical data may be obtained via the intelligent instrument(s) 114, the wearable sensing system(s) 111, the environmental sensing system(s) 115, the robotic system(s) 113, and/or the like in the surgical system 102. Surgical data may include tissue states to assess leaks or perfusion of sealed tissue after a tissue sealing and cutting procedure pathology data, including images of samples of body tissue, anatomical structures of the body using a variety of sensors integrated with imaging devices and techniques such as overlaying images captured by multiple imaging devices, image data, and/or the like. The surgical data may be analyzed to improve surgical procedure outcomes by determining if further treatment, such as the application of endoscopic intervention, emerging technologies, a targeted radiation, targeted intervention, and precise robotics to tissue-specific sites and conditions. Such data analysis may employ outcome analytics processing and using standardized approaches may provide beneficial feedback to either confirm surgical treatments and the behavior of the surgeon or suggest modifications to surgical treatments and the behavior of the surgeon.



FIG. 2 shows one embodiment of the surgical system 102 in one embodiment of a surgical operating room 135. As illustrated in FIG. 2, a patient is being operated on by one or more HCPs. The HCP(s) are being monitored by one or more HCP sensing systems 120 worn by the HCP(s). The HCP(s) and the environment surrounding the HCP(s) may also be monitored by one or more environmental sensing systems including, for example, one or more cameras 121, one or more microphones 122, and other sensors that may be deployed in the operating room. The one or more HCP sensing systems 120 and the environmental sensing systems may be in communication with the surgical hub 106, which in turn may be in communication with the one or more cloud servers 109 of the cloud computing system 108, as shown in FIG. 1. The one or more environmental sensing systems may be used for measuring one or more environmental attributes, for example, HCP position in the surgical theater, HCP movements, ambient noise in the surgical theater, temperature/humidity in the surgical theater, etc.


As illustrated in FIG. 2, a primary display 123 and one or more audio output devices (e.g., speakers 119) are positioned in a sterile field of the surgical system 102 to be visible to an operator at an operating table 124. In addition, a visualization/notification tower 126 is positioned outside the sterile field. The visualization/notification tower 126 may include a first non-sterile human interactive device (HID) 127 and a second non-sterile HID 129, which may be displays and may face away from each other. The display 123 and the HIDs 127, 129 may include a touch screen allowing a human to interface directly with the HID 127, 129. A human interface system, guided by the surgical hub 106, may be configured to utilize the display 123 and the HIDs 127, 129 to coordinate information flow to operators inside and outside the sterile field. In an exemplary embodiment, the surgical hub 106 may cause an HID (e.g., the primary display 123) to display a notification and/or information about the patient and/or a surgical procedure step. In an exemplary embodiment, the surgical hub 106 may prompt for and/or receive input from personnel in the sterile field or in the non-sterile area. In an exemplary embodiment, the surgical hub 106 may cause one or more non-sterile HIDs 127, 129 to display a snapshot of a surgical site, as recorded by an imaging device 130, while maintaining a live feed of the surgical site on one or more sterile HIDs, e.g., the primary HID 123. The snapshot on the non-sterile HID(s) 127, 129 can permit a non-sterile operator to perform a diagnostic step relevant to the surgical procedure, for example.


The surgical hub 106 may be configured to route a diagnostic input or feedback entered by a non-sterile operator at the visualization tower 126 to the primary display 123 within the sterile field, where it can be viewed by a sterile operator at the operating table. In an exemplary embodiment, the input can be in the form of a modification to the snapshot displayed on the non-sterile HID(s) 127, 129, which can be routed to the one or more sterile HIDs, e.g., the primary display 123, by the surgical hub 106.


Various embodiments of surgical hubs are further described in, for example, U.S. Pat. App. Pub. No. 2024/0221937 entitled “Method For Advanced Algorithm Support” published Jul. 4, 2024, U.S. Pat. App. Pub. No. 2024/0112768 entitled “Method For Health Data And Consent Management” published Apr. 4, 2024, U.S. Pat. App. Pub. No. 2024/0220763 entitled “Data Volume Determination For Surgical Machine Learning Applications” published Jul. 2, 2024, U.S. Pat. App. Pub. No. 2019/0206564 entitled “Method For Facility Data Collection And Interpretation” published Jul. 4, 2019, U.S. Pat. App. Pub. No. 2023/0026634 entitled “Surgical Data System And Classification” published Jan. 26, 2023, U.S. Pat. App. Pub. No. 2019/0201115 entitled “Aggregation And Reporting Of Surgical Hub Data” published Jul. 4, 2019, U.S. Pat. App. Pub. No. 2023/0372030 entitled “Automatic Compilation, Annotation, And Dissemination Of Surgical Data To Systems To Anticipate Related Automated Operations” published Nov. 23, 2023, U.S. Pat. App. Pub. No. 2022/0104896 entitled “Interactive Information Overlay On Multiple Surgical Displays” published Apr. 7, 2022, U.S. Pat. No. 11,304,699 entitled “Method For Adaptive Control Schemes For Surgical Network Control And Interaction” issued Apr. 19, 2022, U.S. Pat. No. 10,849,697 entitled “Cloud Interface For Coupled Surgical Devices” issued Dec. 1, 2020, U.S. Pat. App. Pub. No. 2022/0239577 entitled “Ad Hoc Synchronization Of Data From Multiple Link Coordinated Sensing Systems” published Jul. 28, 2022, U.S. Pat. App. Pub. No. 2023/0025061 entitled “Surgical Data System And Management” published Jan. 26, 2023, U.S. Pat. App. Pub. No. 2023/0023083 entitled “Method Of Surgical System Power Management, Communication, Processing, Storage And Display” published Jan. 26, 2023, U.S. Pat. App. Pub. No. 2019/0206556 entitled “Real-Time Analysis Of Comprehensive Cost Of All Instrumentation Used In Surgery Utilizing Data Fluidity To Track Instruments Through Stocking And In-House Processes” published Jul. 4, 2019, U.S. Pat. App. Pub. No. 2023/0096268 entitled “Methods for Controlling Cooperative Surgical Instruments” filed Oct. 5, 2021, U.S. Pat. App. Pub. No. 2019/0200844 entitled “Method Of Hub Communication, Processing, Storage And Display” filed Dec. 4, 2018, U.S. Pat. App. Pub. No. 2019/0200981 entitled “Method Of Compressing Tissue Within A Stapling Device And Simultaneously Displaying The Location Of The Tissue Within The Jaws” filed Dec. 4, 2018, U.S. Pat. App. Pub. No. 2019/0201046 entitled “Method For Controlling Smart Energy Devices” filed Dec. 4, 2018, U.S. Pat. App. Pub. No. 2019/0201114 entitled “Adaptive Control Program Updates For Surgical Hubs” filed Mar. 29, 2018, U.S. Pat. App. Pub. No. 2019/0201140 entitled “Surgical Hub Situational Awareness” filed Mar. 29, 2018, U.S. Pat. App. Pub. No. 2019/0206004 entitled “Interactive Surgical Systems With Condition Handling Of Devices And Data Capabilities” filed Mar. 29, 2018, U.S. Pat. App. Pub. No. 2019/0206555 entitled “Cloud-based Medical Analytics For Customization And Recommendations To A User” filed Mar. 29, 2018, U.S. Pat. No. 11,678,881 entitled “Spatial Awareness Of Surgical Hubs In Operating Rooms” filed Mar. 29, 2018, and U.S. Pat. App. Pub. No. 2019/0207857 entitled “Surgical Network Determination Of Prioritization Of Communication, Interaction, Or Processing Based On System Or Device Needs” filed Nov. 6, 2018, which are hereby incorporated by reference in their entireties.


As in the illustrated embodiment of FIG. 2, one or more surgical instruments 131 may be being used in the surgical procedure as part of the surgical system 102. The surgical hub 106 may be configured to coordinate information flow to at least one display showing the surgical instrument(s) 131. A diagnostic input or feedback entered by a non-sterile operator at the visualization tower 126 can be routed by the surgical hub 106 to the at least one display, e.g., the primary display 123, within the sterile field, where it can be viewed by the operator of the surgical instrument(s) 131.


Various embodiments of coordinating information flow and display and various embodiments of surgical instruments are described further in, for example, U.S. Pat. No. 11,937,769 entitled “Method Of Hub Communication, Processing, Storage And Display” issued Mar. 26, 2024, which is hereby incorporated by reference in its entirety.


Examples of surgical instruments include a surgical dissector, a surgical stapler, a surgical grasper, a surgical scope (e.g., an endoscope, a laparoscope, etc.), a surgical energy device (e.g., a mono-polar probe, a bi-polar probe, an ablation probe, an ultrasound device, an ultrasonic end effector, etc.), a surgical clip applier, etc.


Various embodiments of surgical instruments are described further in, for example, U.S. Pat. No. 11,723,642 entitled “Cooperative Access Hybrid Procedures” issued Aug. 14, 2023, U.S. Pat. App. Pub. No. 2013/0256377 entitled “Layer Comprising Deployable Attachment Members” filed Feb. 8, 2013, U.S. Pat. No. 8,393,514 entitled “Selectively Orientable Implantable Fastener Cartridge” filed Sep. 30, 2010, U.S. Pat. No. 8,317,070 entitled “Surgical Stapling Devices That Produce Formed Staples Having Different Lengths” filed Feb. 28, 2007, U.S. Pat. No. 7,143,925 entitled “Surgical Instrument Incorporating EAP Blocking Lockout Mechanism” filed Jun. 21, 2005, U.S. Pat. App. Pub. No. 2015/0134077 entitled “Sealing Materials for Use in Surgical Stapling” filed Nov. 8, 2013, U.S. Pat. App. Pub. No. 2015/0134076 entitled “Hybrid Adjunct Materials for Use in Surgical Stapling” filed Nov. 8, 2013, U.S. Pat. App. Pub. No. 2015/0133996 entitled “Positively Charged Implantable Materials and Method of Forming the Same” filed Nov. 8, 2013, U.S. Pat. App. Pub. No. 2015/0129634 entitled “Tissue Ingrowth Materials and Method of Using the Same” filed Nov. 8, 2013, U.S. Pat. App. Pub. No. 2015/0133995 entitled “Hybrid Adjunct Materials for Use in Surgical Stapling” filed Nov. 8, 2013, U.S. Pat. No. 9,913,642 entitled “Surgical Instrument Comprising a Sensor System” filed Mar. 26, 2014, U.S. Pat. No. 10,172,611 entitled “Adjunct Materials and Methods of Using Same in Surgical Methods for Tissue Sealing” filed Jun. 10, 2014, U.S. Pat. No. 8,989,903 entitled “Methods And Systems For Indicating A Clamping Prediction” filed Jan. 13, 2012, U.S. Pat. No. 9,072,535 entitled “Surgical Stapling Instruments With Rotatable Staple Deployment Arrangements” filed May 27, 2011, U.S. Pat. No. 9,072,536 entitled “Differential Locking Arrangements For Rotary Powered Surgical Instruments” filed Jun. 28, 2012, U.S. Pat. No. 10,531,929 entitled “Control Of Robotic Arm Motion Based On Sensed Load On Cutting Tool” filed Aug. 16, 2016, U.S. Pat. No. 10,709,516 entitled “Curved Cannula Surgical System Control” filed Apr. 2, 2018, U.S. Pat. No. 11,076,926 entitled “Manual Release For Medical Device Drive System” filed Mar. 21, 2018, U.S. Pat. No. 9,839,487 entitled “Method For Engaging Surgical Instrument With Teleoperated Actuator” filed Mar. 17, 2015, U.S. Pat. No. 10,543,051 entitled “Method For Engaging Surgical Instrument With Teleoperated Actuator” issued Jan. 28, 2020, U.S. Pat. No. 9,804,618 entitled “Systems And Methods For Controlling A Segmented Circuit” filed Mar. 25, 2014, U.S. Pat. No. 11,607,239 entitled “Systems And Methods For Controlling A Surgical Stapling And Cutting Instrument” filed Apr. 15, 2016, U.S. Pat. No. 10,052,044 entitled “Time Dependent Evaluation Of Sensor Data To Determine Stability, Creep, And Viscoelastic Elements Of Measures” filed Mar. 6, 2015, U.S. Pat. No. 9,439,649 entitled “Surgical Instrument Having Force Feedback Capabilities” filed Dec. 12, 2012, U.S. Pat. No. 10,751,117 entitled “Electrosurgical Instrument With Fluid Diverter” filed Sep. 23, 2016, U.S. Pat. No. 11,160,602 entitled “Control Of Surgical Field Irrigation” filed Aug. 29, 2017, U.S. Pat. No. 9,877,783 entitled “Energy Delivery Systems And Uses Thereof” filed Dec. 30, 2016, U.S. Pat. No. 11,266,458 entitled “Cryosurgical System With Pressure Regulation” filed Apr. 19, 2019, U.S. Pat. No. 10,314,649 entitled “Flexible Expandable Electrode And Method Of Intraluminal Delivery Of Pulsed Power” filed Aug. 2, 2012, U.S. Pat. App. Pub. No. 2023/0116781 entitled “Surgical Devices, Systems, And Methods Using Multi-Source Imaging” filed Oct. 5, 2021, U.S. Pat. App. Pub. No. 2023/0102358 entitled “Surgical Devices, Systems, And Methods Using Fiducial Identification And Tracking” filed Oct. 5, 2021, U.S. Pat. No. 10,413,373 entitled “Robotic Visualization And Collision Avoidance” filed Aug. 16, 2016, U.S. Pat. App. Pub. No. 2023/0077141 entitled “Robotically Controlled Uterine Manipulator” filed Sep. 21, 2021, and U.S. Pat. App. Pub. No. 2022/0273309 entitled “Stapler Reload Detection And Identification” filed May 16, 2022, which are hereby incorporated by reference herein in their entireties.


As shown in FIG. 2, the surgical system 102 can be used to perform a surgical procedure on the patient who is lying down on the operating table 124 in the surgical operating room 135. A robotic system 134 may be used in the surgical procedure as a part of the surgical system 102. The robotic system 134 may include a surgeon's console 136, a patient side cart 132 (surgical robot), and a surgical robotic hub 133. The patient side cart 132 can manipulate at least one removably coupled surgical instrument 137 through a minimally invasive incision in the body of the patient while the surgeon views the surgical site via the surgeon's console 136. An image of the surgical site can be obtained by the imaging device 130, which can be manipulated by the patient side cart 132 to orient the imaging device 130. The surgical robotic hub 133 can be used to process the images of the surgical site for subsequent display to the surgeon via the surgeon's console 136.


Various embodiments of robotic systems and various embodiments of surgical instruments are described further in, for example, U.S. Pat. No. 11,559,307 entitled “Method Of Robotic Hub Communication, Detection, And Control” issued Jan. 24, 2023, which is hereby incorporated by reference in its entirety.


The imaging device 130 may include at least one image sensor and one or more optical components. Suitable image sensors may include, but are not limited to, Charge-Coupled Device (CCD) sensors and Complementary Metal-Oxide Semiconductor (CMOS) sensors.


The optical components of the imaging device 130 may include one or more illumination sources and/or one or more lenses. The one or more illumination sources may be directed to illuminate portions of the surgical field. The one or more image sensors may receive light reflected or refracted from the surgical field, including light reflected or refracted from tissue and/or surgical instruments. The illumination source(s) may be configured to radiate electromagnetic energy in the visible spectrum as well as the invisible spectrum. The visible spectrum, sometimes referred to as the “optical spectrum” or the “luminous spectrum,” is the portion of the electromagnetic spectrum that is visible to (e.g., can be detected by) the human eye and may be referred to as “visible light” or simply “light.” A typical human eye will respond to wavelengths in air that range from about 380 nm to about 750 nm.


The invisible spectrum (e.g., the non-luminous spectrum) is the portion of the electromagnetic spectrum that lies below and above the visible spectrum (i.e., wavelengths below about 380 nm and above about 750 nm). The invisible spectrum is not detectable by the human eye. Wavelengths greater than about 750 nm are longer than the red visible spectrum, and they become invisible infrared (IR), microwave, and radio electromagnetic radiation. Wavelengths less than about 380 nm are shorter than the violet spectrum, and they become invisible ultraviolet, x-ray, and gamma ray electromagnetic radiation.


In various aspects, the imaging device 130 is configured for use in a minimally invasive procedure. Examples of imaging devices include an arthroscope, angioscope, bronchoscope, choledochoscope, colonoscope, cytoscope, duodenoscope, enteroscope, esophagogastro-duodenoscope (gastroscope), endoscope, laryngoscope, nasopharyngo-neproscope, sigmoidoscope, thoracoscope, and ureteroscope.


The imaging device 130 may employ multi-spectrum monitoring to discriminate topography and underlying structures. A multi-spectral image is one that captures image data within specific wavelength ranges across the electromagnetic spectrum. The wavelengths may be separated by filters or by the use of instruments that are sensitive to particular wavelengths, including light from frequencies beyond the visible light range, e.g., infrared (IR) and ultraviolet (UV). Spectral imaging can allow extraction of additional information that the human eye fails to capture with its receptors for red, green, and blue. Multi-spectrum monitoring can be a useful tool in relocating a surgical field after a surgical task is completed to perform one or more of the previously described tests on the treated tissue. It is axiomatic that strict sterilization of the operating room and surgical equipment is required during any surgery. The strict hygiene and sterilization conditions required in a “surgical theater,” e.g., an operating or treatment room, necessitate the highest possible sterility of all medical devices and equipment. Part of that sterilization process is the need to sterilize anything that comes in contact with the patient or penetrates the sterile field, including the imaging device 130 and its attachments and components. It will be appreciated that the sterile field may be considered a specified area, such as within a tray or on a sterile towel, that is considered free of microorganisms, or the sterile field may be considered an area, immediately around a patient, who has been prepared for a surgical procedure. The sterile field may include the scrubbed team members, who are properly attired, and all furniture and fixtures in the area.


Various embodiments of multi-spectral imaging are described further in, for example, U.S. Pat. No. 11,937,769 entitled “Method Of Hub Communication, Processing, Storage And Display” filed Dec. 4, 2018, which is hereby incorporated by reference in its entirety.


The wearable sensing system(s) 111 illustrated in FIG. 1 may include the one or more HCP sensing systems 120 as shown in FIG. 2. The one or more HCP sensing systems 120 may include sensing system(s) to monitor and detect a set of physical states and/or a set of physiological states of an HCP. An HCP may be a surgeon or one or more healthcare personnel assisting the surgeon or other healthcare service providers in general. The HCP sensing system 120 may send the measurement data associated with a set of biomarkers and data associated with a physical state of the surgeon to the surgical hub 106 for further processing. In an exemplary embodiment, an HCP sensing system 120 may measure a set of biomarkers to monitor the heart rate of an HCP. In an exemplary embodiment, an HCP sensing system 120 worn on a surgeon's wrist (e.g., a watch or a wristband) may use an accelerometer to detect hand motion and/or shakes and determine the magnitude and frequency of tremors.


The environmental sensing system(s) 115 shown in FIG. 1 may send environmental information to the surgical hub 106. In an exemplary embodiment, the environmental sensing system(s) 115 may include a camera 121 for detecting hand/body position of an HCP. The environmental sensing system(s) 115 may include one or more microphones 122 for measuring ambient noise in the surgical theater. Other environmental sensing system(s) 115 may include one or more devices, for example, a thermometer to measure temperature, a hygrometer to measure humidity of the surroundings in the surgical theater, etc. The surgeon biomarkers may include one or more of the following: stress, heart rate, etc. The environmental measurements from the surgical theater may include ambient noise level associated with the surgeon or the patient, surgeon and/or staff movements, surgeon and/or staff attention level, etc. The surgical hub 106, alone or in communication with the cloud computing system 108, may use the surgeon biomarker measurement data and/or environmental sensing information to modify control algorithms of hand-held instruments or the averaging delay of a robotic interface, for example, to minimize tremors.


The surgical hub 106 may use the surgeon biomarker measurement data associated with an HCP to adaptively control the one or more surgical instruments 131. For example, the surgical hub 106 may send a control program to one of the one or more surgical instruments 131 to control the surgical instrument's actuators to limit or compensate for fatigue and use of fine motor skills. The surgical hub 106 may send the control program based on situational awareness and/or the context on importance or criticality of a task. The control program may instruct the instrument to alter operation to provide more control when control is needed.



FIG. 3 shows an embodiment of the surgical system 102 including the surgical hub 106. The surgical hub 106 may be paired with, via a modular control, the one or more wearable sensing systems 111, the one or more environmental sensing systems 115, the one or more human interface systems 112, the one or more robotic systems 113, and the one or more intelligent instruments 114. As in this illustrated embodiment, the surgical hub 106 may include a display 148, an imaging module 149, a generator module 150 (e.g., an energy generator), a communication module 156, a processor module 157, a storage array 158, and an operating-room mapping module 159. In certain aspects, as illustrated in FIG. 3, the surgical hub 106 further includes a smoke evacuation module 154 and/or a suction/irrigation module 155. The various modules and systems may be connected to the modular control either directly via a router or via the communication module 156. The operating theater devices may be coupled to cloud computing resources and data storage, e.g., to the cloud computing system 108, via the modular control. The human interface system(s) 112 may include a display sub-system and a notification sub-system.


The modular control may be coupled to a non-contact sensor module. The non-contact sensor module may measure the dimensions of the operating theater and generate a map of the surgical theater using ultrasonic, laser-type, and/or the like non-contact measurement devices. Other distance sensors can be employed to determine the bounds of an operating room. The sensor module may be configured to determine the size of the operating theater and to adjust Bluetooth-pairing distance limits. A laser-based non-contact sensor module may scan the operating theater by transmitting laser light pulses, receiving laser light pulses that bounce off the perimeter walls of the operating theater, and comparing the phase of the transmitted pulse to the received pulse to determine the size of the operating theater and to adjust Bluetooth pairing distance limits, for example.


An ultrasound-based non-contact sensor module may scan the operating theater by transmitting a burst of ultrasound and receiving the echo when it bounces off the perimeter walls of an operating theater as described further in, for example, U.S. Pat. No. 11,857,152 entitled “Surgical Hub Spatial Awareness To Determine Devices In Operating Theater” issued Jan. 2, 2024, U.S. Pat. No. 11,278,281 entitled “Interactive Surgical Platform” issued Mar. 22, 2022, and U.S. Prov. Pat. App. No. 62/611,341 entitled “Interactive Surgical Platform” filed Dec. 28, 2017, which are hereby incorporated by reference herein in their entireties.


During a surgical procedure, energy application to tissue, for sealing and/or cutting, may be associated with smoke evacuation, suction of excess fluid, and/or irrigation of the tissue. Fluid, power, and/or data lines from different sources may be entangled during the surgical procedure. Valuable time can be lost addressing this issue during a surgical procedure. Detangling the lines may necessitate disconnecting the lines from their respective modules, which may require resetting the modules. A hub modular enclosure 160 of the surgical hub 106 may offer a unified environment for managing the power, data, and fluid lines, which reduces the frequency of entanglement between such lines.


Energy may be applied to tissue at a surgical site. The surgical hub 106 may include the hub modular enclosure 160 and a combo generator module slidably receivable in a docking station of the hub modular enclosure 160. The docking station may include data and power contacts. The combo generator module may include two or more of: an ultrasonic energy generator component, a bipolar radiofrequency (RF) energy generator component, or a monopolar RF energy generator component that are housed in a single unit. The combo generator module may include a smoke evacuation component, at least one energy delivery cable for connecting the combo generator module to a surgical instrument, at least one smoke evacuation component configured to evacuate smoke, fluid, and/or particulates generated by the application of therapeutic energy to the tissue, and a fluid line extending from the remote surgical site to the smoke evacuation component. The fluid line may be a first fluid line, and a second fluid line may extend from the remote surgical site to a suction and irrigation module 155 slidably received in the hub modular enclosure 160. The hub modular enclosure 160 may include a fluid interface.


The combo generator module may generate multiple energy types for application to the tissue. One energy type may be more beneficial for cutting the tissue, while another different energy type may be more beneficial for sealing the tissue. For example, a bipolar generator can be used to seal the tissue while an ultrasonic generator can be used to cut the sealed tissue. In an exemplary embodiment, the hub modular enclosure 160 may be configured to accommodate different generators and facilitate an interactive communication therebetween. The hub modular enclosure 160 may enable the quick removal and/or replacement of various modules.


The hub modular enclosure 160 may include a first energy-generator module, configured to generate a first energy for application to the tissue, and a first docking station comprising a first docking port that includes first data and power contacts, wherein the first energy-generator module is slidably movable into an electrical engagement with the power and data contacts and wherein the first energy-generator module is slidably movable out of the electrical engagement with the first power and data contacts. The modular surgical enclosure may include a second energy-generator module configured to generate a second energy, different than the first energy, for application to the tissue, and a second docking station comprising a second docking port that includes second data and power contacts, wherein the second energy generator module is slidably movable into an electrical engagement with the power and data contacts, and wherein the second energy-generator module is slidably movable out of the electrical engagement with the second power and data contacts. In addition, the modular surgical enclosure also includes a communication bus between the first docking port and the second docking port, configured to facilitate communication between the first energy-generator module and the second energy-generator module.


As shown in FIG. 3, the hub modular enclosure 160 may allow the modular integration of the generator module 150, the smoke evacuation module 154, and the suction/irrigation module 155. The hub modular enclosure 160 may facilitate interactive communication between the operating-room mapping, smoke evacuation, and suction/irrigation modules 159, 154, 155. The generator module 150 can be with integrated monopolar, bipolar, and ultrasonic components supported in a single housing unit slidably insertable into the hub modular enclosure 160. The generator module 150 may connect to a monopolar device 151, a bipolar device 152, and an ultrasonic device 153. The generator module 150 may include a series of monopolar, bipolar, and/or ultrasonic generator modules that interact through the hub modular enclosure 160. The hub modular enclosure 160 may facilitate the insertion of multiple generators and interactive communication between the generators docked into the hub modular enclosure 160 so that the generators would act as a single generator.


A surgical data network having a set of communication hubs may connect the sensing system(s), the modular devices located in one or more operating theaters of a healthcare facility, a patient recovery room, or a room in a healthcare facility specially equipped for surgical operations, to the cloud computing system 108.



FIG. 4 illustrates one embodiment of a situationally aware surgical system 200. Data sources 202 of the situationally aware surgical system 200 may include, for example, modular devices 204, databases 206 (e.g., an electronic medical records (EMR) database, such as of a hospital or other medical facility, containing patient records, etc.), patient monitoring devices 208 (e.g., a blood pressure (BP) monitor, an electrocardiography (EKG) monitor, one or more wearable sensing systems 111, etc.), HCP monitoring devices 210 (e.g., one or more wearable sensing systems 111, etc.), and/or environment monitoring devices 212 (e.g., one or more environmental sensing systems 115, etc.).


The modular devices 204 may include sensors configured to detect parameters associated with a patient, HCPs and environment and/or the modular device 204 itself. The modular devices 204 may include the one or more intelligent instrument(s) 114.


The data sources 202 may be in communication (e.g., wirelessly or wired) with a surgical hub 214, such as the surgical hub 106. The surgical hub 214 may derive contextual information pertaining to a surgical procedure from data based upon, for example, the particular combination(s) of data received from the data sources 202 or the particular order in which the data is received from the data sources 202. The contextual information inferred from the received data can include, for example, the type of surgical procedure being performed, the particular step of the surgical procedure that a surgeon (and/or other HCP) is performing, the type of tissue being operated on, or a body cavity that is the subject of the surgical procedure. This ability by some aspects of the surgical hub 214 to derive or infer information related to the surgical procedure from received data can be referred to as “situational awareness.” For example, the surgical hub 214 can incorporate a situational awareness system, which may be the hardware and/or programming associated with the surgical hub 214 that derives contextual information pertaining to the surgical procedure from the received data and/or a surgical plan information received from the edge computing system 216 or an enterprise cloud server 218, such as the cloud computing system 108. The contextual information derived from the data sources 202 may include, for example, what step of the surgical procedure is being performed, whether and how a particular modular device 204 is being used, and the patient's condition.


The surgical hub 214 may be connected to the databases 206 of the data sources 202 to retrieve therefrom data regarding the surgical procedure that is being performed or is to be performed. The data that may be received by the situational awareness system of the surgical hub 214 from the databases 206 may include, for example, start (or setup) time or operational information regarding the procedure (e.g., a segmentectomy in the upper right portion of the thoracic cavity). The surgical hub 214 may derive contextual information regarding the surgical procedure from this data alone or from the combination of this data and other data from the data sources 202.


The surgical hub 214 may be connected to (e.g., paired with) the patient monitoring devices 208 of the data sources 202. Examples of the patient monitoring devices 208 that can be paired with the surgical hub 214 may include a pulse oximeter (SpO2 monitor), a blood pressure (BP) monitor, and an electrocardiogram (EKG) monitor. Perioperative data that is received by the situational awareness system of the surgical hub 214 from the patient monitoring devices 208 may include, for example, the patient's oxygen saturation, blood pressure, heart rate, and/or other physiological parameters. The contextual information that may be derived by the surgical hub 214 from the perioperative data transmitted by the patient monitoring devices 208 may include, for example, whether the patient is located in the operating theater or under anesthesia. The surgical hub 214 may derive these inferences from data from the patient monitoring devices 208 alone or in combination with data from other data from the data sources 202, such as a ventilator and/or other data source.


The surgical hub 214 may be connected to (e.g., paired with) the modular devices 204. Examples of the modular devices 204 that are paired with the surgical hub 214 may include a smoke evacuator, a medical imaging device such as the imaging device 130 shown in FIG. 2, an insufflator, a combined energy generator (for powering an ultrasonic surgical instrument and/or an RF electrosurgical instrument), and a ventilator.


The perioperative data received by the surgical hub 214 may be from a medical imaging device and/or other device(s). The perioperative data received by the surgical hub 214 from the medical imaging device may include, for example, whether the medical imaging device is activated and image data. The contextual information that is derived by the surgical hub 214 from the perioperative data sent by the medical imaging device may include, for example, whether the procedure is a video-assisted thoracic surgery (VATS) procedure (based on whether the medical imaging device is activated or paired to the surgical hub 214 at the beginning or during the course of the procedure). The image data (e.g., still image and/or video image) from the medical imaging device (or the data stream representing the video for a digital medical imaging device) may be processed by a pattern recognition system or a machine learning system to recognize features (e.g., organs or tissue types) in the field of view (FOV) of the medical imaging device, for example. The contextual information that is derived by the surgical hub 214 from the recognized features may include, for example, what type of surgical procedure (or step thereof) is being performed, what organ is being operated on, or what body cavity is being operated in.


The situational awareness system of the surgical hub 214 may derive the contextual information from the data received from the data sources 202 in a variety of different ways. For example, the situational awareness system can include a pattern recognition system, or a machine learning system (e.g., an artificial neural network), that has been trained on training data to correlate various inputs (e.g., data from the databases 206, the patient monitoring devices 208, the modular devices 204, the HCP monitoring devices 210, and/or the environment monitoring devices 212) to corresponding contextual information regarding a surgical procedure. For example, a machine learning system may accurately derive contextual information regarding a surgical procedure from the provided inputs. In examples, the situational awareness system can include a lookup table storing pre-characterized contextual information regarding a surgical procedure in association with one or more inputs (or ranges of inputs) corresponding to the contextual information. In response to a query with one or more inputs, the lookup table can return the corresponding contextual information for the situational awareness system for controlling one or more of the modular devices 204. In examples, the contextual information received by the situational awareness system of the surgical hub 214 can be associated with a particular control adjustment or set of control adjustments for one or more of the modular devices 204. In examples, the situational awareness system can include a machine learning system, lookup table, or other such system, which may generate or retrieve one or more control adjustments for one or more of the modular devices 204 when provided the contextual information as input.


For example, based on data from the data sources 202, the surgical hub 214 may determine what type of tissue was being operated on. The surgical hub 214 can infer whether a surgical procedure being performed is a thoracic or an abdominal procedure, allowing the surgical hub 214 to determine whether the tissue clamped by an end effector of a surgical stapling and cutting instrument is lung tissue (for a thoracic procedure) or stomach tissue (for an abdominal procedure) tissue. The surgical hub 214 may determine whether the surgical site is under pressure (by determining that the surgical procedure is utilizing insufflation) and determine the procedure type, for a consistent amount of smoke evacuation for both thoracic and abdominal procedures. Based on data from the data sources 202, the surgical hub 214 may determine what step of the surgical procedure is being performed or will subsequently be performed.


The surgical hub 214 may determine what type of surgical procedure is being performed and customize an energy level according to an expected tissue profile for the surgical procedure. The situationally aware surgical hub 214 may adjust the energy level for an ultrasonic surgical instrument or RF electrosurgical instrument throughout the course of a surgical procedure, rather than just on a procedure-by-procedure basis.


In examples, data can be drawn from one or more data sources 202 to improve the conclusions that the surgical hub 214 draws from another one of the data sources 202. The surgical hub 214 may augment data that it receives from the modular devices 204 with contextual information that it has built up regarding the surgical procedure from the other data sources 202.


The situational awareness system of the surgical hub 214 can consider the physiological measurement data to provide additional context in analyzing the visualization data. The additional context can be useful when the visualization data may be inconclusive or incomplete on its own.


The surgical hub 214 may determine whether a surgeon (and/or other HCP(s)) was making an error or otherwise deviating from the expected course of action during the course of a surgical procedure. For example, the surgical hub 214 may determine a type of surgical procedure being performed, retrieve a corresponding list of steps or order of equipment usage (e.g., from a memory of the surgical hub 214 or other computer system), and compare the steps being performed or the equipment being used during the course of the surgical procedure to the expected steps or equipment for the type of surgical procedure that the surgical hub 214 determined is being performed. The surgical hub 214 can provide an alert indicating that an unexpected action is being performed or an unexpected device is being utilized at the particular step in the surgical procedure.


The surgical instruments (and other modular devices 204) may be adjusted for the particular context of each surgical procedure (such as adjusting to different tissue types) and validating actions during a surgical procedure. Next steps, data, and display adjustments may be provided to surgical instruments (and other modular devices 204) in the surgical theater according to the specific context of the surgical procedure.


Embodiments of situational awareness systems and using situational awareness systems during performance of a surgical procedure are described further in, for example, U.S. patent application Ser. No. 16/729,772 entitled “Analyzing Surgical Trends By A Surgical System” filed Dec. 30, 2019, U.S. patent application Ser. No. 16/729,747 entitled “Dynamic Surgical Visualization Systems” filed Dec. 30, 2019, U.S. patent application Ser. No. 16/729,744 entitled “Visualization Systems Using Structured Light” filed Dec. 30, 2019, U.S. patent application Ser. No. 16/729,778 entitled “System And Method For Determining, Adjusting, And Managing Resection Margin About A Subject Tissue” filed Dec. 30, 2019, U.S. patent application Ser. No. 16/729,729 entitled “Surgical Systems For Proposing And Corroborating Organ Portion Removals” filed Dec. 30, 2019, U.S. patent application Ser. No. 16/729,778 entitled “Surgical System For Overlaying Surgical Instrument Data Onto A Virtual Three Dimensional Construct Of An Organ” filed Dec. 30, 2019, U.S. patent application Ser. No. 16/729,751 entitled “Surgical Systems For Generating Three Dimensional Constructs Of Anatomical Organs And Coupling Identified Anatomical Structures Thereto” filed Dec. 30, 2019, U.S. patent application Ser. No. 16/729,740 entitled “Surgical Systems Correlating Visualization Data And Powered Surgical Instrument Data” filed Dec. 30, 2019, U.S. patent application Ser. No. 16/729,737 entitled “Adaptive Surgical System Control According To Surgical Smoke Cloud Characteristics” filed Dec. 30, 2019, U.S. patent application Ser. No. 16/729,796 entitled “Adaptive Surgical System Control According To Surgical Smoke Particulate Characteristics” filed Dec. 30, 2019, U.S. patent application Ser. No. 16/729,803 entitled “Adaptive Visualization By A Surgical System” filed Dec. 30, 2019, and U.S. patent application Ser. No. 16/729,807 entitled “Method Of Using Imaging Devices In Surgery” filed Dec. 30, 2019, which are hereby incorporated by reference in their entireties.



FIG. 5 illustrates one embodiment of a surgical system 300 that may include a surgical instrument 302, such as the surgical instrument 114 of FIG. 1 or the surgical instrument 131 of FIG. 2. The surgical instrument 302 can be in communication with a console 304 and/or a portable device 306 through a local area network (LAN) 308 and/or a cloud network 310, such as the cloud computing system 108 of FIG. 1, via a wired and/or wireless connection. The console 304 and the portable device 306 may be any suitable computing device.


The surgical instrument 302 may include a handle 312, an adapter 314, and a loading unit 316. The adapter 314 releasably couples to the handle 312 and the loading unit 316 releasably couples to the adapter 314 such that the adapter 314 transmits a force from one or more drive shafts to the loading unit 316. The adapter 314 or the loading unit 316 may include a force gauge (not explicitly shown in FIG. 5) disposed therein to measure a force exerted on the loading unit 316. In some embodiments, the adapter 314 is non-releasably attached to the handle 312. In some embodiments, the adapter 314 and the loading unit 316 are integral and may be releasably attachable to the handle 312 or non-releasably attached to the handle 312.


The loading unit 316 may include an end effector 318 having a first jaw 320 and a second jaw 322. The loading unit 316 may be an in-situ loaded or multi-firing loading unit (MFLU) that allows a clinician to fire a plurality of fasteners (e.g., staples, clips, etc.) multiple times without requiring the loading unit 316 to be removed from a surgical site to reload the loading unit 316. The first and second jaws 320, 322 may be configured to clamp tissue therebetween, fire fasteners through the clamped tissue, and sever the clamped tissue. The first jaw 320 may be configured to fire at least one fastener a plurality of times or may be configured to include a replaceable multi-fire fastener cartridge including a plurality of fasteners that may be fired more than one time prior to being replaced. The second jaw 322 may include an anvil that deforms or otherwise secures the fasteners, as the fasteners are ejected from the multi-fire fastener cartridge.


The surgical instrument 302 may include a motor, such as at the handle 312, that is coupled to the one or more drive shafts to affect rotation of the one or more drive shafts. The surgical instrument 302 may include a control interface, such as at the handle 312, to selectively activate the motor. The control interface may include buttons, switches, levers, sliders, touchscreens, and/or any other suitable input mechanisms or user interfaces, which can be engaged by a clinician to activate the motor.


The control interface of the surgical instrument 302 may be in communication with a controller 324 (e.g., a microprocessor or other controller) of the surgical instrument 302, shown in the embodiment of FIG. 5 disposed in the handle 312, to selectively activate the motor to affect rotation of the one or more drive shafts. The controller 324 may be configured to receive input from the control interface, adapter data from the adapter 314, and loading unit data from the loading unit 316. The controller 324 may analyze the input from the control interface and the data received from the adapter 314 and/or the date received from the loading unit 316 to selectively activate the motor. The surgical instrument 302 may also include a display, such as at the handle 312, that is viewable by a clinician during use of the surgical instrument 302. The display may be configured to display portions of the adapter data and/or loading unit data before, during, or after firing of the surgical instrument 302.


The adapter 314 may include an adapter identification device 326 disposed therein, and the loading unit 316 may include a loading unit identification device 328 disposed therein. The adapter identification device 326 may be in communication with the controller 324, and the loading unit identification device 328 may be in communication with the controller 324. It will be appreciated that the loading unit identification device 328 may be in communication with the adapter identification device 326, which relays or passes communication from the loading unit identification device 328 to the controller 324. In embodiments in which the adapter 314 and the loading unit 316 are integral, one of the adapter identification device 326 and the loading unit identification device 328 may be omitted.


The adapter 314 may also include one or more sensors 330 disposed thereabout to detect various conditions of the adapter 314 or of the environment (e.g., if the adapter 314 is connected to a loading unit, if the adapter 314 is connected to a handle, if the one or more drive shafts are rotating, a torque of the one or more drive shafts, a strain of the one or more drive shafts, a temperature within the adapter 314, a number of firings of the adapter 314, a peak force of the adapter 314 during firing, a total amount of force applied to the adapter 314, a peak retraction force of the adapter 314, a number of pauses of the adapter 314 during firing, etc.). The one or more sensors 330 may provide an input to the adapter identification device 326 (or to the loading unit identification device 328 if the adapter identification device 326 is omitted) in the form of data signals. The data signals of the one or more sensors 330 may be stored within or be used to update the adapter data stored within the adapter identification device 326 (or the loading unit identification device 328 if the adapter identification device 326 is omitted). The data signals of the one or more sensors 330 may be analog or digital. The one or more sensors 330 may include, for example, a force gauge to measure a force exerted on the loading unit 316 during firing.


The handle 312 and the adapter 314 can be configured to interconnect the adapter identification device 326 and the loading unit identification device 328 with the controller 324 via an electrical interface. The electrical interface may be a direct electrical interface (e.g., include electrical contacts that engage one another to transmit energy and signals therebetween). Additionally, or alternatively, the electrical interface may be a non-contact electrical interface to wirelessly transmit energy and signals therebetween (e.g., inductively transfer). It is also contemplated that the adapter identification device 326 and the controller 324 may be in wireless communication with one another via a wireless connection separate from the electrical interface.


The handle 312 may include a transceiver 332 that is configured to transmit instrument data from the controller 324 to one or more other components of the surgical system 300 (e.g., the LAN 308, the cloud 310, the console 304, and/or the portable device 306). The controller 324 may also transmit instrument data and/or measurement data associated with the one or more sensors 330 to a surgical hub, such as the surgical hub 106 of FIGS. 1-3 or the surgical hub 214 of FIG. 4. The transceiver 332 may receive data (e.g., cartridge data, loading unit data, adapter data, and/or other notifications) from the surgical hub. The transceiver 332 may receive data (e.g., cartridge data, loading unit data, or adapter data) from the other components of the surgical system 300. For example, the controller 324 may transmit surgical instrument data including a serial number of an attached adapter (e.g., the adapter 314) attached to the handle 312, a serial number of a loading unit (e.g., the loading unit 316) attached to the adapter 314, and a serial number of a multi-fire fastener cartridge loaded into the loading unit 316, e.g., into one of the jaws 320, 322 at the end effector 318, to the console 304. Thereafter, the console 304 may transmit data (e.g., cartridge data, loading unit data, and/or adapter data) associated with the attached cartridge, the loading unit 316, and the adapter, 314 respectively, back to the controller 324. The controller 324 can display messages on the local instrument display or transmit the message, via transceiver 332, to the console 304 or the portable device 306 to display the message on a display 334 or device screen of the portable device 306, respectively.


Various exemplary embodiments of aspects of smart surgical systems, for example how smart surgical systems choose to interact with each other, are described further in, for example, U.S. patent application Ser. No. 18/810,323 entitled “Method For Multi-System Interaction” filed Aug. 20, 2024, U.S. patent application Ser. No. 18/810,036 entitled “Adaptive Interaction Between Smart Healthcare Systems” filed Aug. 20, 2024, U.S. patent application Ser. No. 18/810,082 entitled “Control Redirection And Image Porting Between Surgical Systems” filed Aug. 20, 2024, U.S. patent application Ser. No. 18/809,890 entitled “Synchronized Motion Of Independent Surgical Devices” filed Aug. 20, 2024, U.S. patent application Ser. No. 18/810,133 entitled “Synchronization Of The Operational Envelopes Of Independent Surgical Devices” filed Aug. 20, 2024, U.S. patent application Ser. No. 18/810,170 entitled “Synchronized Motion Of Independent Surgical Devices To Maintain Relational Field Of Views” filed Aug. 20, 2024, U.S. patent application Ser. No. 18/810,208 entitled “Alignment And Distortion Compensation Of Reference Planes Used By Surgical Devices” filed Aug. 20, 2024, U.S. patent application Ser. No. 18/810,230 entitled “Shared Set Of Object Registrations For Surgical Devices Using Independent Reference Planes” filed Aug. 20, 2024, U.S. patent application Ser. No. 18/810,266 entitled “Coordinated Control Of Therapeutic Treatment Effects” filed Aug. 20, 2024, U.S. patent application Ser. No. 18/810,283 entitled “Functional Restriction Of A System Based On Information From Another Independent System” filed Aug. 20, 2024, U.S. patent application Ser. No. 18/809,960 entitled “Inter-Connectivity Of Data Flows Between Independent Smart Systems” filed Aug. 20, 2024, U.S. patent application Ser. No. 18/810,041 entitled “Inter-Connectivity Of Data Flows Between Independent Smart Systems” filed Aug. 20, 2024, U.S. patent application Ser. No. 18/810,119 entitled “Processing And Display Of Tissue Tension” filed Aug. 20, 2024, U.S. patent application Ser. No. 18/810,175 entitled “Situational Control Of Smart Surgical Devices” filed Aug. 20, 2024, U.S. patent application Ser. No. 18/810,222 entitled “Method For Activation Mode Determination Of An Energy Device” filed Aug. 20, 2024, U.S. patent application Ser. No. 18/810,274 entitled “Visual Data-Based Activation Mode Determination Of An Energy Device” filed Aug. 20, 2024, U.S. patent application Ser. No. 18/810,346 entitled “Electrical Data-Based Activation Mode Determination Of An Energy Device” filed Aug. 20, 2024, U.S. patent application Ser. No. 18/810,355 entitled “Mechanical Data-Based Activation Mode Determination Of An Energy Device” filed Aug. 20, 2024, U.S. patent application Ser. No. 18/810,361 entitled “Multi-Sourced Data-Based Activation Mode Determination Of An Energy Device” filed Aug. 20, 2024, U.S. patent application Ser. No. 18/810,407 entitled “Conflict Resolution For Activation Mode Determination Of An Energy Device” filed Aug. 20, 2024, and U.S. patent application Ser. No. 18/810,419 entitled “Controlling Patient Monitoring Devices” filed Aug. 20, 2024, which are each hereby incorporated by reference in their entireties.


Operating Intelligent Surgical Instruments

An intelligent surgical instrument, such as the surgical instrument 114 of FIG. 1, the surgical instrument 131 of FIG. 2, or the surgical instrument 302 of FIG. 5, can have an algorithm stored thereon, e.g., in a memory thereof, configured to be executable on board the intelligent surgical instrument, e.g., by a processor thereof, to control operation of the intelligent surgical instrument. In some embodiments, instead of or in addition to being stored on the intelligent surgical instrument, the algorithm can be stored on a surgical hub, e.g., in a memory thereof, that is configured to communicate with the intelligent surgical instrument.


The algorithm may be stored in the form of one or more sets of pluralities of data points defining and/or representing instructions, notifications, signals, etc. to control functions of the intelligent surgical instrument. In some embodiments, data gathered by the intelligent surgical instrument can be used by the intelligent surgical instrument, e.g., by a processor of the intelligent surgical instrument, to change at least one variable parameter of the algorithm. As discussed above, a surgical hub can be in communication with an intelligent surgical instrument, so data gathered by the intelligent surgical instrument can be communicated to the surgical hub and/or data gathered by another device in communication with the surgical hub can be communicated to the surgical hub, and data can be communicated from the surgical hub to the intelligent surgical instrument. Thus, instead of or in addition to the intelligent surgical instrument being configured to change a stored variable parameter, the surgical hub can be configured to communicate the changed at least one variable, alone or as part of the algorithm, to the intelligent surgical instrument and/or the surgical hub can communicate an instruction to the intelligent surgical instrument to change the at least one variable as determined by the surgical hub.


The at least one variable parameter may be among the algorithm's data points, e.g., may be included in instructions for operating the intelligent surgical instrument, and are thus each able to be changed by changing one or more of the stored pluralities of data points of the algorithm. After the at least one variable parameter has been changed, subsequent execution of the algorithm can be according to the changed algorithm. As such, operation of the intelligent surgical instrument over time can be managed for a patient to increase the beneficial results use of the intelligent surgical instrument by taking into consideration actual situations of the patient and actual conditions and/or results of the surgical procedure in which the intelligent surgical instrument is being used. Changing the at least one variable parameter is automated to improve patient outcomes. Thus, the intelligent surgical instrument can be configured to provide personalized medicine based on the patient and the patient's surrounding conditions to provide a smart system. In a surgical setting in which the intelligent surgical instrument is being used during performance of a surgical procedure, automated changing of the at least one variable parameter may allow for the intelligent surgical instrument to be controlled based on data gathered during the performance of the surgical procedure, which may help ensure that the intelligent surgical instrument is used efficiently and correctly and/or may help reduce chances of patient harm by harming a critical anatomical structure.


The at least one variable parameter can be any of a variety of different operational parameters. Examples of variable parameters include motor speed, motor torque, energy level, energy application duration, tissue compression rate, jaw closure rate, cutting element speed, load threshold, etc.


Various embodiments of operating surgical instruments are described further in, for example, U.S. Pat. App. Pub. No. 2023/0096268 entitled “Methods for Controlling Cooperative Surgical Instruments” filed Oct. 5, 2021, U.S. Pat. App. Pub. No. 2023/0097906 entitled “Surgical Methods Using Multi-Source Imaging” published Mar. 30, 2023, U.S. Pat. App. Pub. No. 2023/0095002 entitled “Surgical Methods Using Fiducial Identification And Tracking” published Mar. 30, 2023, U.S. Pat. App. Pub. No. 2023/0101750 entitled “Surgical Methods For Control Of One Visualization With Another” published Mar. 30, 2023, U.S. Pat. App. Pub. No. 2023/0100698 entitled “Methods For Controlling Cooperative Surgical Instruments” published Mar. 30, 2023, U.S. Pat. App. Pub. No. 2023/0103005 entitled “Methods for Controlling Cooperative Surgical Instruments” published Mar. 30, 2023, and U.S. Pat. App. Pub. No. 2023/0098538 entitled “Cooperative Access Hybrid Procedures” published Mar. 30, 2023, which are hereby incorporated by reference in their entireties.


Data Pipelines

As discussed herein, data may be transmitted from one point to another point, such as during a performance of a surgical procedure on a patient. The data may be transmitted from a source system to a destination system using a data pipeline.


As shown in FIG. 6A, a data pipeline 400 may move data from a source 402 to a destination 404 both physical and virtual (transient). In some data pipelines, the destination 404 may be called a “sink” or a “target.” Any time data is processed between point A and point B (or between multiple points such as points B, C, and D), there is a data pipeline 400 between those points. In general, the data pipeline 400 can include a set of tools and processes, which may be referred to as “steps” or “processing steps” used to automate the movement and transformation of data between the source 402 and the destination 404.


In some embodiments, the source 402 and the destination 404 are two different elements, such as a first element of a surgical system and a second element of a surgical system. The data from the source 402 may or may not be modified by the data pipeline 400 before being received at the destination 404. For example, the source 402 may be one of the wearable sensing system(s) 111, the environmental sensing system(s) 115, the robotic system(s) 113, the intelligent instrument(s) 114, or the human interface system(s) 112 of the surgical system 102 of FIGS. 1 and 3 and the destination 404 may be the surgical hub 106 of the surgical system 102 of FIGS. 1 and 3. For another example, the source 402 may be one of the wearable sensing system(s) 111, the environmental sensing system(s) 115, the intelligent instrument(s) 114, or the human interface system(s) 112 of one of the surgical systems 102, 103, 104 of FIG. 1 and the destination 404 may be the surgical hub 106 of another one of the surgical systems 102, 103, 104 of FIG. 1. For yet another example, the source 402 may be one of the wearable sensing system(s) 111, the environmental sensing system(s) 115, the robotic system(s) 113, the intelligent instrument(s) 114, or the human interface system(s) 112 of the surgical system 102 of FIGS. 1 and 3 and the destination 404 may be another one of the wearable sensing system(s) 111, the environmental sensing system(s) 115, the robotic system(s) 113, the intelligent instrument(s) 114, or the human interface system(s) 112 of the surgical system 102 of FIGS. 1 and 3. For still another example, the source 402 may be one of the surgical systems 102, 103, 104 of FIG. 1 and the destination 404 may be another one of the surgical systems 102, 103, 104 of FIG. 1.


In some embodiments, the source 402 and the destination 404 are the same element. The data pipeline 400 may thus be purely about modifying the data set between the source 402 and the destination 404.


As shown in FIG. 6B, the data pipeline 400 may include one or more data connectors 406 that extract data from the source 402 and load the extracted data into the destination 404. A plural “N” number of data connectors 406 are shown in FIG. 6B. In some embodiments, such as embodiments in which extract, transform, and load (ETL) processing of data is performed, as opposed to extract, load, and transform (ELT) processing of data, data may be transformed within the data pipeline 400 before the data is received by the destination 404. In other embodiments, such as embodiments in which ELT processing of data is performed, as opposed to ETL processing of data, the one or more data connectors 406 may simply load raw data to the destination 404. In some instances, light transformations may be applied to the data, such as normalizing and cleaning data or orchestrating transformations into models for analysts, before the destination 404 receives the data.


The data pipeline 400 can include physical elements like one or more wires or can include digital elements like one or more packets, network traffic, or internal processor paths/connections. Flexible data pipelines are portions of the overall system where redundant paths can be utilized and therefore data, e.g., a data stream, can be sent down one path, parsed between multiple parallel paths (to increase capacity) and these multiple paths can be flexibly adjusted by the system as necessary to accommodate changes in the volume and details of the data streams as necessary.


The data pipeline 400 can have a small code base that serves a very specific purpose. These types of applications are called microservices.


The data pipeline 400 can be a big data pipeline. There are five characteristics of big data: volume, variety, velocity, veracity, and value. Big data pipelines are data pipelines built to accommodate more than one of the five characteristics of big data. The velocity of big data makes it appealing to build streaming data pipelines for big data. Then data can be captured and processed in real time so some action can then occur. The volume of big data requires that data pipelines must be scalable, as the volume can be variable over time. In practice, there are likely to be many big data events that occur simultaneously or very close together, so the big data pipeline must be able to scale to process significant volumes of data concurrently. The variety of big data requires that big data pipelines be able to recognize and process data in many different formats—structured, unstructured, and semi-structured.


In general, an architecture design of a data pipeline, e.g., the data pipeline 400 of FIGS. 6A and 6B, can include interconnectivity between a first smart device and a second smart device, e.g., the source 402 and the destination 404 of FIGS. 6A and 6B. Data generated in one source system (e.g., the first smart device or the second smart device) may feed multiple data pipelines, which may have multiple other data pipelines dependent on their outputs.


The interconnectivity between the first smart device and the second smart device may be on a common/shared network, e.g., LAN, Wi-fi, powerline networking, MoCA networking, cellular (e.g., 4G, 5G, etc.), low power wide area network (LPWAN), Zigbee, Z-wave, etc.


The interconnectivity between the first smart device and the second smart device may be on a structured network. Traditionally, structured peer-to-peer (P2P) networks implement a distributed hash table (DHT). In order to route traffic efficiently through the network, nodes in a structured overlay must maintain lists of neighbors that satisfy specific criteria. This makes them less robust in networks with a high rate of churn (e.g., with large numbers of nodes frequently joining and leaving the network). DHT-based solutions may have a high cost of advertising/discovering resources and may have static and dynamic load imbalance.


The interconnectivity between the first smart device and the second smart device may be via cooperative networking. Cooperative networking utilizes a system that is a hybrid of a P2P network and a server-client network architecture, offloading serving to peers who have recently established direct interchanges of content.


The interconnectivity between the first smart device and the second smart device may be exclusive. For example, the interconnectivity may be exclusive via Bluetooth. For another example, the interconnectivity may be exclusive via network isolation, such as by using path isolation, a virtual private network (VPN), or a secure access service edge (SASE). The path isolation may include a software-defined LAN (SD-WAN). SD-WANs rely on a software and a centralized control function that can steer traffic across a WAN in a smarter way by handling traffic based on priority, security, and quality of service requirements. The VPN may involve creation of an independent secure network using common/shared open networks. Another network (a carrier network) is used to carry data, which is encrypted. The carrier network will see packets of the data, which it routes. To users of the VPN, it will look like the systems are directly connected to each other.


For example, with interconnectivity between the first smart device and the second smart device being exclusive in a surgical context, an operating room (OR) may have a surgical hub and an established network from a first vendor. In order to secure against hacking or data leakage, the network may be an encrypted common network which the first vendor supplies keys from. A surgical stapler in the OR may be from a second vendor that is different from the first vendor and that does not have the keys from the first vendor. The surgical stapler may want to link to other device(s) it relies on for functionality but does not want data leakage. An advanced energy generator from the second vendor with an accompanying smoke evacuator may also be in the OR and may form their own private network, such as by piggybacking on the first vendor network to create a second encrypted VPN routing through the first vendor network as a primary network or by forming an independent wireless network for bi-direction communication between the advanced energy generator and the smoke evacuator. The surgical stapler may want to communicate with the advanced energy generator, e.g., so the surgical stapler may retrieve updated software from the advanced energy generator, receive tissue properties information from the advanced energy generator, log data for exportation, and receive energy from the advanced energy generator and apply the energy to tissue, but not want to communicate with the smoke evacuator, e.g., because the surgical stapler performs no smoke evacuation. The surgical stapler and a communication backplane of the advanced energy generator may therefore form an isolated network with only the surgical stapler (first smart device) and the advanced energy generator (second smart device) able to communicate via the isolated network and with the surgical hub able to manage the data pipeline between the surgical stapler and the advanced energy generator.


In general, one or more steps may be performed along a data pipeline, e.g., the data pipeline 400 of FIGS. 6A and 6B. The steps in the data pipeline may include data transformation, data augmentation, data enrichment, data filtering, data grouping, data aggregating, and running algorithms against the data.


The data aggregation may include segmentation of data into buckets (e.g., decomposition of a procedure into sub-steps), data fusion and interfacing, and mixing real-time data streams with archived data streams. Various embodiments of data aggregation are described further in, for example, U.S. Pat. App. Pub. No. 2022/0104896 entitled “Interactive Information Overlay On Multiple Surgical Displays” published Apr. 7, 2022, U.S. Pat. App. Pub. No. 2019/0206564 entitled “Method For Facility Data Collection And Interpretation” published Jul. 4, 2019, and U.S. Pat. App. Pub. No. 2024/0221937 entitled “Method For Advanced Algorithm Support” published Jul. 4, 2024, which are hereby incorporated by reference in their entireties.


In one embodiment, mixing real-time data streams with archived data streams may include, in a surgical context, pre-operative data/imaging evaluation. The evaluation may include displaying of static preoperative scan(s), overlaying of video with aligned 3D model, and registering a virtual view to a camera view.


In one embodiment, the display of static pre-operative scan may include alignment based on surgeon (or other HCP) position, for example, where the surgeon (or other HCP) is standing.


In one embodiment, registering the virtual view to the camera view may include identifying organs in a video and triangulating with camera location and/or getting a camera location in reference to a coordinate system. For example, during performance of a surgical procedure, a camera location may be acquired with respect to a trocar by 3D tracking of the trocar, camera insertion in the trocar (e.g., insertion depth and/or insertion angle, and/or determination of what trocar is being used for the camera. An example of insertion depth is a marking on a shaft of the trocar, such as on a graphical scale or a color gradient. Examples of insertion angle in a trocar are 3D trocar orientation and 3D angle of attack.


In one embodiment, the pre-operative data/imaging evaluation may include using a machine learning (ML) algorithm to reviewing preoperative scans of a patient to identify any abnormalities. A cloud-based source may be used for augmented reality (AR) using cloud-based data for surgical procedure planning.


In one embodiment, an ML algorithm may be used in an initial planning stage, e.g., initial planning for a surgical procedure to be performed on a patient. Preoperative scans may be used to facilitate surgical path planning. If the initial scans detect anomalies or diseased tissues, as analyzed by the ML algorithm, the anomalies or diseased tissues may be relayed to the surgeon for the upcoming surgical procedure and a new surgical task order may be suggested based on how previous surgeons handled these problems. The relayed information to the surgeon may also include a recommended inventory list to have on hand based on this initial improved surgical task order.


For example, during preoperative scans for a sleeve gastrectomy, a small hernia may be discovered. This hernia may be highlighted during the surgical planning step and the surgeon may be asked if the surgeon wants to include a hernial repair in the initial sleeve gastrectomy plan. Based on the surgeon's answer, the hernial repair will be added into the surgical task order for an affirmative surgeon answer and the overall inventory for this case will be updated to include relevant items for the hernial repair added into the surgical task order. During performance of the sleeve gastrectomy, a ML algorithm may be used to detect diseased tissue or surgical anomalies. If a diseased tissue is discovered, the diseased tissue may be highlighted on a screen, e.g., on a HID, and a cutting path/angle may be recommended to avoid the tissue or make the tissue state more manageable. These recommendations may be based on how surgeons previously, successfully, handled these situations. If a surgical anomaly is discovered, the system may either automatically update the task order or require the surgeon to give a verbal command (or other command) to update the task order and highlight the required additional inventory on the circulators screen. For foreign bodies (such as bougies) that may be discovered, the foreign body may be highlighted on the screen and a cutting path may be included to provide an ample margin around the foreign body, assuming the foreign body is anticipated. If the foreign body is not anticipated, the foreign body may be highlighted to draw the surgeon's (and/or other HCP's) attention to it.


In one embodiment, the pre-operative data/imaging evaluation may include a cloud comparison of scans periodically taken through time for anatomic changes of that time to indicate possible operative complications. A cloud-based source may be used for augmented reality (AR) using preoperative scans to enhance return surgeries.


Looking at a difference between current and previous surgical scans may help inform the surgeon and/or other HCP and improve patient outcomes. This information can be used in various ways, for example for disease detection, informing surgical task planning, and/or informing previous surgical success and healing.


With respect to disease detection, current and historical scans can be used to determine if various disease states or abnormalities have evolved between surgeries. One case where this could be particularly useful is cancer detection. If a scan initially picks up an abnormal growth for a patient but the patient's HCP decides that it is benign but decides to flag it for caution, a follow-up scan nay confirm that the abnormality is benign or not. The scan may also automatically highlight areas of concern (tissue growth) that were not flagged by the HCP initially but could be areas of concern.


With respect to informing surgical task planning, information about previous surgeries (e.g., potential areas of scare tissue, previously seen difficult tissue, etc.) can help facilitate surgical step and path planning. This information can also be used during the surgery to display areas of scarring, changes of tissue from previous surgeries that might need to be examined, foreign bodies, and/or new adhesions.


With respect to informing previous surgical success and healing, data from various scans over time can be used to determine how successful patients were recovering or had recovered from previous surgeries. This information may be used by surgeons (and/or other HCPs) to help plan future procedures, assess previous work, and/or facilitate quicker patient recovery.


Data development may be performed as a step in a data pipeline and may include one or more of data modeling, database layout and configuration, implementation, data mapping, and correction.


Various data may be communicated using a data pipeline, e.g., the data pipeline 400 of FIGS. 6A and 6B, such as data from a local data source, data from a remote data source, and synthetically generated data.


Data from a local data source may include data collected by, used by, or resulting from the operation of aspects of the local data source, e.g., data gathered using a sensor (e.g., temperature data gathered using a temperature sensor, force data gathered using a force sensor, pressure measured using a pressure sensor, etc.), still and/or video image data (e.g., data gathered by a camera, etc.), operational parameters of a surgical instrument (e.g., energy level, energy type, motor current, cutting element speed, etc.), surgical instrument identification (e.g., instrument type, instrument serial n umber, etc.), etc.


Data from a local data source may have metadata, which may reflect aspects of a data stream, a device configuration, and/or system behavior that define information about the data. For example, metadata may include an auxiliary data location that is shared by two interconnected systems, e.g., first and second robotic systems, etc., to create a single “brain” instead of two distinct ones. Each of the interconnected systems may creates a copy of its memory system and introduce it to the combined system or “collective. Both of the interconnected systems may now use this new area for data exchange, for uni-directional communication, and to directly command control systems. The new combined system may become primary and the individual robotic system's memory areas may become secondary memory areas until the systems are “unpaired,” e.g., are no longer interconnected.


Data from a local data source may include a data stream that is monitored by at least one other system. In this way, data collected by, used by or resulting from the operation of aspects of one system may be sourced to another system (the monitoring system).


In one embodiment, application programming interfaces (APIs) may be used to communicate data from a local source.


In one embodiment, data may be communicated from a local source in response to occurrence of a trigger event. In one embodiment, the trigger event is a digital trigger event. For example, in a surgical context, the trigger event may be a surgical instrument changing orientation after being in a predetermined static position, such as when the surgical instrument “wakes up.” For another example, in a surgical context, the trigger event may be a system's power or signal interruption, e.g., communicating data after the interruption has been resolved. For yet another example, in a surgical context, the trigger event may be a change in system status, capacity, or connectivity. For still another example, in a surgical context, the trigger event may be quality, calibrations, or conversion factors of a surgical instrument.


Data from a remote data source may include data collected by, used by, or resulting from the operation of aspects of the remote data source. One example of a remote data source includes a database, such as a relational database or an NoSQL database. Examples of data in a relational database relevant in a surgical context can include inventory, sterile services process status, billing code, patient records (PHI), and previous procedure data.


One example of data from a remote data source, in a surgical context, includes a procedure plan, e.g., a plan for a surgical procedure to be performed on a patient. The procedure plan data can include, for example, instrument selection, port placement, adjuncts needed for devices, OR timing and local imaging needs, procedural steps, staff number and skill composition, and patient positioning.


Another example of data from a remote data source, in a surgical context, includes pre-operative imaging, such as a CT full body scan, external ultrasound, MRI, etc.


Another example of data from a remote data source includes software parameter updates, such as software parameter updates streaming from a cloud computing system. The software parameter updates can include, for example, original equipment manufacturer (OEM) updates to a device's operational aspects, e.g., updated basic input/output system (BIOS) controls, calibrations, updates on capabilities (e.g., recalls, limits/expansion of use, indications, contra-indications, etc.), etc.


Another example of data from a remote data source includes gold standard of care or outcomes improvement data, such as gold standard of care or outcomes improvement data from a cloud computing system. Gold standard of care or outcomes improvement data can include, for example, improved techniques of device use and/or device combinations.


In one embodiment, Apache® Hadoop®, which is an open source software framework, may be used for distributed processing of data across computer systems.


Examples of types of synthetically generated data may include synthetic text, media (e.g., video, image, sound, etc.), tabular data, and calculated continuous stream of data. The calculated continuous stream of data may be randomly generated (bracketed by extremes thresholds) or may be based off another stream or real continuous data stream that is modified to fit the stream limits of the expected synthetic stream. Reasons for using synthetically generated data can include for training data streams, because of missing data from an expected system that would otherwise draw a device error but is not relevant to the operation of the device or other dependent device, for data streams designed to verify the operational of the transforms or mathematic algorithms, for data streams intended to either verify security or prevent fraud/inauthenticity, for consecutive timing data for redaction of real-time from the relational data of the systems, for creations of trending data for replacement of legal compliance regulated data streams (e.g., either they produce datasets from partially synthetic data, where they replace only a selection of the dataset with synthetic data), and/or for a sudden but anticipatable/explainable change in a data source's feed which is being used as a closed loop control for a destination.


For example of use of synthetically generated data in a surgical context in which a surgical procedure is being performed on a patient, a PO2 sensor (data source) on the patient's finger may be being used as a means for controlling a closed loop feed or O2 through a ventilator (data destination). The ventilator also has an internal closed loop on CO2 outlet concentration, but since O2 blood saturation is the desired fixed relationship to O2 supplementation level, the ventilator is using the PO2 sensor from the patient monitoring system. There may be an abrupt change in the O2 level as measured by the PO2 sensor. The ventilator has two choices: ether switch to the trailing indicator or CO2 which has not had an abrupt change or examine other data sources to try to explain the O2 shift. When compared to the patient's core body temperature measure it may be discovered that the patient's temperature has dropped across a 1.5° C. below normal threshold that usually induces vasoconstriction limiting blood flow to the body's extremities. The PO2 measure by its metadata is known by the ventilator to be a finger monitor and therefore on an extremity. Further comparison over time may show the O2 measure fairly constant before the shift and then fairly constant after the shift as well, reinforcing the idea that this is the vasoconstriction that induced the shift. The ventilator may then create a synthetic data stream based on the shift data pattern and behavior that compensates for the vasoconstriction shift so the ventilator can continue on the primary linked feeds but using a modified synthetic or “calculated” data stream based of a real stream. For example, current body temperature control systems, such as a Bair Hugger™ device, are open-loop user settable heat gradient controlled systems but are affected by local temperature and environment.


Flow Management

In various aspects, the present disclosure provides methods, devices, and systems for managing data flow between a source and a destination of a data pipeline such as the source 402 and the destination 404 of FIGS. 6A and 6B. In a surgical context, the flow management may help ensure that a surgeon and/or other HCP receives interrelated real time information during performance of a surgical procedure on a patient so decisions can be made to maximize intended outcomes of intraoperative patient safety and post-operative recovery and clinical effect.


Data flow management may include managing rate, configuration, destination, transformation, and limits. One or more of the rate, configuration, destination, transformation, and limits may be managed to manage the data flow, e.g., manage the flow of data from a source to a destination such as the source 402 and the destination 404 of FIGS. 6A and 6B. Flow management for a data pipeline in a surgical context as described herein differs from standard congestion control or flow management since several of the data streams are synchronized in some manner where one data stream without the corresponding data stream is meaningless for the control or interpretation of the data. In general, data pipelines may be considered a road, and flow management may be considered traffic on the road.


In one embodiment, limits on the exchange of data may be based on the system's need, the situation, timing of the needs, and the type or configuration of data.


In one embodiment, flow management may include situational adaptation of data flow management within a preexisting data pipeline. Two separate data flows, first and second data flows, may be provided with the first data flow impacting characteristics of the second data flow. A trigger initiating the impact can be based on a step of a surgical procedure being performed, device behavior, or other irregular events. The adaptation may result in a change in prioritization, time-related access of the data, or adjusting the magnitude of the transformation. The impact on the flow may be based on a hierarchical priority of the data. Time-related access may include buffering the data for later use or delaying of the flow by a predetermined amount. Prioritization may be triggered by congestion, current system available capacity, need for ensuring no interruption if data, or if systems become corrupted.


In an exemplary embodiment, flow management during a performance of a surgical procedure may include managing a plurality of image data feeds. Without all of the plurality of image data feeds being available in real time to a surgeon and/or other HCP, such as via a HID, the surgeon and/or other HCP does not have complete information and thus risks making a decision that causes one or more problems that can be very serious, such as causing injury to the patient (e.g., unintentionally cutting tissue, unintentionally breaking out of a body cavity through a stomach tissue wall, a lung tissue wall, or tissue wall, etc.), damaging a surgical instrument in use inside a patient (e.g., two surgical instruments colliding and damaging one or both of the surgical instruments, damaging a lens, etc.), and/or other problem. The flow management may ensure that all of the plurality of image data feeds, which maybe from one or more sources, are all transmitted to the destination in real time to allow real time display of all of the plurality of image data feeds.


One example of flow management during a performance of a surgical procedure including managing a plurality of image data feeds may involve an advanced visualization system that has both fluorescing and infrared (IR) multi spectral imaging. An overlay needed by a surgeon in dissection may be to fluoresce a patient's ureter and denote the patient's iliac arteries to prevent inadvertent damage to either critical structure while skeletonizing and mobilizing the structures. A visual feed without both IR and fluorescent feeds inhibit the identification (ID) of critical structures. Therefore, three data feeds of standard visualization, fluorescent visualization (e.g., fluorescing the ureter), and IR visualization (e.g., IR imaging detecting arteries) need to be managed in the data pipeline by being coupled and handled as a single set of streams relative to any other streams through the system. In this way, a destination, such as a HID, receiving a visual feed from the advanced visualization system as a source can receive, via a data pipeline, all three data feeds of standard visualization, fluorescent visualization, and IR visualization to provide the surgeon and/or other HCP, such as by displaying the information on the HID, with the needed visual information to improve patient safety. In this example, a trigger for managing the data flow may include at least one of the three visual data feeds being transmitted to the destination, whether or not in reply to a request from the destination.


Another example of flow management during a performance of a surgical procedure including managing a plurality of image data feeds may involve a robotic surgical system's, such as the Monarch® platform or other robotic surgical system, visual light visualization and electromagnetic (EM) navigation plus computed tomography (CT) imaging for re-association of scope control to an actual physical bronchus architecture that needs to be handled as a single stream from three sources (two internal and one external to the robotic surgical system). As the robotic scope is introduced deeper into a patient's bronchial structure under control of the robotic surgical system, errors combine in the EM navigation system due to external metal systems. As some point in-situ, the EM system using an initial full body CT map instructs a control direction that the EM system thinks is in an opening but is in reality straight into a bronchial wall. At this point, the EM system needs re-calibration. The cone beam CT (CBCT) data feed plus the visual light visualization data feed need to conform the mis-alignment of the EM navigation from real-life architecture of the bronchus. At that point the EM, the cone beam CT, and the visual light imaging need to be managed in the data pipeline as a single set of data relative to other streams as none of the three can be allowed to be out of sync or absent unless the others are meaningless. In this way, a destination, such as a HID, receiving data from the robotic surgical system as a source can receive, via a data pipeline, all three data feeds of visual light visualization, EM navigation, and CBCT to provide the surgeon and/or other HCP with the needed information, such as by displaying the information on the HID, to improve user input to the robotic surgical system for scope navigation and/or to improve patient safety. In this example, a trigger for managing the data flow may include at least one of the three data feeds being transmitted to the destination, whether or not in reply to a request from the destination.


In an exemplary embodiment, flow management during a performance of a surgical procedure may include managing a plurality of data feeds to prevent bandwidth overflow. In general, bandwidth overflow occurs when an amount of bandwidth needed to transmit data exceeds an available amount of bandwidth. If bandwidth overflow occurs, at least some data will not reach its destination at all (e.g., data is dropped) or in a necessary timely manner during performance of the surgical procedure (e.g., data is queued for later transmission), which may cause one or more problems that can be very serious, such as a surgical instrument not functioning properly, visualization of a surgical site being at least temporarily unavailable, etc.


Flow management may include establishment of a flow management control scheme. The establishment of the flow management control scheme may include contributions to more flow management oversight, reactions to flow management issues, document and content management, and learned or adaptive flow management methods based on monitoring of users and uses.


The contributions to more flow management oversight may include bottleneck control and minimization of data corruption. The bottleneck control may include latency control and congestion control.


One example of latency control may include determining ways for minimizing latency in all or portions of an image, e.g., an image of a surgical site gathered by an imaging device during performance of a surgical procedure, and the image's one or more overlays. The latency in all or portions of image and the image's one or more overlays can be minimized by sending a portion of the image which is more important than the rest of the image from a source, e.g., the source 402 of FIGS. 6A and 6B, to a destination, e.g., the destination 404 of FIGS. 6A and 6B. For example, a High Definition (HD) portion or a multi-spectral portion of an image may be more important than the rest of the image and thus be transmitted as high priority along a data pipeline or through a more dedicated data pipeline in order to minimize a portion of the time lag and thus allow for the higher priority portion of the image to be displayed faster, e.g., provided on a display screen of a HID to a surgeon and/or other HCP. The remaining portion of image can update less often, e.g., updated less often on the display screen, with the more important portion (higher priority portion or key portion) of the image being updated faster to minimize loss of detail or movement. In one embodiment, the latency in all or portions of the image and the image's one or more overlays can be minimized by allowing the system providing flow management, e.g., as executed by a controller of a surgical hub or other controller, to flag data packets of the key portion of the image or all of the image over other data feeds as higher priority. Similarly, the system may provide more dedicated data pipelines for the higher priority data feeds to reduce latency.


One example of congestion control may be to use conversion at different level of data compression as soon as congestion is experienced or predicted. For example, the data flow congestion control may adapt levels of transformation pre-transmission. The conversion to a differing level of data compression may be based on variation in signal reduction or compression dependent upon the data size to maintain flow.


Data flow congestion control can adapt levels of transformation pre-transmission using a variation in signal reduction or compression dependent upon data size to maintain flow. Different compression modalities offer different benefits and drawbacks. Some compression methods are inherently more lossy in the data they transform, but the same lossiness in data provides the benefit of greater compression and reduction of data to be transferred. In this scenario, different compression or signal reduction modalities may be employed to maintain a constant or acceptable data flow rate across the system. These different compression modalities may be employed depending on the type of source of the data, e.g., the source 402 of FIGS. 6A and 6B, or type of destination of the data, e.g., the destination 404 of FIGS. 6A and 6B, as well as the potential application of the data. One example is signal reduction or downconversion. A signal may be transformed directly, in a linear manner. An example is a 4K signal being downconverted to a 1080p HD signal. In this way, the signal is reduced in size in a linear fashion, but data may be permanently lost as part of the process. Another example is signal compression. A signal may be transformed in an algorithmic way, such as with traditional video compression methods. In this way, the bandwidth of the signal may be reduced. Portions of the signal may be recovered by uncompressing the file. For example, a surgical video, e.g., video of a surgical site gathered by an imaging device, being transmitted over a wireless or internet protocol (IP) solution may eventually encounter bandwidth constraints as video image data tends to consume more bandwidth than still image data and many other types of data. The surgical video can, however, be modified, e.g., along the data pipeline such as the data pipeline 400 of FIGS. 6A and 6B, to keep the bandwidth equal even in different scenarios. In a video recording scenario, in which 4K video is being used, a high degree of video compression may be utilized. While this may be more lossy, it would be acceptable in recording and greatly reduce the utilized bandwidth across the IP or wireless modality. In the same scenario but with a 1080p solution, lossless or no compression may be acceptable as well due to the lower inherent bandwidth utilized by the video stream.


As mentioned above, the establishment of the flow management control scheme may include reactions to flow management issues. The reactions to flow management issues may be performed by limiting of data sharing. Selective portions of data from a source, e.g., the source 402 of FIGS. 6A and 6B, are sent to some systems, in real-time or from archives. This selectivity can be different for different systems and some or none of the systems could receive the full dataset. For example, this selectivity may include comprehensive sharing only to some systems, or an edge network may be a clearing house for all data while still within United States Health Insurance Portability and Accountability Act (HIPAA) protected network controls. Further, cloud access may have redacted or partial data with a different filter than the selective real-time sharing of systems within the network. Important aspects to target may include directionality of flow and redundant information.


As mentioned above, the establishment of the flow management control scheme may include document and content management. The document and content management may generally involve a system or process being used to capture, track, and store electronic documents, such as PDFs, word processing files, digital images, patient imaging, etc., or discrete or compartmentalized data containers and documentation locations.


The document and content management may provide one or more benefits. The document and content management may improve significantly all communications with customers and third parties generating and distributing documents online and through a large variety of distribution channels. Another benefit is that the document and content management unifies documents (contracts, delivery notes, etc.), which may reduce significantly a total number of used documents since each designed template allows to automatically generate different documents for each department/user/surgeon/patient. Another benefit is that the document and content management may offer multi-platform solutions, such as Windows, Unix, Linux, System i, zSeries, etc. Another benefit of the document and content management may be that it involves distribution control, limiting the consuming of system resources based on the profiles of the users. Another benefit of the document and content management may be that it offers simple and intuitive document design and a user-friendly environment. Another benefit of the document and content management may be that it allows one to design dynamic and static forms easily and create one-dimensional (1D) and two-dimensional (2D) bar codes and/or 2D and three dimensional (3D) graphics, which may reduce costs significantly for a company as employees save time since processes are automated and tasks related to document management are processed easily and documents are located much faster than ever before. Using physical paper almost disappears, which may also be a huge cost saving allowed for by the document and content management, and the status of bank accounts can be consulted rapidly since invoices can be stored in the system immediately, etc.


As mentioned above, the establishment of the flow management control scheme may include learned or adaptive flow management methods based on monitoring of users and uses. The learned or adaptive flow management methods based on monitoring of users and uses may be performed based on a better understanding of customer data through a single view of customer.


In addition to or in alternative to establishing a flow management control scheme, flow management may include adaptation (adjustment) of predefined flow management elements due to a changing operational environment. The adaptations can have limits or ordering integrated into the adaptations to maintain importance while enabling changing to changing conditions. The adaptation of predefined flow management may be based on changing flow management due to balancing capacity with data compaction.


One example of a trigger for adjusting data flow management includes prioritization. Prioritization can include procedural or task timing being used to determine a type, configuration (e.g., raw data versus transformed data), frequency, or precision of the streamed datasets.


In addition to or instead of procedural or task timing, prioritization can include identified incidents being used to change the priority of a data stream, which can be device-specific or device-internal incidences or can be global patient incidences. In one example of a device-specific or device-internal incidence, a surgical stapler, such as the ECHELON™ 3000 Stapler (Johnson & Johnson of New Brunswick, NJ) or other surgical stapler, may have a stall incident due to too thick of tissue experienced, triggering a smart system pause or rate of change of speed of the stapler's firing member. This in turn may trigger a change in a collection rate and therefore the data flow prioritization and volume. For another example of a device-specific or device-internal incidence, ultrasonic blade immersion in blood or another fluid may cause what appears to be a transducer impedance termination event but in fact is an increase power event and the data rate should be increased as the system is attempting to determine which condition it is in (transducer impedance termination event or increase power event).


With respect to device-specific or device-internal incidences, unanticipated interference or loss of data (single or packet) loss may cause a short term rebalancing of flow rate management to make up some but not necessarily all of the lost data. For example, activation of an unrelated and/or unmonitored monopolar device during performance of a surgical procedure may induce a local interference and wireless packet loss incident which has to be made up for with redundant re-transmission of data while also transmitting real-time data. This could change the flow rate needed to be dedicated to this data stream and may require and adverse reduction of other data feeds to allow for the make-up. This is made more relevant when multiple data feeds were interfered with and the system does not have the capacity for all the data feeds to be made up for and therefore drops one or more data feeds to keep one or more other data feeds and make up them as well, for a limited time period.


In one example of global patient incidences, an uncontrolled bleeding event may be used to trigger data flow priorities to visualization or other smart device streams that can help identify and mitigate the incident (uncontrolled bleeding event). In another example of global patient incidences, an uncontrolled bleeding event may, instead or in addition to changing prioritization as discussed above, broaden the data flows and request more raw data to record a better picture of the entire field of view for achieving or later review. The destination, e.g., the destination 404 of FIGS. 6A and 6B, of the stream could also be affected.


Global patient incidences can include patient biomarker variation of issues, e.g., rapid core temperature changes, an EKG flat line, multiple related biomarkers becoming unsynchronized, etc. For example, a patient's EKG pulse rate and blood pressure may appear during performance of a surgical procedure to be moving in opposite directions due to vasoconstriction occurring (due to core temperature loss) while elevated heart rate event also occurs.


In addition to or instead of identified incidents and/or procedural or task timing, prioritization can include system capacities relative to full capacity being used to effect the priorities of what systems have what portions of the flow rate.


A response to an adaptation of flow management can include one or more responses. For example, the data exchange adaption response can include one or more of frequency, requested versus prescribed, real time versus delayed, recorded versus time sensitive, compensation for limitations of data handling, and use of an aspect of the data as a control for movement of the data. Regulatory compliance typically requires fast access to trusted data. The compensation for limitations of data handling can include transferring transformed versus untransformed data depending of the need to access aspects of the underlining data or only having an interest is small highly mobile transformed data. The use of an aspect of the data as a control for movement of the data can include prioritization and/or aspects to control data flow such as one or more of purpose, function, usefulness, integrity, and data directionality.


A forced ranking of importance levels of data streams may be provided in determining the adaptations. For example, in a surgical theater, local imaging of a surgical site must be maintained for the surgeon and/or other HCP to maintain visualization of the tools and tissue in play, especially in VATS and laparoscopic surgery. A primary feed of the local imaging needs to be prioritized over all other feeds to maintain control and visualization for safety and patient well-being. Due to this, other secondary feeds may have to be managed in order to maintain the primary feed display to a surgeon and/or other HCPs. The primary feed can be merely a portion of a primary scope data stream. For instance, a visual light portion to be able to see the surgical site can be the primary portion, and multi-spectral aspects can be classified secondary and therefore do not have to be prioritized over other feeds. Examples of primary data feeds include visual light scope imaging, navigation sensing (e.g., EM on the Monarch® platform) on a smart flexible endoscopy scope, patient vital sign monitoring, and anesthesia/ventilator parameters. All data streams may at some level be in an established hierarchy based on their prioritization, risk level, or other importance characterization to the maintenance of the patient or the safety of the surgery. Some parameters can be escalated or returned to their pervious importance levels based on the surgical step being performed or surgical situation. The primary feeds for visualization in medical applications have to have a parallel data pipeline for any of the transformed feeds to maintain a risk mitigation aspect of redundancy in case portions of the pipeline network drop out or are impacts either due to resetting or interference.


Various embodiments of surgical hub communicating, processing, storing, and displaying data that may be used in adapting flow management elements are described further in, for example, U.S. Pat. App. Pub. No. 2019/0200844 entitled “Method Of Hub Communication, Processing, Storage And Display” published Jul. 4, 2019, which is hereby incorporated by reference in its entirety.


In addition to or in alternative to establishing a flow management control scheme and/or adaptation (adjustment) of predefined flow management elements due to a changing operational environment, flow management may include managing data destination or intermediate temporary storage. The management of data destination or intermediate temporary storage may include managing one or more of local storage of data, buffering of data, migration of data, destinations for data, cataloging of data, and extended reality rendering and streaming from remote servers.


The local storage of data for the management of data destination or intermediate temporary storage may be used to provide access of surgical procedure plan, surgical procedure simulation, or as previous video stream sources.


The buffering of data for the management of data destination or intermediate temporary storage can include establishment of a local, parallel, non-contiguous for buffering non-real-time use of untransformed data from one system, e.g., the source 402 of FIGS. 6A and 6B, to another system, e.g., the destination 404 of FIGS. 6A and 6B, while still providing the transformed data from the one system to the other system.


The buffering of data for the management of data destination or intermediate temporary storage may include determining how to place a mid-process or pipe buffer that would allow some data streams to flow through the data pipeline, e.g., the data pipeline 400 of FIGS. 6A and 6B, unobstructed, but if the data becomes too big to process real-time, e.g., by a controller of a surgical hub or other controller, the system can segregate the data that is not needed immediately and store it temporarily in storage, which may be transmitted later when the system has more throughput available. Buffering methodologies may be used to adapt system resources or handle data out of sequence of real-time to free of moment in time resources. The mid communication buffering may thus allow for handling less real-time necessary data out of real-time.


Buffering to handle data out of sequence of real-time may include dynamically allocating parallel buffers for prioritizing data within a data pipeline, e.g., the data pipeline 400 of FIGS. 6A and 6B. Data that is being transmitted along a data pipeline, e.g., the data pipeline 400 of FIGS. 6A and 6B, may have numerous different priorities. That data may be buffered from its appropriate data pipelines to a set of buffers which may be dependent upon the total number of input data pipelines, categories of priorities of data, potential output data pipelines, and processing capabilities. This allows data of separate priorities to be independently buffered from one another based on the available processing of the overall system.



FIG. 7A illustrates one embodiment of a single incoming data pipeline 500 containing data, which is all data related to performance of a surgical procedure (e.g., a surgical procedure currently being performed or a surgical procedure that was previously performed), from at least one source and having four levels of different data priority: highest priority data 502, second highest priority data 504, third highest priority data 506, and fourth highest priority data 508. The data in the incoming data pipeline 500 is unsorted, e.g., is not sorted by priority.


The system performing flow management, e.g., a surgical hub, a cloud computing system, etc., sorts 510 the incoming data 502, 504, 506, 508 by priority and buffers the incoming data 502, 504, 506, 508 in a buffer 512. FIG. 7A shows the data 502, 504, 506, 508 in the buffer sorted 510 by priority, e.g., with the highest priority data 502 at a top of the buffer 512, the second highest priority data 504 below the highest priority data 502 in the buffer 512, the third highest priority data 506 below the second highest priority data 504 in the buffer 512, and the fourth highest priority data 508 below the third highest priority data 506 in the buffer 512.


The system serializes and transmits 514 the buffered data 502, 504, 506, 508 in accordance with the priority ranking of the data with the highest priority data 502 being transmitted first in an outcoming data pipeline 516 leading to a destination of the data. For example, all of the buffered highest priority data 502 is transmitted 514, followed by all of the second highest priority data 504 being transmitted 514, followed by all of the third highest priority data 506 being transmitted 514, followed by all of the fourth highest priority data 508 being transmitted 514. If, during the transmission 514 of a certain priority of data, data of a higher priority is received, the system may transmit the higher priority, later-received data before resuming with transmission 514 of buffered data. In some embodiments, a quasi buffered methodology (discussed further below) is used and the highest priority data 502 is not stored in the buffer 512 before being transmitted 514, which may allow the highest priority data 502 to be transmitted 514 faster and consequently received sooner by the destination.


The data stream in the illustrated embodiment of FIG. 7A is buffered in multiple buffers 512a, 512b, 512c, 512d of the buffer 512 for the single data stream from the incoming data pipeline 500. A single data stream is thus allocated and cached within multiple different buffers 512a, 512b, 512c, 512d. As in this illustrated embodiment, a number of the buffers (four in the FIG. 7A embodiment) may be allocated dependent upon a number of distinct data priorities (four in the FIG. 7A embodiment) that exist for the data being transmitted along the data stream. Instead of using multiple buffers 512a, 512b, 512c, 512d for the incoming data 502, 504, 506, 508, a singular buffer can instead be used.


The system in the embodiment of FIG. 7A is shown buffering a single data pipeline 500. However, the system may buffer multiple data pipelines within a shared buffering control. The shared buffering control may include multiple buffers for the multiple data streams, e.g., one buffer per data stream (which may include a single buffer or multiple buffers, as discussed above), or may include a single buffer for all of the data (which may include a single buffer or multiple buffers, as discussed above).


Priority of data may be established in the system based on what is deemed most surgically relevant and time sensitive. Priority may also be determined contextually such as by what a user or operator is doing at that time, or in some cases, not doing. For example, after a case (e.g., after performance of a surgical procedure), a surgeon (e.g., a surgeon who performed the surgical procedure) may be reviewing a 4K video segment from a portion of the case to better understand the surgeon's notes taken during or shortly after the performance of the surgical procedure. As the case was recorded in 4K video, it has a relatively large size and will take the system some time to continue uploading the data. However, since the surgeon is actively trying to view the video now, the system may further prioritize that data since a surgery is not in progress, and the surgeon is actively requesting it.


Priority of data may be flagged pre-transmission or post-transmission. A source or a transmits of data may flag a priority of the data prior to transmission of that data, such as over a controller areas network (CAN) bus. Instead, data may be received and analyzed by the a receiver of the data or recipient of the data to establish priority of the data, such as based on context, type, or other factors.


Buffering to handle data out of sequence of real-time may include quasi buffered systems to optimize system latency or responsiveness. In this methodology, highest priority data is never buffered and all lower priority data is always buffered as part of the system handling for information. Quasi buffered methodology may be most effective in distributed systems in which high criticality tasks are handled directly in hardware or a field-programmable gate array (FPGA) equivalent and in which lower criticality tasks are handled with a software processor.


In one example of quasi buffering, a pair of robots being used in a surgical procedure, operating together with no-fly zones, and having active communication between the both of them may frequently communicate with each other on a wide variety of informational topics, e.g., overall robotic system health status, selected tools and capabilities, current positioning, etc. A large degree of the information being communicated between the two robots is not time critical and has no consequences if it is slightly delayed in its transmission and/or receipt. The lowest priority data as a result may always be buffered prior to transmission. However, some of the information, such as current positioning, becomes highly time critical and will be immediately transmitted without going through the buffering process.


In another example of quasi buffering, a robotic system may provide its surgical user a video stream of a current laparoscopic video feed as part of its surgical console. The laparoscopic video feed has information from the rest of the system overlaid to the equipment related to the status of the robotic system. In this example, the laparoscopic feed is never buffered, as the video data for this and associated latency is seen as having a critical level of importance. However, the overlaid information is of a lower level of importance. As a result, the system may handle other software computing processes before finalizing and returning to the display of that information.


Buffering to handle data out of sequence of real-time may include intentionally lossy buffering of data. In intentionally lossy buffering of data, only data of sufficient prioritization may be buffered data. Data that does not meet a sufficient priority, based on the congestion of the system, may be immediately deprecated or discarded. The buffer may be used as part of the discarding process to maintain a previous small amount of the data that could be recovered in the event of critical need. In this way the data can be discarded from the primary data stream but a small short-term snap shot would prevent complete loss.


In some embodiments, buffering includes establishment of a local parallel non-contiguous buffer for buffering non-real-time use of untransformed data from one system to another system while still providing the transformed data to the other system.


The migration of data for the management of data destination or intermediate temporary storage can include long-term storage migration, migration for machine learning (ML) set databases, and migration from remote storage for real-time utilization.


The destinations for data for the management of data destination or intermediate temporary storage can include central data repositories and/or multiple locations. The multiple locations may be dispersed, redundant, an active use vault, and/or a secure vault. The central data repositories may be either data lakes or data warehouses that permanently store large amounts of data. Data warehouses follow a relational structure while data lakes can accommodate mass storage of files. The central data repositories may allow for backup, warehousing, and perpetual maintenance and revision. With respect to perpetual maintenance and revision, since the data pipeline both extracts and transforms data, the moment upstream schemas or downstream data models change, the data pipeline breaks and the code base must be revised. This provides both backup and warehousing.


The cataloging of data for the management of data destination or intermediate temporary storage can include organization of the data by the usefulness to the user of the data, such as recording and documentation of a combined data aggregation of the surgical procedure and/or cross-annotation of event logs.


The extended reality rendering and streaming from remote servers for the management of data destination or intermediate temporary storage can include extended reality rendering and streaming from remote servers, such as cloud based access. This can apply to body mass index (BMI).


The cloud based access may access a cloud database of relevant information coupled with additional information for identified augmented reality (AR) features. The additional information for identified AR features may be coupled with GPS/location information to allow content specific to a region to be delivered, with GPS/location information to allow additional information on a specific hospital (or other medical care facility) product to be delivered (e.g., information on what is available under contract in the hospital/facility, e.g., powered circular vs. manual circular, particular endocutter shaft lengths, etc.), and/or with GPS/location information to allow content with a specific language to be delivered.


The cloud based access may access a cloud database of local information for identified AR features. Targeting an AR feature on a device may provide overlay of real time inventory, quantity, and expiration of related product for central supply.


The cloud based access may access a cloud database of relevant information for identified AR features. Instead of utilizing stored data, a secure cloud database may be accessed to provide up-to-date information on a product delivered by interacting with the feature. Content can include instructions for use (IFU), videos, relevant publications, white papers, etc.


Cloud-based AR/VR (augmented reality/virtual reality) may provide the ability to process data and render videos faster, e.g., 100 times faster. A faster compute means faster iteration, which means more efficient engineering and more quickly testing solutions, troubleshooting and resolving issues, and accelerating delivery of more high-performing services.


Cloud-based access may allow for guidance based on matching to most similar anatomy, case, and/or surgical procedure step, which may allow for identifying what at this point could lead to a good result or a bad result. For example, a processed image can be utilized for determination of a preferred device for use. The image may be processed to identify a surgical instrument in the image and, based on the processed image (e.g., based on the identification), a hint may be provided to a surgeon and/or other HCP, e.g., via a display and/or other device, identifying another surgical instrument that may be better to use.


In one embodiment, a method of using a processed image for determining a preferred device can include, in the regular surgical procedure, and when the surgeon is ready to do the next step, based on the image processing and the best knowledge from the database, a special device can be introduced to the surgeon and/or other HCP based on other surgeon's experience, outcomes, and/or a mechanism/theory can be presented to the surgeon and/or other HCP why an additional device can be helpful to the surgical procedure. One example is a tissue clamp being used in a sleeve procedure being performed on a patient or a target tube size and/or shape to the patient in the sleeve procedure. Also in the method, a simulated video can be shown as option on how the special device will be used. The simulated video may show the benefit of a special device.


Cloud based access may allow for detection of adverse events. The detection of an adverse event may trigger an alert to be provided, e.g., via a display, a microphone, etc. One example of the alter is a message such as “Never seen this situation before in cloud sources (disease or organ structure, organ position), proceed with caution/consult other sources.”


Cloud based access may allow for a surgeon and/or other HCP who has never seen the current situation before to request active guidance and for the most similar cases and anatomies to be found to generate video guidance and/or other guidance.


In addition to or in alternative to establishing a flow management control scheme, adaptation (adjustment) of predefined flow management elements due to a changing operational environment, and/or managing data destination or intermediate temporary storage, flow management may include data flow monitored triggers for adaptations. The data flow monitoring may operate as a safety confirmation system.


The data flow monitoring may be done with analytics tools. These include off-the-shelf business intelligence platforms for reporting and dashboards, as well as analytics and data science packages for common programming languages. The analytics tools may be used to produce visualizations, summaries, reports, and dashboards.


The data flow monitoring may determine a flow of data and its relativity to a capacity of a resource network that the data is flowing through. The resultant monitor analysis can be used to effect and tune the flow thru the data pipelines and/or a display of the flow to outside users to communicate the limits of the system or the active throttling/flow restrictions in place. These restrictions may be re-balanced by a user if they are determined to be inappropriate for the needs of the user in the current situations. For example, a flow on a multi-spectral imaging data stream through available data pipelines may be restricting resolution or frame rate to a point where the loss of detail or the time lag are inhibiting the surgeon's (and/or other HCP's) ability to see critical structures of a patient they are working around in performing a surgical procedure on the patient. A user may input the need for improved resolution to the system, and the system may respond with alternative balanced flow management options given the resource limits of the system and the data pipelines. This may include slowing or buffering other non-primary feeds or may result in only the portion of the image focused on the surgical site having increased resolution.


As mentioned above, flow management during a performance of a surgical procedure may include managing a plurality of data feeds to prevent bandwidth overflow. FIG. 7B illustrates one embodiment of a method 600 of managing a plurality of data feeds to prevent bandwidth overflow during performance of a surgical procedure on a patient. The method 600 may include receiving 602, at a first surgical system, a first dataflow from a second surgical system. The first data flow may be transmitted along a data pipeline. The first dataflow may include first data regarding a measured patient parameter that the first surgical system is configured to use in performing a function during the performance of the surgical procedure; and/or the first surgical system may be a first type of surgical system, and the second surgical system may be a second, different type of surgical system. The first type of surgical system and the second type of surgical system may each be one of a hospital network, a database, a surgical instrument, or a surgical cart.


The method 600 may also include determining 604 that a trigger event occurred during the performance of the surgical procedure such that a sum of a first bandwidth of the first dataflow and a second bandwidth of a second dataflow exceeds an available bandwidth. The second dataflow may be one of flow from the first surgical system to the second surgical system, and flow within the first surgical system; the first bandwidth may be one of a bandwidth between the first and second surgical systems, and a bandwidth of the first surgical system; the first surgical system may be configured to use data in the second data flow in performing a second function during the performance of the surgical procedure; and/or the trigger event may be one of congestion of traffic using the available bandwidth, performance of a predetermined step of the surgical procedure by a surgeon performing the surgical procedure, loss of data in the first dataflow, and a predetermined patient biomarker variation.


The method 600 may also include, in response to determining 604 that the trigger event occurred, and during the performance of the surgical procedure, adjusting 606 at least one of the first and second dataflows such that the sum of the first and second bandwidths does not exceed the available bandwidth. The adjusting may include changing a prioritization of the first and second dataflows, changing time-related access to the at least one of the first and second dataflows, and/or temporarily storing, e.g., in a buffer, one of the first data of the first dataflow or the second data of the second dataflow.


The method 600 may also include, after the adjusting 606, transmitting 608 data on the adjusted at least one of the first and second dataflows.


The determining 604 and the adjusting 606 may be performed using a processor of the first surgical system, the determining 604 and the adjusting 606 may be performed using a processor of a surgical hub communicatively coupled with the first and second surgical systems, or the determining 604 and the adjusting 606 may be performed using a processor of a cloud-based remote server communicatively coupled with the first and second surgical systems.


Computer Systems

A computer system may be suitable for use in implementing computerized components described herein. In broad overview of an exemplary embodiment, the computer system may include a processor configured to perform actions in accordance with instructions, and memory devices configured to store instructions and data. The processor may be in communication, via a bus, with the memory (and/or incorporates the memory) and with at least one network interface controller with a network interface for connecting to external devices, e.g., a computer system (such as a mobile phone, a tablet, a laptop, a server, etc.). The processor may also be configured to be in communication, via the bus, with any other processor(s) of the computer system and with any I/O devices at an I/O interfaces. Generally, a processor will execute instructions received from the memory. In some embodiments, the computer system can be configured within a cloud computing environment, a virtual or containerized computing environment, and/or a web-based microservices environment.


In more detail, the processor can be any logic circuitry that processes instructions, e.g., instructions fetched from the memory. In many embodiments, the processor may be an embedded processor, a microprocessor unit (MPU), microcontroller unit (MCU), field-programmable gate array (FPGA or FGPA), or special purpose processor. The computer system can be based on any processor, e.g., suitable digital signal processor (DSP), or set of processors, capable of operating as described herein. In some embodiments, the processor can be a single core or multi-core processor. In some embodiments, the processor can be composed of multiple processors.


The memory can be any device suitable for storing computer readable data. The memory can be a device with fixed storage or a device for reading removable storage media. Examples include all forms of non-volatile memory, media and memory devices, semiconductor memory devices (e.g., EPROM, EEPROM, SDRAM, flash memory devices, and all types of solid state memory), magnetic disks, and magneto optical disks. A computer system can have any number of memory devices.


The memory also can include a cache memory, which is generally a form of high-speed computer memory placed in close proximity to the processor for fast read/write times. In some embodiments, the cache memory is part of, or on the same chip as, the processor.


The network interface controller may be configured to manage data exchanges via the network interface. The network interface controller may handle the physical, media access control, and data link layers of the Open Systems Interconnect (OSI) model for network communication. In some embodiments, some of the network interface controller's tasks may be handled by the processor. In some embodiments, the network interface controller may be part of the processor. In some embodiments, a computer system may have multiple network interface controllers. In some implementations, the network interface may be a connection point for a physical network link, e.g., an RJ 45 connector. In some embodiments, the network interface controller may support wireless network connections and an interface port may be a wireless Bluetooth transceiver. Generally, a computer system can be configured to exchange data with other network devices via physical or wireless links to a network interface. In some embodiments, the network interface controller may implement a network protocol such as LTE, TCP/IP Ethernet, IEEE 802.11, IEEE 802.16, Bluetooth, or the like.


In some uses, the I/O interface may support an input device and/or an output device. In some uses, the input device and the output device may be integrated into the same hardware, e.g., as in a touch screen. In some uses, such as in a server context, there may be no I/O interface or the I/O interface may not be used. In some uses, additional other components may be in communication with the computer system, e.g., external devices connected via a universal serial bus (USB). In some embodiments, an I/O device may be incorporated into the computer system, e.g., a touch screen on a tablet device.


In some implementations, a computer device may include an additional device such as a co-processor, e.g., a math co-processor configured to assist the processor with high precision or complex calculations.


CONCLUSION

Certain illustrative implementations have been described to provide an overall understanding of the principles of the structure, function, manufacture, and use of the systems, devices, and methods disclosed herein. One or more examples of these implementations have been illustrated in the accompanying drawings. Those skilled in the art will understand that the systems, devices, and methods specifically described herein and illustrated in the accompanying drawings are non-limiting illustrative implementations and that the scope of the present invention is defined solely by the claims. The features illustrated or described in connection with one illustrative implementation may be combined with the features of other implementations. Such modifications and variations are intended to be included within the scope of the present invention. Further, in the present disclosure, like-named components of the implementations generally have similar features, and thus within a particular implementation each feature of each like-named component is not necessarily fully elaborated upon.


Approximating language, as used herein throughout the specification and claims, may be applied to modify any quantitative representation that can permissibly vary without resulting in a change in the basic function to which it is related. Accordingly, a value modified by a term or terms, such as “about,” “approximately,” and “substantially,” are not to be limited to the precise value specified. In at least some instances, the approximating language may correspond to the precision of an instrument for measuring the value. Here and throughout the specification and claims, range limitations may be combined and/or interchanged, such ranges are identified and include all the sub-ranges contained therein unless context or language indicates otherwise.


One skilled in the art will appreciate further features and advantages of the invention based on the above-described implementations. Accordingly, the present application is not to be limited by what has been particularly shown and described, except as indicated by the appended claims. All publications and references cited herein are expressly incorporated herein by reference in their entirety for all purposes.

Claims
  • 1. A computer-implemented method, comprising: during performance of a surgical procedure on a patient, receiving, at a first surgical system, a first dataflow from a second surgical system, the first dataflow including first data regarding a measured patient parameter that the first surgical system is configured to use in performing a function during the performance of the surgical procedure;determining that a trigger event occurred during the performance of the surgical procedure such that a sum of a first bandwidth of the first dataflow and a second bandwidth of a second dataflow exceeds an available bandwidth; andin response to determining that the trigger event occurred, and during the performance of the surgical procedure, adjusting at least one of the first and second dataflows such that the sum of the first and second bandwidths does not exceed the available bandwidth.
  • 2. The method of claim 1, wherein the second dataflow is one of: flow from the first surgical system to the second surgical system; andflow within the first surgical system.
  • 3. The method of claim 1, wherein the first bandwidth is one of: a bandwidth between the first and second surgical systems; anda bandwidth of the first surgical system.
  • 4. The method of claim 1, wherein the adjusting comprises changing a prioritization of the first and second dataflows.
  • 5. The method of claim 1, wherein the adjusting comprises changing time-related access to the at least one of the first and second dataflows.
  • 6. The method of claim 1, wherein the adjusting comprises temporarily storing one of the first data of the first dataflow or the second data of the second dataflow.
  • 7. The method of claim 1, wherein the first surgical system is configured to use data in the second data flow in performing a second function during the performance of the surgical procedure.
  • 8. The method of claim 1, wherein the trigger event is one of: congestion of traffic using the available bandwidth;performance of a predetermined step of the surgical procedure by a surgeon performing the surgical procedure;loss of data in the first dataflow; anda predetermined patient biomarker variation.
  • 9. The method of claim 1, further comprising, after the adjusting, transmitting data on the adjusted at least one of the first and second dataflows.
  • 10. The method of claim 1, wherein the determining and the adjusting is performed using a processor of the first surgical system, the determining and the adjusting is performed using a processor of a surgical hub communicatively coupled with the first and second surgical systems, or the determining and the adjusting is performed using a processor of a cloud-based remote server communicatively coupled with the first and second surgical systems.
  • 11. The method of claim 1, wherein the first surgical system is a first type of surgical system; and the second surgical system is a second, different type of surgical system.
  • 12. The method of claim 11, wherein the first type of surgical system and the second type of surgical system are each one of a hospital network, a database, a surgical instrument, or a surgical cart.
  • 13. A surgical system, comprising: the first surgical system of claim 1; andthe second surgical system of claim 1.
  • 14. A computer-implemented method, comprising: receiving, at a first surgical system and in real time with performance of a surgical procedure on a patient, a first dataflow of patient data from a second surgical system;the first surgical system using the received the first dataflow of patient data to perform a first function in real time with the performance of the surgical procedure;the first surgical system using second data received by the first surgical system in a second dataflow in real time with the performance of the surgical procedure to perform a second function in real time with the performance of the surgical procedure; andadjusting, based on occurrence of a trigger event in real time with the performance of the surgical procedure that results in the first and second dataflows exceeding a total bandwidth limit, at least one of the first or second dataflows so that the first and second dataflows do not exceed the total bandwidth limit.
  • 15. The method of claim 14, wherein the second dataflow is one of: flow from the first surgical system to the second surgical system; andflow within the first surgical system.
  • 16. The method of claim 14, wherein: the adjusting comprises at least one of: changing a prioritization of the first and second dataflows,changing time-related access to the at least one of the first and second dataflows, andtemporarily storing data of one of the first dataflow and the second dataflow; andthe adjusting is performed using a processor of the first surgical system, the adjusting is performed using a processor of a surgical hub communicatively coupled with the first and second surgical systems, or the adjusting is performed using a processor of a cloud-based remote server communicatively coupled with the first and second surgical systems.
  • 17. The method of claim 14, wherein the trigger event is one of: congestion of traffic using the available bandwidth;performance of a predetermined step of the surgical procedure by a surgeon performing the surgical procedure;loss of data in the first dataflow; anda predetermined patient biomarker variation.
  • 18. The method of claim 14, further comprising, after the adjusting, transmitting data on the adjusted at least one of the first and second dataflows.
  • 19. The method of claim 14, wherein the first type of surgical system and the second type of surgical system are each one of a hospital network, a database, a surgical instrument, or a surgical cart.
  • 20. A surgical system, comprising: the first surgical system of claim 14; andthe second surgical system of claim 14.
CROSS REFERENCE TO RELATED APPLICATIONS

The present application claims priority to U.S. Provisional Patent Application No. 63/603,031 entitled “Smart Surgical Systems” filed Nov. 27, 2023, which is hereby incorporated by reference in its entirety. The subject matter of the present application is related to the following patent applications filed on Nov. 26, 2024, which are hereby incorporated by reference in their entireties: U.S. application Ser. No. 18/960,006 entitled “Methods For Smart Surgical Systems,” U.S. application Ser. No. 18/960,047 entitled “Mapping Data Pipelines For Surgical Systems,” U.S. application Ser. No. 18/960,059 entitled “Broadcast And Peer-To-Peer Communication For Surgical Systems,” U.S. application Ser. No. 18/960,070 entitled “Data Lifecycle Management For Surgical Systems,” U.S. application Ser. No. 18/960,081 entitled “Data Transformation For Surgical Systems,” U.S. application Ser. No. 18/960,094 entitled “Geofencing For Surgical Systems,” U.S. application Ser. No. 18/960,107 entitled “Information Discrimination For Surgical Instruments,” and U.S. application Ser. No. 18/960,117 entitled “Adaptation Of Data Pipelines For Surgical Systems.”

Provisional Applications (1)
Number Date Country
63603031 Nov 2023 US