GEOFENCING FOR SURGICAL SYSTEMS

Abstract
Surgical systems and related computer-implemented methods are provided, including during performance of a surgical procedure on a patient, receiving, at a first surgical system, a first dataflow from a second surgical system, the first dataflow including first data regarding a measured patient parameter that the first surgical system is configured to use in performing a function during the performance of the surgical procedure. The computer-implemented methods including determining that a trigger event occurred during the performance of the surgical procedure such that a sum of a first bandwidth of the first dataflow and a second bandwidth of a second dataflow exceeds an available bandwidth, and in response to determining that the trigger event occurred, and during the performance of the surgical procedure, adjusting at least one of the first and second dataflows such that the sum of the first and second bandwidths does not exceed the available bandwidth.
Description
FIELD

The present disclosure relates generally to smart surgical devices, systems, and methods.


BACKGROUND

Surgical operations and environments have benefited from advances in technology. These advances include upgraded equipment, therapeutics, techniques, and more, which has resulted in more favorable outcomes for both patients and healthcare personnel. Further benefits can be realized through the continued advancement of technology and the continued integration of such advancements into the operations and environments.


Computers are more and ubiquitous in everyday life, and as the power of computers and computing systems increases, larger quantities of data can be processed in ways that render meaningful results and information for end users. This type of big data processing has immense benefits to surgical operations and environments as well, as more information can be distilled to meaningful assistance to a user, such as a surgeon, for use and reliance during surgical operations. Ultimately, this additional information for the user can result in even more favorable outcomes for both patients and healthcare personnel.


SUMMARY

In general, smart surgical devices, systems, and methods are provided.


In one embodiment, a surgical data management system is provided and includes a surgical hub including a processor and a memory storing instructions that, when executed by the processor, cause the processor to perform operations including: during a performance of a surgical procedure on a patient, adjust processing of a dataflow associated with a surgical system, which is located within a first digital fence surrounding an aspect of the surgical procedure, in response to the surgical system moving into a second digital fence that is nested within the first digital fence. The dataflow includes data regarding a measured patient parameter that at least one of the surgical hub and the surgical system is configured to use in performing a function during the performance of the surgical procedure.


The surgical data management system can have any number of variations. For example, the first digital fence can represent a fence surrounding a physical space. Further, the physical space can be an operating room in which the surgical procedure is to be performed, and the second digital fence can represent a fence surrounding a partial portion of the operating room. Further, the second digital fence can represent a fence surrounding a surgical field in the operating room.


For another example, the first digital fence can represent a fence surrounding a temporal space. Further, the temporal space can be a total amount of time in which the surgical procedure is to be performed, and the second digital fence can represent a fence surrounding a portion of the total amount of time. Further, the instructions, when executed by the processor, can also cause the processor to perform operations including: adjust processing of the dataflow associated with the surgical system, which is located within the first digital fence, in response to the surgical system moving into a third digital fence that is nested within the first digital fence; and the third digital fence can represent a fence surrounding a second, different portion of the total amount of time.


For yet another example, adjusting the processing can include at least one of: adjusting a data flow rate of the dataflow, adjusting a bandwidth capacity of the dataflow, adjusting a latency of the dataflow, matching the processing of the dataflow with a processing of a second dataflow of a second surgical system located in the second digital fence, and preventing transmission of the dataflow.


For still another example, the instructions, when executed by the processor, can also cause the processor to perform operations including: adjust processing of the dataflow associated with the surgical system, which is located within the first digital fence, in response to the surgical system moving into a third digital fence that is nested within the first digital fence and is different from the second digital fence. Further, the first digital fence can represent a fence surrounding a physical space, the physical space can be an operating room in which the surgical procedure is to be performed, the second digital fence can represent a fence surrounding a first portion of a surgical field in the operating room, and the third digital fence can represent a fence surrounding a second, different portion of the surgical field in the operating room.


For another example, the instructions, when executed by the processor, can also cause the processor to perform operations including: adjust processing of the dataflow associated with the surgical system, which is located within the second digital fence, in response to the surgical system moving out of the second digital fence and remaining in the first digital fence. Further, a predefined first set of rules for processing data can be associated with the first digital fence, a predefined second set of rules for processing data can be associated with the second digital fence, the adjusting in response to the surgical system moving into the second digital fence can include the dataflow being processed in accordance with the predefined first set of rules and the predefined second set of rules, and the adjusting in response to the surgical system moving out of the second digital fence can include the dataflow being processed in accordance with the predefined first set of rules and not in accordance with the predefined second set of rules.


For yet another example, a predefined first set of rules for processing data can be associated with the first digital fence, a predefined second set of rules for processing data can be associated with the second digital fence, and the adjusting can include processing the dataflow in accordance with the predefined second set of rules in addition to the predefined first set of rules. Further, the instructions, when executed by the processor, can also cause the processor to perform operations including: adjust processing of the dataflow associated with the surgical system, which is located within the second digital fence, in response to the surgical system moving out of the second digital fence such that the dataflow is processed in accordance with the predefined first set of rules and is no longer processed in accordance with the predefined second set of rules.


For still another example, a predefined first set of rules for processing data can be associated with the first digital fence, a predefined second set of rules for processing data can be associated with the second digital fence, and the adjusting can include suspending at least one of the rules in the predefined first set of rules such that the dataflow is configured to be processed in accordance with the predefined second set of rules and with a subset of the predefined first set of rules. Further, the instructions, when executed by the processor, can also cause the processor to perform operations including: adjust processing of the dataflow associated with the surgical system, which is located within the second digital fence, in response to the surgical system moving out of the second digital fence such that the dataflow is configured to processed in accordance with the predefined first set of rules instead of in accordance with the subset of the predefined first set of rules.


For another example, a predefined first set of rules for processing data can be associated with the first digital fence, a predefined second set of rules for processing data can be associated with the second digital fence, and the adjusting can include adding to at least one of the rules in the predefined first set of rules such that the dataflow is configured to be processed in accordance with the predefined second set of rules and with the added-to predefined first set of rules. Further, the instructions, when executed by the processor, can also cause the processor to perform operations including: adjust processing of the dataflow associated with the surgical system, which is located within the second digital fence, in response to the surgical system moving out of the second digital fence such that the dataflow is configured to be processed in accordance with the predefined first set of rules instead of in accordance with the added-to predefined first set of rules.


For yet another example, the instructions, when executed by the processor, can also cause the processor to perform operations including: receive data characterizing the surgical procedure, and establish, prior to a start of the performance of the surgical procedure on the patient and using the received data: the first digital fence and the second digital fence; and the received data can include at least one of information regarding a plan for the surgical procedure, a total amount of time for the surgical procedure, a location where the surgical procedure is to be performed, a layout of a location where the surgical procedure is to be performed, and at least one surgical tool to be used in the surgical procedure, the patient.


For still another example, boundaries of the first and second digital fences can be defined using a global positioning system (GPS) or a radio frequency identification (RFID) system, and the surgical system can be one of a hospital network, a database, a surgical instrument, or a surgical cart.


For another example, boundaries of the first and second digital fences can be defined using a GPS or an RFID system; the surgical system can be one of a hospital network, a database, a surgical instrument, or a surgical cart; and/or the surgical hub can be configured to be operatively coupled to a robotic surgical system. Further, the surgical data management system can also include the surgical system.


In another embodiment, a surgical data management system is provided that includes a cloud-based server including: a processor and a memory storing instructions that, when executed by the processor, cause the processor to perform operations including: during a performance of a surgical procedure on a patient, adjust processing of a dataflow associated with a surgical system, which is located within a first digital fence surrounding an aspect of the surgical procedure, in response to the surgical system moving into a second digital fence that is nested within the first digital fence. The dataflow includes data regarding a measured patient parameter that at least one of the cloud-based server and the surgical system is configured to use in performing a function during the performance of the surgical procedure.


The surgical data management system can have any number of variations. For example, the first digital fence can represent a fence surrounding a physical space. Further, the physical space can be an operating room in which the surgical procedure is to be performed, and the second digital fence can represent a fence surrounding a partial portion of the operating room. Further, the second digital fence can represent a fence surrounding a surgical field in the operating room.


For another example, the first digital fence can represent a fence surrounding a temporal space. Further, the temporal space can be a total amount of time in which the surgical procedure is to be performed, and the second digital fence can represent a fence surrounding a portion of the total amount of time. Further, the instructions, when executed by the processor, can also cause the processor to perform operations including: adjust processing of the dataflow associated with the surgical system, which is located within the first digital fence, in response to the surgical system moving into a third digital fence that is nested within the first digital fence; and the third digital fence can represent a fence surrounding a second, different portion of the total amount of time.


For yet another example, adjusting the processing can include at least one of: adjusting a data flow rate of the dataflow, adjusting a bandwidth capacity of the dataflow, adjusting a latency of the dataflow, matching the processing of the dataflow with a processing of a second dataflow of a second surgical system located in the second digital fence, and preventing transmission of the dataflow.


For still another example, the instructions, when executed by the processor, can also cause the processor to perform operations including: adjust processing of the dataflow associated with the surgical system, which is located within the first digital fence, in response to the surgical system moving into a third digital fence that is nested within the first digital fence and is different from the second digital fence. Further, the first digital fence can represent a fence surrounding a physical space, the physical space can be an operating room in which the surgical procedure is to be performed, the second digital fence can represent a fence surrounding a first portion of a surgical field in the operating room, and the third digital fence can represent a fence surrounding a second, different portion of the surgical field in the operating room.


For another example, the instructions, when executed by the processor, can also cause the processor to perform operations including: adjust processing of the dataflow associated with the surgical system, which is located within the second digital fence, in response to the surgical system moving out of the second digital fence and remaining in the first digital fence. Further, a predefined first set of rules for processing data can be associated with the first digital fence, a predefined second set of rules for processing data can be associated with the second digital fence, the adjusting in response to the surgical system moving into the second digital fence can include the dataflow being processed in accordance with the predefined first set of rules and the predefined second set of rules, and the adjusting in response to the surgical system moving out of the second digital fence can include the dataflow being processed in accordance with the predefined first set of rules and not in accordance with the predefined second set of rules.


For yet another example, a predefined first set of rules for processing data can be associated with the first digital fence, a predefined second set of rules for processing data can be associated with the second digital fence, and the adjusting can include processing the dataflow in accordance with the predefined second set of rules in addition to the predefined first set of rules. Further, the instructions, when executed by the processor, can also cause the processor to perform operations including: adjust processing of the dataflow associated with the surgical system, which is located within the second digital fence, in response to the surgical system moving out of the second digital fence such that the dataflow is processed in accordance with the predefined first set of rules and is no longer processed in accordance with the predefined second set of rules.


For still another example, a predefined first set of rules for processing data can be associated with the first digital fence, a predefined second set of rules for processing data can be associated with the second digital fence, and the adjusting can include suspending at least one of the rules in the predefined first set of rules such that the dataflow is configured to be processed in accordance with the predefined second set of rules and with a subset of the predefined first set of rules. Further, the instructions, when executed by the processor, can also cause the processor to perform operations including: adjust processing of the dataflow associated with the surgical system, which is located within the second digital fence, in response to the surgical system moving out of the second digital fence such that the dataflow is configured to processed in accordance with the predefined first set of rules instead of in accordance with the subset of the predefined first set of rules.


For another example, a predefined first set of rules for processing data can be associated with the first digital fence, a predefined second set of rules for processing data can be associated with the second digital fence, and the adjusting can include adding to at least one of the rules in the predefined first set of rules such that the dataflow is configured to be processed in accordance with the predefined second set of rules and with the added-to predefined first set of rules. Further, the instructions, when executed by the processor, can also cause the processor to perform operations including: adjust processing of the dataflow associated with the surgical system, which is located within the second digital fence, in response to the surgical system moving out of the second digital fence such that the dataflow is configured to be processed in accordance with the predefined first set of rules instead of in accordance with the added-to predefined first set of rules.


For yet another example, the instructions, when executed by the processor, can also cause the processor to perform operations including: receive data characterizing the surgical procedure, and establish, prior to a start of the performance of the surgical procedure on the patient and using the received data: the first digital fence and the second digital fence; and the received data can include at least one of information regarding a plan for the surgical procedure, a total amount of time for the surgical procedure, a location where the surgical procedure is to be performed, a layout of a location where the surgical procedure is to be performed, and at least one surgical tool to be used in the surgical procedure, the patient.


For still another example, boundaries of the first and second digital fences can be defined using a global positioning system (GPS) or a radio frequency identification (RFID) system, and the surgical system can be one of a hospital network, a database, a surgical instrument, or a surgical cart.


For another example, boundaries of the first and second digital fences can be defined using a GPS or an RFID system; the surgical system can be one of a hospital network, a database, a surgical instrument, or a surgical cart; and/or the cloud-based server can be configured to be operatively coupled to a robotic surgical system. Further, the surgical data management system can also include the surgical system.





BRIEF DESCRIPTION OF DRAWINGS

The present invention is described by way of reference to the accompanying figures which are as follows:



FIG. 1 is a schematic view of one embodiment of a computer-implemented surgical system;



FIG. 2 is a perspective view of one embodiment of a surgical system in one embodiment of a surgical operating room;



FIG. 3 is a schematic view of one embodiment of a surgical hub paired with various systems;



FIG. 4 is a schematic view of one embodiment of a situationally aware surgical system;



FIG. 5 is a perspective view of one embodiment of a surgical instrument and one embodiment of a surgical system that includes the surgical instrument;



FIG. 6A is a schematic view of a data pipeline architecture;



FIG. 6B is an expanded schematic view of the data pipeline architecture of FIG. 6A;



FIG. 7 is a top view of one embodiment of geofences for an operating room;



FIG. 8 is another top view of the geofences and the operating room of FIG. 7;



FIG. 9 is another top view of a geofences and the operating room of FIG. 7;



FIG. 10 is a schematic view of one embodiment of mutual triangulation between cooperating surgical systems;



FIG. 11 is a schematic view of one embodiment of fixed sensors with known geometries usable to calculate a relative orientation and position of a device compared to another device;



FIG. 12 is a schematic view of one embodiment of a process for using stereoscopic imaging;



FIG. 13 is a schematic view of one embodiment of a process for identification of a known entity;



FIG. 14 is schematic view of one embodiment of an image for identification of unique elements within an object;



FIG. 15 is a schematic view of one embodiment of another image for identification of unique elements within the object of FIG. 14; and



FIG. 16 is a flowchart of one embodiment of a method of geofencing.





It is noted that the drawings are not necessarily to scale. The drawings are intended to depict only typical aspects of the subject matter disclosed herein, and therefore should not be considered as limiting the scope of the disclosure.


DETAILED DESCRIPTION

Certain exemplary embodiments will now be described to provide an overall understanding of the principles of the structure, function, manufacture, and use of the devices, systems, and methods disclosed herein. One or more examples of these embodiments are illustrated in the accompanying drawings. A person skilled in the art will understand that the devices, systems, and methods specifically described herein and illustrated in the accompanying drawings are non-limiting exemplary embodiments and that the scope of the present invention is defined solely by the claims. The features illustrated or described in connection with one exemplary embodiment may be combined with the features of other embodiments. Such modifications and variations are intended to be included within the scope of the present invention.


Further, in the present disclosure, like-named components of the embodiments generally have similar features, and thus within a particular embodiment each feature of each like-named component is not necessarily fully elaborated upon. Additionally, to the extent that linear or circular dimensions are used in the description of the disclosed systems, devices, and methods, such dimensions are not intended to limit the types of shapes that can be used in conjunction with such systems, devices, and methods. A person skilled in the art will recognize that an equivalent to such linear and circular dimensions can easily be determined for any geometric shape. A person skilled in the art will appreciate that a dimension may not be a precise value but nevertheless be considered to be at about that value due to any number of factors such as manufacturing tolerances and sensitivity of measurement equipment. Sizes and shapes of the systems and devices, and the components thereof, can depend at least on the size and shape of components with which the systems and devices will be used.


In general, a health data management system may include an interactive smart system that includes data origination facets, movement, architecture and management, and transformation and lifecycle to determine mechanisms by which smart systems talk to each other. The health data management system may include a data stack that defines handling of data from beginning to end. A data stack may include data sources, data pipelines, data transformation/modeling systems, and data storage systems that define end-to-end handling of data. In one embodiment, the health data management system may include a plurality of smart medical systems that are configured to perform one or more medical operations. The health data management system may utilize the data stack to control and manage data flow to the different smart device systems. In one embodiment, the health data management system may control and manage the data flow for managing a patient or performing a medical procedure, for example, providing surgical assistance during performance of a surgical procedure by one or more smart medical systems (also referred to herein as “surgical systems”).


Surgical Systems


FIG. 1 shows one embodiment of a computer-implemented surgical system 100. The surgical system 100 may include one or more surgical systems (e.g., surgical sub-systems) 102, 103, 104. As in this illustrated embodiment, the surgical system 100 may include first, second, and third surgical systems 102, 103, 104, but may instead include another number, e.g., one, two, four, etc.


The first surgical system 102 is discussed herein as a general representative of the surgical systems 102, 103, 104. For example, the surgical system 102 may include a computer-implemented interactive surgical system. For example, the surgical system 102 may include a surgical hub 106 and/or a computing device 116 in communication with a cloud computing system 108, for example, as described in FIG. 2.


The cloud computing system 108 may include at least one remote cloud server 109 and at least one remote cloud storage unit 110. Embodiments of surgical systems 102, 103, 104 may include one or more wearable sensing systems 111, one or more environmental sensing systems 115, one or more robotic systems (also referred to herein as “robotic surgical systems”) 113, one or more intelligent instruments 114 (e.g., smart surgical instruments), one or more human interface systems 112, etc. A “human interface system” is also referred herein as a “human interface device.” The wearable sensing system(s) 111 may include one or more HCPs (“health care professional” or “health care personnel”) sensing systems and/or one or more patient sensing systems. The environmental sensing system(s) 115 may include one or more devices used for measuring one or more environmental attributes, for example, as further described in FIG. 2. The robotic system(s) 113 may include a plurality of devices used for performing a surgical procedure, for example, as further described in FIG. 2.


Examples of robotic surgical systems include the Ottava™ robotic-assisted surgery system (Johnson & Johnson of New Brunswick, NJ), da Vinci® surgical systems (Intuitive Surgical, Inc. of Sunnyvale, CA), the Hugo™ robotic-assisted surgery system (Medtronic PLC of Minneapolis, MN), the Versius® surgical robotic system (CMR Surgical Ltd of Cambridge, UK), and the Monarch® platform (Auris Health, Inc. of Redwood City, CA). Embodiments of various robotic surgical systems and using robotic surgical systems are further described in, for example, U.S. Pat. App. Pub. No. 2018/0177556 entitled “Flexible Instrument Insertion Using An Adaptive Force Threshold” filed Dec. 28, 2016, U.S. Pat. App. Pub. No. 2020/0000530 entitled “Systems And Techniques For Providing Multiple Perspectives During Medical Procedures” filed Apr. 16, 2019, U.S. Pat. App. Pub. No. 2020/0170720 entitled “Image-Based Branch Detection And Mapping For Navigation” filed Feb. 7, 2020, U.S. Pat. App. Pub. No. 2020/0188043 entitled “Surgical Robotics System” filed Dec. 9, 2019, U.S. Pat. App. Pub. No. 2020/0085516 entitled “Systems And Methods For Concomitant Medical Procedures” filed Sep. 3, 2019, U.S. Pat. No. 8,831,782 entitled “Patient-Side Surgeon Interface For A Teleoperated Surgical Instrument” filed Jul. 15, 2013, and Intl. Pat. Pub. No. WO 2014151621 entitled “Hyperdexterous Surgical System” filed Mar. 13, 2014, which are hereby incorporated by reference in their entireties.


The surgical system 102 may be in communication with the one or more remote servers 109 that may be part of the cloud computing system 108. In an example embodiment, the surgical system 102 may be in communication with the one or more remote servers 109 via an internet service provider's cable/FIOS networking node. In an example embodiment, a patient sensing system may be in direct communication with the one or more remote servers 109. The surgical system 102 (and/or various sub-systems, smart surgical instruments, robots, sensing systems, and other computerized devices described herein) may collect data in real-time and transfer the data to the cloud computing system 108 for data processing and manipulation, e.g., by the one or more remote servers 109. It will be appreciated that cloud computing may rely on sharing computing resources rather than having local servers or personal devices to handle software applications.


The surgical system 102 and/or a component therein may communicate with the one or more remote servers 109 via a cellular transmission/reception point (TRP) or a base station using one or more of the following cellular protocols: GSM/GPRS/EDGE (2G), UMTS/HSPA (3G), long term evolution (LTE) or 4G, LTE-Advanced (LTE-A), new radio (NR) or 5G, and/or other wired or wireless communication protocols. Various embodiments of cloud-based analytics that may be performed by the cloud computing system 108 are described further in, for example, U.S. Pat. App Pub. No. 2019/0206569 entitled “Method Of Cloud Based Data Analytics For Use With The Hub” published Jul. 4, 2019, which is hereby incorporated by reference in its entirety.


The surgical hub 106 may have cooperative interactions with one or more means of displaying an image, e.g., a display configured to display an image from a laparoscopic scope, etc., and information from one or more other smart devices and/or one or more sensing systems. The surgical hub 106 may interact with the one or more sensing systems, the one or more smart devices, and the one or more means of displaying an image. The surgical hub 106 may be configured to gather measurement data from the sensing system(s) and send notifications or control messages to the sensing system(s). The surgical hub 106 may send and/or receive information including notification information to and/or from the one or more human interface systems 112. The one or more human interface systems 112 may include one or more human interface devices (HIDs). The surgical hub 106 may send and/or receive notification information or control information to audio, display and/or control information to various devices that are in communication with the surgical hub 106.


In an exemplary embodiment, the one or more sensing systems may include the one or more wearable sensing system 111 (which may include one or more HCP sensing systems and/or one or more patient sensing systems) and/or the environmental sensing system(s) 115 shown in FIG. 1. The sensing system(s) may measure data relating to various biomarkers.


In an exemplary embodiment, the sensing system(s) may measure the biomarkers using one or more sensors, for example, photosensors (e.g., photodiodes, photoresistors), mechanical sensors (e.g., motion sensors), acoustic sensors, electrical sensors, electrochemical sensors, thermoelectric sensors, infrared sensors, etc. The sensor(s) may measure the biomarkers using one of more of the following sensing technologies: photoplethysmography, electrocardiography, electroencephalography, colorimetry, impedimentary, potentiometry, amperometry, etc.


The biomarkers measured by the sensing system(s) may include, but are not limited to, sleep, core body temperature, maximal oxygen consumption, physical activity, alcohol consumption, respiration rate, oxygen saturation, blood pressure, blood sugar, heart rate variability, blood potential of hydrogen, hydration state, heart rate, skin conductance, peripheral temperature, tissue perfusion pressure, coughing and sneezing, gastrointestinal motility, gastrointestinal tract imaging, respiratory tract bacteria, edema, mental aspects, sweat, circulating tumor cells, autonomic tone, circadian rhythm, and/or menstrual cycle.


The biomarkers may relate to physiologic systems, which may include, but are not limited to, behavior and psychology, cardiovascular system, renal system, skin system, nervous system, gastrointestinal system, respiratory system, endocrine system, immune system, tumor, musculoskeletal system, and/or reproductive system. Information from the biomarkers may be determined and/or used by the computer-implemented surgical system 100, for example. The information from the biomarkers may be determined and/or used by the computer-implemented patient and the computer-implemented surgical system 100 to improve said systems and/or to improve patient outcomes, for example.


The sensing system(s) may send data to the surgical hub 106. The sensing system(s) may use one or more of the following radiofrequency (RF) protocols for communicating with the surgical hub 106: Bluetooth, Bluetooth Low-Energy (BLE), Bluetooth Smart, Zigbee, Z-wave, IPv6 Low-power wireless Personal Area Network (6LoWPAN), Wi-Fi, etc.


Various embodiments of sensing systems, biomarkers, and physiological systems are described further in, for example, U.S. Pat. App Pub. No. 2022/0233119 entitled “Method Of Adjusting A Surgical Parameter Based On Biomarker Measurements” filed published Jul. 28, 2022, which is hereby incorporated by reference in its entirety.


The sensing systems described herein may be employed to assess physiological conditions of a surgeon operating on a patient or a patient being prepared for a surgical procedure or a patient recovering after a surgical procedure. The cloud-based computing system 108 may be used to monitor biomarkers associated with a HCP (a surgeon, a nurse, etc.) or a patient in real-time and to generate surgical plans based at least on measurement data gathered prior to a surgical procedure, provide control signals to one or more surgical instruments during a surgical procedure, and notify a patient of a complication during a post-surgical period.


The cloud-based computing system 108 may be used to analyze surgical data. Surgical data may be obtained via the intelligent instrument(s) 114, the wearable sensing system(s) 111, the environmental sensing system(s) 115, the robotic system(s) 113, and/or the like in the surgical system 102. Surgical data may include tissue states to assess leaks or perfusion of sealed tissue after a tissue sealing and cutting procedure pathology data, including images of samples of body tissue, anatomical structures of the body using a variety of sensors integrated with imaging devices and techniques such as overlaying images captured by multiple imaging devices, image data, and/or the like. The surgical data may be analyzed to improve surgical procedure outcomes by determining if further treatment, such as the application of endoscopic intervention, emerging technologies, a targeted radiation, targeted intervention, and precise robotics to tissue-specific sites and conditions. Such data analysis may employ outcome analytics processing and using standardized approaches may provide beneficial feedback to either confirm surgical treatments and the behavior of the surgeon or suggest modifications to surgical treatments and the behavior of the surgeon.



FIG. 2 shows one embodiment of the surgical system 102 in one embodiment of a surgical operating room 135. As illustrated in FIG. 2, a patient is being operated on by one or more HCPs. The HCP(s) are being monitored by one or more HCP sensing systems 120 worn by the HCP(s). The HCP(s) and the environment surrounding the HCP(s) may also be monitored by one or more environmental sensing systems including, for example, one or more cameras 121, one or more microphones 122, and other sensors that may be deployed in the operating room. The one or more HCP sensing systems 120 and the environmental sensing systems may be in communication with the surgical hub 106, which in turn may be in communication with the one or more cloud servers 109 of the cloud computing system 108, as shown in FIG. 1. The one or more environmental sensing systems may be used for measuring one or more environmental attributes, for example, HCP position in the surgical theater, HCP movements, ambient noise in the surgical theater, temperature/humidity in the surgical theater, etc.


As illustrated in FIG. 2, a primary display 123 and one or more audio output devices (e.g., speakers 119) are positioned in a sterile field of the surgical system 102 to be visible to an operator at an operating table 124. In addition, a visualization/notification tower 126 is positioned outside the sterile field. The visualization/notification tower 126 may include a first non-sterile human interactive device (HID) 127 and a second non-sterile HID 129, which may be displays and may face away from each other. The display 123 and the HIDs 127, 129 may include a touch screen allowing a human to interface directly with the HID 127, 129. A human interface system, guided by the surgical hub 106, may be configured to utilize the display 123 and the HIDs 127, 129 to coordinate information flow to operators inside and outside the sterile field. In an exemplary embodiment, the surgical hub 106 may cause an HID (e.g., the primary display 123) to display a notification and/or information about the patient and/or a surgical procedure step. In an exemplary embodiment, the surgical hub 106 may prompt for and/or receive input from personnel in the sterile field or in the non-sterile area. In an exemplary embodiment, the surgical hub 106 may cause one or more non-sterile HIDs 127, 129 to display a snapshot of a surgical site, as recorded by an imaging device 130, while maintaining a live feed of the surgical site on one or more sterile HIDs, e.g., the primary HID 123. The snapshot on the non-sterile HID(s) 127, 129 can permit a non-sterile operator to perform a diagnostic step relevant to the surgical procedure, for example.


The surgical hub 106 may be configured to route a diagnostic input or feedback entered by a non-sterile operator at the visualization tower 126 to the primary display 123 within the sterile field, where it can be viewed by a sterile operator at the operating table. In an exemplary embodiment, the input can be in the form of a modification to the snapshot displayed on the non-sterile HID(s) 127, 129, which can be routed to the one or more sterile HIDs. e.g., the primary display 123, by the surgical hub 106.


Various embodiments of surgical hubs are further described in, for example, U.S. Pat. App. Pub. No. 2024/0221937 entitled “Method For Advanced Algorithm Support” published Jul. 4, 2024, U.S. Pat. App. Pub. No. 2024/0112768 entitled “Method For Health Data And Consent Management” published Apr. 4, 2024, U.S. Pat. App. Pub. No. 2024/0220763 entitled “Data Volume Determination For Surgical Machine Learning Applications” published Jul. 2, 2024, U.S. Pat. App. Pub. No. 2019/0206564 entitled “Method For Facility Data Collection And Interpretation” published Jul. 4, 2019, U.S. Pat. App. Pub. No. 2023/0026634 entitled “Surgical Data System And Classification” published Jan. 26, 2023, U.S. Pat. App. Pub. No. 2019/0201115 entitled “Aggregation And Reporting Of Surgical Hub Data” published Jul. 4, 2019, U.S. Pat. App. Pub. No. 2023/0372030 entitled “Automatic Compilation, Annotation, And Dissemination Of Surgical Data To Systems To Anticipate Related Automated Operations” published Nov. 23, 2023, U.S. Pat. App. Pub. No. 2022/0104896 entitled “Interactive Information Overlay On Multiple Surgical Displays” published Apr. 7, 2022, U.S. Pat. No. 11,304,699 entitled “Method For Adaptive Control Schemes For Surgical Network Control And Interaction” issued Apr. 19, 2022, U.S. Pat. No. 10,849,697 entitled “Cloud Interface For Coupled Surgical Devices” issued Dec. 1, 2020, U.S. Pat. App. Pub. No. 2022/0239577 entitled “Ad Hoc Synchronization Of Data From Multiple Link Coordinated Sensing Systems” published Jul. 28, 2022, U.S. Pat. App. Pub. No. 2023/0025061 entitled “Surgical Data System And Management” published Jan. 26, 2023, U.S. Pat. App. Pub. No. 2023/0023083 entitled “Method Of Surgical System Power Management, Communication, Processing, Storage And Display” published Jan. 26, 2023, U.S. Pat. App. Pub. No. 2019/0206556 entitled “Real-Time Analysis Of Comprehensive Cost Of All Instrumentation Used In Surgery Utilizing Data Fluidity To Track Instruments Through Stocking And In-House Processes” published Jul. 4, 2019, U.S. Pat. App. Pub. No. 2023/0096268 entitled “Methods for Controlling Cooperative Surgical Instruments” filed Oct. 5, 2021, U.S. Pat. App. Pub. No. 2019/0200844 entitled “Method Of Hub Communication, Processing, Storage And Display” filed Dec. 4, 2018, U.S. Pat. App. Pub. No. 2019/0200981 entitled “Method Of Compressing Tissue Within A Stapling Device And Simultaneously Displaying The Location Of The Tissue Within The Jaws” filed Dec. 4, 2018, U.S. Pat. App. Pub. No. 2019/0201046 entitled “Method For Controlling Smart Energy Devices” filed Dec. 4, 2018, U.S. Pat. App. Pub. No. 2019/0201114 entitled “Adaptive Control Program Updates For Surgical Hubs” filed Mar. 29, 2018, U.S. Pat. App. Pub. No. 2019/0201140 entitled “Surgical Hub Situational Awareness” filed Mar. 29, 2018, U.S. Pat. App. Pub. No. 2019/0206004 entitled “Interactive Surgical Systems With Condition Handling Of Devices And Data Capabilities” filed Mar. 29, 2018, U.S. Pat. App. Pub. No. 2019/0206555 entitled “Cloud-based Medical Analytics For Customization And Recommendations To A User” filed Mar. 29, 2018, U.S. Pat. No. 11,678,881 entitled “Spatial Awareness Of Surgical Hubs In Operating Rooms” filed Mar. 29, 2018, and U.S. Pat. App. Pub. No. 2019/0207857 entitled “Surgical Network Determination Of Prioritization Of Communication, Interaction, Or Processing Based On System Or Device Needs” filed Nov. 6, 2018, which are hereby incorporated by reference in their entireties.


As in the illustrated embodiment of FIG. 2, one or more surgical instruments 131 may be being used in the surgical procedure as part of the surgical system 102. The surgical hub 106 may be configured to coordinate information flow to at least one display showing the surgical instrument(s) 131. A diagnostic input or feedback entered by a non-sterile operator at the visualization tower 126 can be routed by the surgical hub 106 to the at least one display, e.g., the primary display 123, within the sterile field, where it can be viewed by the operator of the surgical instrument(s) 131.


Various embodiments of coordinating information flow and display and various embodiments of surgical instruments are described further in, for example, U.S. Pat. No. 11,937,769 entitled “Method Of Hub Communication, Processing, Storage And Display” issued Mar. 26, 2024, which is hereby incorporated by reference in its entirety.


Examples of surgical instruments include a surgical dissector, a surgical stapler, a surgical grasper, a surgical scope (e.g., an endoscope, a laparoscope, etc.), a surgical energy device (e.g., a mono-polar probe, a bi-polar probe, an ablation probe, an ultrasound device, an ultrasonic end effector, etc.), a surgical clip applier, etc.


Various embodiments of surgical instruments are described further in, for example, U.S. Pat. No. 11,723,642 entitled “Cooperative Access Hybrid Procedures” issued Aug. 14, 2023, U.S. Pat. App. Pub. No. 2013/0256377 entitled “Layer Comprising Deployable Attachment Members” filed Feb. 8, 2013, U.S. Pat. No. 8,393,514 entitled “Selectively Orientable Implantable Fastener Cartridge” filed Sep. 30, 2010, U.S. Pat. No. 8,317,070 entitled “Surgical Stapling Devices That Produce Formed Staples Having Different Lengths” filed Feb. 28, 2007, U.S. Pat. No. 7,143,925 entitled “Surgical Instrument Incorporating EAP Blocking Lockout Mechanism” filed Jun. 21, 2005, U.S. Pat. App. Pub. No. 2015/0134077 entitled “Scaling Materials for Use in Surgical Stapling” filed Nov. 8, 2013, U.S. Pat. App. Pub. No. 2015/0134076 entitled “Hybrid Adjunct Materials for Use in Surgical Stapling” filed Nov. 8, 2013, U.S. Pat. App. Pub. No. 2015/0133996 entitled “Positively Charged Implantable Materials and Method of Forming the Same” filed Nov. 8, 2013, U.S. Pat. App. Pub. No. 2015/0129634 entitled “Tissue Ingrowth Materials and Method of Using the Same” filed Nov. 8, 2013, U.S. Pat. App. Pub. No. 2015/0133995 entitled “Hybrid Adjunct Materials for Use in Surgical Stapling” filed Nov. 8, 2013, U.S. Pat. No. 9,913,642 entitled “Surgical Instrument Comprising a Sensor System” filed Mar. 26, 2014, U.S. Pat. No. 10,172,611 entitled “Adjunct Materials and Methods of Using Same in Surgical Methods for Tissue Scaling” filed Jun. 10, 2014, U.S. Pat. No. 8,989,903 entitled “Methods And Systems For Indicating A Clamping Prediction” filed Jan. 13, 2012, U.S. Pat. No. 9,072,535 entitled “Surgical Stapling Instruments With Rotatable Staple Deployment Arrangements” filed May 27, 2011, U.S. Pat. No. 9,072,536 entitled “Differential Locking Arrangements For Rotary Powered Surgical Instruments” filed Jun. 28, 2012, U.S. Pat. No. 10,531,929 entitled “Control Of Robotic Arm Motion Based On Sensed Load On Cutting Tool” filed Aug. 16, 2016, U.S. Pat. No. 10,709,516 entitled “Curved Cannula Surgical System Control” filed Apr. 2, 2018, U.S. Pat. No. 11,076,926 entitled “Manual Release For Medical Device Drive System” filed Mar. 21, 2018, U.S. Pat. No. 9,839,487 entitled “Method For Engaging Surgical Instrument With Teleoperated Actuator” filed Mar. 17, 2015, U.S. Pat. No. 10,543,051 entitled “Method For Engaging Surgical Instrument With Teleoperated Actuator” issued Jan. 28, 2020, U.S. Pat. No. 9,804,618 entitled “Systems And Methods For Controlling A Segmented Circuit” filed Mar. 25, 2014, U.S. Pat. No. 11,607,239 entitled “Systems And Methods For Controlling A Surgical Stapling And Cutting Instrument” filed Apr. 15, 2016, U.S. Pat. No. 10,052,044 entitled “Time Dependent Evaluation Of Sensor Data To Determine Stability, Creep, And Viscoelastic Elements Of Measures” filed Mar. 6, 2015, U.S. Pat. No. 9,439,649 entitled “Surgical Instrument Having Force Feedback Capabilities” filed Dec. 12, 2012, U.S. Pat. No. 10,751,117 entitled “Electrosurgical Instrument With Fluid Diverter” filed Sep. 23, 2016, U.S. Pat. No. 11,160,602 entitled “Control Of Surgical Field Irrigation” filed Aug. 29, 2017, U.S. Pat. No. 9,877,783 entitled “Energy Delivery Systems And Uses Thereof” filed Dec. 30, 2016, U.S. Pat. No. 11,266,458 entitled “Cryosurgical System With Pressure Regulation” filed Apr. 19, 2019, U.S. Pat. No. 10,314,649 entitled “Flexible Expandable Electrode And Method Of Intraluminal Delivery Of Pulsed Power” filed Aug. 2, 2012, U.S. Pat. App. Pub. No. 2023/0116781 entitled “Surgical Devices, Systems, And Methods Using Multi-Source Imaging” filed Oct. 5, 2021, U.S. Pat. App. Pub. No. 2023/0102358 entitled “Surgical Devices, Systems, And Methods Using Fiducial Identification And Tracking” filed Oct. 5, 2021, U.S. Pat. No. 10,413,373 entitled “Robotic Visualization And Collision Avoidance” filed Aug. 16, 2016, U.S. Pat. App. Pub. No. 2023/0077141 entitled “Robotically Controlled Uterine Manipulator” filed Sep. 21, 2021, and U.S. Pat. App. Pub. No. 2022/0273309 entitled “Stapler Reload Detection And Identification” filed May 16, 2022, which are hereby incorporated by reference herein in their entireties.


As shown in FIG. 2, the surgical system 102 can be used to perform a surgical procedure on the patient who is lying down on the operating table 124 in the surgical operating room 135. A robotic system 134 may be used in the surgical procedure as a part of the surgical system 102. The robotic system 134 may include a surgeon's console 136, a patient side cart 132 (surgical robot), and a surgical robotic hub 133. The patient side cart 132 can manipulate at least one removably coupled surgical instrument 137 through a minimally invasive incision in the body of the patient while the surgeon views the surgical site via the surgeon's console 136. An image of the surgical site can be obtained by the imaging device 130, which can be manipulated by the patient side cart 132 to orient the imaging device 130. The surgical robotic hub 133 can be used to process the images of the surgical site for subsequent display to the surgeon via the surgeon's console 136.


Various embodiments of robotic systems and various embodiments of surgical instruments are described further in, for example, U.S. Pat. No. 11,559,307 entitled “Method Of Robotic Hub Communication, Detection, And Control” issued Jan. 24, 2023, which is hereby incorporated by reference in its entirety.


The imaging device 130 may include at least one image sensor and one or more optical components. Suitable image sensors may include, but are not limited to, Charge-Coupled Device (CCD) sensors and Complementary Metal-Oxide Semiconductor (CMOS) sensors.


The optical components of the imaging device 130 may include one or more illumination sources and/or one or more lenses. The one or more illumination sources may be directed to illuminate portions of the surgical field. The one or more image sensors may receive light reflected or refracted from the surgical field, including light reflected or refracted from tissue and/or surgical instruments. The illumination source(s) may be configured to radiate electromagnetic energy in the visible spectrum as well as the invisible spectrum. The visible spectrum, sometimes referred to as the “optical spectrum” or the “luminous spectrum,” is the portion of the electromagnetic spectrum that is visible to (e.g., can be detected by) the human eye and may be referred to as “visible light” or simply “light.” A typical human eye will respond to wavelengths in air that range from about 380 nm to about 750 nm.


The invisible spectrum (e.g., the non-luminous spectrum) is the portion of the electromagnetic spectrum that lies below and above the visible spectrum (i.e., wavelengths below about 380 nm and above about 750 nm). The invisible spectrum is not detectable by the human eye. Wavelengths greater than about 750 nm are longer than the red visible spectrum, and they become invisible infrared (IR), microwave, and radio electromagnetic radiation. Wavelengths less than about 380 nm are shorter than the violet spectrum, and they become invisible ultraviolet, x-ray, and gamma ray electromagnetic radiation.


In various aspects, the imaging device 130 is configured for use in a minimally invasive procedure. Examples of imaging devices include an arthroscope, angioscope, bronchoscope, choledochoscope, colonoscope, cytoscope, duodenoscope, enteroscope, esophagogastro-duodenoscope (gastroscope), endoscope, laryngoscope, nasopharyngo-neproscope, sigmoidoscope, thoracoscope, and ureteroscope.


The imaging device 130 may employ multi-spectrum monitoring to discriminate topography and underlying structures. A multi-spectral image is one that captures image data within specific wavelength ranges across the electromagnetic spectrum. The wavelengths may be separated by filters or by the use of instruments that are sensitive to particular wavelengths, including light from frequencies beyond the visible light range, e.g., infrared (IR) and ultraviolet (UV). Spectral imaging can allow extraction of additional information that the human eye fails to capture with its receptors for red, green, and blue. Multi-spectrum monitoring can be a useful tool in relocating a surgical field after a surgical task is completed to perform one or more of the previously described tests on the treated tissue. It is axiomatic that strict sterilization of the operating room and surgical equipment is required during any surgery. The strict hygiene and sterilization conditions required in a “surgical theater,” e.g., an operating or treatment room, necessitate the highest possible sterility of all medical devices and equipment. Part of that sterilization process is the need to sterilize anything that comes in contact with the patient or penetrates the sterile field, including the imaging device 130 and its attachments and components. It will be appreciated that the sterile field may be considered a specified area, such as within a tray or on a sterile towel, that is considered free of microorganisms, or the sterile field may be considered an area, immediately around a patient, who has been prepared for a surgical procedure. The sterile field may include the scrubbed team members, who are properly attired, and all furniture and fixtures in the area.


Various embodiments of multi-spectral imaging are described further in, for example, U.S. Pat. No. 11,937,769 entitled “Method Of Hub Communication, Processing, Storage And Display” filed Dec. 4, 2018, which is hereby incorporated by reference in its entirety.


The wearable sensing system(s) 111 illustrated in FIG. 1 may include the one or more HCP sensing systems 120 as shown in FIG. 2. The one or more HCP sensing systems 120 may include sensing system(s) to monitor and detect a set of physical states and/or a set of physiological states of an HCP. An HCP may be a surgeon or one or more healthcare personnel assisting the surgeon or other healthcare service providers in general. The HCP sensing system 120 may send the measurement data associated with a set of biomarkers and data associated with a physical state of the surgeon to the surgical hub 106 for further processing. In an exemplary embodiment, an HCP sensing system 120 may measure a set of biomarkers to monitor the heart rate of an HCP. In an exemplary embodiment, an HCP sensing system 120 worn on a surgeon's wrist (e.g., a watch or a wristband) may use an accelerometer to detect hand motion and/or shakes and determine the magnitude and frequency of tremors.


The environmental sensing system(s) 115 shown in FIG. 1 may send environmental information to the surgical hub 106. In an exemplary embodiment, the environmental sensing system(s) 115 may include a camera 121 for detecting hand/body position of an HCP. The environmental sensing system(s) 115 may include one or more microphones 122 for measuring ambient noise in the surgical theater. Other environmental sensing system(s) 115 may include one or more devices, for example, a thermometer to measure temperature, a hygrometer to measure humidity of the surroundings in the surgical theater, etc. The surgeon biomarkers may include one or more of the following: stress, heart rate, etc. The environmental measurements from the surgical theater may include ambient noise level associated with the surgeon or the patient, surgeon and/or staff movements, surgeon and/or staff attention level, etc. The surgical hub 106, alone or in communication with the cloud computing system 108, may use the surgeon biomarker measurement data and/or environmental sensing information to modify control algorithms of hand-held instruments or the averaging delay of a robotic interface, for example, to minimize tremors.


The surgical hub 106 may use the surgeon biomarker measurement data associated with an HCP to adaptively control the one or more surgical instruments 131. For example, the surgical hub 106 may send a control program to one of the one or more surgical instruments 131 to control the surgical instrument's actuators to limit or compensate for fatigue and use of fine motor skills. The surgical hub 106 may send the control program based on situational awareness and/or the context on importance or criticality of a task. The control program may instruct the instrument to alter operation to provide more control when control is needed.



FIG. 3 shows an embodiment of the surgical system 102 including the surgical hub 106. The surgical hub 106 may be paired with, via a modular control, the one or more wearable sensing systems 111, the one or more environmental sensing systems 115, the one or more human interface systems 112, the one or more robotic systems 113, and the one or more intelligent instruments 114. As in this illustrated embodiment, the surgical hub 106 may include a display 148, an imaging module 149, a generator module 150 (e.g., an energy generator), a communication module 156, a processor module 157, a storage array 158, and an operating-room mapping module 159. In certain aspects, as illustrated in FIG. 3, the surgical hub 106 further includes a smoke evacuation module 154 and/or a suction/irrigation module 155. The various modules and systems may be connected to the modular control either directly via a router or via the communication module 156. The operating theater devices may be coupled to cloud computing resources and data storage, e.g., to the cloud computing system 108, via the modular control. The human interface system(s) 112 may include a display sub-system and a notification sub-system.


The modular control may be coupled to a non-contact sensor module. The non-contact sensor module may measure the dimensions of the operating theater and generate a map of the surgical theater using ultrasonic, laser-type, and/or the like non-contact measurement devices. Other distance sensors can be employed to determine the bounds of an operating room. The sensor module may be configured to determine the size of the operating theater and to adjust Bluetooth-pairing distance limits. A laser-based non-contact sensor module may scan the operating theater by transmitting laser light pulses, receiving laser light pulses that bounce off the perimeter walls of the operating theater, and comparing the phase of the transmitted pulse to the received pulse to determine the size of the operating theater and to adjust Bluetooth pairing distance limits, for example.


An ultrasound-based non-contact sensor module may scan the operating theater by transmitting a burst of ultrasound and receiving the echo when it bounces off the perimeter walls of an operating theater as described further in, for example, U.S. Pat. No. 11,857,152 entitled “Surgical Hub Spatial Awareness To Determine Devices In Operating Theater” issued Jan. 2, 2024, U.S. Pat. No. 11,278,281 entitled “Interactive Surgical Platform” issued Mar. 22, 2022, and U.S. Prov. Pat. App. No. 62/611,341 entitled “Interactive Surgical Platform” filed Dec. 28, 2017, which are hereby incorporated by reference herein in their entireties.


During a surgical procedure, energy application to tissue, for sealing and/or cutting, may be associated with smoke evacuation, suction of excess fluid, and/or irrigation of the tissue. Fluid, power, and/or data lines from different sources may be entangled during the surgical procedure. Valuable time can be lost addressing this issue during a surgical procedure. Detangling the lines may necessitate disconnecting the lines from their respective modules, which may require resetting the modules. A hub modular enclosure 160 of the surgical hub 106 may offer a unified environment for managing the power, data, and fluid lines, which reduces the frequency of entanglement between such lines.


Energy may be applied to tissue at a surgical site. The surgical hub 106 may include the hub modular enclosure 160 and a combo generator module slidably receivable in a docking station of the hub modular enclosure 160. The docking station may include data and power contacts. The combo generator module may include two or more of: an ultrasonic energy generator component, a bipolar radiofrequency (RF) energy generator component, or a monopolar RF energy generator component that are housed in a single unit. The combo generator module may include a smoke evacuation component, at least one energy delivery cable for connecting the combo generator module to a surgical instrument, at least one smoke evacuation component configured to evacuate smoke, fluid, and/or particulates generated by the application of therapeutic energy to the tissue, and a fluid line extending from the remote surgical site to the smoke evacuation component. The fluid line may be a first fluid line, and a second fluid line may extend from the remote surgical site to a suction and irrigation module 155 slidably received in the hub modular enclosure 160. The hub modular enclosure 160 may include a fluid interface.


The combo generator module may generate multiple energy types for application to the tissue. One energy type may be more beneficial for cutting the tissue, while another different energy type may be more beneficial for sealing the tissue. For example, a bipolar generator can be used to seal the tissue while an ultrasonic generator can be used to cut the sealed tissue. In an exemplary embodiment, the hub modular enclosure 160 may be configured to accommodate different generators and facilitate an interactive communication therebetween. The hub modular enclosure 160 may enable the quick removal and/or replacement of various modules.


The hub modular enclosure 160 may include a first energy-generator module, configured to generate a first energy for application to the tissue, and a first docking station comprising a first docking port that includes first data and power contacts, wherein the first energy-generator module is slidably movable into an electrical engagement with the power and data contacts and wherein the first energy-generator module is slidably movable out of the electrical engagement with the first power and data contacts. The modular surgical enclosure may include a second energy-generator module configured to generate a second energy, different than the first energy, for application to the tissue, and a second docking station comprising a second docking port that includes second data and power contacts, wherein the second energy generator module is slidably movable into an electrical engagement with the power and data contacts, and wherein the second energy-generator module is slidably movable out of the electrical engagement with the second power and data contacts. In addition, the modular surgical enclosure also includes a communication bus between the first docking port and the second docking port, configured to facilitate communication between the first energy-generator module and the second energy-generator module.


As shown in FIG. 3, the hub modular enclosure 160 may allow the modular integration of the generator module 150, the smoke evacuation module 154, and the suction/irrigation module 155. The hub modular enclosure 160 may facilitate interactive communication between the operating-room mapping, smoke evacuation, and suction/irrigation modules 159, 154, 155. The generator module 150 can be with integrated monopolar, bipolar, and ultrasonic components supported in a single housing unit slidably insertable into the hub modular enclosure 160. The generator module 150 may connect to a monopolar device 151, a bipolar device 152, and an ultrasonic device 153. The generator module 150 may include a series of monopolar, bipolar, and/or ultrasonic generator modules that interact through the hub modular enclosure 160. The hub modular enclosure 160 may facilitate the insertion of multiple generators and interactive communication between the generators docked into the hub modular enclosure 160 so that the generators would act as a single generator.


A surgical data network having a set of communication hubs may connect the sensing system(s), the modular devices located in one or more operating theaters of a healthcare facility, a patient recovery room, or a room in a healthcare facility specially equipped for surgical operations, to the cloud computing system 108.



FIG. 4 illustrates one embodiment of a situationally aware surgical system 200. Data sources 202 of the situationally aware surgical system 200 may include, for example, modular devices 204, databases 206 (e.g., an electronic medical records (EMR) database, such as of a hospital or other medical facility, containing patient records, etc.), patient monitoring devices 208 (e.g., a blood pressure (BP) monitor, an electrocardiography (EKG) monitor, one or more wearable sensing systems 111, etc.), HCP monitoring devices 210 (e.g., one or more wearable sensing systems 111, etc.), and/or environment monitoring devices 212 (e.g., one or more environmental sensing systems 115, etc.).


The modular devices 204 may include sensors configured to detect parameters associated with a patient. HCPs and environment and/or the modular device 204 itself. The modular devices 204 may include the one or more intelligent instrument(s) 114.


The data sources 202 may be in communication (e.g., wirelessly or wired) with a surgical hub 214, such as the surgical hub 106. The surgical hub 214 may derive contextual information pertaining to a surgical procedure from data based upon, for example, the particular combination(s) of data received from the data sources 202 or the particular order in which the data is received from the data sources 202. The contextual information inferred from the received data can include, for example, the type of surgical procedure being performed, the particular step of the surgical procedure that a surgeon (and/or other HCP) is performing, the type of tissue being operated on, or a body cavity that is the subject of the surgical procedure. This ability by some aspects of the surgical hub 214 to derive or infer information related to the surgical procedure from received data can be referred to as “situational awareness.” For example, the surgical hub 214 can incorporate a situational awareness system, which may be the hardware and/or programming associated with the surgical hub 214 that derives contextual information pertaining to the surgical procedure from the received data and/or a surgical plan information received from the edge computing system 216 or an enterprise cloud server 218, such as the cloud computing system 108. The contextual information derived from the data sources 202 may include, for example, what step of the surgical procedure is being performed, whether and how a particular modular device 204 is being used, and the patient's condition.


The surgical hub 214 may be connected to the databases 206 of the data sources 202 to retrieve therefrom data regarding the surgical procedure that is being performed or is to be performed. The data that may be received by the situational awareness system of the surgical hub 214 from the databases 206 may include, for example, start (or setup) time or operational information regarding the procedure (e.g., a segmentectomy in the upper right portion of the thoracic cavity). The surgical hub 214 may derive contextual information regarding the surgical procedure from this data alone or from the combination of this data and other data from the data sources 202.


The surgical hub 214 may be connected to (e.g., paired with) the patient monitoring devices 208 of the data sources 202. Examples of the patient monitoring devices 208 that can be paired with the surgical hub 214 may include a pulse oximeter (SpO2 monitor), a blood pressure (BP) monitor, and an electrocardiogram (EKG) monitor. Perioperative data that is received by the situational awareness system of the surgical hub 214 from the patient monitoring devices 208 may include, for example, the patient's oxygen saturation, blood pressure, heart rate, and/or other physiological parameters. The contextual information that may be derived by the surgical hub 214 from the perioperative data transmitted by the patient monitoring devices 208 may include, for example, whether the patient is located in the operating theater or under anesthesia. The surgical hub 214 may derive these inferences from data from the patient monitoring devices 208 alone or in combination with data from other data from the data sources 202, such as a ventilator and/or other data source.


The surgical hub 214 may be connected to (e.g., paired with) the modular devices 204. Examples of the modular devices 204 that are paired with the surgical hub 214 may include a smoke evacuator, a medical imaging device such as the imaging device 130 shown in FIG. 2, an insufflator, a combined energy generator (for powering an ultrasonic surgical instrument and/or an RF electrosurgical instrument), and a ventilator.


The perioperative data received by the surgical hub 214 may be from a medical imaging device and/or other device(s). The perioperative data received by the surgical hub 214 from the medical imaging device may include, for example, whether the medical imaging device is activated and image data. The contextual information that is derived by the surgical hub 214 from the perioperative data sent by the medical imaging device may include, for example, whether the procedure is a video-assisted thoracic surgery (VATS) procedure (based on whether the medical imaging device is activated or paired to the surgical hub 214 at the beginning or during the course of the procedure). The image data (e.g., still image and/or video image) from the medical imaging device (or the data stream representing the video for a digital medical imaging device) may be processed by a pattern recognition system or a machine learning system to recognize features (e.g., organs or tissue types) in the field of view (FOV) of the medical imaging device, for example. The contextual information that is derived by the surgical hub 214 from the recognized features may include, for example, what type of surgical procedure (or step thereof) is being performed, what organ is being operated on, or what body cavity is being operated in.


The situational awareness system of the surgical hub 214 may derive the contextual information from the data received from the data sources 202 in a variety of different ways. For example, the situational awareness system can include a pattern recognition system, or a machine learning system (e.g., an artificial neural network), that has been trained on training data to correlate various inputs (e.g., data from the databases 206, the patient monitoring devices 208, the modular devices 204, the HCP monitoring devices 210, and/or the environment monitoring devices 212) to corresponding contextual information regarding a surgical procedure. For example, a machine learning system may accurately derive contextual information regarding a surgical procedure from the provided inputs. In examples, the situational awareness system can include a lookup table storing pre-characterized contextual information regarding a surgical procedure in association with one or more inputs (or ranges of inputs) corresponding to the contextual information. In response to a query with one or more inputs, the lookup table can return the corresponding contextual information for the situational awareness system for controlling one or more of the modular devices 204. In examples, the contextual information received by the situational awareness system of the surgical hub 214 can be associated with a particular control adjustment or set of control adjustments for one or more of the modular devices 204. In examples, the situational awareness system can include a machine learning system, lookup table, or other such system, which may generate or retrieve one or more control adjustments for one or more of the modular devices 204 when provided the contextual information as input.


For example, based on data from the data sources 202, the surgical hub 214 may determine what type of tissue was being operated on. The surgical hub 214 can infer whether a surgical procedure being performed is a thoracic or an abdominal procedure, allowing the surgical hub 214 to determine whether the tissue clamped by an end effector of a surgical stapling and cutting instrument is lung tissue (for a thoracic procedure) or stomach tissue (for an abdominal procedure) tissue. The surgical hub 214 may determine whether the surgical site is under pressure (by determining that the surgical procedure is utilizing insufflation) and determine the procedure type, for a consistent amount of smoke evacuation for both thoracic and abdominal procedures. Based on data from the data sources 202, the surgical hub 214 may determine what step of the surgical procedure is being performed or will subsequently be performed.


The surgical hub 214 may determine what type of surgical procedure is being performed and customize an energy level according to an expected tissue profile for the surgical procedure. The situationally aware surgical hub 214 may adjust the energy level for an ultrasonic surgical instrument or RF electrosurgical instrument throughout the course of a surgical procedure, rather than just on a procedure-by-procedure basis.


In examples, data can be drawn from one or more data sources 202 to improve the conclusions that the surgical hub 214 draws from another one of the data sources 202. The surgical hub 214 may augment data that it receives from the modular devices 204 with contextual information that it has built up regarding the surgical procedure from the other data sources 202.


The situational awareness system of the surgical hub 214 can consider the physiological measurement data to provide additional context in analyzing the visualization data. The additional context can be useful when the visualization data may be inconclusive or incomplete on its own.


The surgical hub 214 may determine whether a surgeon (and/or other HCP(s)) was making an error or otherwise deviating from the expected course of action during the course of a surgical procedure. For example, the surgical hub 214 may determine a type of surgical procedure being performed, retrieve a corresponding list of steps or order of equipment usage (e.g., from a memory of the surgical hub 214 or other computer system), and compare the steps being performed or the equipment being used during the course of the surgical procedure to the expected steps or equipment for the type of surgical procedure that the surgical hub 214 determined is being performed. The surgical hub 214 can provide an alert indicating that an unexpected action is being performed or an unexpected device is being utilized at the particular step in the surgical procedure.


The surgical instruments (and other modular devices 204) may be adjusted for the particular context of each surgical procedure (such as adjusting to different tissue types) and validating actions during a surgical procedure. Next steps, data, and display adjustments may be provided to surgical instruments (and other modular devices 204) in the surgical theater according to the specific context of the surgical procedure.


Embodiments of situational awareness systems and using situational awareness systems during performance of a surgical procedure are described further in, for example, U.S. patent application Ser. No. 16/729,772 entitled “Analyzing Surgical Trends By A Surgical System” filed Dec. 30, 2019, U.S. patent application Ser. No. 16/729,747 entitled “Dynamic Surgical Visualization Systems” filed Dec. 30, 2019, U.S. patent application Ser. No. 16/729,744 entitled “Visualization Systems Using Structured Light” filed Dec. 30, 2019, U.S. patent application Ser. No. 16/729,778 entitled “System And Method For Determining, Adjusting, And Managing Resection Margin About A Subject Tissue” filed Dec. 30, 2019, U.S. patent application Ser. No. 16/729,729 entitled “Surgical Systems For Proposing And Corroborating Organ Portion Removals” filed Dec. 30, 2019, U.S. patent application Ser. No. 16/729,778 entitled “Surgical System For Overlaying Surgical Instrument Data Onto A Virtual Three Dimensional Construct Of An Organ” filed Dec. 30, 2019, U.S. patent application Ser. No. 16/729,751 entitled “Surgical Systems For Generating Three Dimensional Constructs Of Anatomical Organs And Coupling Identified Anatomical Structures Thereto” filed Dec. 30, 2019, U.S. patent application Ser. No. 16/729,740 entitled “Surgical Systems Correlating Visualization Data And Powered Surgical Instrument Data” filed Dec. 30, 2019, U.S. patent application Ser. No. 16/729,737 entitled “Adaptive Surgical System Control According To Surgical Smoke Cloud Characteristics” filed Dec. 30, 2019, U.S. patent application Ser. No. 16/729,796 entitled “Adaptive Surgical System Control According To Surgical Smoke Particulate Characteristics” filed Dec. 30, 2019, U.S. patent application Ser. No. 16/729,803 entitled “Adaptive Visualization By A Surgical System” filed Dec. 30, 2019, and U.S. patent application Ser. No. 16/729,807 entitled “Method Of Using Imaging Devices In Surgery” filed Dec. 30, 2019, which are hereby incorporated by reference in their entireties.



FIG. 5 illustrates one embodiment of a surgical system 300 that may include a surgical instrument 302, such as the surgical instrument 114 of FIG. 1 or the surgical instrument 131 of FIG. 2. The surgical instrument 302 can be in communication with a console 304 and/or a portable device 306 through a local area network (LAN) 308 and/or a cloud network 310, such as the cloud computing system 108 of FIG. 1, via a wired and/or wireless connection. The console 304 and the portable device 306 may be any suitable computing device.


The surgical instrument 302 may include a handle 312, an adapter 314, and a loading unit 316. The adapter 314 releasably couples to the handle 312 and the loading unit 316 releasably couples to the adapter 314 such that the adapter 314 transmits a force from one or more drive shafts to the loading unit 316. The adapter 314 or the loading unit 316 may include a force gauge (not explicitly shown in FIG. 5) disposed therein to measure a force exerted on the loading unit 316. In some embodiments, the adapter 314 is non-releasably attached to the handle 312. In some embodiments, the adapter 314 and the loading unit 316 are integral and may be releasably attachable to the handle 312 or non-releasably attached to the handle 312.


The loading unit 316 may include an end effector 318 having a first jaw 320 and a second jaw 322. The loading unit 316 may be an in-situ loaded or multi-firing loading unit (MFLU) that allows a clinician to fire a plurality of fasteners (e.g., staples, clips, etc.) multiple times without requiring the loading unit 316 to be removed from a surgical site to reload the loading unit 316. The first and second jaws 320, 322 may be configured to clamp tissue therebetween, fire fasteners through the clamped tissue, and sever the clamped tissue. The first jaw 320 may be configured to fire at least one fastener a plurality of times or may be configured to include a replaceable multi-fire fastener cartridge including a plurality of fasteners that may be fired more than one time prior to being replaced. The second jaw 322 may include an anvil that deforms or otherwise secures the fasteners, as the fasteners are ejected from the multi-fire fastener cartridge.


The surgical instrument 302 may include a motor, such as at the handle 312, that is coupled to the one or more drive shafts to affect rotation of the one or more drive shafts. The surgical instrument 302 may include a control interface, such as at the handle 312, to selectively activate the motor. The control interface may include buttons, switches, levers, sliders, touchscreens, and/or any other suitable input mechanisms or user interfaces, which can be engaged by a clinician to activate the motor.


The control interface of the surgical instrument 302 may be in communication with a controller 324 (e.g., a microprocessor or other controller) of the surgical instrument 302, shown in the embodiment of FIG. 5 disposed in the handle 312, to selectively activate the motor to affect rotation of the one or more drive shafts. The controller 324 may be configured to receive input from the control interface, adapter data from the adapter 314, and loading unit data from the loading unit 316. The controller 324 may analyze the input from the control interface and the data received from the adapter 314 and/or the date received from the loading unit 316 to selectively activate the motor. The surgical instrument 302 may also include a display, such as at the handle 312, that is viewable by a clinician during use of the surgical instrument 302. The display may be configured to display portions of the adapter data and/or loading unit data before, during, or after firing of the surgical instrument 302.


The adapter 314 may include an adapter identification device 326 disposed therein, and the loading unit 316 may include a loading unit identification device 328 disposed therein. The adapter identification device 326 may be in communication with the controller 324, and the loading unit identification device 328 may be in communication with the controller 324. It will be appreciated that the loading unit identification device 328 may be in communication with the adapter identification device 326, which relays or passes communication from the loading unit identification device 328 to the controller 324. In embodiments in which the adapter 314 and the loading unit 316 are integral, one of the adapter identification device 326 and the loading unit identification device 328 may be omitted.


The adapter 314 may also include one or more sensors 330 disposed thereabout to detect various conditions of the adapter 314 or of the environment (e.g., if the adapter 314 is connected to a loading unit, if the adapter 314 is connected to a handle, if the one or more drive shafts are rotating, a torque of the one or more drive shafts, a strain of the one or more drive shafts, a temperature within the adapter 314, a number of firings of the adapter 314, a peak force of the adapter 314 during firing, a total amount of force applied to the adapter 314, a peak retraction force of the adapter 314, a number of pauses of the adapter 314 during firing, etc.). The one or more sensors 330 may provide an input to the adapter identification device 326 (or to the loading unit identification device 328 if the adapter identification device 326 is omitted) in the form of data signals. The data signals of the one or more sensors 330 may be stored within or be used to update the adapter data stored within the adapter identification device 326 (or the loading unit identification device 328 if the adapter identification device 326 is omitted). The data signals of the one or more sensors 330 may be analog or digital. The one or more sensors 330 may include, for example, a force gauge to measure a force exerted on the loading unit 316 during firing.


The handle 312 and the adapter 314 can be configured to interconnect the adapter identification device 326 and the loading unit identification device 328 with the controller 324 via an electrical interface. The electrical interface may be a direct electrical interface (e.g., include electrical contacts that engage one another to transmit energy and signals therebetween). Additionally, or alternatively, the electrical interface may be a non-contact electrical interface to wirelessly transmit energy and signals therebetween (e.g., inductively transfer). It is also contemplated that the adapter identification device 326 and the controller 324 may be in wireless communication with one another via a wireless connection separate from the electrical interface.


The handle 312 may include a transceiver 332 that is configured to transmit instrument data from the controller 324 to one or more other components of the surgical system 300 (e.g., the LAN 308, the cloud 310, the console 304, and/or the portable device 306). The controller 324 may also transmit instrument data and/or measurement data associated with the one or more sensors 330 to a surgical hub, such as the surgical hub 106 of FIGS. 1-3 or the surgical hub 214 of FIG. 4. The transceiver 332 may receive data (e.g., cartridge data, loading unit data, adapter data, and/or other notifications) from the surgical hub. The transceiver 332 may receive data (e.g., cartridge data, loading unit data, or adapter data) from the other components of the surgical system 300. For example, the controller 324 may transmit surgical instrument data including a serial number of an attached adapter (e.g., the adapter 314) attached to the handle 312, a serial number of a loading unit (e.g., the loading unit 316) attached to the adapter 314, and a serial number of a multi-fire fastener cartridge loaded into the loading unit 316, e.g., into one of the jaws 320, 322 at the end effector 318, to the console 304. Thereafter, the console 304 may transmit data (e.g., cartridge data, loading unit data, and/or adapter data) associated with the attached cartridge, the loading unit 316, and the adapter, 314 respectively, back to the controller 324. The controller 324 can display messages on the local instrument display or transmit the message, via transceiver 332, to the console 304 or the portable device 306 to display the message on a display 334 or device screen of the portable device 306, respectively.


Various exemplary embodiments of aspects of smart surgical systems, for example how smart surgical systems choose to interact with each other, are described further in, for example, U.S. patent application Ser. No. 18/810,323 entitled “Method For Multi-System Interaction” filed Aug. 20, 2024, U.S. patent application Ser. No. 18/810,036 entitled “Adaptive Interaction Between Smart Healthcare Systems” filed Aug. 20, 2024, U.S. patent application Ser. No. 18/810,082 entitled “Control Redirection And Image Porting Between Surgical Systems” filed Aug. 20, 2024, U.S. patent application Ser. No. 18/809,890 entitled “Synchronized Motion Of Independent Surgical Devices” filed Aug. 20, 2024, U.S. patent application Ser. No. 18/810,133 entitled “Synchronization Of The Operational Envelopes Of Independent Surgical Devices” filed Aug. 20, 2024, U.S. patent application Ser. No. 18/810,170 entitled “Synchronized Motion Of Independent Surgical Devices To Maintain Relational Field Of Views” filed Aug. 20, 2024, U.S. patent application Ser. No. 18/810,208 entitled “Alignment And Distortion Compensation Of Reference Planes Used By Surgical Devices” filed Aug. 20, 2024, U.S. patent application Ser. No. 18/810,230 entitled “Shared Set Of Object Registrations For Surgical Devices Using Independent Reference Planes” filed Aug. 20, 2024, U.S. patent application Ser. No. 18/810,266 entitled “Coordinated Control Of Therapeutic Treatment Effects” filed Aug. 20, 2024, U.S. patent application Ser. No. 18/810,283 entitled “Functional Restriction Of A System Based On Information From Another Independent System” filed Aug. 20, 2024, U.S. patent application Ser. No. 18/809,960 entitled “Inter-Connectivity Of Data Flows Between Independent Smart Systems” filed Aug. 20, 2024, U.S. patent application Ser. No. 18/810,041 entitled “Inter-Connectivity Of Data Flows Between Independent Smart Systems” filed Aug. 20, 2024, U.S. patent application Ser. No. 18/810,119 entitled “Processing And Display Of Tissue Tension” filed Aug. 20, 2024, U.S. patent application Ser. No. 18/810,175 entitled “Situational Control Of Smart Surgical Devices” filed Aug. 20, 2024, U.S. patent application Ser. No. 18/810,222 entitled “Method For Activation Mode Determination Of An Energy Device” filed Aug. 20, 2024, U.S. patent application Ser. No. 18/810,274 entitled “Visual Data-Based Activation Mode Determination Of An Energy Device” filed Aug. 20, 2024, U.S. patent application Ser. No. 18/810,346 entitled “Electrical Data-Based Activation Mode Determination Of An Energy Device” filed Aug. 20, 2024, U.S. patent application Ser. No. 18/810,355 entitled “Mechanical Data-Based Activation Mode Determination Of An Energy Device” filed Aug. 20, 2024, U.S. patent application Ser. No. 18/810,361 entitled “Multi-Sourced Data-Based Activation Mode Determination Of An Energy Device” filed Aug. 20, 2024, U.S. patent application Ser. No. 18/810,407 entitled “Conflict Resolution For Activation Mode Determination Of An Energy Device” filed Aug. 20, 2024, and U.S. patent application Ser. No. 18/810,419 entitled “Controlling Patient Monitoring Devices” filed Aug. 20, 2024, which are each hereby incorporated by reference in their entireties.


Operating Intelligent Surgical Instruments

An intelligent surgical instrument, such as the surgical instrument 114 of FIG. 1, the surgical instrument 131 of FIG. 2, or the surgical instrument 302 of FIG. 5, can have an algorithm stored thereon, e.g., in a memory thereof, configured to be executable on board the intelligent surgical instrument, e.g., by a processor thereof, to control operation of the intelligent surgical instrument. In some embodiments, instead of or in addition to being stored on the intelligent surgical instrument, the algorithm can be stored on a surgical hub, e.g., in a memory thereof, that is configured to communicate with the intelligent surgical instrument.


The algorithm may be stored in the form of one or more sets of pluralities of data points defining and/or representing instructions, notifications, signals, etc. to control functions of the intelligent surgical instrument. In some embodiments, data gathered by the intelligent surgical instrument can be used by the intelligent surgical instrument, e.g., by a processor of the intelligent surgical instrument, to change at least one variable parameter of the algorithm. As discussed above, a surgical hub can be in communication with an intelligent surgical instrument, so data gathered by the intelligent surgical instrument can be communicated to the surgical hub and/or data gathered by another device in communication with the surgical hub can be communicated to the surgical hub, and data can be communicated from the surgical hub to the intelligent surgical instrument. Thus, instead of or in addition to the intelligent surgical instrument being configured to change a stored variable parameter, the surgical hub can be configured to communicate the changed at least one variable, alone or as part of the algorithm, to the intelligent surgical instrument and/or the surgical hub can communicate an instruction to the intelligent surgical instrument to change the at least one variable as determined by the surgical hub.


The at least one variable parameter may be among the algorithm's data points, e.g., may be included in instructions for operating the intelligent surgical instrument, and are thus each able to be changed by changing one or more of the stored pluralities of data points of the algorithm. After the at least one variable parameter has been changed, subsequent execution of the algorithm can be according to the changed algorithm. As such, operation of the intelligent surgical instrument over time can be managed for a patient to increase the beneficial results use of the intelligent surgical instrument by taking into consideration actual situations of the patient and actual conditions and/or results of the surgical procedure in which the intelligent surgical instrument is being used. Changing the at least one variable parameter is automated to improve patient outcomes. Thus, the intelligent surgical instrument can be configured to provide personalized medicine based on the patient and the patient's surrounding conditions to provide a smart system. In a surgical setting in which the intelligent surgical instrument is being used during performance of a surgical procedure, automated changing of the at least one variable parameter may allow for the intelligent surgical instrument to be controlled based on data gathered during the performance of the surgical procedure, which may help ensure that the intelligent surgical instrument is used efficiently and correctly and/or may help reduce chances of patient harm by harming a critical anatomical structure.


The at least one variable parameter can be any of a variety of different operational parameters. Examples of variable parameters include motor speed, motor torque, energy level, energy application duration, tissue compression rate, jaw closure rate, cutting element speed, load threshold, etc.


Various embodiments of operating surgical instruments are described further in, for example, U.S. Pat. App. Pub. No. 2023/0096268 entitled “Methods for Controlling Cooperative Surgical Instruments” filed Oct. 5, 2021, U.S. Pat. App. Pub. No. 2023/0097906 entitled “Surgical Methods Using Multi-Source Imaging” published Mar. 30, 2023, U.S. Pat. App. Pub. No. 2023/0095002 entitled “Surgical Methods Using Fiducial Identification And Tracking” published Mar. 30, 2023, U.S. Pat. App. Pub. No. 2023/0101750 entitled “Surgical Methods For Control Of One Visualization With Another” published Mar. 30, 2023, U.S. Pat. App. Pub. No. 2023/0100698 entitled “Methods For Controlling Cooperative Surgical Instruments” published Mar. 30, 2023, U.S. Pat. App. Pub. No. 2023/0103005 entitled “Methods for Controlling Cooperative Surgical Instruments” published Mar. 30, 2023, and U.S. Pat. App. Pub. No. 2023/0098538 entitled “Cooperative Access Hybrid Procedures” published Mar. 30, 2023, which are hereby incorporated by reference in their entireties.


Data Pipelines

As discussed herein, data may be transmitted from one point to another point, such as during a performance of a surgical procedure on a patient. The data may be transmitted from a source system to a destination system using a data pipeline.


As shown in FIG. 6A, a data pipeline 400 may move data from a source 402 to a destination 404 both physical and virtual (transient). In some data pipelines, the destination 404 may be called a “sink” or a “target.” Any time data is processed between point A and point B (or between multiple points such as points B, C, and D), there is a data pipeline 400 between those points. In general, the data pipeline 400 can include a set of tools and processes, which may be referred to as “steps” or “processing steps” used to automate the movement and transformation of data between the source 402 and the destination 404.


In some embodiments, the source 402 and the destination 404 are two different elements, such as a first element of a surgical system and a second element of a surgical system. The data from the source 402 may or may not be modified by the data pipeline 400 before being received at the destination 404. For example, the source 402 may be one of the wearable sensing system(s) 111, the environmental sensing system(s) 115, the robotic system(s) 113, the intelligent instrument(s) 114, or the human interface system(s) 112 of the surgical system 102 of FIGS. 1 and 3 and the destination 404 may be the surgical hub 106 of the surgical system 102 of FIGS. 1 and 3. For another example, the source 402 may be one of the wearable sensing system(s) 111, the environmental sensing system(s) 115, the intelligent instrument(s) 114, or the human interface system(s) 112 of one of the surgical systems 102, 103, 104 of FIG. 1 and the destination 404 may be the surgical hub 106 of another one of the surgical systems 102, 103, 104 of FIG. 1. For yet another example, the source 402 may be one of the wearable sensing system(s) 111, the environmental sensing system(s) 115, the robotic system(s) 113, the intelligent instrument(s) 114, or the human interface system(s) 112 of the surgical system 102 of FIGS. 1 and 3 and the destination 404 may be another one of the wearable sensing system(s) 111, the environmental sensing system(s) 115, the robotic system(s) 113, the intelligent instrument(s) 114, or the human interface system(s) 112 of the surgical system 102 of FIGS. 1 and 3. For still another example, the source 402 may be one of the surgical systems 102, 103, 104 of FIG. 1 and the destination 404 may be another one of the surgical systems 102, 103, 104 of FIG. 1.


In some embodiments, the source 402 and the destination 404 are the same element. The data pipeline 400 may thus be purely about modifying the data set between the source 402 and the destination 404.


As shown in FIG. 6B, the data pipeline 400 may include one or more data connectors 406 that extract data from the source 402 and load the extracted data into the destination 404. A plural “N” number of data connectors 406 are shown in FIG. 6B. In some embodiments, such as embodiments in which extract, transform, and load (ETL) processing of data is performed, as opposed to extract, load, and transform (ELT) processing of data, data may be transformed within the data pipeline 400 before the data is received by the destination 404. In other embodiments, such as embodiments in which ELT processing of data is performed, as opposed to ETL processing of data, the one or more data connectors 406 may simply load raw data to the destination 404. In some instances, light transformations may be applied to the data, such as normalizing and cleaning data or orchestrating transformations into models for analysts, before the destination 404 receives the data.


The data pipeline 400 can include physical elements like one or more wires or can include digital elements like one or more packets, network traffic, or internal processor paths/connections. Flexible data pipelines are portions of the overall system where redundant paths can be utilized and therefore data, e.g., a data stream, can be sent down one path, parsed between multiple parallel paths (to increase capacity) and these multiple paths can be flexibly adjusted by the system as necessary to accommodate changes in the volume and details of the data streams as necessary.


The data pipeline 400 can have a small code base that serves a very specific purpose. These types of applications are called microservices.


The data pipeline 400 can be a big data pipeline. There are five characteristics of big data: volume, variety, velocity, veracity, and value. Big data pipelines are data pipelines built to accommodate more than one of the five characteristics of big data. The velocity of big data makes it appealing to build streaming data pipelines for big data. Then data can be captured and processed in real time so some action can then occur. The volume of big data requires that data pipelines must be scalable, as the volume can be variable over time. In practice, there are likely to be many big data events that occur simultaneously or very close together, so the big data pipeline must be able to scale to process significant volumes of data concurrently. The variety of big data requires that big data pipelines be able to recognize and process data in many different formats—structured, unstructured, and semi-structured.


In general, an architecture design of a data pipeline, e.g., the data pipeline 400 of FIGS. 6A and 6B, can include interconnectivity between a first smart device and a second smart device, e.g., the source 402 and the destination 404 of FIGS. 6A and 6B. Data generated in one source system (e.g., the first smart device or the second smart device) may feed multiple data pipelines, which may have multiple other data pipelines dependent on their outputs.


The interconnectivity between the first smart device and the second smart device may be on a common/shared network, e.g., LAN, Wi-fi, powerline networking, MoCA networking, cellular (e.g., 4G, 5G, etc.), low power wide area network (LPWAN), Zigbee, Z-wave, etc.


The interconnectivity between the first smart device and the second smart device may be on a structured network. Traditionally, structured peer-to-peer (P2P) networks implement a distributed hash table (DHT). In order to route traffic efficiently through the network, nodes in a structured overlay must maintain lists of neighbors that satisfy specific criteria. This makes them less robust in networks with a high rate of churn (e.g., with large numbers of nodes frequently joining and leaving the network). DHT-based solutions may have a high cost of advertising/discovering resources and may have static and dynamic load imbalance.


The interconnectivity between the first smart device and the second smart device may be via cooperative networking. Cooperative networking utilizes a system that is a hybrid of a P2P network and a server-client network architecture, offloading serving to peers who have recently established direct interchanges of content.


The interconnectivity between the first smart device and the second smart device may be exclusive. For example, the interconnectivity may be exclusive via Bluetooth. For another example, the interconnectivity may be exclusive via network isolation, such as by using path isolation, a virtual private network (VPN), or a secure access service edge (SASE). The path isolation may include a software-defined LAN (SD-WAN). SD-WANs rely on a software and a centralized control function that can steer traffic across a WAN in a smarter way by handling traffic based on priority, security, and quality of service requirements. The VPN may involve creation of an independent secure network using common/shared open networks. Another network (a carrier network) is used to carry data, which is encrypted. The carrier network will see packets of the data, which it routes. To users of the VPN, it will look like the systems are directly connected to each other.


For example, with interconnectivity between the first smart device and the second smart device being exclusive in a surgical context, an operating room (OR) may have a surgical hub and an established network from a first vendor. In order to secure against hacking or data leakage, the network may be an encrypted common network which the first vendor supplies keys from. A surgical stapler in the OR may be from a second vendor that is different from the first vendor and that does not have the keys from the first vendor. The surgical stapler may want to link to other device(s) it relies on for functionality but does not want data leakage. An advanced energy generator from the second vendor with an accompanying smoke evacuator may also be in the OR and may form their own private network, such as by piggybacking on the first vendor network to create a second encrypted VPN routing through the first vendor network as a primary network or by forming an independent wireless network for bi-direction communication between the advanced energy generator and the smoke evacuator. The surgical stapler may want to communicate with the advanced energy generator, e.g., so the surgical stapler may retrieve updated software from the advanced energy generator, receive tissue properties information from the advanced energy generator, log data for exportation, and receive energy from the advanced energy generator and apply the energy to tissue, but not want to communicate with the smoke evacuator, e.g., because the surgical stapler performs no smoke evacuation. The surgical stapler and a communication backplane of the advanced energy generator may therefore form an isolated network with only the surgical stapler (first smart device) and the advanced energy generator (second smart device) able to communicate via the isolated network and with the surgical hub able to manage the data pipeline between the surgical stapler and the advanced energy generator.


In general, one or more steps may be performed along a data pipeline, e.g., the data pipeline 400 of FIGS. 6A and 6B. The steps in the data pipeline may include data transformation, data augmentation, data enrichment, data filtering, data grouping, data aggregating, and running algorithms against the data.


The data aggregation may include segmentation of data into buckets (e.g., decomposition of a procedure into sub-steps), data fusion and interfacing, and mixing real-time data streams with archived data streams. Various embodiments of data aggregation are described further in, for example, U.S. Pat. App. Pub. No. 2022/0104896 entitled “Interactive Information Overlay On Multiple Surgical Displays” published Apr. 7, 2022, U.S. Pat. App. Pub. No. 2019/0206564 entitled “Method For Facility Data Collection And Interpretation” published Jul. 4, 2019, and U.S. Pat. App. Pub. No. 2024/0221937 entitled “Method For Advanced Algorithm Support” published Jul. 4, 2024, which are hereby incorporated by reference in their entireties.


In one embodiment, mixing real-time data streams with archived data streams may include, in a surgical context, pre-operative data/imaging evaluation. The evaluation may include displaying of static preoperative scan(s), overlaying of video with aligned 3D model, and registering a virtual view to a camera view.


In one embodiment, the display of static pre-operative scan may include alignment based on surgeon (or other HCP) position, for example, where the surgeon (or other HCP) is standing.


In one embodiment, registering the virtual view to the camera view may include identifying organs in a video and triangulating with camera location and/or getting a camera location in reference to a coordinate system. For example, during performance of a surgical procedure, a camera location may be acquired with respect to a trocar by 3D tracking of the trocar, camera insertion in the trocar (e.g., insertion depth and/or insertion angle, and/or determination of what trocar is being used for the camera. An example of insertion depth is a marking on a shaft of the trocar, such as on a graphical scale or a color gradient. Examples of insertion angle in a trocar are 3D trocar orientation and 3D angle of attack.


In one embodiment, the pre-operative data/imaging evaluation may include using a machine learning (ML) algorithm to reviewing preoperative scans of a patient to identify any abnormalities. A cloud-based source may be used for augmented reality (AR) using cloud-based data for surgical procedure planning.


In one embodiment, an ML algorithm may be used in an initial planning stage, e.g., initial planning for a surgical procedure to be performed on a patient. Preoperative scans may be used to facilitate surgical path planning. If the initial scans detect anomalies or diseased tissues, as analyzed by the ML algorithm, the anomalies or diseased tissues may be relayed to the surgeon for the upcoming surgical procedure and a new surgical task order may be suggested based on how previous surgeons handled these problems. The relayed information to the surgeon may also include a recommended inventory list to have on hand based on this initial improved surgical task order.


For example, during preoperative scans for a sleeve gastrectomy, a small hernia may be discovered. This hernia may be highlighted during the surgical planning step and the surgeon may be asked if the surgeon wants to include a hernial repair in the initial sleeve gastrectomy plan. Based on the surgeon's answer, the hernial repair will be added into the surgical task order for an affirmative surgeon answer and the overall inventory for this case will be updated to include relevant items for the hernial repair added into the surgical task order. During performance of the sleeve gastrectomy, a ML algorithm may be used to detect diseased tissue or surgical anomalies. If a diseased tissue is discovered, the diseased tissue may be highlighted on a screen, e.g., on a HID, and a cutting path/angle may be recommended to avoid the tissue or make the tissue state more manageable. These recommendations may be based on how surgeons previously, successfully, handled these situations. If a surgical anomaly is discovered, the system may either automatically update the task order or require the surgeon to give a verbal command (or other command) to update the task order and highlight the required additional inventory on the circulators screen. For foreign bodies (such as bougies) that may be discovered, the foreign body may be highlighted on the screen and a cutting path may be included to provide an ample margin around the foreign body, assuming the foreign body is anticipated. If the foreign body is not anticipated, the foreign body may be highlighted to draw the surgeon's (and/or other HCP's) attention to it.


In one embodiment, the pre-operative data/imaging evaluation may include a cloud comparison of scans periodically taken through time for anatomic changes of that time to indicate possible operative complications. A cloud-based source may be used for augmented reality (AR) using preoperative scans to enhance return surgeries.


Looking at a difference between current and previous surgical scans may help inform the surgeon and/or other HCP and improve patient outcomes. This information can be used in various ways, for example for disease detection, informing surgical task planning, and/or informing previous surgical success and healing.


With respect to disease detection, current and historical scans can be used to determine if various disease states or abnormalities have evolved between surgeries. One case where this could be particularly useful is cancer detection. If a scan initially picks up an abnormal growth for a patient but the patient's HCP decides that it is benign but decides to flag it for caution, a follow-up scan nay confirm that the abnormality is benign or not. The scan may also automatically highlight areas of concern (tissue growth) that were not flagged by the HCP initially but could be areas of concern.


With respect to informing surgical task planning, information about previous surgeries (e.g., potential areas of scare tissue, previously seen difficult tissue, etc.) can help facilitate surgical step and path planning. This information can also be used during the surgery to display areas of scarring, changes of tissue from previous surgeries that might need to be examined, foreign bodies, and/or new adhesions.


With respect to informing previous surgical success and healing, data from various scans over time can be used to determine how successful patients were recovering or had recovered from previous surgeries. This information may be used by surgeons (and/or other HCPs) to help plan future procedures, assess previous work, and/or facilitate quicker patient recovery.


Data development may be performed as a step in a data pipeline and may include one or more of data modeling, database layout and configuration, implementation, data mapping, and correction.


Various data may be communicated using a data pipeline, e.g., the data pipeline 400 of FIGS. 6A and 6B, such as data from a local data source, data from a remote data source, and synthetically generated data.


Data from a local data source may include data collected by, used by, or resulting from the operation of aspects of the local data source, e.g., data gathered using a sensor (e.g., temperature data gathered using a temperature sensor, force data gathered using a force sensor, pressure measured using a pressure sensor, etc.), still and/or video image data (e.g., data gathered by a camera, etc.), operational parameters of a surgical instrument (e.g., energy level, energy type, motor current, cutting element speed, etc.), surgical instrument identification (e.g., instrument type, instrument serial number, etc.), etc.


Data from a local data source may have metadata, which may reflect aspects of a data stream, a device configuration, and/or system behavior that define information about the data. For example, metadata may include an auxiliary data location that is shared by two interconnected systems, e.g., first and second robotic systems, etc., to create a single “brain” instead of two distinct ones. Each of the interconnected systems may creates a copy of its memory system and introduce it to the combined system or “collective. Both of the interconnected systems may now use this new area for data exchange, for uni-directional communication, and to directly command control systems. The new combined system may become primary and the individual robotic system's memory areas may become secondary memory areas until the systems are “unpaired,” e.g., are no longer interconnected.


Data from a local data source may include a data stream that is monitored by at least one other system. In this way, data collected by, used by or resulting from the operation of aspects of one system may be sourced to another system (the monitoring system).


In one embodiment, application programming interfaces (APIs) may be used to communicate data from a local source.


In one embodiment, data may be communicated from a local source in response to occurrence of a trigger event. In one embodiment, the trigger event is a digital trigger event. For example, in a surgical context, the trigger event may be a surgical instrument changing orientation after being in a predetermined static position, such as when the surgical instrument “wakes up.” For another example, in a surgical context, the trigger event may be a system's power or signal interruption, e.g., communicating data after the interruption has been resolved. For yet another example, in a surgical context, the trigger event may be a change in system status, capacity, or connectivity. For still another example, in a surgical context, the trigger event may be quality, calibrations, or conversion factors of a surgical instrument.


Data from a remote data source may include data collected by, used by, or resulting from the operation of aspects of the remote data source. One example of a remote data source includes a database, such as a relational database or an NoSQL database. Examples of data in a relational database relevant in a surgical context can include inventory, sterile services process status, billing code, patient records (PHI), and previous procedure data.


One example of data from a remote data source, in a surgical context, includes a procedure plan, e.g., a plan for a surgical procedure to be performed on a patient. The procedure plan data can include, for example, instrument selection, port placement, adjuncts needed for devices, OR timing and local imaging needs, procedural steps, staff number and skill composition, and patient positioning.


Another example of data from a remote data source, in a surgical context, includes pre-operative imaging, such as a CT full body scan, external ultrasound, MRI, etc.


Another example of data from a remote data source includes software parameter updates, such as software parameter updates streaming from a cloud computing system. The software parameter updates can include, for example, original equipment manufacturer (OEM) updates to a device's operational aspects, e.g., updated basic input/output system (BIOS) controls, calibrations, updates on capabilities (e.g., recalls, limits/expansion of use, indications, contra-indications, etc.), etc.


Another example of data from a remote data source includes gold standard of care or outcomes improvement data, such as gold standard of care or outcomes improvement data from a cloud computing system. Gold standard of care or outcomes improvement data can include, for example, improved techniques of device use and/or device combinations.


In one embodiment, Apache® Hadoop®, which is an open source software framework, may be used for distributed processing of data across computer systems.


Examples of types of synthetically generated data may include synthetic text, media (e.g., video, image, sound, etc.), tabular data, and calculated continuous stream of data. The calculated continuous stream of data may be randomly generated (bracketed by extremes thresholds) or may be based off another stream or real continuous data stream that is modified to fit the stream limits of the expected synthetic stream. Reasons for using synthetically generated data can include for training data streams, because of missing data from an expected system that would otherwise draw a device error but is not relevant to the operation of the device or other dependent device, for data streams designed to verify the operational of the transforms or mathematic algorithms, for data streams intended to either verify security or prevent fraud/inauthenticity, for consecutive timing data for redaction of real-time from the relational data of the systems, for creations of trending data for replacement of legal compliance regulated data streams (e.g., either they produce datasets from partially synthetic data, where they replace only a selection of the dataset with synthetic data), and/or for a sudden but anticipatable/explainable change in a data source's feed which is being used as a closed loop control for a destination.


For example of use of synthetically generated data in a surgical context in which a surgical procedure is being performed on a patient, a PO2 sensor (data source) on the patient's finger may be being used as a means for controlling a closed loop feed or O2 through a ventilator (data destination). The ventilator also has an internal closed loop on CO2 outlet concentration, but since O2 blood saturation is the desired fixed relationship to O2 supplementation level, the ventilator is using the PO2 sensor from the patient monitoring system. There may be an abrupt change in the O2 level as measured by the PO2 sensor. The ventilator has two choices: ether switch to the trailing indicator or CO2 which has not had an abrupt change or examine other data sources to try to explain the O2 shift. When compared to the patient's core body temperature measure it may be discovered that the patient's temperature has dropped across a 1.5° C. below normal threshold that usually induces vasoconstriction limiting blood flow to the body's extremities. The PO2 measure by its metadata is known by the ventilator to be a finger monitor and therefore on an extremity. Further comparison over time may show the O2 measure fairly constant before the shift and then fairly constant after the shift as well, reinforcing the idea that this is the vasoconstriction that induced the shift. The ventilator may then create a synthetic data stream based on the shift data pattern and behavior that compensates for the vasoconstriction shift so the ventilator can continue on the primary linked feeds but using a modified synthetic or “calculated” data stream based of a real stream. For example, current body temperature control systems, such as a Bair Hugger™ device, are open-loop user settable heat gradient controlled systems but are affected by local temperature and environment.


Geofences

In various aspects, the present disclosure provides methods, devices, and systems for geofencing. The geofencing may help determine what sources of data a system could use, e.g., determine one or more sources (e.g., the source 402 of FIGS. 6A and 6B) for a destination (e.g., the destination 404 of FIGS. 6A and 6B). The system, e.g., the destination, may thus have access to all available data and thus facilitate the most effective decision-making.


In general, a geofence (also referred to herein as a “fence”) is a virtual geographical boundary that may be defined by Global Positioning System (GPS) technology or by Radio-Frequency Identification (RFID) technology. In one embodiment, the methods, devices, and systems for geofencing described herein may include determination, creation, and utilization of a geofence surrounding a common surgical area.


As discussed above, various data may be communicated using a data pipeline, e.g., the data pipeline 400 of FIGS. 6A and 6B, such as data from a local data source. An operating room (OR) may define a “local” site, with each possible data source in the OR being a local data source. In other words, an OR may define a common surgical area for a geofence.


For another example, a surgical site may define a “local” site, with each possible data source located at the surgical site being a local data source. In other words, a surgical site may define a common surgical area for a geofence.


Different types of geofences may be established, e.g., geofence software being executed on a surgical hub, a cloud-based server, or other computer system. In one embodiment, a user may manually define a geofence around the boundary of a desired area, e.g., an OR, a surgical site, etc. In one embodiment, a geofence may be created automatically, e.g., by the geofence software, based on identifying different regions in a medical environment, e.g., an OR, a surgical site, etc. For example, the geofence software may create a geofence based on the different visual properties of the medical environment.


Examples of types of geofences include an absolute geofence, a relative geofence, and a localized or nested geofence. In general, an absolute geofence is a geofence created for geographically separate areas with no overlap. In general, a relative geofence is a geofence that has overlap between two or more geofences. In general, a localized or nested geofence may be established as a child to parent geofence with the nested geofence being contained entirely within the parent geofence.


In one embodiment, one or more nested geofences may be established as a child to a parent geofence. A nested geofence may be established because of localized interference preventing high precision in a region, e.g., a region outside of the nested geofence but within the parent geofence. For example, a parent geofence defined by an OR may include four child, nested geofences with the child geofences breaking the OR into quadrants.



FIG. 7 illustrates one embodiment of a parent geofence 1500 defined by an OR. Boundaries of the parent geofence 1500 may be defined by boundaries of the OR, e.g., the entire room. Any equipment entering or leaving the OR may thus be easily understood based on their presence in, movement into, or movement out of the parent geofence 1500. The parent geofence 1500 may tie into, e.g., be able to communicate with, operating room and inventory management systems to allow equipment within the parent geofence 1500 to understand the of the OR. A child geofence 1502 defined by a surgical field in the OR may be nested in the parent geofence 1500. Any equipment entering or leaving the surgical field may thus be easily understood based on their presence in, movement into, or movement out of the parent geofence 1500.


In an exemplary embodiment, properties of nested geofences may be enabled/disabled based on at least one of proximity, situational awareness, and hierarchical prioritization. For example, multiple discrete geofences within a common surgical area may interact when they coincide, e.g., overlap at least partially, with each other, differently than when they are completely separated. This nesting or overlapping of fences may have implication on one or more the fences directly related to the area of overlap. For example, when first and second geofences overlap, the interaction may be inheriting parameters of the first geofence to the second geofence or may be adding or subtracting aspects of rules for operating within the first geofence.


Using the embodiment of FIG. 7 by way of example, a first surgical instrument 1504 in the parent geofence 1500 that enters the child geofence 1502 (as shown in FIG. 7) may gain one or more new permissions (e.g., may allow the first surgical instrument 1504 to power up and get its initial parameters uploaded, etc.) and/or now operate with different equipment (e.g., begin attempting to wirelessly connect with a surgical hub, etc.) due to entering the child geofence 1502, e.g., entering the surgical field. All these activities can be done during the first surgical instrument's movement, e.g., the first surgical instrument 1504 being on a back table 1510 outside the child geofence 1502 and being taken off the back table 1510 by a scrub tech and handed to a surgeon within the child geofence 1502. Similarly, the first surgical instrument 1504 leaving the child geofence 1502 but still being within the parent geofence 1500 may lose one or more existing permissions and/or now operate with different equipment due to leaving the child geofence 1502, e.g., leaving the surgical field. Similarly, the first surgical instrument 1504 leaving the parent geofence 1500 may lose one or more existing permissions and/or now operate with different equipment (e.g., no longer communicate with a second surgical instrument 1506 located in the child geofence 1502, etc.) due to leaving the parent geofence 1500, e.g., leaving the OR. Similarly, a third surgical instrument 1508 entering the parent geofence 1500 (as shown in FIG. 7) may gain one or more existing permissions and/or now operate with different equipment (e.g., now communicate with the first surgical instrument 1502 also located in the child geofence 1502, now communicate with a surgical hub in communication with a visualization system visualizing the surgical field, etc.) due to entering the parent geofence 1500, e.g., entering the OR.


For another example, a parent geofence, e.g., the parent geofence 1500 of FIG. 7, may be established as defined by an OR. A patient in the OR may define a plurality of nested geofences located in the parent geofence 1500. A head and neck area of the patient may define a first nested geofence of the plurality of nested geofences. A thoracic cavity of the patient may define a second nested geofence of the plurality of nested geofences. A stomach/intestine area of the patient may defined a third nested geofence of the plurality of nested geofences.


In one embodiment, a geofence system including a parent geofence and one or more child, nested geofences may define initial positions of all HCPs, e.g., all OR staff and surgeon, in an OR in which a surgical procedure is to be performed on a patient. Using a procedure plan, e.g., predetermined plan for the surgical procedure, as a guide, a system controlling the geofences (e.g., a surgical hub, a cloud-based server, or other computer system) may recommend optimal positions of the HCPs around the patient in the OR.


In one embodiment, surgeon movement in a geofence system including a parent geofence and one or more nested geofences may trigger changes. For example, the surgeon entering a first one of the one or more nested geofences may trigger any “off” surgical instruments in the first nested geofence to “wake up” in anticipation of possible imminent use by the surgeon. For another example, the surgeon exiting the first one of the one or more nested geofences may trigger any “on” surgical instruments in the first nested geofence to “sleep” since the surgical instrument(s) are not out of the surgeon's reach. For yet another example, the surgeon entering a second one of the one or more nested geofences may trigger any surgical instruments in the second nested geofence to begin communicating data to a surgical hub in anticipation of possible imminent use by the surgeon. For example, the surgeon entering the parent geofence may trigger any “off” surgical instruments in the parent geofence to “wake up” in anticipation of a surgical procedure beginning soon.


In one embodiment, in a geofence system including a parent geofence and one or more nested geofences, a current step of a surgical procedure being performed may trigger changes. For example, the current step of the surgical procedure, which may be tracked by a surgical hub, a cloud-based server, etc., may automatically trigger an adjustment in where HCPs and devices should be positioned before the current step of the surgical procedure is performed. The HCPs may be informed of the adjustment by, for example, an indication of the positions being provided via one or more HIDs.


In one embodiment, in a geofence system including a parent geofence and one or more child, nested geofences, situational awareness may trigger changes. For example, if a first device in one of the one or more nested geofences malfunctions and needed data is consequently unavailable, the one of the one or more nested geofences may update, e.g., under control of a surgical hub, cloud-based server, etc., to include a second device that is within the parent geofence and capable of providing that necessary information.


In one embodiment, in a geofence system including a parent geofence and one or more child, nested geofences, potential hacking may trigger changes. For example, if a competitive smart system is observed trying to obtain data from a secondary smart system without permission, a nested geofence perimeter update, e.g., under control of a surgical hub, cloud-based server, etc., can exclude the competitive smart system that is attempting to steal information.


In a geofence system including a parent geofence and one or more child, nested geofences, within a particular one of the nested geofences, the particular one of the nested geofences may inherit attributes related to its own position from the parent geofence. This is based on the parent geofence being an absolute geofence that may be fixed and have an absolute location while the location of each of the one or more child geofences may be relative and thus each be a relative geofence.


For example, an absolute geofence may exist within an operating room, with a control system, e.g., a surgical hub, cloud-based server, etc., being able to detect pieces of capital equipment moving within the absolute geofence. However, due to obstructions and people moving in the absolute geofence. the control system may struggle to determine with high accuracy a location of each of a plurality of devices located near a surgical field within the absolute geofence. A relative geofence may also exist much closer to the surgical field, which may allow this struggle to be overcome. However, the absolute and relative geofences' positions are only relatively known to each other. Through cooperation of systems in the absolute geofence, however, the relative geofence can inherit the necessary distance parameters of its core components in relation to each other from the parent geofence. As a result, the relative geofence may transition to an absolute geofence.


For another example, as shown in FIG. 8, the parent geofence 1500 and the child geofence 1502 of FIG. 7 may be absolute and relative geofences. A distance 1512 between a first sensor 1514, which is located within the relative geofence 1502 (and thus also within the absolute geofence 1500), and a second sensor 1516, which is on a cart 1518 and located outside of the relative geofence 1502 and within the absolute geofence 1500, may be unknown by itself. However, the distance 1512 can be resolved by use of the absolute geofence 1500, e.g., by a control system (e.g., a surgical hub, cloud-based server, etc.) that encompasses both sensors 1514, 1516.


Nested geofences may have multiple levels or multiple differing size geofences that have differing results. A reason for the multiple level of zones may be based on safety or the type of interaction the fences control, e.g., strong local effects that may have collateral damage implications, broader effects may have control loop or data handling implications, or broadest affected zones may have interference or sensing/control implications. For example, power level may adjust size or magnitude of the surrounding zone or zones.


As mentioned above, a nested geofence may inherit attributes (zones) related to its own position from a parent geofence in which the nested geofence is located. Inherited zones may take multiple forms.


In one embodiment, inherited zones can overwrite existing zones and thus replace behavior permanently. In such a scenarios, when a first fence (e.g., a nested fence) is within a second fence (e.g., a parent fence or another nested fence), the second fence may take on all the parameters of the first fence. For example, in a surgical context, each of a plurality of specific smart device may have its own localized geofence (e.g., nested geofence within a parent geofence) that moves with the smart device and affects more stationary, other geofences as the smart device is moved.


In one embodiment, inherited zones may have additive/subtractive effects of a first zone and a second zone and thus replace behavior conditionally. For example, in a surgical context in which a surgical procedure is being performed in an OR on a patient, a local substrative geofence may be coupled to a monopolar blade and move with a surgical instrument that includes the monopolar blade. When the geofence of the surgical instrument overlaps with a local geofence of the patient (e.g., as indicated by patient monitoring sensors on the patient) and when the monopolar blade is activity energized, one or more smart devices within the overlapped area may need to modify their activity (e.g., ignore interference, prevent capacitive coupling, etc.). The geofence of the perturbance (in this case the local geofence of the surgical instrument with the monopolar blade origin of electromagnetic (EM) disturbance) may necessitate multiple levels of geofencing concentric to the same origin. Since the EM field is quadratic (inverse squared law) a first area of effect (zone 0) may be indicative of the EM field creating electrical current in or near by a metallic instrument (capacitive coupling). Where zone 1 might be EM interference with high signal strength sensors and may be up to 4-10 inches from the origin, and then the zone 2 could define interference with small signal (milliamp) sensing and may be 10-20 inches away. In this case these three “zones” or fences may move with the monopolar EM origin and as these move into other preestablished geofence areas of the OR they may change the behavior or reaction to other smart systems within that zone the by “inheriting” aspects of the mobile geofence.


In some aspects, an existing geofence relationship may be moved or changed. In one embodiment, the existing geofence relationship may be moved or changed by dynamically calibrating a geofence zone. In another embodiment, the existing geofence relationship may be moved or changed by dynamically moving a geofence.


With respect to dynamic calibration of a geofence zone, a nested geofence may be used to allow for dynamic calibration or re-orientation of a zone within a larger geofence, e.g., a parent geofence or another nested geofence in which the nested geofence is located. For example, in a surgical context in which a surgical procedure is being performed in an OR, as sensors in a child geofence (and thus also in a parent geofence that contains the child geofence) may each be attached to a specific entity, such as a patient bed, a surgical light, etc., the sensors may move with their respective specific entity as the entity moves throughout the OR. This creates the potential that the entity may be used a zero point for always identifying a specific location within the operating room, and devices relative to that origin point or region within the operating room.


With respect to dynamically moving a geofence, using nested geofences may allow for dynamic relocation of a child geofence relative to a parent geofence that contains the child geofence. By attaching the nested geofence to a device or person, such as, in a surgical context, capital equipment, a surgical bed, or other device or person within an operating room, it may allow the nested geofence to establish a specific zone of interest relative to that device or person. This zone of interest can move with that device or person, as it moves, and ensure that the appropriate zoning relative to it is maintained. For example, in a surgical context in which a surgical procedure is being performed in an OR, surgical staff for the surgical procedure may want to know all equipment that is within 3 feet of a surgical bed. For another example, in a surgical context in which a surgical procedure is being performed in an OR, an OR monitoring system may want to know if equipment has been placed on a back table to prepare for surgery or if equipment has been discarded into a waste recycling bin.


Using geofencing may change communication interactions between first and second surgical systems directly versus communication interactions between the first and second surgical systems indirectly via a surgical hub. The change may be with respect to data rate, bandwidth capacity, triangulation, hierarchical prioritization, latency, precision/accuracy, and/or data transferability.


With respect to triangulation, an orientation or location of a device may be determined through a relationship of more than one sensed geofence including a parent geofence and a child geofence nested in the parent geofence. Within a child, nested geofence, a triangulated device may be paired to a second device with a parent geofence that contains the nested geofence. As the triangulated device enters a region of the child or zoned geofence, pairing between the triangulated device and the second device can be maintained across the two boundaries between the child and parent geofences. However, the determinism to the zone (e.g., in a surgical context, clinically relevant zone or scrub tech zone) may allow the system (e.g., control system such as a surgical hub, cloud-based server, etc.) to prioritize communications or certain triggers.


With respect to hierarchical prioritization, hierarchy relationships between geofence zones may be used, e.g., by a control system such as a surgical hub, cloud-based server, etc., to determine which aspects are inherited from one geofence to another. Hierarchical geofence zones may include parent-child, as well as multi-level zones.


For example, in a parent-child hierarchy, in a surgical context, a relationship may be a harmonic energy device's cord length where the harmonic energy device exceeds a maximum distance to an generator supplying energy to the harmonic energy device such that the harmonic energy device is not plugged into the generator any more.


In some aspects, a system (e.g., control system such as a surgical hub, cloud-based server, etc.) may allow for hierarchical zones to be established within a region of different nested geofences that may exist and/or overlap within a larger geofence. For example, in a surgical context in which a surgical procedure is being performed in an OR on a patient, a first zone defined by the entire OR may be low priority, a second zone defined by a back table in the OR may be medium priority, and a third zone defined by a sterile field in the OR may be high priority. With respect to the first zone, the system has an interest in knowing the devices it can pair with and discriminating against devices that are outside of the first zone, but communication may be infrequent and of a non-critical manner. Examples of information include heartbeats, beacon data, or low bandwidth identification data. With respect to the second zone, the system may expect information from this region to still be of a non-critical nature. However, the system may also need to be able to dynamically swap devices from this second zone into the third zone, which is the highest priority zone, e.g., at a top of the hierarchy. As a result, low-priority but higher bandwidth messages may be maintained, such as device configuration, identification, battery life and/or operating status, as well as any active alarm conditions. With respect to the third zone, information within this zone may be used to directly influence patient care, and as a result, may receive high bandwidth and minimized latency times to ensure messages are fully prioritized by the system.


Regarding relationships between zones, interdependencies may exist between smart systems. Local geofences may overlap and be re-prioritized based on which systems are actively interacting.


With respect to latency, messaging may be prioritized within a nested geofence. There exist constraints or limitations to zones, so message prioritization may help ensure that the higher priority messages are delivered/received. Message prioritization is discussed further below, but similar prioritization may be applied to all intercommunications and data packets.


In a surgical context in which a surgical procedure is being performed on a patient, a child geofence may be established by a control system, e.g., a surgical hub, cloud-based server, etc., to a clinically relevant area. Implementing message prioritization, the control system, e.g., a surgical hub, cloud-based server, etc., may prioritize messages to/from devices in the child geofence as appropriate based on a calculated client device location or a zone within the child geofence. As a result, devices within that clinically relevant area may be prioritized from a communication perspective and allow their message priorities to be elevated. Similarly, devices that are outside the child geofence may have their messages de-prioritized by the control system.


In one embodiment, a device may inherit message priority based on a geofence zone in which the device is located. For example, in a surgical context in which a surgical procedure is being performed on a patient, as a device is moving throughout the OR, the device may be constantly triangulated by a wireless triangulation system in the OR, and that system may continues to calculate the geofence location or zone the device is in. The system may continue to send that zone's identification back to the device, such as “Zone 1,” “Zone 2,” etc. The device may then use this geofence zone identification information to apply a corresponding priority to its own messages prior to transmission such that the receiving control system, e.g., a surgical hub, cloud-based server, etc., is informed of the message priority. The device thus knows or calculates the priority of the messages prior to transmission.


Messaging may refer generally to any and all communications. Thus, as a device enters into a geofenced area with another device already in the geofenced area, a control system, e.g., a surgical hub, cloud-based server, etc., may direct the two devices to begin direct communications between the two devices as well as with the central system.


A change with respect to precision/accuracy may result in improved absolute accuracy to a local area. For example, a parent geofence may establish an absolute geofence within a room (e.g., an OR or other room) for discriminating devices within or outside of the room. However, a control system, e.g., a surgical hub, cloud-based server, etc., may lack the accuracy to determine local devices when interference such as human bodies and metal equipment are moved throughout the OR, e.g., moved within the parent geofence. A localized or child geofence may allow for greater accuracy to a specific zone of interest.


A change with respect to data transferability, may allow for an ability to exclude a device or system from a geofence due to a regulation, such as a HIPPA regulation, which may enable patient data safety in instances where data transfer is prohibited. A geofence exclusion may change a defined shape of the geofence to a be non-symmetric, thus excluding specific devices from the geofence.


As mentioned above, a relative geofence is a type of geofence that may be created where boundaries of the geofence are relative to objects creating the geofence, as opposed to an absolute geofence where boundaries of the geofence are relative to fixed positions within a room. For example, in the illustrated embodiment of FIG. 7, a relative geofence 1520 may be established, e.g., by a control system, e.g., a surgical hub, cloud-based server, etc., to surround a specific object, such as surrounding an operating table 1522 located in the OR and thus also located in the parent geofence 1500. The relative geofence 1520 may be established via communication to a paired piece of capital equipment within the OR and thus within the parent geofence 1500 in which the control system may communicate with equipment. In this illustrated implementation, the relative geofence 1520 is established by triangulating the third surgical instrument 1508, the first sensor 1514, and the second sensor 1516. As a result, exact boundaries of the relative geofence 1520 may shift as there is movement with the capital equipment. However, sufficient accuracy may be retained to determine when devices have entered the vicinity of a sterile field, and may be in use by a surgeon, as opposed to the a nearby table. This may also allow for co-location of adjacent instruments.


As mentioned above, an absolute geofence is a type of geofence. As also mentioned above, an absolute geofence may be fixed (e.g., immobile) and have an absolute location. In one embodiment, the absolute location may be a boundary of a room, such as a boundary of the OR in the illustrated embodiment of FIGS. 7 and 8 that includes the parent geofence 1500 as an absolute geofence.


In one embodiment, the absolute location may be defined by fixed sensors that allow for known or more precise implementation in fixed regions, such as the boundary of an OR or other room. For example, in the illustrated embodiment of FIGS. 7 and 8, a plurality of fixed sensors 1524, 1526, 1528, 1530 are located in the OR. Each of the fixed sensors 1524, 1526, 1528, 1530 is located in a corner of the OR and may be used by the control system to triangulate with another device in the OR, such as the third surgical instrument 1508, e.g., an antenna of the third surgical instrument 1508, as in the illustrated implementation of FIG. 9.


Various aspects that may be involved in implementing a geofence are described further in, for example, U.S. Pat. No. 11,678,881 entitled “Spatial Awareness Of Surgical Hubs In Operating Rooms” filed Mar. 29, 2018, which is hereby incorporated by reference in its entirety.


In one embodiment, the methods, devices, and systems for geofencing described herein may include spatial orientation and navigation of devices located within a geofence. Data on orientation, location, relational data, and/or physical/communication interrelationships of the devices with respect to each other may be provided as an independent data feed from what is measured during the operation of each of the devices.


In one embodiment, spatial orientation and navigation of devices located within a geofence may include detection of a surgical instrument within a geofenced location. The detection of a surgical instrument within a geofenced location may include mutual triangulation with cooperating systems. For example, as shown in one embodiment illustrated in FIG. 10, a surgery may be performed with both a digitally enabled smart endocutter 1600 as well as a smart energy device 1602. The endocutter 1600 and the energy device 1602 may both be connected to respective systems, with the endocutter 1600 being connected to a surgical hub and the energy device 1602 being connected to a smart electrosurgical generator. Each of the endocutter 1600 and the energy device 1602 may be equipped with one or more wireless beacons 1604, 1606 to support triangulation, As a result, each of the endocutter 1600 and the energy device 1602 may be able to simultaneously triangulate its own position relative to other of the endocutter 1600 and the energy device 1602 within the operating field and provide that information to its respective system. These respective systems, e.g., the surgical hub and the generator, may then able to externally communicate or exchange information outside.


In one embodiment, spatial orientation and navigation of devices located within a geofence may include device tracking. Examples of device tracking include time of flight, vision, Bluetooth/radar, confirmation/identification of a wireless beacon, sensing, physical feature/attribute, and a magnetic field to calculate distance/location.


In one embodiment, the time of flight may be achieved using microelectromechanical system (MEMS) ultrasound sensors and an algorithm (e.g., implementing the algorithm within certain a body cavity and an amount of sensors), tracking zones within a body cavity, creating a feedback loop of a patient's body cavity (e.g., an abdominal cavity, etc.) with sensors within the patient, and/or FPGA digital signals.


In one embodiment, the vision may include pick-up shape of surgical jaws/blade.


In one embodiment, the Bluetooth/radar may include identifying location of a surgical instrument and then waiting until the surgical instrument moves into an imaging system's field of view, then vision may take over.


In one embodiment, the confirmation/identification of a wireless beacon may include the wireless beacon being within one zone (inside a surgical arena) then transitioning into another surgical zone (e.g., an abdominal cavity or other body cavity) which switches tracking over to a visual system.


In one embodiment, the sensing may include adding a material insert in a distal end of a top jaw of a surgical instrument, adding a sensor to jaws/distal end of a surgical instrument, and/or using MEMS ultrasound sensors.


In one embodiment, the physical feature/attribute may include creating a joint in a blade but more robust (e.g., an expected failure mode, etc.), a coating/tip/tracking feature that is always biologically inert, a load/resistance feedback at a distal end of a surgical instrument, and/or introducing in instrument signaling (e.g., using tantalum beads, etc.). The load/resistance feedback at a distal end of a surgical instrument may include tension, such as using a reverse load to determine tension, e.g., using a strain gauge, forces of a distal end of a surgical instrument through testing/robotic controls, and/or motor resistance learning over time, e.g., using a robotic surgical system's tool driver portion and/or understanding a location of top/bottom jaws of a surgical instrument.


Various embodiments of aspects that may be involved in detection of a surgical instrument within a geofenced location are described further in, for example, U.S. Pat. No. 11,678,881 entitled “Spatial Awareness Of Surgical Hubs In Operating Rooms” filed Mar. 29, 2018, U.S. Pat. App. Pub. No. 2023/0096268 entitled “Methods for Controlling Cooperative Surgical Instruments” filed Oct. 5, 2021, and U.S. Pat. No. 11,559,307 entitled “Method Of Robotic Hub Communication, Detection, And Control” issued Jan. 24, 2023, which are hereby incorporated by reference herein in their entireties.


In one embodiment, device tracking may include, in a surgical context in which a surgical procedure is being perform on a patient, how to identify a location of an end of a hot blade, e.g., of an energy device, to know the blade's 3D position in space. Identifying the hot blade's location may allow an alarm to be provided to a surgeon and/or other HCP indicating that the hot blade is getting close to tissue. The location may be identified using a secondary system to identify geometry, location, and spatial location and/or using direct penetrant secondary imaging to locate both the instrument location and adjunct tissues such as by using cone beam CT.


In one embodiment, using a secondary system to identify geometry, location, and spatial location may include use of a structured light system in combination with an imaging system to align local instrument coordinates to a global patient coordinate system, then use of local instrument measured systems to track the system from that point forward in combination with using instrument detection of the temperature of its system via the use of the phase shift of the natural frequency of the wave guide/blade. Various embodiments of a structured light system are described further in, for example, U.S. Pat. No. 11,219,501 entitled “Visualization Systems Using Structured Light” issued Jan. 11, 2022, which is hereby incorporated by reference in its entirety.


For example, use the structured light system in combination with the imaging system can include registering when the blade, e.g., the energy device enters the patient's fence, then establish direct communication with a central system (e.g., a surgical hub, a cloud-based server, etc.). An accelerometer within the energy device may then begin to communicate with other devices, e.g., devices x, y, and z. A port placement may be mapped into the patient's geofence and, more specifically, mapped to a skin surface of the patient. When the energy device is handed to the surgeon, the energy device may alert the central system that it is the current device of choice and the accelerometer may increase its published location data. In addition, the energy device may communicate its dimensions to the central system as a secondary computational entity. It will verify all the positional calculations for energy device tip location, as the blade is located at the energy device's end effector defining a tip of the energy device. From the data from the accelerometer indicating device orientation, the location of the port, and the device's dimensions, a final internal location can be established by the energy device. Using the skin surface as a focal point, an algorithm can be used by the energy device to map the energy device's end effector real time internally. Using this information overlayed on to pre-surgery internal scans, a determination can be made by the central system as to a risk level associated with the current location of the blade. The central system can be a second source of the output from the algorithm. Various methods may then be used to alert if the risk was too high. The alert may include, for example, a haptic buzz of the energy device, a warning on an overlaid monitor, an alarm tone, etc.


In one embodiment, using direct penetrant secondary imaging to locate both the instrument location and the adjunct tissues may include using cone beam CT to identify instrument location and adjacency to the tissue or structures in a flexible endoscopy space for in-situ re-calibration of the tip. It could be combined with the data stream from a generator or an advanced imaging system (e.g., IR spectrum monitoring) to determine blade temperature, and a combined feed may be used to display location and temperature.


In one embodiment, spatial orientation and navigation of devices located within a geofence may include determining location of relative orientations. Various embodiments of aspects that may be involved in determining location of relative orientations are described further in, for example, U.S. Pat. No. 11,678,881 entitled “Spatial Awareness Of Surgical Hubs In Operating Rooms” filed Mar. 29, 2018, U.S. Pat. Pub. No. 2022/0187163 entitled “Light-Transmission-Path-Spectrum Measurement Device, Light-Transmission-Path System, And Computer-Readable Medium” published Jun. 16, 2022, U.S. Pat. App. Pub. No. 2019/0200981 entitled “Method Of Compressing Tissue Within A Stapling Device And Simultaneously Displaying The Location Of The Tissue Within The Jaws” filed Dec. 4, 2018, and U.S. Pat. No. 11,559,307 entitled “Method Of Robotic Hub Communication, Detection, And Control” issued Jan. 24, 2023, which are hereby incorporated by reference herein in their entireties.


In one embodiment, determining location of relative orientations may include determining an absolute orientation and position of a device within a space. The absolute orientation and position may be determined in a 2D plane or a 3D plane.


The absolute orientation and position being determined in a 2D plane may include using fixed sensors with known geometries to accurately calculate orientation and location of a first device relative to a second device. Since the sensors are fixed within the respective first and second devices, their distance may be known absolutely to one another.



FIG. 11 illustrates one embodiment of using, e.g., a central system using, fixed sensors 1700, 1702 with known geometries to accurately calculate orientation and position of a first device 1704 relative to a second device 1706. A first distance D1 is known between the fixed sensors 1700 of the first device 1704. A second distance D6 is known between the fixed sensors 1702 of the second device 1706. Distances D2, D3, D4 may thus be determined between the fixed sensors 1700 of the first device 1704 and the fixed sensors 1702 of the second device 1706 by mapping to a 3D space. A model may then be created with a calculated center 1708 of the first device 1704, which is indicative of the first device's position, a calculated orientation 1710 of the first device 1704, a calculated center 1712 of the second device 1706, which is indicative of the first device's position, a calculated orientation 1714 of the second device 1706.


The absolute orientation and position may be determined in a 3D plane may include using a combination of RF antennas with an additional sensor.


One problem encountered during performance of a surgical procedure may be, for example, that RF field strength and intensity may be subject to external conditions, ambient RF noise, or other external conditions. Adjustments may be required to RF signal strength to accommodate local interference. This can impact signal calibration for received signal strength indication (RSSI). In one embodiment, determining location of relative orientations may include mitigations within an operating environment to account of varying signal strength and environmental factors.


Dynamic calibration of known distances can be another important aspect in spatial orientation and navigation located within a geofence, and can include concerns with error detection, methods for redundancy, combination method, combined uses with other objects, and co-location of independent beacons, which can have several uses including co-location of different devices to which trocar they have been inserted into (doable with relative triangulation), triangulation only, and mapping of co-located devices such as with a Bluetooth trocar.


In one embodiment, spatial orientation and navigation of devices located within a geofence may include handling perspective and scale of two images from differing sources. In some aspects, an imaging system without distance measurement as part of the device may be able to differentiate perspective, scale, and size.


In one embodiment, an option can be adjusting a data transformation of one data source relative to a data transformation of another data source in order for their data streams to be aligned, which may allow overlay and combination in feeds to other systems. For example, in a surgical procedure, such as a gallbladder procedure in which an ultrasound local imaging of stones, nodes, and ducts is converted to a laparoscopic coordinate system or a procedure in which a serosal endoscopic visualized tumor for a laparoscopic measure of location, when two images or data streams are used to monitor a single discrete organ or device, then one of the two likely will need to be adjusted so they both show the same size and orientation of the object being cooperatively viewed. The image that is currently being used by a surgeon to control the active surgical site is likely the prime or unadjusted feed with the other secondary feeds being adjusted to look similar.


In one embodiment, deterministic assessment of dimensions based on entities within a field of view can be an option. The deterministic assessment may include stereoscopic imaging, relative sizing and scaling, identification of a known entity, and/or relative sizing with an unknown entity.


In one embodiment, stereoscopic imaging may include a primary camera system (such as a laparoscopic camera system) receiving data (such as wirelessly) from a hand-held endoscopic piece of equipment. Stereoscopic imaging has been integrated into robotic systems to provide depth perception. This may be performed with only one camera, or results may be improved even as two imaging systems become more concentric to each other's origin. A non-visual medium (e.g., ultrasonic, infrared, wireless signal strength, connection vector, etc.) may be used to further enhance the visual perception. Movement tracking of field devices may be used to recreate the depth of field (e.g., tip to handle of a device or proximity to capital equipment). A secondary (or more) camera may be used to determine depth of field. Local identification of surgical space maybe be made through smart trocars (e.g., using Bluetooth) and communication between smart trocars and devices.



FIG. 12 illustrates one embodiment of using stereoscopic imaging. As shown in FIG. 12, a unique geometry may be used with a field of view that can be visualized by two independent sources 1800, 1802 such as two independent imaging systems. Establishing an absolute plane will require a predetermined size, but a relative plane can be established 1804 without a predetermined size. Model extraction and transformations in rotation and size may allow scale and orientation to be determined 1806, which becomes the primary image 1808 with a secondary image 1810 being rotated and scaled by the same transformations.


In one embodiment, relative sizing and scaling can be at a point in time. A first image, at the point in time, with some instrument movement may be used and a second image, at the point in time, may be used to adjust perspective of a visualized device. As with many of the other aspects of data streams, a data stream over time or during the known motion of another object can be used to add an additional layer of detail or data. An object may be getting bigger or small as the imaging system (e.g., a scope) is moved a known distance, or the object may get larger or smaller as an out of field of view shape of size of an organ changes as the imaging system is moved or portioned. This aspect of time can provide context to the initial image frame detail.


In one embodiment, identification of a known entity can include utilization of a localized method of triangulated to partner with visual mapping of a patient's anatomy. For example, as shown in one embodiment illustrated in FIG. 13, as a first surgical system 1900 touches or clamps on tissue 1902, this may effectively create a characterized or known distance the first surgical system 1900 and a second surgical system 1904 (a scope in this illustrated embodiment), to create a proxy for depth perception. A central system (e.g., a surgical hub, a cloud-based server, etc.) in communication with the first and second surgical systems 1900, 1904 may then simultaneously use a triangulation method, e.g., using data from sensors 1908, 1910 of the first and second surgical systems 1900, 1904, to create a 3D map 1906 in space or characterize the coordinates of that tissue. The central system may use a size of a first object and then a second closely located other object to provide relativity of the sizes and shapes which can enable the central system to understand if the first object is merely bigger than expected or if it just closer to the visualizing surgical system, e.g., the second surgical system 1904. Relative mapping of key landmarks or anatomy can provide understanding of the size and orientation of related organs. In this way the map can be scalable since the landmarks from one person to the next are more common even if the actual sizes vary.


In one embodiment, relative sizing with an unknown entity may include identification of unique elements (e.g., inserted or imported features and/or naturally occurring features of an image) within an object and/or creation of depth with an external entity. FIGS. 14 and 15 illustrate one embodiment of identification of unique elements within an object, which in this illustrated embodiment includes a stomach 2000 of a patient. FIG. 14 shows a first image 2002 gathered by a first imaging device and showing the object. FIG. 15 shows a second image 2004 gathered from a different perspective by a second imaging device and showing the object at a same time as in the first image 2002. A first distance X1 between first and second features 2006, 2008 on the object, as shown in the first image 2002, equals a delta times a second distance X2 between the first and second features 2006, 2008 on the object, as shown in the second image 2004.


In one embodiment, relative sizing with an unknown entity may include creation of depth with an external entity. In one embodiment of the creation of depth, a relative scale of measure within a system may be based on what the system can see. For example, in a surgical procedure including ultrasound local imaging of stones, nodes, and ducts converted to a laparoscopic coordinate system, a size of an ultrasonic sensed object may be used in conjunction with density and fixation (if the objects were moved around slightly) to determine one organ or node from another stone which has to be within another duct or tube. In another embodiment of the creation of depth, an endoscopic system may handle orientation (radial) relative to a jejunum to coordinate to a laparoscopic system. For example, in a surgical procedure including ultrasound local imaging of stones, nodes, and ducts converted to a laparoscopic coordinate system, the device may provide a physical feature or light which is keyed, e.g., by a central system (e.g., surgical hub, cloud-based server, etc.) to an orientation of a scope relative to the organ which may be sensed by another system to allow the second system to associate orientation with location of the scope.


In one embodiment, spatial orientation and navigation of devices located within a geofence may include using light detection and ranging (LIDAR) and structured light to image distance and geometric shape surfacing and used to determine orientation relative to a source not merely a distance from the source. Relative references may be defined over time in order to quantify the objects seen are the same object from differing perspectives, such as secondary distance measures (e.g., real measurements like LIDAR) being used by a central system (e.g., surgical hub, cloud-based server, etc.) to establish one or more real measures that may then be cooperated to adjacent devices. A relational parameter of devices being determined by the central system by markers or aspects of a surgical instrument that allow for the relational parameter to be calibrated to real measurements. Once one instrument has these parameters set, the other instruments that are nearby may be used by the central system to relate a real measure of the first instrument to other surgical instruments or organs.


Various embodiments of aspects that may be involved in spatial orientation and navigation are described further in, for example, U.S. Pat. App. Pub. No. 2023/0116781 entitled “Surgical Devices, Systems, And Methods Using Multi-Source Imaging” filed Oct. 5, 2021, U.S. Pat. App. Pub. No. 2023/0100698 entitled “Methods For Controlling Cooperative Surgical Instruments” published Mar. 30, 2023, U.S. Pat. App. Pub. No. 2023/0096268 entitled “Methods for Controlling Cooperative Surgical Instruments” filed Oct. 5, 2021, U.S. Pat. No. 11,723,642 entitled “Cooperative Access Hybrid Procedures” issued Aug. 14, 2023, U.S. Pat. No. 11,678,881 entitled “Spatial Awareness Of Surgical Hubs In Operating Rooms” filed Mar. 29, 2018, U.S. Pat. No. 11,559,307 entitled “Method Of Robotic Hub Communication, Detection, And Control” issued Jan. 24, 2023, U.S. Pat. App. Pub. No. 2019/0200981 entitled “Method Of Compressing Tissue Within A Stapling Device And Simultaneously Displaying The Location Of The Tissue Within The Jaws” filed Dec. 4, 2018, U.S. Pat. No. 10,758,310 entitled “Wireless Pairing Of A Surgical Device With Another Device Within A Sterile Surgical Field Based On The Usage And Situational Awareness Of Devices” issued Sep. 1, 2020, and U.S. Pat. App. Pub. No. 2022/0104896 entitled “Interactive Information Overlay On Multiple Surgical Displays” published Apr. 7, 2022, which are hereby incorporated by reference herein in their entireties.



FIG. 16 illustrates one embodiment of a method 2100 of geofencing. The method 2100 may include, during performance of a surgical procedure on a patient, adjusting 2102 processing of a dataflow associated with a surgical system, which is located within a first digital fence (e.g., a first geofence) surrounding an aspect of the surgical procedure, in response to the surgical system moving into a second digital fence (e.g., a second geofence) that is nested within the first digital fence. The dataflow may include data regarding a measured patient parameter that at least one of a surgical hub and the surgical system is configured to use in performing a function during the performance of the surgical procedure.


The adjusting 2102 of the processing may include at least one of: adjusting 2104 a data flow rate of the dataflow, adjusting 2106 a bandwidth capacity of the dataflow, adjusting 2108 a latency of the dataflow, matching 2110 the processing of the dataflow with a processing of a second dataflow of a second surgical system located in the second digital fence, preventing 2112 transmission of the dataflow, processing 2114 the dataflow in accordance with a predefined second set of rules for processing data that is associated with the second digital fence in addition to a predefined first set of rules for processing data that is associated with the first digital fence, and suspending 2116 at least one rule in a predefined third set of rules for processing data that is associated with the first digital fence such that the dataflow is configured to be processed in accordance with a predefined fourth set of rules for processing data that is associated with the second digital fence and with a subset of the predefined third set of rules.


Computer Systems

A computer system may be suitable for use in implementing computerized components described herein. In broad overview of an exemplary embodiment, the computer system may include a processor configured to perform actions in accordance with instructions, and memory devices configured to store instructions and data. The processor may be in communication, via a bus, with the memory (and/or incorporates the memory) and with at least one network interface controller with a network interface for connecting to external devices, e.g., a computer system (such as a mobile phone, a tablet, a laptop, a server, etc.). The processor may also be configured to be in communication, via the bus, with any other processor(s) of the computer system and with any I/O devices at an I/O interfaces. Generally, a processor will execute instructions received from the memory. In some embodiments, the computer system can be configured within a cloud computing environment, a virtual or containerized computing environment, and/or a web-based microservices environment.


In more detail, the processor can be any logic circuitry that processes instructions, e.g., instructions fetched from the memory. In many embodiments, the processor may be an embedded processor, a microprocessor unit (MPU), microcontroller unit (MCU), field-programmable gate array (FPGA or FGPA), or special purpose processor. The computer system can be based on any processor, e.g., suitable digital signal processor (DSP), or set of processors, capable of operating as described herein. In some embodiments, the processor can be a single core or multi-core processor. In some embodiments, the processor can be composed of multiple processors.


The memory can be any device suitable for storing computer readable data. The memory can be a device with fixed storage or a device for reading removable storage media. Examples include all forms of non-volatile memory, media and memory devices, semiconductor memory devices (e.g., EPROM, EEPROM, SDRAM, flash memory devices, and all types of solid state memory), magnetic disks, and magneto optical disks. A computer system can have any number of memory devices.


The memory also can include a cache memory, which is generally a form of high-speed computer memory placed in close proximity to the processor for fast read/write times. In some embodiments, the cache memory is part of, or on the same chip as, the processor.


The network interface controller may be configured to manage data exchanges via the network interface. The network interface controller may handle the physical, media access control, and data link layers of the Open Systems Interconnect (OSI) model for network communication. In some embodiments, some of the network interface controller's tasks may be handled by the processor. In some embodiments, the network interface controller may be part of the processor. In some embodiments, a computer system may have multiple network interface controllers. In some implementations, the network interface may be a connection point for a physical network link, e.g., an RJ 45 connector. In some embodiments, the network interface controller may support wireless network connections and an interface port may be a wireless Bluetooth transceiver. Generally, a computer system can be configured to exchange data with other network devices via physical or wireless links to a network interface. In some embodiments, the network interface controller may implement a network protocol such as LTE, TCP/IP Ethernet, IEEE 802.11, IEEE 802.16, Bluetooth, or the like.


In some uses, the I/O interface may support an input device and/or an output device. In some uses, the input device and the output device may be integrated into the same hardware, e.g., as in a touch screen. In some uses, such as in a server context, there may be no I/O interface or the I/O interface may not be used. In some uses, additional other components may be in communication with the computer system, e.g., external devices connected via a universal serial bus (USB). In some embodiments, an I/O device may be incorporated into the computer system, e.g., a touch screen on a tablet device.


In some implementations, a computer device may include an additional device such as a co-processor, e.g., a math co-processor configured to assist the processor with high precision or complex calculations.


CONCLUSION

Certain illustrative implementations have been described to provide an overall understanding of the principles of the structure, function, manufacture, and use of the systems, devices, and methods disclosed herein. One or more examples of these implementations have been illustrated in the accompanying drawings. Those skilled in the art will understand that the systems, devices, and methods specifically described herein and illustrated in the accompanying drawings are non-limiting illustrative implementations and that the scope of the present invention is defined solely by the claims. The features illustrated or described in connection with one illustrative implementation may be combined with the features of other implementations. Such modifications and variations are intended to be included within the scope of the present invention. Further, in the present disclosure, like-named components of the implementations generally have similar features, and thus within a particular implementation each feature of each like-named component is not necessarily fully elaborated upon.


Approximating language, as used herein throughout the specification and claims, may be applied to modify any quantitative representation that can permissibly vary without resulting in a change in the basic function to which it is related. Accordingly, a value modified by a term or terms, such as “about,” “approximately,” and “substantially,” are not to be limited to the precise value specified. In at least some instances, the approximating language may correspond to the precision of an instrument for measuring the value. Here and throughout the specification and claims, range limitations may be combined and/or interchanged, such ranges are identified and include all the sub-ranges contained therein unless context or language indicates otherwise.


One skilled in the art will appreciate further features and advantages of the invention based on the above-described implementations. Accordingly, the present application is not to be limited by what has been particularly shown and described, except as indicated by the appended claims. All publications and references cited herein are expressly incorporated herein by reference in their entirety for all purposes.

Claims
  • 1. A surgical data management system, comprising: a surgical hub comprising: a processor; anda memory storing instructions that, when executed by the processor, cause the processor to perform operations comprising: during a performance of a surgical procedure on a patient, adjust processing of a dataflow associated with a surgical system, which is located within a first digital fence surrounding an aspect of the surgical procedure, in response to the surgical system moving into a second digital fence that is nested within the first digital fence;wherein the dataflow includes data regarding a measured patient parameter that at least one of the surgical hub and the surgical system is configured to use in performing a function during the performance of the surgical procedure.
  • 2. The surgical data management system of claim 1, wherein the first digital fence represents a fence surrounding a physical space.
  • 3. The surgical data management system of claim 2, wherein the physical space is an operating room in which the surgical procedure is to be performed; and the second digital fence represents a fence surrounding a partial portion of the operating room.
  • 4. The surgical data management system of claim 3, wherein the second digital fence represents a fence surrounding a surgical field in the operating room.
  • 5. The surgical data management system of claim 1, wherein the first digital fence represents a fence surrounding a temporal space; and the temporal space is a total amount of time in which the surgical procedure is to be performed; andthe second digital fence represents a fence surrounding a portion of the total amount of time.
  • 6. The surgical data management system of claim 5, wherein: the instructions, when executed by the processor, further cause the processor to perform operations comprising: adjust processing of the dataflow associated with the surgical system, which is located within the first digital fence, in response to the surgical system moving into a third digital fence that is nested within the first digital fence; andthe third digital fence represents a fence surrounding a second, different portion of the total amount of time.
  • 7. The surgical data management system of claim 1, wherein adjusting the processing comprises at least one of: adjusting a data flow rate of the dataflow,adjusting a bandwidth capacity of the dataflow,adjusting a latency of the dataflow,matching the processing of the dataflow with a processing of a second dataflow of a second surgical system located in the second digital fence, andpreventing transmission of the dataflow.
  • 8. The surgical data management system of claim 1, wherein the instructions, when executed by the processor, further cause the processor to perform operations comprising: adjust processing of the dataflow associated with the surgical system, which is located within the first digital fence, in response to the surgical system moving into a third digital fence that is nested within the first digital fence and is different from the second digital fence.
  • 9. The surgical data management system of claim 8, wherein the first digital fence represents a fence surrounding a physical space; the physical space is an operating room in which the surgical procedure is to be performed;the second digital fence represents a fence surrounding a first portion of a surgical field in the operating room; andthe third digital fence represents a fence surrounding a second, different portion of the surgical field in the operating room.
  • 10. The surgical data management system of claim 1, wherein the instructions, when executed by the processor, further cause the processor to perform operations comprising: adjust processing of the dataflow associated with the surgical system, which is located within the second digital fence, in response to the surgical system moving out of the second digital fence and remaining in the first digital fence.
  • 11. The surgical data management system of claim 10, wherein a predefined first set of rules for processing data is associated with the first digital fence; a predefined second set of rules for processing data is associated with the second digital fence;the adjusting in response to the surgical system moving into the second digital fence comprises the dataflow being processed in accordance with the predefined first set of rules and the predefined second set of rules; andthe adjusting in response to the surgical system moving out of the second digital fence comprises the dataflow being processed in accordance with the predefined first set of rules and not in accordance with the predefined second set of rules.
  • 12. The surgical data management system of claim 1, wherein a predefined first set of rules for processing data is associated with the first digital fence; a predefined second set of rules for processing data is associated with the second digital fence; andthe adjusting comprises processing the dataflow in accordance with the predefined second set of rules in addition to the predefined first set of rules.
  • 13. The surgical data management system of claim 12, wherein the instructions, when executed by the processor, further cause the processor to perform operations comprising: adjust processing of the dataflow associated with the surgical system, which is located within the second digital fence, in response to the surgical system moving out of the second digital fence such that the dataflow is processed in accordance with the predefined first set of rules and is no longer processed in accordance with the predefined second set of rules.
  • 14. The surgical data management system of claim 1, wherein a predefined first set of rules for processing data is associated with the first digital fence; a predefined second set of rules for processing data is associated with the second digital fence; andthe adjusting comprises suspending at least one of the rules in the predefined first set of rules such that the dataflow is configured to be processed in accordance with the predefined second set of rules and with a subset of the predefined first set of rules.
  • 15. The surgical data management system of claim 14, wherein the instructions, when executed by the processor, further cause the processor to perform operations comprising: adjust processing of the dataflow associated with the surgical system, which is located within the second digital fence, in response to the surgical system moving out of the second digital fence such that the dataflow is configured to processed in accordance with the predefined first set of rules instead of in accordance with the subset of the predefined first set of rules.
  • 16. The surgical data management system of claim 1, wherein a predefined first set of rules for processing data is associated with the first digital fence; a predefined second set of rules for processing data is associated with the second digital fence; andthe adjusting comprises adding to at least one of the rules in the predefined first set of rules such that the dataflow is configured to be processed in accordance with the predefined second set of rules and with the added-to predefined first set of rules.
  • 17. The surgical data management system of claim 16, wherein the instructions, when executed by the processor, further cause the processor to perform operations comprising: adjust processing of the dataflow associated with the surgical system, which is located within the second digital fence, in response to the surgical system moving out of the second digital fence such that the dataflow is configured to be processed in accordance with the predefined first set of rules instead of in accordance with the added-to predefined first set of rules.
  • 18. The surgical data management system of claim 1, wherein the instructions, when executed by the processor, further cause the processor to perform operations comprising: receive data characterizing the surgical procedure, andestablish, prior to a start of the performance of the surgical procedure on the patient and using the received data: the first digital fence and the second digital fence;wherein the received data includes at least one of information regarding a plan for the surgical procedure, a total amount of time for the surgical procedure, a location where the surgical procedure is to be performed, a layout of a location where the surgical procedure is to be performed, and at least one surgical tool to be used in the surgical procedure, the patient.
  • 19. The surgical data management system of claim 1, wherein boundaries of the first and second digital fences are defined using a global positioning system (GPS) or a radio frequency identification (RFID) system; the surgical system is one of a hospital network, a database, a surgical instrument, or a surgical cart; and/orthe surgical hub is configured to be operatively coupled to a robotic surgical system.
  • 20. A surgical data management system, comprising: a cloud-based server comprising: a processor; anda memory storing instructions that, when executed by the processor, cause the processor to perform operations comprising: during a performance of a surgical procedure on a patient, adjust processing of a dataflow associated with a surgical system, which is located within a first digital fence surrounding an aspect of the surgical procedure, in response to the surgical system moving into a second digital fence that is nested within the first digital fence;wherein the dataflow includes data regarding a measured patient parameter that at least one of the cloud-based server and the surgical system is configured to use in performing a function during the performance of the surgical procedure.
CROSS REFERENCE TO RELATED APPLICATIONS

The present application claims priority to U.S. Provisional Patent Application No. 63/603,031 entitled “Smart Surgical Systems” filed Nov. 27, 2023, which is hereby incorporated by reference in its entirety. The subject matter of the present application is related to the following patent applications filed on Nov. 26, 2024, which are hereby incorporated by reference in their entireties: U.S. application Ser. No. 18/960,006 entitled “Methods For Smart Surgical Systems,” U.S. application Ser. No. 18/960,032 entitled “Data Flow Management Between Surgical Systems,” U.S. application Ser. No. 18/960,047 entitled “Mapping Data Pipelines For Surgical Systems,” U.S. application Ser. No. 18/960,059 entitled “Broadcast And Peer-To-Peer Communication For Surgical Systems,” U.S. application Ser. No. 18/960,070 entitled “Data Lifecycle Management For Surgical Systems,” U.S. application Ser. No. 18/960,081 entitled “Data Transformation For Surgical Systems,” U.S. application Ser. No. 18/960,107 entitled “Information Discrimination For Surgical Instruments,” and U.S. application Ser. No. 18/960,117 entitled “Adaptation Of Data Pipelines For Surgical Systems.”

Provisional Applications (1)
Number Date Country
63603031 Nov 2023 US