Patient care is generally improved when tailored to the individual. Every person has different needs, so surgical and interventional solutions that center on the unique journey of every patient may represent efficient, groundbreaking pathways to healing. At the same time, the high stakes of patient care, in particular surgical processes, often drive a focus on conservative, repeatable activities.
Innovative medical technology, such as advanced surgical support computing systems and intelligent surgical instruments for example, may improve approaches to patient care and address the particular needs of health care providers.
The ever-increasing availability data and computing resources have made non-traditional algorithms, such as machine learning algorithms, a specific technical opportunity in health care systems. But incorporating such non-traditional algorithms into any medical technology presents many challenges.
Some surgical systems may include a centralized surgical computing device that may be interacting with a plurality of surgical devices. It may be desirable to have surgical system configured in a manner that may avoid or partially avoid the use of a centralized computing device.
Systems, methods, and instrumentalities may be provided for a smart surgical instrument or a surgical device monitoring other surgical instruments or surgical devices in a peer-to-peer interconnected surgical ecosystem. The monitoring and/or recording may be performed by a surgical device that may be configured as a monitoring surgical device. The monitoring surgical device may use peer-to-peer surgical ecosystem to monitor and/or record surgical information associated with a surgical task on a peer surgical instrument, for example, without a central surgical hub.
In an example, a surgical device may determine that it has capability of monitoring and recording surgical device associated with a surgical task of a surgical procedure being performed at a second surgical device. The surgical device may determine whether it has a capability of being configured as a monitoring surgical device. The capability of the surgical device being the monitoring surgical device may include the surgical device having a capability of accessing surgical information associated with a surgical task on a second peer surgical device and/or a capability of setting (e.g., remotely setting) a surgical parameter associated with a surgical task or on the second surgical device or its own surgical parameter based at least on the surgical information accessed and/or received from the second peer surgical device. The accessed surgical information may include surgical data associated with a patient, a healthcare professional, a surgical task and/or the second peer surgical device. Based on the determination, the surgical device can configure itself as a monitoring surgical device, it may configure itself as a monitoring surgical device. In an example, the roles of a first surgical device being a monitoring surgical device and the second surgical device being a peer surgical instrument t may be based on a negotiation between the monitoring surgical device and the second peer surgical device.
The monitoring surgical device may determine that the second surgical device has capability of being a peer surgical device that may be monitored by it. The capability of being a peer surgical device may include having a capability of establishing a peer-to-peer connection with a monitoring surgical device and/or having a capability of gathering surgical data associated with a patient, a healthcare professional, or a surgical device and reporting the gathered surgical data to the monitoring surgical device. The surgical device may configure the second surgical device as a peer surgical device.
The surgical device may establish a peer-to-peer connection with the second surgical device. The peer-to-peer connection may be established between the first surgical device and the second surgical device for the first surgical device to monitor and record information associated with a surgical task on the second surgical device. The peer-to-peer connection between the two surgical devices may be established using a wired interface (e.g., over a local area network (LAN), a wireless interface (e.g., a WiFi interface (WiFi 6, WiFi6E, etc.), a Bluetooth X interface, etc.), and/or an optical interface (e.g., a fiber optic-based LAN). The two surgical devices may exchange surgical information and other parameters without a central surgical computing device (e.g., a surgical hub).
After a peer-to-peer connection may be established between the two surgical devices, the monitoring surgical device may begin monitoring and recording surgical information and exchanging surgical information with each other. In an example, the monitoring surgical device may monitor and/or record surgical information associated with a surgical task on the peer surgical device using the established peer-to-peer connection. In an example, the peer surgical device may, periodically or aperiodically (e.g., based on a trigger), send/report surgical information associated with a surgical task to the monitoring surgical device using the established peer-to-peer connection.
Surgical systems 102, 103, 104 may each computer-enabled surgical equipment and devices. For example, surgical systems 102, 103, 104 may include a wearable sensing system 111, a human interface system 112, a robotic system 113, one or more intelligent instruments 114, environmental sensing system 115, and/or the like. The wearable sensing system 111 may include one or more devices used to sense aspects of individuals status and activity within a surgical environment. For example, the wearable sensing system 111 may include health care provider sensing systems and/or patient sensing systems.
The human interface system 112 may include devices that enable an individual to interact with the surgical system 102, 103, 104 and/or the cloud computing system 108. The human interface system 112 may include a human interface device.
The robotic system 113 may include surgical robotic devices, such a surgical robot. The robotic system 113 may enable robotic surgical procedures. The robotic system 113 may receive information, settings, programming, controls and the like from the surgical hub 106 for example, the robotic system 113 may send data, such as sensor data, feedback information, video information, operational logs, and the like to the surgical hub 106.
The environmental sensing system 115 may include one or more devices, for example, used for measuring one or more environmental attributes, for example, as further described in
The surgical system 102 may be in communication with a remote server 109 that may be part of a cloud computing system 108. In an example, the surgical system 102 may be in communication with a remote server 109 via networked connection, such an internet connection (e.g., business internet service, T3, cable/FIOS networking node, and the like). The surgical system 102 and/or a component therein may communicate with the remote servers 109 via a cellular transmission/reception point (TRP) or a base station using one or more of the following cellular protocols: GSM/GPRS/EDGE (2G), UMTS/HSPA (3G), long term evolution (LTE) or 4G, LTE-Advanced (LTE-A), new radio (NR) or 5G.
In an example, the surgical hub 106 may facilitate displaying the image from a surgical imaging device, like a laparoscopic scope for example. The surgical hub 106 have cooperative interactions with the other local systems to facilitate displaying information relevant to those local systems. The surgical hub 106 may interact with one or more sensing systems 111, 115, one or more intelligent instruments 114, and/or multiple displays. For example, the surgical hub 106 may be configured to gather measurement data from the one or more sensing systems 111, 115 and send notifications or control messages to the one or more sensing systems 111, 115. The surgical hub 106 may send and/or receive information including notification information to and/or from the human interface system 112. The human interface system 112 may include one or more human interface devices (HIDs). The surgical hub 106 may send and/or receive notification information or control information to audio, display and/or control information to various devices that are in communication with the surgical hub.
For example, the sensing systems 111, 115 may include the wearable sensing system 111 (which may include one or more HCP sensing systems and one or more patient sensing systems) and the environmental sensing system 115. The one or more sensing systems 111, 115 may measure data relating to various biomarkers. The one or more sensing systems 111, 115 may measure the biomarkers using one or more sensors, for example, photosensors (e.g., photodiodes, photoresistors), mechanical sensors (e.g., motion sensors), acoustic sensors, electrical sensors, electrochemical sensors, thermoelectric sensors, infrared sensors, etc. The one or more sensors may measure the biomarkers as described herein using one of more of the following sensing technologies: photoplethysmography, electrocardiography, electroencephalography, colorimetry, impedimentary, potentiometry, amperometry, etc.
The biomarkers measured by the one or more sensing systems 111, 115 may include, but are not limited to, sleep, core body temperature, maximal oxygen consumption, physical activity, alcohol consumption, respiration rate, oxygen saturation, blood pressure, blood sugar, heart rate variability, blood potential of hydrogen, hydration state, heart rate, skin conductance, peripheral temperature, tissue perfusion pressure, coughing and sneezing, gastrointestinal motility, gastrointestinal tract imaging, respiratory tract bacteria, edema, mental aspects, sweat, circulating tumor cells, autonomic tone, circadian rhythm, and/or menstrual cycle.
The biomarkers may relate to physiologic systems, which may include, but are not limited to, behavior and psychology, cardiovascular system, renal system, skin system, nervous system, gastrointestinal system, respiratory system, endocrine system, immune system, tumor, musculoskeletal system, and/or reproductive system. Information from the biomarkers may be determined and/or used by the computer-implemented patient and the surgical system 100, for example. The information from the biomarkers may be determined and/or used by the computer-implemented patient and the surgical system 100 to improve said systems and/or to improve patient outcomes, for example. The one or more sensing systems 111, 115, biomarkers, and physiological systems are described in more detail in U.S. application Ser. No. 17/156,287 (attorney docket number END9290USNP1), titled METHOD OF ADJUSTING A SURGICAL PARAMETER BASED ON BIOMARKER MEASUREMENTS, filed Jan. 22, 2021, the disclosure of which is herein incorporated by reference in its entirety.
As illustrated in
In one aspect, the surgical hub 206 may be configured to route a diagnostic input or feedback entered by a non-sterile operator at the visualization tower 226 to the primary display 223 within the sterile field, where it can be viewed by a sterile operator at the operating table. In one example, the input can be in the form of a modification to the snapshot displayed on the non-sterile display 227 or 229, which can be routed to the primary display 223 by the surgical hub 206.
Referring to
Other types of robotic systems can be readily adapted for use with the surgical system 202. Various examples of robotic systems and surgical tools that are suitable for use with the present disclosure are described in U.S. Patent Application Publication No. US 2019-0201137 A1 (U.S. patent application Ser. No. 16/209,407), titled METHOD OF ROBOTIC HUB COMMUNICATION, DETECTION, AND CONTROL, filed Dec. 4, 2018, the disclosure of which is herein incorporated by reference in its entirety.
Various examples of cloud-based analytics that are performed by the cloud computing system 208, and are suitable for use with the present disclosure, are described in U.S. Patent Application Publication No. US 2019-0206569 A1 (U.S. patent application Ser. No. 16/209,403), titled METHOD OF CLOUD BASED DATA ANALYTICS FOR USE WITH THE HUB, filed Dec. 4, 2018, the disclosure of which is herein incorporated by reference in its entirety.
In various aspects, the imaging device 230 may include at least one image sensor and one or more optical components. Suitable image sensors may include, but are not limited to, Charge-Coupled Device (CCD) sensors and Complementary Metal-Oxide Semiconductor (CMOS) sensors.
The optical components of the imaging device 230 may include one or more illumination sources and/or one or more lenses. The one or more illumination sources may be directed to illuminate portions of the surgical field. The one or more image sensors may receive light reflected or refracted from the surgical field, including light reflected or refracted from tissue and/or surgical instruments.
The one or more illumination sources may be configured to radiate electromagnetic energy in the visible spectrum as well as the invisible spectrum. The visible spectrum, sometimes referred to as the optical spectrum or luminous spectrum, is the portion of the electromagnetic spectrum that is visible to (i.e., can be detected by) the human eye and may be referred to as visible light or simply light. A typical human eye will respond to wavelengths in air that range from about 380 nm to about 750 nm.
The invisible spectrum (e.g., the non-luminous spectrum) is the portion of the electromagnetic spectrum that lies below and above the visible spectrum (i.e., wavelengths below about 380 nm and above about 750 nm). The invisible spectrum is not detectable by the human eye. Wavelengths greater than about 750 nm are longer than the red visible spectrum, and they become invisible infrared (IR), microwave, and radio electromagnetic radiation. Wavelengths less than about 380 nm are shorter than the violet spectrum, and they become invisible ultraviolet, x-ray, and gamma ray electromagnetic radiation.
In various aspects, the imaging device 230 is configured for use in a minimally invasive procedure. Examples of imaging devices suitable for use with the present disclosure include, but are not limited to, an arthroscope, angioscope, bronchoscope, choledochoscope, colonoscope, cytoscope, duodenoscope, enteroscope, esophagogastro-duodenoscope (gastroscope), endoscope, laryngoscope, nasopharyngo-neproscope, sigmoidoscope, thoracoscope, and ureteroscope.
The imaging device may employ multi-spectrum monitoring to discriminate topography and underlying structures. A multi-spectral image is one that captures image data within specific wavelength ranges across the electromagnetic spectrum. The wavelengths may be separated by filters or by the use of instruments that are sensitive to particular wavelengths, including light from frequencies beyond the visible light range, e.g., IR and ultraviolet. Spectral imaging can allow extraction of additional information that the human eye fails to capture with its receptors for red, green, and blue. The use of multi-spectral imaging is described in greater detail under the heading “Advanced Imaging Acquisition Module” in U.S. Patent Application Publication No. US 2019-0200844 A1 (U.S. patent application Ser. No. 16/209,385), titled METHOD OF HUB COMMUNICATION, PROCESSING, STORAGE AND DISPLAY, filed Dec. 4, 2018, the disclosure of which is herein incorporated by reference in its entirety. Multi-spectrum monitoring can be a useful tool in relocating a surgical field after a surgical task is completed to perform one or more of the previously described tests on the treated tissue. It is axiomatic that strict sterilization of the operating room and surgical equipment is required during any surgery. The strict hygiene and sterilization conditions required in a “surgical theater,” i.e., an operating or treatment room, necessitate the highest possible sterility of all medical devices and equipment. Part of that sterilization process is the need to sterilize anything that comes in contact with the patient or penetrates the sterile field, including the imaging device 230 and its attachments and components. It will be appreciated that the sterile field may be considered a specified area, such as within a tray or on a sterile towel, that is considered free of microorganisms, or the sterile field may be considered an area, immediately around a patient, who has been prepared for a surgical procedure. The sterile field may include the scrubbed team members, who are properly attired, and all furniture and fixtures in the area.
Wearable sensing system 211 illustrated in
The surgical hub 206 may use the surgeon biomarker measurement data associated with an HCP to adaptively control one or more surgical instruments 231. For example, the surgical hub 206 may send a control program to a surgical instrument 231 to control its actuators to limit or compensate for fatigue and use of fine motor skills. The surgical hub 206 may send the control program based on situational awareness and/or the context on importance or criticality of a task. The control program may instruct the instrument to alter operation to provide more control when control is needed.
The modular control may be coupled to non-contact sensor module. The non-contact sensor module may measure the dimensions of the operating theater and generate a map of the surgical theater using, ultrasonic, laser-type, and/or the like, non-contact measurement devices. Other distance sensors can be employed to determine the bounds of an operating room. An ultrasound-based non-contact sensor module may scan the operating theater by transmitting a burst of ultrasound and receiving the echo when it bounces off the perimeter walls of an operating theater as described under the heading “Surgical Hub Spatial Awareness Within an Operating Room” in U.S. Provisional Patent Application Ser. No. 62/611,341, titled INTERACTIVE SURGICAL PLATFORM, filed Dec. 28, 2017, which is herein incorporated by reference in its entirety. The sensor module may be configured to determine the size of the operating theater and to adjust Bluetooth-pairing distance limits. A laser-based non-contact sensor module may scan the operating theater by transmitting laser light pulses, receiving laser light pulses that bounce off the perimeter walls of the operating theater, and comparing the phase of the transmitted pulse to the received pulse to determine the size of the operating theater and to adjust Bluetooth pairing distance limits, for example.
During a surgical procedure, energy application to tissue, for sealing and/or cutting, is generally associated with smoke evacuation, suction of excess fluid, and/or irrigation of the tissue. Fluid, power, and/or data lines from different sources are often entangled during the surgical procedure. Valuable time can be lost addressing this issue during a surgical procedure. Detangling the lines may necessitate disconnecting the lines from their respective modules, which may require resetting the modules. The hub modular enclosure 360 offers a unified environment for managing the power, data, and fluid lines, which reduces the frequency of entanglement between such lines. Aspects of the present disclosure present a surgical hub 306 for use in a surgical procedure that involves energy application to tissue at a surgical site.
The surgical hub 306 includes a hub enclosure 360 and a combo generator module slidably receivable in a docking station of the hub enclosure 360. The docking station includes data and power contacts. The combo generator module includes two or more of an ultrasonic energy generator component, a bipolar RF energy generator component, and a monopolar RF energy generator component that are housed in a single unit. In one aspect, the combo generator module also includes a smoke evacuation component, at least one energy delivery cable for connecting the combo generator module to a surgical instrument, at least one smoke evacuation component configured to evacuate smoke, fluid, and/or particulates generated by the application of therapeutic energy to the tissue, and a fluid line extending from the remote surgical site to the smoke evacuation component. In one aspect, the fluid line may be a first fluid line, and a second fluid line may extend from the remote surgical site to a suction and irrigation module 355 slidably received in the hub enclosure 360. In one aspect, the hub enclosure 360 may include a fluid interface.
Certain surgical procedures may require the application of more than one energy type to the tissue. One energy type may be more beneficial for cutting the tissue, while another different energy type may be more beneficial for sealing the tissue. For example, a bipolar generator can be used to seal the tissue while an ultrasonic generator can be used to cut the sealed tissue. Aspects of the present disclosure present a solution where a hub modular enclosure 360 is configured to accommodate different generators and facilitate an interactive communication therebetween. The hub modular enclosure 360 may enable the quick removal and/or replacement of various modules. Aspects of the present disclosure present a modular surgical enclosure for use in a surgical procedure that involves energy application to tissue. The modular surgical enclosure includes a first energy-generator module, configured to generate a first energy for application to the tissue, and a first docking station comprising a first docking port that includes first data and power contacts, wherein the first energy-generator module is slidably movable into an electrical engagement with the power and data contacts and wherein the first energy-generator module is slidably movable out of the electrical engagement with the first power and data contacts. Further to the above, the modular surgical enclosure also includes a second energy-generator module configured to generate a second energy, different than the first energy, for application to the tissue, and a second docking station comprising a second docking port that includes second data and power contacts, wherein the second energy generator module is slidably movable into an electrical engagement with the power and data contacts, and wherein the second energy-generator module is slidably movable out of the electrical engagement with the second power and data contacts. In addition, the modular surgical enclosure also includes a communication bus between the first docking port and the second docking port, configured to facilitate communication between the first energy-generator module and the second energy-generator module. Referring to
As illustrated in
The computer system 463 may comprise a processor and a network interface. The processor may be coupled to a communication module, storage, memory, non-volatile memory, and input/output (I/O) interface via a system bus. The system bus can be any of several types of bus structure(s) including the memory bus or memory controller, a peripheral bus or external bus, and/or a local bus using any variety of available bus architectures including, but not limited to, 9-bit bus, Industrial Standard Architecture (ISA), Micro-Charmel Architecture (MSA), Extended ISA (EISA), Intelligent Drive Electronics (IDE), VESA Local Bus (VLB), Peripheral Component Interconnect (PCI), USB, Advanced Graphics Port (AGP), Personal Computer Memory Card International Association bus (PCMCIA), Small Computer Systems Interface (SCSI), or any other proprietary bus.
The processor may be any single-core or multicore processor such as those known under the trade name ARM Cortex by Texas Instruments. In one aspect, the processor may be an LM4F230H5QR ARM Cortex-M4F Processor Core, available from Texas Instruments, for example, comprising an on-chip memory of 256 KB single-cycle flash memory, or other non-volatile memory, up to 40 MHz, a prefetch buffer to improve performance above 40 MHz, a 32 KB single-cycle serial random access memory (SRAM), an internal read-only memory (ROM) loaded with StellarisWare® software, a 2 KB electrically erasable programmable read-only memory (EEPROM), and/or one or more pulse width modulation (PWM) modules, one or more quadrature encoder inputs (QEI) analogs, one or more 12-bit analog-to-digital converters (ADCs) with 12 analog input channels, details of which are available for the product datasheet.
In an example, the processor may comprise a safety controller comprising two controller-based families such as TMS570 and RM4x, known under the trade name Hercules ARM Cortex R4, also by Texas Instruments. The safety controller may be configured specifically for IEC 61508 and ISO 26262 safety critical applications, among others, to provide advanced integrated safety features while delivering scalable performance, connectivity, and memory options.
It is to be appreciated that the computer system 463 may include software that acts as an intermediary between users and the basic computer resources described in a suitable operating environment. Such software may include an operating system. The operating system, which can be stored on the disk storage, may act to control and allocate resources of the computer system. System applications may take advantage of the management of resources by the operating system through program modules and program data stored either in the system memory or on the disk storage. It is to be appreciated that various components described herein can be implemented with various operating systems or combinations of operating systems.
A user may enter commands or information into the computer system 463 through input device(s) coupled to the I/O interface. The input devices may include, but are not limited to, a pointing device such as a mouse, trackball, stylus, touch pad, keyboard, microphone, joystick, game pad, satellite dish, scanner, TV tuner card, digital camera, digital video camera, web camera, and the like. These and other input devices connect to the processor through the system bus via interface port(s). The interface port(s) include, for example, a serial port, a parallel port, a game port, and a USB. The output device(s) use some of the same types of ports as input device(s). Thus, for example, a USB port may be used to provide input to the computer system 463 and to output information from the computer system 463 to an output device. An output adapter may be provided to illustrate that there can be some output devices like monitors, displays, speakers, and printers, among other output devices that may require special adapters. The output adapters may include, by way of illustration and not limitation, video and sound cards that provide a means of connection between the output device and the system bus. It should be noted that other devices and/or systems of devices, such as remote computer(s), may provide both input and output capabilities.
The computer system 463 can operate in a networked environment using logical connections to one or more remote computers, such as cloud computer(s), or local computers. The remote cloud computer(s) can be a personal computer, server, router, network PC, workstation, microprocessor-based appliance, peer device, or other common network node, and the like, and typically includes many or all of the elements described relative to the computer system. For purposes of brevity, only a memory storage device is illustrated with the remote computer(s). The remote computer(s) may be logically connected to the computer system through a network interface and then physically connected via a communication connection. The network interface may encompass communication networks such as local area networks (LANs) and wide area networks (WANs). LAN technologies may include Fiber Distributed Data Interface (FDDI), Copper Distributed Data Interface (CDDI), Ethernet/IEEE 802.3, Token Ring/IEEE 802.5, and the like. WAN technologies may include, but are not limited to, point-to-point links, circuit-switching networks like Integrated Services Digital Networks (ISDN) and variations thereon, packet-switching networks, and Digital Subscriber Lines (DSL).
In various examples, the computer system 463 may comprise an image processor, image-processing engine, media processor, or any specialized digital signal processor (DSP) used for the processing of digital images. The image processor may employ parallel computing with single instruction, multiple data (SIMD) or multiple instruction, multiple data (MIMD) technologies to increase speed and efficiency. The digital image-processing engine can perform a range of tasks. The image processor may be a system on a chip with multicore processor architecture.
The communication connection(s) may refer to the hardware/software employed to connect the network interface to the bus. While the communication connection is shown for illustrative clarity inside the computer system 463, it can also be external to the computer system 463. The hardware/software necessary for connection to the network interface may include, for illustrative purposes only, internal and external technologies such as modems, including regular telephone-grade modems, cable modems, optical fiber modems, and DSL modems, ISDN adapters, and Ethernet cards. In some examples, the network interface may also be provided using an RF interface.
Surgical data network associated with the surgical hub system 460 may be configured as passive, intelligent, or switching. A passive surgical data network serves as a conduit for the data, enabling it to go from one device (or segment) to another and to the cloud computing resources. An intelligent surgical data network includes additional features to enable the traffic passing through the surgical data network to be monitored and to configure each port in the network hub 461 or network switch 462. An intelligent surgical data network may be referred to as a manageable hub or switch. A switching hub reads the destination address of each packet and then forwards the packet to the correct port.
Modular devices 1a-1n located in the operating theater may be coupled to the modular communication hub 465. The network hub 461 and/or the network switch 462 may be coupled to a network router 466 to connect the devices 1a-1n to the cloud computing system 464 or the local computer system 463. Data associated with the devices 1a-1n may be transferred to cloud-based computers via the router for remote data processing and manipulation. Data associated with the devices 1a-1n may also be transferred to the local computer system 463 for local data processing and manipulation. Modular devices 2a-2m located in the same operating theater also may be coupled to a network switch 462. The network switch 462 may be coupled to the network hub 461 and/or the network router 466 to connect the devices 2a-2m to the cloud 464. Data associated with the devices 2a-2m may be transferred to the cloud computing system 464 via the network router 466 for data processing and manipulation. Data associated with the devices 2a-2m may also be transferred to the local computer system 463 for local data processing and manipulation.
As illustrated in
The microcontroller 521 may be any single-core or multicore processor such as those known under the trade name ARM Cortex by Texas Instruments. In one aspect, the main microcontroller 521 may be an LM4F230H5QR ARM Cortex-M4F Processor Core, available from Texas Instruments, for example, comprising an on-chip memory of 256 KB single-cycle flash memory, or other non-volatile memory, up to 40 MHz, a prefetch buffer to improve performance above 40 MHz, a 32 KB single-cycle SRAM, and internal ROM loaded with StellarisWare® software, a 2 KB EEPROM, one or more PWM modules, one or more QEI analogs, and/or one or more 12-bit ADCs with 12 analog input channels, details of which are available for the product datasheet.
The microcontroller 521 may comprise a safety controller comprising two controller-based families such as TMS570 and RM4x, known under the trade name Hercules ARM Cortex R4, also by Texas Instruments. The safety controller may be configured specifically for IEC 61508 and ISO 26262 safety critical applications, among others, to provide advanced integrated safety features while delivering scalable performance, connectivity, and memory options.
The microcontroller 521 may be programmed to perform various functions such as precise control over the speed and position of the knife and articulation systems. In one aspect, the microcontroller 521 may include a processor 522 and a memory 523. The electric motor 530 may be a brushed direct current (DC) motor with a gearbox and mechanical links to an articulation or knife system. In one aspect, a motor driver 529 may be an A3941 available from Allegro Microsystems, Inc. Other motor drivers may be readily substituted for use in the tracking system 528 comprising an absolute positioning system. A detailed description of an absolute positioning system is described in U.S. Patent Application Publication No. 2017/0296213, titled SYSTEMS AND METHODS FOR CONTROLLING A SURGICAL STAPLING AND CUTTING INSTRUMENT, which published on Oct. 19, 2017, which is herein incorporated by reference in its entirety.
The microcontroller 521 may be programmed to provide precise control over the speed and position of displacement members and articulation systems. The microcontroller 521 may be configured to compute a response in the software of the microcontroller 521. The computed response may be compared to a measured response of the actual system to obtain an “observed” response, which is used for actual feedback decisions. The observed response may be a favorable, tuned value that balances the smooth, continuous nature of the simulated response with the measured response, which can detect outside influences on the system.
The motor 530 may be controlled by the motor driver 529 and can be employed by the firing system of the surgical instrument or tool. In various forms, the motor 530 may be a brushed DC driving motor having a maximum rotational speed of approximately 25,000 RPM. In some examples, the motor 530 may include a brushless motor, a cordless motor, a synchronous motor, a stepper motor, or any other suitable electric motor. The motor driver 529 may comprise an H-bridge driver comprising field-effect transistors (FETs), for example. The motor 530 can be powered by a power assembly releasably mounted to the handle assembly or tool housing for supplying control power to the surgical instrument or tool. The power assembly may comprise a battery which may include a number of battery cells connected in series that can be used as the power source to power the surgical instrument or tool. In certain circumstances, the battery cells of the power assembly may be replaceable and/or rechargeable. In at least one example, the battery cells can be lithium-ion batteries which can be couplable to and separable from the power assembly.
The motor driver 529 may be an A3941 available from Allegro Microsystems, Inc. A3941 may be a full-bridge controller for use with external N-channel power metal-oxide semiconductor field-effect transistors (MOSFETs) specifically designed for inductive loads, such as brush DC motors. The driver 529 may comprise a unique charge pump regulator that can provide full (>10 V) gate drive for battery voltages down to 7 V and can allow the A3941 to operate with a reduced gate drive, down to 5.5 V. A bootstrap capacitor may be employed to provide the above battery supply voltage required for N-channel MOSFETs. An internal charge pump for the high-side drive may allow DC (100% duty cycle) operation. The full bridge can be driven in fast or slow decay modes using diode or synchronous rectification. In the slow decay mode, current recirculation can be through the high-side or the low-side FETs. The power FETs may be protected from shoot-through by resistor-adjustable dead time. Integrated diagnostics provide indications of undervoltage, overtemperature, and power bridge faults and can be configured to protect the power MOSFETs under most short circuit conditions. Other motor drivers may be readily substituted for use in the tracking system 528 comprising an absolute positioning system.
The tracking system 528 may comprise a controlled motor drive circuit arrangement comprising a position sensor 525 according to one aspect of this disclosure. The position sensor 525 for an absolute positioning system may provide a unique position signal corresponding to the location of a displacement member. In some examples, the displacement member may represent a longitudinally movable drive member comprising a rack of drive teeth for meshing engagement with a corresponding drive gear of a gear reducer assembly. In some examples, the displacement member may represent the firing member, which could be adapted and configured to include a rack of drive teeth. In some examples, the displacement member may represent a firing bar or the I-beam, each of which can be adapted and configured to include a rack of drive teeth. Accordingly, as used herein, the term displacement member can be used generically to refer to any movable member of the surgical instrument or tool such as the drive member, the firing member, the firing bar, the I-beam, or any element that can be displaced. In one aspect, the longitudinally movable drive member can be coupled to the firing member, the firing bar, and the I-beam. Accordingly, the absolute positioning system can, in effect, track the linear displacement of the I-beam by tracking the linear displacement of the longitudinally movable drive member. In various aspects, the displacement member may be coupled to any position sensor 525 suitable for measuring linear displacement. Thus, the longitudinally movable drive member, the firing member, the firing bar, or the I-beam, or combinations thereof, may be coupled to any suitable linear displacement sensor. Linear displacement sensors may include contact or non-contact displacement sensors. Linear displacement sensors may comprise linear variable differential transformers (LVDT), differential variable reluctance transducers (DVRT), a slide potentiometer, a magnetic sensing system comprising a movable magnet and a series of linearly arranged Hall effect sensors, a magnetic sensing system comprising a fixed magnet and a series of movable, linearly arranged Hall effect sensors, an optical sensing system comprising a movable light source and a series of linearly arranged photo diodes or photo detectors, an optical sensing system comprising a fixed light source and a series of movable linearly, arranged photodiodes or photodetectors, or any combination thereof.
The electric motor 530 can include a rotatable shaft that operably interfaces with a gear assembly that is mounted in meshing engagement with a set, or rack, of drive teeth on the displacement member. A sensor element may be operably coupled to a gear assembly such that a single revolution of the position sensor 525 element corresponds to some linear longitudinal translation of the displacement member. An arrangement of gearing and sensors can be connected to the linear actuator, via a rack and pinion arrangement, or a rotary actuator, via a spur gear or other connection. A power source may supply power to the absolute positioning system and an output indicator may display the output of the absolute positioning system. The displacement member may represent the longitudinally movable drive member comprising a rack of drive teeth formed thereon for meshing engagement with a corresponding drive gear of the gear reducer assembly. The displacement member may represent the longitudinally movable firing member, firing bar, I-beam, or combinations thereof.
A single revolution of the sensor element associated with the position sensor 525 may be equivalent to a longitudinal linear displacement d1 of the displacement member, where d1 is the longitudinal linear distance that the displacement member moves from point “a” to point “b” after a single revolution of the sensor element coupled to the displacement member. The sensor arrangement may be connected via a gear reduction that results in the position sensor 525 completing one or more revolutions for the full stroke of the displacement member. The position sensor 525 may complete multiple revolutions for the full stroke of the displacement member.
A series of switches, where n is an integer greater than one, may be employed alone or in combination with a gear reduction to provide a unique position signal for more than one revolution of the position sensor 525. The state of the switches may be fed back to the microcontroller 521 that applies logic to determine a unique position signal corresponding to the longitudinal linear displacement d1+d2+ . . . dn of the displacement member. The output of the position sensor 525 is provided to the microcontroller 521. The position sensor 525 of the sensor arrangement may comprise a magnetic sensor, an analog rotary sensor like a potentiometer, or an array of analog Hall-effect elements, which output a unique combination of position signals or values.
The position sensor 525 may comprise any number of magnetic sensing elements, such as, for example, magnetic sensors classified according to whether they measure the total magnetic field or the vector components of the magnetic field. The techniques used to produce both types of magnetic sensors may encompass many aspects of physics and electronics. The technologies used for magnetic field sensing may include search coil, fluxgate, optically pumped, nuclear precession, SQUID, Hall-effect, anisotropic magnetoresistance, giant magnetoresistance, magnetic tunnel junctions, giant magnetoimpedance, magnetostrictive/piezoelectric composites, magnetodiode, magnetotransistor, fiber-optic, magneto-optic, and microelectromechanical systems-based magnetic sensors, among others.
The position sensor 525 for the tracking system 528 comprising an absolute positioning system may comprise a magnetic rotary absolute positioning system. The position sensor 525 may be implemented as an AS5055EQFT single-chip magnetic rotary position sensor available from Austria Microsystems, AG. The position sensor 525 is interfaced with the microcontroller 521 to provide an absolute positioning system. The position sensor 525 may be a low-voltage and low-power component and may include four Hall-effect elements in an area of the position sensor 525 that may be located above a magnet. A high-resolution ADC and a smart power management controller may also be provided on the chip. A coordinate rotation digital computer (CORDIC) processor, also known as the digit-by-digit method and Volder's algorithm, may be provided to implement a simple and efficient algorithm to calculate hyperbolic and trigonometric functions that require only addition, subtraction, bit-shift, and table lookup operations. The angle position, alarm bits, and magnetic field information may be transmitted over a standard serial communication interface, such as a serial peripheral interface (SPI) interface, to the microcontroller 521. The position sensor 525 may provide 12 or 14 bits of resolution. The position sensor 525 may be an AS5055 chip provided in a small QFN 16-pin 4×4×0.85 mm package.
The tracking system 528 comprising an absolute positioning system may comprise and/or be programmed to implement a feedback controller, such as a PID, state feedback, and adaptive controller. A power source converts the signal from the feedback controller into a physical input to the system: in this case the voltage. Other examples include a PWM of the voltage, current, and force. Other sensor(s) may be provided to measure physical parameters of the physical system in addition to the position measured by the position sensor 525. In some aspects, the other sensor(s) can include sensor arrangements such as those described in U.S. Pat. No. 9,345,481, titled STAPLE CARTRIDGE TISSUE THICKNESS SENSOR SYSTEM, which issued on May 24, 2016, which is herein incorporated by reference in its entirety; U.S. Patent Application Publication No. 2014/0263552, titled STAPLE CARTRIDGE TISSUE THICKNESS SENSOR SYSTEM, which published on Sep. 18, 2014, which is herein incorporated by reference in its entirety; and U.S. patent application Ser. No. 15/628,175, titled TECHNIQUES FOR ADAPTIVE CONTROL OF MOTOR VELOCITY OF A SURGICAL STAPLING AND CUTTING INSTRUMENT, filed Jun. 20, 2017, which is herein incorporated by reference in its entirety. In a digital signal processing system, an absolute positioning system is coupled to a digital data acquisition system where the output of the absolute positioning system will have a finite resolution and sampling frequency. The absolute positioning system may comprise a compare-and-combine circuit to combine a computed response with a measured response using algorithms, such as a weighted average and a theoretical control loop, that drive the computed response towards the measured response. The computed response of the physical system may take into account properties like mass, inertia, viscous friction, inductance resistance, etc., to predict what the states and outputs of the physical system will be by knowing the input.
The absolute positioning system may provide an absolute position of the displacement member upon power-up of the instrument, without retracting or advancing the displacement member to a reset (zero or home) position as may be required with conventional rotary encoders that merely count the number of steps forwards or backwards that the motor 530 has taken to infer the position of a device actuator, drive bar, knife, or the like.
A sensor 526, such as, for example, a strain gauge or a micro-strain gauge, may be configured to measure one or more parameters of the end effector, such as, for example, the amplitude of the strain exerted on the anvil during a clamping operation, which can be indicative of the closure forces applied to the anvil. The measured strain may be converted to a digital signal and provided to the processor 522. Alternatively, or in addition to the sensor 526, a sensor 527, such as, for example, a load sensor, can measure the closure force applied by the closure drive system to the anvil. The sensor 527, such as, for example, a load sensor, can measure the firing force applied to an I-beam in a firing stroke of the surgical instrument or tool. The I-beam is configured to engage a wedge sled, which is configured to upwardly cam staple drivers to force out staples into deforming contact with an anvil. The I-beam also may include a sharpened cutting edge that can be used to sever tissue as the I-beam is advanced distally by the firing bar. Alternatively, a current sensor 531 can be employed to measure the current drawn by the motor 530. The force required to advance the firing member can correspond to the current drawn by the motor 530, for example. The measured force may be converted to a digital signal and provided to the processor 522.
For example, the strain gauge sensor 526 can be used to measure the force applied to the tissue by the end effector. A strain gauge can be coupled to the end effector to measure the force on the tissue being treated by the end effector. A system for measuring forces applied to the tissue grasped by the end effector may comprise a strain gauge sensor 526, such as, for example, a micro-strain gauge, that can be configured to measure one or more parameters of the end effector, for example. In one aspect, the strain gauge sensor 526 can measure the amplitude or magnitude of the strain exerted on a jaw member of an end effector during a clamping operation, which can be indicative of the tissue compression. The measured strain can be converted to a digital signal and provided to a processor 522 of the microcontroller 521. A load sensor 527 can measure the force used to operate the knife element, for example, to cut the tissue captured between the anvil and the staple cartridge. A magnetic field sensor can be employed to measure the thickness of the captured tissue. The measurement of the magnetic field sensor also may be converted to a digital signal and provided to the processor 522.
The measurements of the tissue compression, the tissue thickness, and/or the force required to close the end effector on the tissue, as respectively measured by the sensors 526, 527, can be used by the microcontroller 521 to characterize the selected position of the firing member and/or the corresponding value of the speed of the firing member. In one instance, a memory 523 may store a technique, an equation, and/or a lookup table which can be employed by the microcontroller 521 in the assessment.
The control system 520 of the surgical instrument or tool also may comprise wired or wireless communication circuits to communicate with a surgical hub, such as surgical hub 460 for example, as shown in
The first and second jaws 691, 690 may be configured to clamp tissue therebetween, fire fasteners through the clamped tissue, and sever the clamped tissue. The first jaw 691 may be configured to fire at least one fastener a plurality of times or may be configured to include a replaceable multi-fire fastener cartridge including a plurality of fasteners (e.g., staples, clips, etc.) that may be fired more than one time prior to being replaced. The second jaw 690 may include an anvil that deforms or otherwise secures the fasteners, as the fasteners are ejected from the multi-fire fastener cartridge.
The handle 697 may include a motor that is coupled to the drive shaft to affect rotation of the drive shaft. The handle 697 may include a control interface to selectively activate the motor. The control interface may include buttons, switches, levers, sliders, touchscreens, and any other suitable input mechanisms or user interfaces, which can be engaged by a clinician to activate the motor.
The control interface of the handle 697 may be in communication with a controller 698 of the handle 697 to selectively activate the motor to affect rotation of the drive shafts. The controller 698 may be disposed within the handle 697 and may be configured to receive input from the control interface and adapter data from the adapter 685 or loading unit data from the loading unit 687. The controller 698 may analyze the input from the control interface and the data received from the adapter 685 and/or loading unit 687 to selectively activate the motor. The handle 697 may also include a display that is viewable by a clinician during use of the handle 697. The display may be configured to display portions of the adapter or loading unit data before, during, or after firing of the instrument 682.
The adapter 685 may include an adapter identification device 684 disposed therein and the loading unit 687 may include a loading unit identification device 688 disposed therein. The adapter identification device 684 may be in communication with the controller 698, and the loading unit identification device 688 may be in communication with the controller 698. It will be appreciated that the loading unit identification device 688 may be in communication with the adapter identification device 684, which relays or passes communication from the loading unit identification device 688 to the controller 698.
The adapter 685 may also include a plurality of sensors 686 (one shown) disposed thereabout to detect various conditions of the adapter 685 or of the environment (e.g., if the adapter 685 is connected to a loading unit, if the adapter 685 is connected to a handle, if the drive shafts are rotating, the torque of the drive shafts, the strain of the drive shafts, the temperature within the adapter 685, a number of firings of the adapter 685, a peak force of the adapter 685 during firing, a total amount of force applied to the adapter 685, a peak retraction force of the adapter 685, a number of pauses of the adapter 685 during firing, etc.). The plurality of sensors 686 may provide an input to the adapter identification device 684 in the form of data signals. The data signals of the plurality of sensors 686 may be stored within or be used to update the adapter data stored within the adapter identification device 684. The data signals of the plurality of sensors 686 may be analog or digital. The plurality of sensors 686 may include a force gauge to measure a force exerted on the loading unit 687 during firing.
The handle 697 and the adapter 685 can be configured to interconnect the adapter identification device 684 and the loading unit identification device 688 with the controller 698 via an electrical interface. The electrical interface may be a direct electrical interface (i.e., include electrical contacts that engage one another to transmit energy and signals therebetween). Additionally, or alternatively, the electrical interface may be a non-contact electrical interface to wirelessly transmit energy and signals therebetween (e.g., inductively transfer). It is also contemplated that the adapter identification device 684 and the controller 698 may be in wireless communication with one another via a wireless connection separate from the electrical interface.
The handle 697 may include a transceiver 683 that is configured to transmit instrument data from the controller 698 to other components of the system 680 (e.g., the LAN 20292, the cloud 693, the console 694, or the portable device 696). The controller 698 may also transmit instrument data and/or measurement data associated with one or more sensors 686 to a surgical hub. The transceiver 683 may receive data (e.g., cartridge data, loading unit data, adapter data, or other notifications) from the surgical hub 670. The transceiver 683 may receive data (e.g., cartridge data, loading unit data, or adapter data) from the other components of the system 680. For example, the controller 698 may transmit instrument data including a serial number of an attached adapter (e.g., adapter 685) attached to the handle 697, a serial number of a loading unit (e.g., loading unit 687) attached to the adapter 685, and a serial number of a multi-fire fastener cartridge loaded into the loading unit to the console 694. Thereafter, the console 694 may transmit data (e.g., cartridge data, loading unit data, or adapter data) associated with the attached cartridge, loading unit, and adapter, respectively, back to the controller 698. The controller 698 can display messages on the local instrument display or transmit the message, via transceiver 683, to the console 694 or the portable device 696 to display the message on the display 695 or portable device screen, respectively.
The surgical information may include information related to the configuration and/or control of devices being used in the surgery (e.g., device operational information 729). Such device operational information 729 may include information about the initial settings of surgical devices. Device operational information 729 may include information about changes to the settings of surgical devices. Device operational information 729 may include information about controls sent to the devices from the surgical hub 704 and information flows related to such controls.
The surgical information may include information generated during the surgery itself (e.g., surgery information 727). Such surgery information 727 may be include any information generated by a surgical data source 726. The data sources 726 may include any device in a surgical context that may generate useful surgery information 727. This surgery information 727 may present itself as observable qualities of the data source 726. The observable qualities may include static qualities, such as a device's model number, serial number, and the like. The observable qualities may include dynamic qualities such as the state of configurable settings of the device. The surgery information 727 may present itself as the result of sensor observations for example. Sensor observations may include those from specific sensors within the surgical theatre, sensors for monitoring conditions, such as patient condition, sensors embedded in surgical devices, and the like. The sensor observations may include information used during the surgery, such as video, audio, and the like. The surgery information 727 may present itself as a device event data. Surgical devices may generate notifications and/or may log events, and such events may be included in surgery information 727 for communication to the surgical hub 704. The surgery information 727 may present itself as the result of manual recording, for example. A healthcare professional may make a record during the surgery, such as asking that a note be taken, capturing a still image from a display, and the like
The surgical data sources 726 may include modular devices (e.g., which can include sensors configured to detect parameters associated with the patient, HCPs and environment and/or the modular device itself), local databases (e.g., a local EMR database containing patient records), patient monitoring devices (e.g., a blood pressure (BP) monitor and an electrocardiography (EKG) monitor), HCP monitoring devices, environment monitoring devices, surgical instruments, surgical support equipment, and the like.
The surgical hub 704 can be configured to derive the contextual information pertaining to the surgical procedure from the data based upon, for example, the particular combination(s) of received data or the particular order in which the data is received from the data sources 726. The contextual information inferred from the received data can include, for example, the type of surgical procedure being performed, the particular step of the surgical procedure that the surgeon is performing, the type of tissue being operated on, or the body cavity that is the subject of the procedure. This ability by some aspects of the surgical hub 704 to derive or infer information related to the surgical procedure from received data can be referred to as “situational awareness.” For example, the surgical hub 704 can incorporate a situational awareness system, which is the hardware and/or programming associated with the surgical hub 704 that derives contextual information pertaining to the surgical procedure from the received data and/or a surgical plan information received from the edge computing system 714 or a healthcare data system 716 (e.g., enterprise cloud server).
In operation, this matrix of surgical information may be present as one or more information flows. For example, surgical information may flow from the surgical data sources 726 to the surgical hub 704. Surgical information may flow from the surgical hub 704 to the surgical data sources 726 (e.g., surgical devices). Surgical information may flow between the surgical hub 704 and one or more healthcare data systems 716. Surgical information may flow between the surgical hub 704 and one or more edge computing devices 714.
Surgical information, as presented in its one or more information flows, may be used in connection with one or more artificial intelligence (AI) systems to further enhance the operation of the surgical system 700. For example, a machine learning system, such as that described herein, may operate on one or more information flows to further enhance the operation of the surgical system 700.
For example, a generator module 734 (and corresponding energy device) may exchange surgical information with the surgical computing device 704. Such information may include information from the generator module 734 (and corresponding energy device), such as electrical information (e.g., current, voltage, impedance, frequency, wattage), activity state information, sensor information such as temperature, current settings, system events, active time duration, and activation timestamp, and the like. The generator module 734 may receive information from the surgical computing device 704, such as control information, configuration information, changes to the nature of the visible and audible notifications to the healthcare professional (e.g., changing the pitch, duration, and melody of audible tones), electrical application profiles and/or application logic that may instruct the generator module to provide energy with a defined characteristic curve over the application time, operational updates (such as software/firmware), and the like.
For example, a smoke evacuator 735 may exchange surgical information with the surgical computing device 704. Such information may include information from the smoke evacuator 735, such as operational information (e.g., revolutions per minute), activity state information, sensor information such as air temperature, current settings, system events, active time duration, and activation timestamp, and the like. The smoke evacuator 735 may receive information from the surgical computing device 704, such as control information, configuration information, operational updates (such as software/firmware), and the like.
For example, a suction/irrigation module 736 may exchange surgical information with the surgical computing device 704. Such information may include information from the suction/irrigation module 736, such as operational information (e.g., liters per minute), activity state information, internal sensor information, current settings, system events, active time duration, and activation timestamp, and the like. The suction/irrigation module 736 may receive information from the surgical computing device 704, such as control information, configuration information, operational updates (such as software/firmware), and the like.
For example, a communication module 739, a processor module 737, and/or a storage array 738 may exchange surgical information with the surgical computing device 704. In an example, the communication module 739, the processor module 737, and/or the storage array 738 may constitute all or part of the computing platform upon which the surgical computing device 704 runs. In an example, the communication module 739, the processor module 737, and/or the storage array 738 may provide local computing resources to other devices in the surgical system 730. Information from the communication module 739, the processor module 737, and/or the storage array 738 to the surgical computing device 704 may include logical computing-related reports, such as processing load, processing capacity, process identification, CPU %, CPU time, threads, GPU %, GPU time, memory utilization, memory thread, memory ports, energy usage, bandwidth related information, packets in, packets out, data rate, channel utilization, buffer status, packet loss information, system events, other state information, and the like. The communication module 739, the processor module 737, and/or the storage array 738 may receive information from the surgical computing device 704, such as control information, configuration information, operational updates (such as software/firmware), and the like. The communication module 739, the processor module 737, and/or the storage array 738 may also receive information from the surgical computing device 704 generated by another element or device of the surgical system 730. For example, data source information may be sent to and stored in the storage array. For example, data source information may be processed by the processor module 737.
For example, an intelligent instrument 740 (with or without a corresponding display) may exchange surgical information with the surgical computing device 704. Such information may include information from the intelligent instrument 740 relative to the instrument's operation, such as device electrical and/or mechanical information (e.g., current, voltage, impedance, frequency, wattage, torque, force, pressure, etc.), load state information (e.g., information regarding the identity, type, and/or status of reusables, such as staple cartridges), internal sensor information such as clamping force, tissue compression pressure and/or time, system events, active time duration, and activation timestamp, and the like. The intelligent instrument 740 may receive information from the surgical computing device 704, such as control information, configuration information, changes to the nature of the visible and audible notifications to the healthcare professional (e.g., changing the pitch, duration, and melody of audible tones), mechanical application profiles and/or application logic that may instruct a mechanical component of the instrument to operate with a defined characteristic (e.g., blade/anvil advance speed, mechanical advantage, firing time, etc.), operational updates (such as software/firmware), and the like.
For example, a sensor module 741 may exchange surgical information with the surgical computing device 704. Such information may include information from the sensor module 741 relative to its sensor function, such as sensor results themselves, observational frequency and/or resolution, observational type, device alerts such as alerts for sensor failure, observations exceeding a defined range, observations exceeding an observable range, and the like. The sensor module 741 may receive information from the surgical computing device 704, such as control information, configuration information, changes to the nature of observation (e.g., frequency, resolution, observational type etc.), triggers that define specific events for observation, on control, off control, data buffering, data preprocessing algorithms, operational updates (such as software/firmware), and the like.
For example, a visualization system 742 may exchange surgical information with the surgical computing device 704. Such information may include information from the visualization system 742, such visualization data itself (e.g., still image, video, advanced spectrum visualization, etc.), visualization metadata (e.g., visualization type, resolution, frame rate, encoding, bandwidth, etc.). The visualization system 742 may receive information from the surgical computing device 704, such as control information, configuration information, changes to the video settings (e.g., visualization type, resolution, frame rate, encoding, etc.), visual display overlay data, data buffering size, data preprocessing algorithms, operational updates (such as software/firmware), and the like.
For example, a surgical robot 743 may exchange surgical information with the surgical computing device 704. Information from the surgical robot 743 may include any aforementioned information as applied to robotic instruments, sensors, and devices. Information from the surgical robot 743 may also include information related to the robotic operation or control of such instruments, such as electrical/mechanical feedback of robot articulators, system events, system settings, mechanical resolution, control operation log, articulator path information, and the like. The surgical robot 743 may receive information from the surgical computing device 704, such as control information, configuration information, operational updates (such as software/firmware), and the like.
Here, the two surgical computing systems 704a, 704b request permission from a surgical operator for the second surgical computing system 704b (with the corresponding surgical robot 756) to take control of the operating room from the existing surgical computing system 704a. The second surgical computing system 704b presents in the operating theater with control of the corresponding surgical robot 756, a robot visualization tower 758, a mono hat tool 759, and a robot stapler 749. The permission can be requested through a surgeon interface or console 751. Once permission is granted, the second surgical computing system 704b messages the existing surgical computing system 704a a request a transfer of control of the operating room.
In an example, the surgical computing systems 704a, 704b can negotiate the nature of their interaction without external input based on previously gathered data. For example, the surgical computing systems 704a, 704b may collectively determine that the next surgical task requires use of a robotic system. Such determination may cause the existing surgical computing system 704a to autonomously surrender control of the operating room to the second surgical computing system 704b. Upon completion of the surgical task, the second surgical computing system 704b may then autonomously return the control of the operating room to the existing surgical computing system 704a.
As illustrated in
Surgical information 762 from a plurality of surgical procedures 764 (e.g., a subset of surgical information from each procedure) may be collected. The surgical information 762 may be collected from the plurality of surgical procedures 764 by collecting data represented by the one or more information flows disclosed herein, for example.
To illustrate, example instance of surgical information 766 may be generated from the example procedure 768 (e.g., a lung segmentectomy procedure as shown on a timeline 769). Surgical information 766 may be generated during the preoperative planning and may include patient record information. Surgical information 766 may be generated from the data sources (e.g., data sources 726) during the course of the surgical procedure, including data generated each time medical personnel utilize a modular device that is paired with the surgical computing system (e.g., surgical computing system 704). The surgical computing system may receive this data from the paired modular devices and other data sources The surgical computing system itself may generate surgical information as part of its operation during the procedure. For example, the surgical computing system may record information relating to configuration and control operations. The surgical computing system may record information related to situational awareness activities. For example, the surgical computing system may record the recommendations, prompts, and/or other information provided to the healthcare team (e.g., provided via a display screen) that may be pertinent for the next procedural step. For example, the surgical computing system may record configuration and control changes (e.g., the adjusting of modular devices based on the context). Such configuration and control changes may include activating monitors, adjusting the field of view (FOV) of a medical imaging device, changing the energy level of an ultrasonic surgical instrument or RF electrosurgical instrument, or the like.
At 770, the hospital staff members retrieve the patient's EMR from the hospital's EMR database. Based on select patient data in the EMR, the surgical computing system determines that the procedure to be performed is a thoracic procedure.
At 771, the staff members scan the incoming medical supplies for the procedure. The surgical computing system may cross-reference the scanned supplies with a list of supplies that are utilized in various types of procedures. The surgical computing system may confirm that the mix of supplies corresponds to a thoracic procedure. Further, the surgical computing system may determine that the procedure is not a wedge procedure (because the incoming supplies either lack certain supplies that are necessary for a thoracic wedge procedure or do not otherwise correspond to a thoracic wedge procedure). The medical personnel may also scan the patient band via a scanner that is communicably connected to the surgical computing system. The surgical computing system may confirm the patient's identity based on the scanned data.
At 774, the medical staff turns on the auxiliary equipment. The auxiliary equipment being utilized can vary according to the type of surgical procedure and the techniques to be used by the surgeon. In this example, the auxiliary equipment may include a smoke evacuator, an insufflator, and medical imaging device. When activated, the auxiliary equipment may pair with the surgical computing system. The surgical computing system may derive contextual information about the surgical procedure based on the types of paired. In this example, the surgical computing system determines that the surgical procedure is a VATS procedure based on this particular combination of paired devices. The contextual information about the surgical procedure may be confirmed by the surgical computing system via information from the patient's EMR.
The surgical computing system may retrieve the steps of the procedure to be performed. For example, the steps may be associated with a procedural plan (e.g., a procedural plan specific to this patient's surgery, a procedural plan associated with a particular surgeon, a procedural plan template for the procedure generally, or the like).
At 775, the staff members attach the EKG electrodes and other patient monitoring devices to the patient. The EKG electrodes and other patient monitoring devices pair with the surgical computing system. The surgical computing system may receive data from the patient monitoring devices.
At 776, the medical personnel induce anesthesia in the patient. The surgical computing system may record information related to this procedural step such as data from the modular devices and/or patient monitoring devices, including EKG data, blood pressure data, ventilator data, or combinations thereof, for example.
At 777, the patient's lung subject to operation is collapsed (ventilation may be switched to the contralateral lung). The surgical computing system may determine that this procedural step has commenced and may collect surgical information accordingly, including for example, ventilator data, one or more timestamps, and the like
At 778, the medical imaging device (e.g., a scope) is inserted and video from the medical imaging device is initiated. The surgical computing system may receive the medical imaging device data (i.e., video or image data) through its connection to the medical imaging device. The data from the medical imaging device may include imaging data and/or imaging metadata, such as the angle at which the medical imaging device is oriented with respect to the visualization of the patient's anatomy, the number or medical imaging devices presently active, and the like. The surgical computing system may record positioning information of the medical imaging device. For example, one technique for performing a VATS lobectomy places the camera in the lower anterior corner of the patient's chest cavity above the diaphragm. Another technique for performing a VATS segmentectomy places the camera in an anterior intercostal position relative to the segmental fissure.
Using pattern recognition or machine learning techniques, for example, the surgical computing system may be trained to recognize the positioning of the medical imaging device according to the visualization of the patient's anatomy. For example, one technique for performing a VATS lobectomy utilizes a single medical imaging device. Another technique for performing a VATS segmentectomy uses multiple cameras. Yet another technique for performing a VATS segmentectomy uses an infrared light source (which may be communicably coupled to the surgical computing system as part of the visualization system).
At 779, the surgical team begins the dissection step of the procedure. The surgical computing system may collect data from the RF or ultrasonic generator indicating that an energy instrument is being fired. The surgical computing system may cross-reference the received data with the retrieved steps of the surgical procedure to determine that an energy instrument being fired at this point in the process (i.e., after the completion of the previously discussed steps of the procedure) corresponds to the dissection step. In an example, the energy instrument may be an energy tool mounted to a robotic arm of a robotic surgical system.
At 780, the surgical team proceeds to the ligation step of the procedure. The surgical computing system may collect surgical information 766 with regard to the surgeon ligating arteries and veins based on receiving data from the surgical stapling and cutting instrument indicating that such instrument is being fired. Next, the segmentectomy portion of the procedure is performed. The surgical computing system may collect information relating to the surgeon transecting the parenchyma. For example, the surgical computing system may receive surgical information 766 from the surgical stapling and cutting instrument, including data regarding its cartridge, settings, firing details, and the like.
At 782, the node dissection step is then performed. The surgical computing system may collect surgical information 766 with regard to the surgical team dissecting the node and performing a leak test. For example, the surgical computing system may collect data received from the generator indicating that an RF or ultrasonic instrument is being fired and including the electrical and status information associated with the firing. Surgeons regularly switch back and forth between surgical stapling/cutting instruments and surgical energy (i.e., RF or ultrasonic) instruments depending upon the particular step in the procedure. The surgical computing system may collect surgical information 766 in view of the particular sequence in which the stapling/cutting instruments and surgical energy instruments are used. In an example, robotic tools may be used for one or more steps in a surgical procedure. The surgeon may alternate between robotic tools and handheld surgical instruments and/or can use the devices concurrently, for example.
Next, the incisions are closed up and the post-operative portion of the procedure begins. At 784, the patient's anesthesia is reversed. The surgical computing system may collect surgical information regarding the patient emerging from the anesthesia based on ventilator data (e.g., the patient's breathing rate begins increasing), for example.
At 785, the medical personnel remove the various patient monitoring devices from the patient. The surgical computing system may collect information regarding the conclusion of the procedure. For example, the surgical computing system may collect information related to the loss of EKG, BP, and other data from the patient monitoring devices.
The surgical information 762 (including the surgical information 766) may be structured and/or labeled. The surgical computing system may provide such structure and/or labeling inherently in the data collection. For example, in surgical information 762 may be labeled according to a particular characteristic, a desired result (e.g., efficiency, patient outcome, cost, and/or a combination of the same, or the like), a certain surgical technique, an aspect of instrument use (e.g., selection, timing, and activation of a surgical instrument, the instrument's settings, the nature of the instrument's use, etc.), the identity of the health care professionals involved, a specific patient characteristic, or the like, each of which may be present in the data collection.
Surgical information (e.g., surgical information 762 collected across procedures 764) may be used in connection with one or more artificial intelligence (AI) systems. AI may be used to perform computer cognitive tasks. For example, AI may be used to perform complex tasks based on observations of data. AI may be used to enable computing systems to perform cognitive tasks and solve complex tasks. AI may include using machine learning and machine learning techniques. ML techniques may include performing complex tasks, for example, without being programmed (e.g., explicitly programmed). For example, a ML technique may improve over time based on completing tasks with different inputs. A ML process may train itself, for example using input data and/or a learning dataset.
Machine learning (ML) techniques may be employed, for example, in the medical field. For example, ML may be used on a set of data (e.g., a set of surgical data) to produce an output (e.g., reduced surgical data, processed surgical data). In examples, the output of a ML process may include identified trends or relationships of the data that were input for processing. The outputs may include verifying results and/or conclusions associated with the input data. In examples, an input to a ML process may include medical data, such as surgical images and patient scans. The ML process may output a determined medical condition based on the input surgical images and patient scans. The ML process may be used to diagnose medical conditions, for example, based on the surgical scans.
ML processes may improve themselves, for example, using the historic data that trained the ML processes and/or the input data. Therefore, ML processes may be constantly improving with added inputs and processing. The ML processes may update based on input data. For example, over time, a ML process that produces medical conclusions based on medical data may improve and become more accurate and consistent in medical diagnoses.
ML processes may be used to solve different complex tasks (e.g., medical tasks). For example, ML processes may be used for data reduction, data preparation, data processing, trend identification, conclusion determination, medical diagnoses, and/or the like. For example, ML processes may take in surgical data as an input and process the data to be used for medical analysis. The processed data may be used to determine a medical diagnosis. In the end, the ML processes may take raw surgical data and generate useful medical information (e.g., medical trends and/or diagnoses) associated with the raw surgical data.
ML processes may be combined to perform different discrete tasks on an input data set. For example, a ML process may include testing different combinations of ML sub-processes performing discrete tasks to determine which combination of ML sub-processes performs the best (e.g., competitive usage of different process/algorithm types and training to determine the best combination for a dataset). For example, the ML process may include sub-process (e.g., algorithm) control and monitoring to improve and/or verify results and/or conclusions (e.g., error bounding).
A ML process may be initialized and/or setup to perform tasks. For example, the ML process may be initialized based on initialization configuration information. The initialized ML process may be untrained and/or a base ML process for performing the task. The untrained ML process may be inaccurate in performing the designated tasks. As the ML process becomes trained, the tasks may be performed more accurately.
The initialization configuration information for a ML process may include initial settings and/or parameters. For example, the initial settings and/or parameters may include defined ranges for the ML process to employ. The ranges may include ranges for manual inputs and/or received data. The ranges may include default ranges and/or randomized ranges for variables not received, for example, which may be used to complete a dataset for processing. For example, if a dataset is missing a data range, the default data range may be used as a substitute to perform the ML process.
The initialization configuration information for a ML process may include data storage locations. For example, locations or data storages and/or databases associated with data interactions may be included. The databases associated with data interactions may be used to identify trends in datasets. The databases associated with data interactions may include mappings of data to a medical condition. For example, a database associated with data interactions may include a mapping for heart rate data to medical conditions, such as, for example, arrhythmia and/or the like.
The initialization configuration information may include parameters associated with defining the system. The initialization configuration information may include instructions (e.g., methods) associated with displaying, confirming, and/or providing information to a user. For example, the initialization configuration may include instructions to the ML process to output the data in a specific format for visualization for a user.
ML techniques may be used, for example, to perform data reduction. ML techniques for data reductions may include using multiple different data reduction techniques. For example, ML techniques for data reductions may include using one or more of the following: CUR matrix decomposition; a decision tree; expectation-maximization (EM) processes (e.g., algorithms); explicit semantic analysis (ESA); exponential smoothing forecast; generalized linear model; k-means clustering (e.g., nearest neighbor); Naive Bayes; neural network processes; a multivariate analysis; an o-cluster; a singular value decomposition; Q-learning; a temporal difference (TD); deep adversarial networks; support vector machines (SVM); linear regression; reducing dimensionality; linear discriminant analysis (LDA); adaptive boosting (e.g., AdaBoost); gradient descent (e.g., Stochastic gradient descent (SGD)); outlier detection; and/or the like.
ML techniques may be used to perform data reduction, for example, using CUR matrix decompositions. A CUR matrix decomposition may include using a matrix decomposition model (e.g., process, algorithm), such as a low-rank matrix decomposition model. For example, CUR matrix decomposition may include a low-rank matrix decomposition process that is expressed (e.g., explicitly expressed) in a number (e.g., small number) of columns and/or rows of a data matrix (e.g., the CUR matrix decomposition may be interpretable). CUR matrix decomposition may include selecting columns and/or rows associated with statistical leverage and/or a large influence in the data matrix. Using CUR matrix decomposition may enable identification of attributes and/or rows in the data matrix. The simplification of a larger dataset (e.g., using CUR matrix decomposition) may enable review and interaction (e.g., with the data) by a user. CUR matrix decomposition may facilitate regression, classification, clustering, and/or the like.
ML techniques may be used to perform data reduction, for example, using decision trees (e.g., decision tree model). Decision trees may be used, for example, as a framework to quantify values of outcomes and/or the probabilities of outcomes occurring. Decision trees may be used, for example, to calculate the value of uncertain outcome nodes (e.g., in a decision tree). Decision trees may be used, for example, to calculate the value of decision nodes (e.g., in a decision tree). A decision tree may be a model to enable classification and/or regression (e.g., adaptable to classification and/or regression problems). Decision trees may be used to analyze numerical (e.g., continuous values) and/or categorical data. Decision trees may be more successful with large data sets and/or may be more efficient (e.g., as compared to other data reduction techniques).
Decision trees may be used in combination with other decision trees. For example, a random forest may refer to a collection of decision trees (e.g., ensemble of decision trees). A random forest may include a collection of decision trees whose results may be aggregated into a result. A random forest may be a supervised learning algorithm. A random forest may be trained, for example, using a bagging training process.
A random decision forest (e.g., random forest) may add randomness (e.g., additional randomness) to a model, for example, while growing the trees. A random forest may be used to search for a best feature among a random subset of features, for example, rather than searching for the most important feature (e.g., while splitting a node). Searching for the best feature among a random subset of features may result in a wide diversity that may result in a better (e.g., more efficient and/or accurate) model.
A random forest may include using parallel ensembling. Parallel ensembling may include fitting (e.g., several) decision tree classifiers in parallel, for example, on different data set sub-samples. Parallel ensembling may include using majority voting or averages for outcomes or final results. Parallel ensembling may be used to minimize overfitting and/or increase prediction accuracy and control. A random forest with multiple decision trees may (e.g., generally) be more accurate than a single decision tree-based model. A series of decision trees with controlled variation may be built, for example, by combining bootstrap aggregation (e.g., bagging) and random feature selection.
ML techniques may be used to perform data reduction, for example, using an expectation maximization (EM) model (e.g., process, algorithm). For example, an EM model may be used to find a likelihood (e.g., local maximum likelihood) parameter of a statistical model. An EM model may be used for cases where equations may not be solved directly. An EM model may consider latent variables and/or unknown parameters and known data observations. For example, the EM model may determine that missing values exist in a data set. The EM model receive configuration information indicating to assume the existence of missing (e.g., unobserved) data points in a data set.
An EM model may use component clustering. For example, component clustering may enable the grouping of EM components into high-level clusters. Components may be treated as clustered, for example, if component clustering is disabled (e.g., in an EM model).
ML techniques may be used to perform data reduction, for example, using explicit semantic analysis (ESA). ESA may be used at a level of semantics (e.g., meaning) rather than on vocabulary (e.g., surface form vocabulary) of words or a document. ESA may focus on the meaning of a set of text, for example, as a combination of the concepts found in the text. ESA may be used in document classification. ESA may be used for a semantic relatedness calculation (e.g., how similar in meaning words or pieces of text are to each other). ESA may be used for information retrieval.
ESA may be used in document classification, for example. Document classification may include tagging documents for managing and sorting. Tagging a document (e.g., with a keyword) may allow for easier searching. Keyword tagging (e.g., only using keyword tagging) may limit the accuracy and/or efficiency of document classification. For example, using keyword tagging may uncover (e.g., only uncover) documents with the keywords and not documents with words with similar meaning to the keywords. Classifying text semantically (e.g., using ESA) may improve a model's understanding of text. Classifying text semantically may include representing documents as concepts and lowering dependence on specific keywords.
ML techniques may be used to perform data reduction, for example, using an exponential smoothing forecast model. Exponential smoothing may be used to smooth time series data, for example, using an exponential window function. For example, in a moving average, past observations may be weighted equally, but exponential functions may be used to assign exponentially decreasing weights over time.
ML techniques may be used to perform data reduction, for example, using linear regression. Linear regression may be used to predict continuous outcomes. For example, linear regression may be used to predict the value of a variable (e.g., dependent variable) based on the value of a different variable (e.g., independent variable). Linear regression may apply a linear approach for modeling a relationship between a scalar response and one or more explanatory variables (e.g., dependent and/or independent variables). Simple linear regression may refer to linear regression use cases associated with one explanatory variable. Multiple linear regression may refer to linear regression use cases associated with more than one explanatory variables. Linear regression may model relationships, for example, using linear predictor functions. The linear predictor functions may estimate unknown model parameters from a data set.
For example, linear regression may be used to identify patterns within a training dataset. The identified patterns may relate to values and/or label groupings. The model may learn a relationship between the (e.g., each) label and the expected outcomes. After training, the model may be used on raw data outside the training data set (e.g., data without a mapped and/or known output). The trained model using linear regression may determine calculated predictions associated with the raw data, for example, such as identifying seasonal changes in sales data.
ML techniques may be used to perform data reduction, for example, a generalized linear model (GLM). A GLM may be used as a flexible generalization of linear regression. GLM may generalize linear regression, for example, by enabling a linear model to be related to a response variable.
ML techniques may be used to perform data reduction, for example, using k-means clustering (e.g., a nearest neighbor model). K-means clustering may be used for vector quantization. K-means clustering may be used in signal processing. K-means clustering may be aimed at partitioning n observations into k clusters, for example, where each observation is classified into a cluster with the closest mean.
K-means clustering may include K-Nearest Neighbors (KNN) learning. KNN may be an instance-based learning (e.g., non-generalized learning, lazy learning). KNN may refrain from constructing a general internal model. KNN may include storing instances corresponding to training data in an n-dimensional space. KNN may use data and classify data points, for example, based on similarity measures (e.g., Euclidean distance function). Classification may be computed, for example, based on a majority vote of the k nearest neighbors of a (e.g., each) point. KNN may be robust for noisy training data. Accuracy may depend on data quality (e.g., for KNN). KNN may include choosing a number of neighbors to be considered (e.g., optimal number of neighbors to be considered). KNN may be used for classification and/or regression.
ML techniques may be used to perform data reduction, for example, using a Naive Bayes model (e.g., process). A Naive Bayes model may be used, for example, to construct classifiers. A Naive Bayes model may be used to assign class labels to problem instances (e.g., represented as vectors of feature values). The class labels may be drawn from a set (e.g., finite set). Different processes (e.g., algorithms) may be used to train the classifiers. A family of processes (e.g., family of algorithms) may be used. The family of processes may be based on a principle where the Naive Bayes classifiers (e.g., all the Naive Bayes) classifiers assume that the value of a feature is independent of the value of a different feature (e.g., given the class variable).
ML techniques may be used to perform data reduction, for example, using a neural network. Neural networks may learn (e.g., be trained) by processing examples, for example, to perform other tasks (e.g., similar tasks). A processing example may include an input and a result (e.g., input mapped to a result). The neural network may learn by forming probability-weighted associations between the input and the result. The probability-weighted associations may be stored within a data structure of the neural network. The training of the neural network from a given example may be conducted by determining the difference between a processed output of the network (e.g., prediction) and a target output. The difference may be the error. The neural network may adjust the weighted associations (e.g., stored weighted associations), for example, according to a learning rule and the error value.
ML techniques may be used to perform data reduction, for example, using multivariate analysis. Multivariate analysis may include performing multivariate state estimation and/or non-negative matrix factorization.
ML techniques may be used to perform data reduction, for example, using support vector machines (SVMs). SVMs may be used in a multi-dimensional space (e.g., high-dimensional space, infinite-dimensional space). SVCs may be used to construct a hyper-plane (e.g., set of hyper-planes). A hyper-plane that has the greatest distance (e.g., compared to the other constructed hyper-planes) from a nearest training data point in a class (e.g., any class) may achieve a strong separation (e.g., in general, the greater the margin, the lower the classifier's generalization error). SVMs may be effective in high-dimensional spaces. SVMs may behave differently, for example, based on different mathematical functions (e.g., the kernel, kernel functions). For example, kernel functions may include one or more of the following: linear, polynomial, radial basis function (RBF), sigmoid, etc. The kernel functions may be used as a SVM classifier. SVM may be limited in use cases, for example, where a data set contains high amounts of noise (e.g., overlapping target classes).
ML techniques may be used to perform data reduction, for example, such as reducing dimensionality. Reducing dimensionality of a sample of data (e.g., unlabeled data) may help refine groups and/or clusters. Reducing a number of variables in a model may simplify data trends. Simplified data trends may enable more efficient processing. Reducing dimensionality may be used, for example, if many (e.g., too many) dimensions are clouding (e.g., negatively affecting) insights, trends, patterns, conclusions, and/or the like.
Reducing dimensionality may include using principal component analysis (PCA). PCA may be used to establish principal components that govern a relationship between data points. PCA may focus on simplifying (e.g., only simplifying) the principal components. Reducing dimensionality (e.g., PCA) may be used to maintain the variety of data grouping in a data set, but streamline the number of separate groups.
ML techniques may be used to perform data reduction, for example, linear discriminant analysis (LDA). LDA may refer to a linear decision boundary classifier, for example, that may be created by fitting class conditional densities to data (e.g., and applying Bayes' rule). LDA may include a generalization of Fisher's linear discriminant (e.g., projecting a given dataset into lower-dimensional space, for example, to reduce dimensionality and minimize complexity of a model and reduce computational costs). An LDA model (e.g., standard LDA model) may suit a class with a Gaussian density. The LDA model may assume that the classes (e.g., all classes) share a covariance matrix. LDA may be similar to analysis of variance (ANOVA) processes and/or regression analysis. For example, LDA may be used to express a dependent variable as a linear combination of other features and/or measurements.
ML techniques may be used to perform data reduction, for example, such as adaptive boosting (e.g., AdaBoost). Adaptive boosting may include creating a classifier (e.g., powerful classifier). Adaptive boosting may include creating a classier by combining multiple classifiers (e.g., poorly performing classifiers), for example, to obtain a resulting classifier with high accuracy. AdaBoost may be an adaptive classifier that improves the efficiency of a classifier. AdaBoost may trigger overfits. AdaBoost may be used (e.g., best used) to boost the performance of decision trees, base estimator(s), binary classification problems, and/or the like. AdaBoost may be sensitive to noisy data and/or outliers.
ML techniques may be used to perform data reduction, for example, such as stochastic gradient descent (SGD). SGD may include an iterative process used to optimize a function (e.g., objective function). SGD may be used to optimize an objective function, for example, with certain smoothness properties. Stochastic may refer to random probability. SGD may be used to reduce computational burden, for example, in high-dimensional optimization problems. SGD may be used to enable faster iterations, for example, while exchanging for a lower convergence rate. A gradient may refer to the slop of a function, for example, that calculates a variable's degree of change in response to another variable's changes. Gradient descent may refer to a convex function that outputs a partial derivative of a set of its input parameters. For example, a may be a learning rate and Ji may be a training example cost of the ith iteration. The equation may represent the stochastic gradient descent weight update method at the jth iteration. In large-scale ML and sparse ML, SGD may be applied to problems in text classification and/or natural language processing (NLP). SGD may be sensitive to feature scaling (e.g., may need to use a range of hyperparameters, for example, such as a regularization parameter and a number of iterations).
ML techniques may be used to perform data reduction, for example, such as using outlier detection. An outlier may be a data point that contains information (e.g., useful information) on an abnormal behavior of a system described by the data. Outlier detection processes may include univariate processes and multivariate processes.
ML processes may be trained, for example, using one or more training methods. For example, ML processes may be trained using one or more of the following training techniques: supervised learning; unsupervised learning; semi-supervised learning; reinforcement learning; and/or the like.
Machine learning may be supervised (e.g., supervised learning). A supervised learning algorithm may create a mathematical model from training a dataset (e.g., training data).
Machine learning may be unsupervised (e.g., unsupervised learning).
Machine learning may be semi-supervised (e.g., semi-supervised learning). A semi-supervised learning algorithm may be used in scenarios where a cost to label data is high (e.g., because it requires skilled experts to label the data) and there are limited labels for the data. Semi-supervised learning models may exploit an idea that although group memberships of unlabeled data are unknown, the data still carries important information about the group parameters.
Machine learning may include reinforcement learning, which may be an area of machine learning that may be concerned with how software agents may take actions in an environment to maximize a notion of cumulative reward. Reinforcement learning algorithms may not assume knowledge of an exact mathematical model of the environment (e.g., represented by Markov decision process (MDP)) and may be used when exact models may not be feasible. Reinforcement learning algorithms may be used in autonomous vehicles or in learning to play a game against a human opponent. Examples algorithms may include Q-Learning, Temporal Difference (TD), Deep Adversarial Networks, and/or the like.
Reinforcement learning may include an algorithm (e.g., agent) continuously learning from the environment in an iterative manner. In the training process, the agent may learn from experiences of the environment until the agent explores the full range of states (e.g., possible states). Reinforcement learning may be defined by a type of problem. Solutions of reinforcement learning may be classed as reinforcement learning algorithms. In a problem, an agent may decide an action (e.g., the best action) to select based on the agent's current state. If a step if repeated, the problem may be referred to as an MDP.
For example, reinforcement learning may include operational steps. An operation step in reinforcement learning may include the agent observing an input state. An operation step in reinforcement learning may include using a decision making function to make the agent perform an action. An operation step may include (e.g., after an action is performed) the agent receiving a reward and/or reinforcement from the environment. An operation step in reinforcement learning may include storing the state-action pair information about the reward.
Machine learning may be a part of a technology platform called cognitive computing (CC), which may constitute various disciplines such as computer science and cognitive science. CC systems may be capable of learning at scale, reasoning with purpose, and interacting with humans naturally. By means of self-teaching algorithms that may use data mining, visual recognition, and/or natural language processing, a CC system may be capable of solving problems and optimizing human processes.
The output of machine learning's training process may be a model for predicting outcome(s) on a new dataset. For example, a linear regression learning algorithm may be a cost function that may minimize the prediction errors of a linear prediction function during the training process by adjusting the coefficients and constants of the linear prediction function. When a minimal may be reached, the linear prediction function with adjusted coefficients may be deemed trained and constitute the model the training process has produced. For example, a neural network (NN) algorithm (e.g., multilayer perceptrons (MLP)) for classification may include a hypothesis function represented by a network of layers of nodes that are assigned with biases and interconnected with weight connections. The hypothesis function may be a non-linear function (e.g., a highly non-linear function) that may include linear functions and logistic functions nested together with the outermost layer consisting of one or more logistic functions. The NN algorithm may include a cost function to minimize classification errors by adjusting the biases and weights through a process of feedforward propagation and backward propagation. When a global minimum may be reached, the optimized hypothesis function with its layers of adjusted biases and weights may be deemed trained and constitute the model the training process has produced.
Data collection may be performed for machine learning as a first stage of the machine learning lifecycle. Data collection may include steps such as identifying various data sources, collecting data from the data sources, integrating the data, and the like. For example, for training a machine learning model for predicting surgical complications and/or post-surgical recovery rates, data sources containing pre-surgical data, such as a patient's medical conditions and biomarker measurement data, may be identified. Such data sources may be a patient's electronical medical records (EMR), a computing system storing the patient's pre-surgical biomarker measurement data, and/or other like datastores. The data from such data sources may be retrieved and stored in a central location for further processing in the machine learning lifecycle. The data from such data sources may be linked (e.g., logically linked) and may be accessed as if they were centrally stored. Surgical data and/or post-surgical data may be similarly identified, collected. Further, the collected data may be integrated. In examples, a patient's pre-surgical medical record data, pre-surgical biomarker measurement data, pre-surgical data, surgical data, and/or post-surgical may be combined into a record for the patient. The record for the patient may be an EMR.
Data preparation may be performed for machine learning as another stage of the machine learning lifecycle. Data preparation may include data preprocessing steps such as data formatting, data cleaning, and data sampling. For example, the collected data may not be in a data format suitable for training a model. Such data record may be converted to a flat file format for model training. Such data may be mapped to numeric values for model training. Such identifying data may be removed before model training. For example, identifying data may be removed for privacy reasons. As another example, data may be removed because there may be more data available than may be used for model training. In such case, a subset of the available data may be randomly sampled and selected for model training and the remainder may be discarded.
Data preparation may include data transforming procedures (e.g., after preprocessing), such as scaling and aggregation. For example, the preprocessed data may include data values in a mixture of scales. These values may be scaled up or down, for example, to be between 0 and 1 for model training. For example, the preprocessed data may include data values that carry more meaning when aggregated.
Model training may be another aspect of the machine learning lifecycle. The model training process as described herein may be dependent on the machine learning algorithm used. A model may be deemed suitably trained after it has been trained, cross validated, and tested. Accordingly, the dataset from the data preparation stage (e.g., an input dataset) may be divided into a training dataset (e.g., 60% of the input dataset), a validation dataset (e.g., 20% of the input dataset), and a test dataset (e.g., 20% of the input dataset). After the model has been trained on the training dataset, the model may be run against the validation dataset to reduce overfitting. If accuracy of the model were to decrease when run against the validation dataset when accuracy of the model has been increasing, this may indicate a problem of overfitting. The test dataset may be used to test the accuracy of the final model to determine whether it is ready for deployment or more training may be required.
Model deployment may be another aspect of the machine learning lifecycle. The model may be deployed as a part of a standalone computer program. The model may be deployed as a part of a larger computing system. A model may be deployed with model performance parameters(s). Such performance parameters may monitor the model accuracy as it is used for predicating on a dataset in production. For example, such parameters may keep track of false positives and false positives for a classification model. Such parameters may further store the false positives and false positives for further processing to improve the model's accuracy.
Post-deployment model updates may be another aspect of the machine learning cycle. For example, a deployed model may be updated as false positives and/or false positives are predicted on production data. In an example, for a deployed MLP model for classification, as false positives occur, the deployed MLP model may be updated to increase the probably cutoff for predicting a positive to reduce false positives. In an example, for a deployed MLP model for classification, as false negatives occur, the deployed MLP model may be updated to decrease the probably cutoff for predicting a positive to reduce false negatives. In an example, for a deployed MLP model for classification of surgical complications, as both false positives and false negatives occur, the deployed MLP model may be updated to decrease the probably cutoff for predicting a positive to reduce false negatives because it may be less critical to predict a false positive than a false negative.
For example, a deployed model may be updated as more live production data become available as training data. In such case, the deployed model may be further trained, validated, and tested with such additional live production data. In an example, the updated biases and weights of a further-trained MLP model may update the deployed MLP model's biases and weights. Those skilled in the art recognize that post-deployment model updates may not be a one-time occurrence and may occur as frequently as suitable for improving the deployed model's accuracy.
ML techniques may be used independently of each other or in combination. Different problems and/or datasets may benefit from using different ML techniques (e.g., combinations of ML techniques). Different training types for models may be better suited for a certain problem and/or dataset. An optimal algorithm (e.g., combination of ML techniques) and/or training type may be determined for a specific usage, problem, and/or dataset. For example, a process may be performed to for one or more of the following: choose a data reduction type, choose a configuration for a model and/or algorithm, determine a location for the data reduction, choose an efficiency of the reduction and/or result, and/or the like.
For example, a ML technique and/or combination of ML techniques may be determined for a particular problem and/or use case. Multiple data reduction and/or data analysis processes may be performed to determine accuracy, efficiency, and/or compatibility associated with a dataset. For example, a first ML technique (e.g., first set of combined ML techniques) may be used on a dataset to perform data reduction and/or data analysis. The first ML technique may produce a first output. Similarly, a second ML technique (e.g., second set of combined ML techniques) may be used on the dataset (e.g., same dataset) to perform data reduction and/or data analysis. The second ML technique may produce a second output. The first output may be compared with the second output to determine which ML technique produced more desirable results (e.g., more efficient results, more accurate results). Multiple ML techniques may be compared with the same dataset to determine the optimal ML technique(s) to use on a future similar dataset and/or problem.
In examples, in a medical context, a surgeon or healthcare professional may give feedback to ML techniques and/or models used on a dataset. The surgeon may input feedback to weighted results of a ML model. The feedback may be used as an input by the model to determine a reduction method for future analyses.
In examples, a data analysis method (e.g., ML techniques to be used in the data analysis method) may be determined based on the dataset itself. For example, the origin of the data may influence the type of data analysis method to be used on the dataset. System resources available may be used to determine the data analysis method to be used on a given dataset. The data magnitude, for example, may be considered in determining a data analysis method. For example, the need for datasets exterior to the local processing level or magnitude of operational responses may be considered (e.g., small device changes may be made with local data, major device operation changes may require global compilation and verification).
Such ML techniques may be applied to surgical information (e.g., a combination of information flows of surgical information in
Referring to
Other types of robotic systems may be readily adapted for use with the surgical system. Various examples of robotic systems and surgical tools that are suitable for use with the present disclosure are described in U.S. Patent Application Publication No. US 2019-0201137 A1 (U.S. patent application Ser. No. 16/209,407), titled METHOD OF ROBOTIC HUB COMMUNICATION, DETECTION, AND CONTROL, filed Dec. 4, 2018, the disclosure of which is herein incorporated by reference in its entirety.
In an example, could-based analytics may be deployed to analyze surgical information and/or perform various surgical tasks. Various examples of cloud-based analytics that are performed by the cloud, and are suitable for use with the present disclosure, are described in U.S. Patent Application Publication No. US 2019-0206569 A1 (U.S. patent application Ser. No. 16/209,403), titled METHOD OF CLOUD BASED DATA ANALYTICS FOR USE WITH THE HUB, filed Dec. 4, 2018, the disclosure of which is herein incorporated by reference in its entirety.
In various aspects, an imaging device may be used in the surgical system and may include at least one image sensor and one or more optical components. Suitable image sensors may include, but are not limited to, Charge-Coupled Device (CCD) sensors and Complementary Metal-Oxide Semiconductor (CMOS) sensors.
The optical components of the imaging device may include one or more illumination sources and/or one or more lenses. The one or more illumination sources may be directed to illuminate portions of the surgical field. The one or more image sensors may receive light reflected or refracted from the surgical field, including light reflected or refracted from tissue and/or surgical instruments.
The one or more illumination sources may be configured to radiate electromagnetic energy in the visible spectrum as well as the invisible spectrum. The visible spectrum, sometimes referred to as the optical spectrum or luminous spectrum, is that portion of the electromagnetic spectrum that is visible to (e.g., can be detected by) the human eye and may be referred to as visible light or simply light. A typical human eye will respond to wavelengths in air that are from about 380 nm to about 750 nm.
The invisible spectrum (e.g., the non-luminous spectrum) is that portion of the electromagnetic spectrum that lies below and above the visible spectrum (i.e., wavelengths below about 380 nm and above about 750 nm). The invisible spectrum is not detectable by the human eye. Wavelengths greater than about 750 nm are longer than the red visible spectrum, and they become invisible infrared (IR), microwave, and radio electromagnetic radiation. Wavelengths less than about 380 nm are shorter than the violet spectrum, and they become invisible ultraviolet, x-ray, and gamma ray electromagnetic radiation.
In various aspects, the imaging device may be configured for use in a minimally invasive procedure. Examples of imaging devices suitable for use with the present disclosure include, but not limited to, an arthroscope, angioscope, bronchoscope, choledochoscope, colonoscope, cytoscope, duodenoscope, enteroscope, esophagogastro-duodenoscope (gastroscope), endoscope, laryngoscope, nasopharyngo-neproscope, sigmoidoscope, thoracoscope, and ureteroscope.
The imaging device may employ multi-spectrum monitoring to discriminate topography and underlying structures. A multi-spectral image is one that captures image data within specific wavelength ranges across the electromagnetic spectrum. The wavelengths may be separated by filters or by the use of instruments that are sensitive to particular wavelengths, including light from frequencies beyond the visible light range, e.g., IR and ultraviolet. Spectral imaging can allow extraction of additional information the human eye fails to capture with its receptors for red, green, and blue. The use of multi-spectral imaging is described in greater detail under the heading “Advanced Imaging Acquisition Module” in U.S. Patent Application Publication No. US 2019-0200844 A1 (U.S. patent application Ser. No. 16/209,385), titled METHOD OF HUB COMMUNICATION, PROCESSING, STORAGE AND DISPLAY, filed Dec. 4, 2018, the disclosure of which is herein incorporated by reference in its entirety. Multi-spectrum monitoring can be a useful tool in relocating a surgical field after a surgical task is completed to perform one or more of the previously described tests on the treated tissue. It is axiomatic that strict sterilization of the operating room and surgical equipment is required during any surgery. The strict hygiene and sterilization conditions required in a “surgical theater,” i.e., an operating or treatment room, necessitate the highest possible sterility of all medical devices and equipment. Part of that sterilization process is the need to sterilize anything that comes in contact with the patient or penetrates the sterile field, including the imaging device and its attachments and components. It will be appreciated that the sterile field may be considered a specified area, such as within a tray or on a sterile towel, that is considered free of microorganisms, or the sterile field may be considered an area, immediately around a patient, who has been prepared for a surgical procedure. The sterile field may include the scrubbed team members, who are properly attired, and all furniture and fixtures in the area.
As shown in
In an example, multiple surgical hub/edge devices may be associated with respective operating rooms. A patient 53005 may be undergoing a surgery in the operating room. The operating room(s) may include one or more surgical devices (e.g., surgical instruments A 53010, B 53015, and C 53020). The terms surgical devices and surgical instruments may be used interchangeably herein. The surgical devices may be used (e.g., autonomously or manually used by a healthcare professional) to perform various tasks associated with a surgical procedure on a patient. How a surgical instruments operates autonomously is described in greater detail under the heading “METHOD OF CONTROLLING AUTONOMOUS OPERATIONS IN A SURGICAL SYSTEM” in U.S. patent application Ser. No. 17/747,806, filed May 18, 2022, the disclosure of which is herein incorporated by reference in its entirety. For example, the surgical device may be an endocutter. The surgical device may be in communication with the surgical hub/edge device 53000 located within the operating room. The surgical hub/edge device 53000 may instruct the surgical device about information related to the surgery being performed on the patient 53005. In examples, the surgical hub/edge device 53000 may set a parameter of the surgical instrument (e.g., device) via sending the surgical device a message, which may be in response to the surgical instrument sending a request message to the surgical hub/edge device 53000 for the parameter. For example, the surgical hub/edge device 53000 may send the surgical device information indicative of a firing rate for the endocutter to be set at on during a stage of the surgery.
Surgical information (e.g., surgical data associated with a patient/healthcare professional/surgical device) that is associated with a surgical procedure may be generated (e.g., by a monitoring subsystem located at the surgical hub/edge device 53000 or locally by the surgical device). For example, the surgical information may be based on the performance of the surgical instrument. For example, the surgical data may be associated with physical measurement physiological measurements, and/or the like. The measurements are described in greater detail under the heading “Monitoring Of Adjusting A Surgical Parameter Based On Biomarker Measurements” in U.S. patent application Ser. No. 17/156,28, filed Nov. 10, 2021, the disclosure of which is herein incorporated by reference in its entirety.
Surgical information related to a surgical procedure being performed in the operating room may be sent to the local surgical hub/edge device 53000. For example, the operating room may include a surgical display. As the surgical procedure is being performed (e.g., by the healthcare professionals), surgical data (e.g., surgical data associated with measurements taken from a surgical display) may be sent to the surgical hub/edge device 53000 where it may be analyzed. The surgical hub/edge device 53000 may further send the surgical information for analysis to an enterprise cloud server (not shown in
As shown in
The surgical hub/edge device 53000 based on a surgical procedure, for example, may break down the surgical procedure into surgical tasks or surgical steps. The surgical hub/edge device 53000 may maintain the sequence of the surgical tasks or surgical steps in a subsystem or a module (e.g., surgical plan module) located locally at the surgical hub/edge device 53000. The surgical hub/edge device 53000 (e.g., as a part of discovery process) may perform discovery of surgical devices or surgical instruments that are associated with the surgical procedure and/or the surgical steps of the surgical procedure. For example, the surgical hub/edge device 53000 may identify that a colectomy is being performed and that the first step of the colectomy is severing tissue that is attached to the colon, thereby mobilizing the colon. Based on this information, the surgical hub/edge device 53000 may send one or more discovery request messages to various surgical devices or surgical instruments that are to be used for during the surgical procedure. The surgical hub/edge device 53000, in response to the request messages, may receive response messages from various surgical instruments. The response messages from the surgical instruments may include respective identification (e.g., which may be referred to as type 53025) and surgical instrument capabilities (e.g., which may be referred to as parameters 53030), as described with respect to
The discovery request message may include an indication that the surgical hub/edge 53000 is requesting information (e.g., characteristics and capabilities) associated with the surgical instrument. In response, the surgical instrument may include the requested information (surgical characteristics and/or surgical parameters 53030) associated with the surgical instrument. For example, the characteristics may include a range of frequencies that the surgical instrument is capable of operating in. The surgical characteristics may include a power ratings associated with the surgical instrument. In an example, the surgical hub/edge device 53000 may perform discovery of instruments based on a surgical procedure plan associated with the current surgical procedure.
The surgical hub/edge device 53000, based on the characteristics or parameters, and the type of the surgical instrument, for example, may determine whether to establish a connection with the surgical instrument. For example, the surgical hub/edge device 53000 may determine that one of the responsive surgical instruments is an endocutter with frequency operating range that is to be used for the anastomosis step of the colectomy. Based on this determination, the surgical hub/edge device 53000 may determine not to establish a connection with the endocutter.
In an example, as a part of discovery process, the surgical hub/edge device 53000 may assign an identification to the surgical instruments (e.g., each of the surgical instruments) that are involved in a surgical procedure and the surgical hub/edge device 53000 may establish a connection with. For example, after determining whether to establish a connection with a surgical instrument based on the surgical type 53025 and the parameters 53030, the surgical hub/edge device 53000 may assign an identification tag 53035 to the surgical instrument and may send the identification tag 53035 to the surgical instrument. As described herein, the identification tag 53035 may be used by the surgical hub/edge device 53000 and/or by the monitoring surgical instrument when requesting data associated with a surgical instrument.
In an example, a surgical hub/edge device 53000 may determine a role (e.g., monitoring surgical instrument or a peer surgical instrument that is being monitored) associated with each of the surgical instruments that are part of a surgical ecosystem. For example, the surgical hub/edge device 53000 may assign one surgical instrument to be a monitoring surgical instrument and another surgical instrument a peer surgical instrument that is being monitored by the monitoring surgical instrument. The assignment of a role may include assignment of respective privileges associated with a surgical instrument, as described with respect to
In an example, a surgical instrument may be preconfigured with configuration that may enable it to assume the role of a monitoring surgical instrument or a peer surgical instrument that is being monitored. The surgical instrument may be configured and enabled as a monitoring surgical instrument or a peer surgical instrument. In an example, a surgical instruments may determine or select its role based on one or more of the following: type of the peer surgical instrument or the role assumed by the peer surgical instrument, the surgical instrument capabilities of the peer surgical instrument, the surgical step being performed, the surgical procedure being performed and/or the like. In an example, the surgical instrument may be configured with such information or may request such information from the surgical hub it has established a connection with. After selecting or enabling a particular role, the surgical instruments may send an indication of its selected role to one another surgical instrument and/or the surgical hub.
In an example, if two or more of the surgical instruments indicate that they have assumed the monitoring role, the surgical instruments involved may negotiate to determine which of the surgical instruments should stay in the monitoring role and which of the surgical instruments should change its role to a peer role or have no role. The negotiation may be based at least on the type of the surgical instruments involved, the surgical instrument capabilities of the surgical instruments involved, the surgical step being performed, the surgical procedure being performed and/or the like. In an example, multiple surgical instruments may agree that both can operate in a monitoring role. In an example, a surgical instrument may not have a capability of assuming a monitoring role or as a peer role. In such a case, no role may be assigned to the surgical instrument and the surgical instrument may not be connected with another surgical instrument.
In an example, the negotiation between the two surgical instruments may comprise transfer of data between the two devices and the application of one or more rules to determine the assignment of roles (e.g., the monitor role vs the monitored role or peer role). The determination may depend on speed or capability of each of the devices, memory capacity of the devices, timing (for example, which device sent the discovery request), an attribute of connectivity between the surgical instruments or surgical devices, etc. The determination may be based on whether the surgical instrument type or surgical device type is used in a surgical task of the surgical procedure and, optionally, the capabilities of the surgical device type required for that task, or the capabilities of the monitoring surgical instrument (e.g., higher processing speed for processing the data, more up-to-date models for processing the data, greater memory, etc.).
In an example, a surgical instrument may be powered on during a surgical procedure in a surgical operating room, for example, after one of the surgical instruments in the surgical operating room has been configured as a monitoring surgical instrument. In such a case, the newly powered surgical instrument may determine that one of the surgical instruments is acting as a monitoring surgical instrument and it may then assume its role as a peer surgical instrument and establish a connection with the existing monitoring surgical instrument. In an example, the existing monitoring surgical instrument may indicate to the newly added surgical instrument its status of being a monitoring surgical instrument.
In an example, once a surgical instrument assumes its role as a monitoring surgical instrument, it may then have the ability to directly monitor the performance and pull data directly from the surgical instruments without the use of the surgical hub/edge device 53000. In an example, a monitoring surgical instrument may request information about peer surgical instruments from the surgical hub/edge device 53000. For example, as shown in
In an example, the surgical instrument A 53010 may monitor surgical data (at surgical instruments B 53015 and/or C 53020) that is associated with a surgical step of a surgical procedure. In an example, surgical instrument A 53010 may request surgical information or surgical data (e.g., send a message requesting for data) associated with the performance of surgical tasks being performed on each of the of the surgical instruments B 53015 and C 53020 directly from surgical instruments B 53015 and C 53020 without involving the surgical hub/edge device 53000. In an example, surgical instrument A 53010 may request data associated with the performance of surgical tasks being performed on each of the surgical instrument B 53015 and surgical instrument C 53020 from the surgical hub/edge device 53000 or via the surgical hub/edge device 53000. As described with respect to
Monitoring a surgical device or a surgical instrument may include the monitoring surgical device (e.g., the monitoring surgical instrument on its own or the monitoring surgical instrument in collaboration with the surgical hub/edge device 53000) gathering surgical information associated with the patient, the healthcare provider, and/or a surgical task being performed by a surgical instrument that is being monitored). The surgical information associated with the patient, the healthcare professional may include measurements related to physical conditions, physiological conditions, and/or the like. The surgical information associated with the surgical instrument may include performance metrics associated with the surgical instrument or a task being performed by the surgical instrument.
Determining whether the monitoring surgical instrument is capable of directly interacting with the peer surgical instruments may be determined by a machine learning model 53040 located at the surgical hub/edge device 53000, as described herein in
In an example, the monitoring surgical instrument may receive (e.g., receive from the surgical hub/edge device 53000) a list of potential peer surgical instruments it may monitor. The monitoring surgical instrument may also receive indication identifying the peer surgical instruments that the monitoring surgical instrument may be able to monitor directly and the peer surgical instrument that the monitoring surgical instrument may be able to monitor in collaboration with the surgical hub/edge device 53000.
In an example, the monitoring surgical instrument may receive an indication for the monitoring a set of peer surgical instruments. The indication may include a list of the identification tags 53035 associated with the peer surgical instruments. The monitoring surgical instrument may store the list of the peer surgical instruments to be monitored locally (e.g., in local memory).
In an example, the surgical hub/edge device 53000 may obtain a list of surgical instruments that may be utilized during a surgical. As part of the surgical procedure, for example, the surgical hub/edge device 53000 may assign roles to be assigned to the surgical instruments. The surgical hub/edge device 53000 may communicate the roles to the devices involved, for example, by sending messages to the surgical instruments.
In an example, the surgical hub/edge device 53000 may update roles and/or privileges assigned to the surgical instruments. For example, the roles may be updated during transitioning from one surgical step of a surgical procedure to another surgical step of the surgical procedure. In an example, a surgical instrument that may have been previously assigned a monitoring role may be updated to a peer surgical instrument and may be monitored by another surgical instrument, for example, a newly powered surgical instrument. The surgical hub/edge device 53000 may send an update message to the surgical instrument indicating for the surgical instrument to change its role from a monitoring surgical instrument to a peer surgical instrument. The surgical hub/edge device 53000 may also indicate to the surgical instrument an identification of a new monitoring surgical instrument.
In an example, surgical instrument A 53010 may receive surgical information directly from surgical instrument C 53020. The surgical instrument A 53010 may receive the surgical information periodically or aperiodically (e.g., based on completion of a surgical task at the surgical instrument B 53015 or C 53020 or based on a triggering condition being met (for example, commencing and/or finishing certain instrument operations such as clamping, firing etc. or when a derived parameter falls outside of an expected range/threshold)). For example, surgical instrument A 53010 may request and/or receive surgical parameters related to a tissue it may be dissecting or mobilizing. In an example surgical instrument, A 53010 may request and/or receive surgical information associated with a surgical task of a surgical procedure from surgical instrument B 53015 indirectly via the surgical hub/edge device 53000.
In an example, as described with respect to
In an example, a local machine learning model 53040 located within the surgical hub/edge server device 53000 may use surgical information and surgical parameters associated with a patient, a healthcare professional and/or a surgical instrument to predict settings for a surgical instrument or identify a surgical instrument part (e.g., a cartridge) as an outcome. The surgical hub/edge device 53000 may send the predicted outcome to the monitoring surgical instrument.
In an example, a local machine learning model may reside in a peer surgical instrument, as described herein in
In an example, a local machine learning model may reside in a monitoring surgical instrument as described herein in
In an example, the surgical procedure to be performed may be a colectomy. At the anastomosis step of the surgical procedure, an endocutter may be configured or configure itself to be the monitoring surgical instrument, while an. energy device may be configured to be a peer surgical instrument, to be monitored by the monitoring surgical instrument. The energy device (being a surgical instrument that is being monitored) may send surgical information to the endocutter (the monitoring device). The surgical information may include information about the anatomy of the tissue it observes such as the tissue's thickness. In an example, the energy device may send the surgical data based on a request it receives from the endocutter. In an example, the energy device may send surgical information to the endocutter based on a triggering condition being met, as described herein. In an example, the surgical information may be sent periodically to the endocutter (e.g., based on timer configured at the energy device). The endocutter may store the surgical data and perform analysis on the surgical data, as described herein. In examples, the monitoring surgical instrument (e.g., endocutter) may provide recommendations to the surgical instrument being monitored (e.g., the energy device) to adjust one or more of its parameters (e.g., surgical instrument parameters) based on the analysis of the tissue thickness. For example, the endocutter may analyze the tissue thickness and determine a uniqueness in the tissue thickness. Based on this analysis, the endocutter may send a recommendation (e.g., an updated recommendation) to the energy device to set its power settings accordingly, for example, when performing a surgical task of the surgical procedure.
Analysis performed in in the endocutter may involve a machine learning model 53040, which may take the data (e.g., measurements) from the energy device as input and output recommendations for setting one or more instrument parameters. The endocutter may, based on the surgical data (e.g., surgical measurements) received from the energy device, send a recommendation to a third device performing or assisting in performing the surgical step at hand. For example, measurements from the energy device may be received by the endocutter, which indicate that the tissue thickness of the patient is larger than average. Based on this, the endocutter may send a message to a third device, such as robotic arm or a clamp, to reorient itself in a different position (e.g., based on the tissue large tissue thickness), which may allow the energy device more freedom to operate within the surgical site. The endocutter may, based on the surgical information (e.g., surgical measurements) received from the energy device, send a recommendation to a device performing or assisting in performing a surgical task (e.g., a future surgical task).
In an example, the surgical hub/edge device 53045 may statically obtain a list of surgical instruments present in the operating room and information about their respective surgical instrument type and/or surgical instrument capabilities from a surgical procedure plan or a surgical instrument list associated with a surgical procedure (e.g., a list of surgical instruments that have been activated and are to be used in a surgical procedure).
In an example, as illustrated in
At 53072, each of the surgical instruments may determine its surgical instrument type and surgical instrument capabilities. In an example, the surgical instrument may be configured (e.g., pre-configured) with a surgical instrument type and a set of surgical instrument capabilities. At 53072, each of the surgical instruments may generate surgical instrument type and surgical instrument capabilities.
At 53075 each of the surgical instrument, in response to the discovery request message 53070, may send a response message 53075 to the surgical hub/edge server 53045. The response message 53075 may include an indication of the surgical instrument type and the surgical instrument capabilities associated with the surgical instrument sending the response message. The surgical instrument capabilities may include qualities related to the performance and/or the intelligence of the surgical instrument. Qualities related to the performance and/or intelligence of the surgical instrument are described in greater detail under the heading “Monitoring Of Adjusting A Surgical Parameter Based On Biomarker Measurements” in U.S. patent application Ser. No. 17/156,28, filed Nov. 10, 2021, the disclosure of which is herein incorporated by reference in its entirety.
The surgical hub/edge device 53045, for example, based on the response message 53075 from the surgical instruments (e.g., each of the surgical instruments), may assign roles to the available surgical instruments B or C. A surgical instrument may be assigned a role as a monitoring surgical instrument (e.g., surgical instrument A) or a peer surgical instrument (e.g., surgical instrument B or C) that is being monitored by the monitoring surgical instrument.
In an example, a surgical instrument (e.g., surgical instrument D 53065) based on its surgical instrument type and/or surgical instrument capabilities information may not be assigned a monitoring or a peer surgical instrument role. For example, the surgical instrument D 53065 may lack a capability of establishing a point-to-point connection with another surgical instrument. In an example, the surgical hub/edge device 53045 after receiving a response from the surgical instrument D 53065 may determine that a capability of the surgical instrument (e.g., operating power) is not within an acceptable operational range and therefore may not be assigned a monitoring or a peer role.
In an example, based on a surgical instrument's capabilities, the surgical hub/edge device 53045 may determine that this surgical instrument (e.g., surgical instrument A 53050) is a smart surgical instrument and, therefore, may be assigned the role of a monitoring surgical instrument. Based on at least the determination that the surgical instrument is a smart surgical instrument (e.g., has sufficient processing capability and memory capability of performing monitoring and recording of a surgical task being performed at a peer surgical instrument), the surgical hub/edge device 53045 may determine and/or assign the surgical instrument as the role of a monitoring surgical instrument.
At 53080, the surgical hub/edge device 53045 may send an assignment message to the surgical instrument 53050 indicating that it has been assigned the role of a monitoring surgical instrument. In the assignment message, the surgical hub/edge device 53045 may include an indication that surgical instrument A 53050 can establish a direct peer-to-peer connection with the surgical instrument B 53055. The surgical hub/edge device 53045 may send another assignment message 53082 to surgical instruments B 53055 and C 53060 indicating that the respective surgical instruments has been assigned the role of a peer surgical instrument. The assignment message 53082 may indicate to surgical instrument B 53055 to establish a direct peer-to-peer connection with the surgical instrument A 53050. The assignment message 53082 may indicate to surgical instrument C 53060 to establish a direct peer-to-peer connection with the surgical instrument A 53050.
As described with respect to
In an example, the local surgical hub/edge device 53045 may indicate to the monitoring surgical instrument may connect indirectly to a peer surgical instrument, for example, the monitoring surgical instrument may access the peer surgical instrument's data via the surgical hub/edge device 53045. As described with respect to
At 53084 the monitoring surgical instrument 53050 may establish peer-to-peer connections with the peer surgical instrument B 53055 and surgical instrument C 53060. The established peer-to-peer connection may be utilized monitor and/or record surgical information associated with a surgical task being performed on the peer surgical instrument 53055.
In an example, the monitoring surgical instrument may establish connections with the peer surgical instruments at the beginning of a surgical procedure. For example, if the surgical procedure includes surgical steps 1 through K, the peer-to-peer connection establishment may occur as a part of surgical step 1.
In an example, the roles assigned to a surgical instrument may be altered at a transition from one surgical step to a subsequent surgical step. For example, during the transition from surgical step one to surgical step two of the surgical procedure, that the assigned role of surgical instrument A 53050 may be altered from a monitoring surgical instrument to a peer surgical instrument. In such a case, during surgical step two, surgical instrument A 53050 with new assigned role may no longer have the privileges of a monitoring surgical instrument.
As the surgical instruments perform their respective surgical tasks associated with the surgical step, they may generate surgical information related to how they are performing their surgical tasks. This surgical data may be sent to or accessed by the monitoring surgical instrument 53050, either directly without involving the surgical hub/edge device 53045 or indirectly via the surgical hub/edge device 53045.
At 53091, a peer surgical instruments B 53055 and C 53060 may generate surgical information associated with a patient, healthcare professional, or a surgical task performed by a surgical instrument. At 53092, the peer surgical instrument B 53055 may send the surgical information to the monitoring surgical instrument A 53050 using the established peer-to-peer connections at 53084, for example. At 53093, the peer surgical instrument C 53060 may send the surgical information to the monitoring surgical instrument A 53050 using the established peer-to-peer connections at 53084, for example. The surgical information transfer between the monitoring surgical instrument A 53050 and the peer surgical instruments B 53055 and/or C 53060 may be performed under supervision of the surgical hub/edge device 53045.
As illustrated in
In an example, the surgical instrument A 53095 may be a smart surgical instrument. For example, the surgical instrument may determine that it is capable of operating independently, identifying surgical instruments other than itself, and communicating with the identified surgical instruments over a network. The network may be a local area network (LAN), a wireless interface (e.g., a WiFi interface (WiFi 6, WiFi6E, etc.), a Bluetooth X interface, etc.), and/or an optical interface (e.g., a fiber optic-based LAN). The devices in the network may include a smart computing device (e.g., a smart surgical hub) or a server (e.g., an edge server) at the center of the network. The network may be located inside a secured boundary (e.g., a HIPAA boundary).
In an example, the surgical instrument may identify and/or monitor other devices without utilizing the centralized computing device. In such a configuration, surgical information (e.g., surgical information associated with a surgical task) may be exchanged directly between the smart surgical instruments without utilizing a central surgical computing device or a server. In an example, the surgical instrument may determine that it has a capability of being a monitoring device, i.e., monitoring and/or recoding surgical information associated with one or more surgical tasks being performed at other surgical instruments (e.g., other peer surgical instruments). In an example, the surgical instrument may be capable of monitoring communication between two smart devices and recording aspects of their interaction or streams to monitor their operation. In an example, the surgical instrument may be capable of monitoring its own operation. Based on at least these determination, the surgical instrument A 53095 may configure itself as monitoring surgical instrument.
In an example, the surgical instrument A 53095 may analyze the surgical instrument capabilities information it may receive from a set of peer surgical instruments (e.g., surgical instrument B 53105 and surgical instrument C 53110). Based on the analysis of the surgical instrument capabilities information (e.g., limitations of the peer surgical instrument) associated with the set of peer surgical instruments, the surgical instrument A 53095 may determine that it is the only or one of the smart surgical instrument to be utilized during the surgical procedure. Accordingly, the surgical instrument A may configure itself as a monitoring surgical instrument.
In an example, one of the smart surgical instruments being utilized in a surgical procedure may determine that a plurality of other smart surgical instruments is also being utilized in the surgical procedure. The smart surgical instrument, as a part of discovery procedure for example, may obtain the firmware/software versions (e.g., version of ML software) running on each of the smart surgical instruments being utilized in the surgical procedure. The smart surgical may compare its firmware/software version with the firmware/software versions of the other surgical instruments and determine that it is running the latest version of firmware/software. Based on this determination, the smart surgical instrument may configure itself as a monitoring surgical instrument.
The surgical instrument A 53095 may initiate a discovery procedure. The surgical instrument A 53095 may obtain (e.g., obtain from a pre-configuration or obtain from a surgical hub/edge device 53100) a list of the surgical instruments that may be utilized during a surgical procedure. At 53120, the instrument A 53095 may send discovery message(s) to one or more surgical instruments and/or surgical hub/edge devices that may be part of a surgical procedure, for example.
In an example, the surgical hub/edge device 53100 may assign the roles of the surgical instruments (e.g., as described with respect to
At 53122, the surgical instruments may determine their respective surgical instrument type and surgical instrument capabilities. In an example, the surgical instrument may be configured (e.g., pre-configured) with a surgical instrument type and/or a set of surgical instrument capabilities. The surgical instrument type and surgical instrument capabilities may be stored in the surgical instrument's local memory.
At 53125, each of the surgical instruments and the surgical hub that received a discovery message from a monitoring surgical device may respond with a response message. The response message sent by each of the surgical instruments or received by the monitoring surgical device may include an indication of the surgical instrument type and surgical instrument capabilities, e.g., as determined at 53122. The surgical instrument capabilities may include qualities related to the performance and/or the intelligence of the surgical instrument, which may be described in greater detail under the heading “Monitoring Of Adjusting A Surgical Parameter Based On Biomarker Measurements” in U.S. patent application Ser. No. 17/156,28, filed Nov. 10, 2021, the disclosure of which is herein incorporated by reference in its entirety.
The monitoring surgical instrument (e.g., surgical instrument A 53095, for example, based on the response message from the surgical instruments, may assign a role of a peer surgical instrument to the available surgical instruments A/B/C/D, and/or the surgical hubs/edge device 53100. The peer surgical instrument role assignment may be based on a selection criteria that may include the surgical instrument type, the surgical instrument capabilities, the surgical step of the surgical procedure, the surgical procedure, etc.
In an example, a surgical instrument (e.g., surgical instrument D 53115) based on its surgical instrument type and/or surgical instrument capabilities information may not be assigned a peer surgical instrument role. For example, the surgical instrument D 53065 may lack a capability of establishing a point-to-point connection with another surgical instrument. In an example, the surgical monitoring instrument 53095 after receiving a response from the surgical instrument D 53115 may determine that a capability of the surgical instrument (e.g., operating power) is not within an acceptable operational range and therefore may not be assigned a peer role.
At 53130, the monitoring surgical instrument A 53095 may send an assignment message to each of the surgical instruments B/C and surgical hub/edge device 53100 indicating that each of the surgical instruments B/C and surgical hub/edge device 53100 has been assigned the role of a peer surgical instrument. In an example, the assignment message may include the privileges associated with the peer role that has been assigned to a surgical instrument and/or the surgical hub. For example, the monitoring surgical instrument may assign surgical instrument B 53105 and surgical instrument C 53110 as a peer surgical instruments.
In an example, the assignment message, the surgical monitoring surgical instrument A 53095 may include an indication that surgical instrument may establish a peer-to-peer connection with surgical instrument A 53095. In an example, as part of the establishment of the peer-to-peer connection, the surgical instrument A 53095 and the peer-to-peer surgical instrument may optimize various parameters of the peer-to-peer connection (e.g., surgical data sharing, data transfer speeds, etc.)
At 53131, the monitoring surgical instrument A 53095 may establish a peer-to-peer connection with a surgical computing device/edge server 53100. The established peer-to-peer connection may be utilized to monitor and/or record surgical information on the surgical computing device/edge server 53100.
At 53132, the monitoring surgical instrument A 53095 may establish a peer-to-peer connection with a peer surgical instrument B 53105. The established peer-to-peer connection may be utilized to monitor and/or record surgical information on the peer surgical instrument B 53105.
At 53133, the monitoring surgical instrument 53095 may establish a peer-to-peer connection with a peer surgical instrument C 53110. The established peer-to-peer connection may be utilized monitor and/or record surgical information on the peer surgical instrument C 53110.
In an example, the monitoring surgical instrument A 53095 may establish direct peer-to-peer connections with the peer surgical instruments at the beginning of a surgical procedure. For example, if the surgical procedure includes surgical steps 1 through K, the peer-to-peer connection establishment may occur as a part of surgical step 1.
At 53126, peer surgical instruments B 53105 and C 53110 may generate surgical information associated with a patient, healthcare professional, or a surgical instrument. At 53127, the surgical instrument may send the surgical information to the monitoring surgical instrument A 53095.
In an example, a monitoring surgical instrument, for example, a smart surgical stapling device may identify a surgical energy device to be used during a surgical procedure in an operating room. The smart surgical stapling device may retrieve capabilities of the surgical energy device and configure it as a peer surgical instrument to be monitored by the smart energy stapler. The smart surgical stapling device may establish a peer-to-peer connection with the surgical energy device. As part of a surgical task, the surgical energy device may be used for dissecting and/or mobilizing a tissue. During this surgical task, the energy device may record and/or process the tissue viability, for example, based on feedback of the various surgical parameters collected by the surgical energy device. The surgical parameters may include power, time, impedance, etc. The smart surgical stapling device may directly obtain the information collected by the energy device (e.g., parameters including power, time, impedance) via the established peer-to-peer connection. In an example, the energy device may calculate surgical instrument settings like initial starting speed of the motor for firing the staples and send it to the smart surgical stapling device. In an example, based on the information directly obtained from the energy device, the smart surgical stapling device may calculate the initial starting speed of the motor for firing the staples. In an example, based at least on the information directly obtained from the energy device, the smart surgical stapler may identify an optimal location for tissue dissection with a stapling device. The location for tissue dissection may be based on tissue properties/disease state of tissue or areas with minimization of vessels avoidance. In an example, based at least on the area dissected, the energy device may identify the cartridge (e.g., the size of the cartridge (45 mm or 60 mm) and/or the color of the cartridge (e.g., blue) based on tissue thickness collected on jaw). The energy device may communicate the cartridge identification information directly to the smart energy device using the peer-to-peer connection between the energy device and the smart stamping device.
In an example, the interconnections may be altered at a transition from one surgical step to a subsequent surgical step. For example, from surgical step one to surgical step two, the interconnections and the assignments of the privileges may be adjusted. For example, with respect to
In an example, as the surgical instruments are performing their respective surgical tasks associated with the surgical step, they may generate surgical data related to how they are performing their surgical tasks, which may be described in greater detail under the heading “Monitoring Of Adjusting A Surgical Parameter Based On Biomarker Measurements” in U.S. patent application Ser. No. 17/156,28, filed Nov. 10, 2021, the disclosure of which is herein incorporated by reference in its entirety. This surgical data may be accessed by the monitoring surgical instrument, either directly as described here with respect to
The surgical information or surgical data measurements may be associated with one of more actuators located within the operating room. For example, surgical information may be generated from measurements on potentiometer readings. This surgical information may be associated with an orientation of the surgical instrument. The surgical information may be used in evaluating how the surgical instrument is performing its individual surgical tasks as described with respect to
As illustrated in
The processor 53137 in the surgical computing device or a monitoring surgical device 53136 may be a general purpose processor, a special purpose processor, a conventional processor, a digital signal processor (DSP), a plurality of microprocessors, one or more microprocessors in association with a DSP core, a controller, a microcontroller, Application Specific Integrated Circuits (ASICs), Field Programmable Gate Arrays (FPGAs) circuits, any other type of integrated circuit (IC), a state machine, and the like. The processor 53137 may perform data processing, authentication, input/output processing, and/or any other functionality that may enable the surgical computing device or a monitoring surgical device 53136 to operate in an environment that is suitable for performing surgical procedures. The processor 53137 may be coupled with a transceiver (not shown). The processor 53137 may use the transceiver (not shown in the figure) to communicate with the peer surgical instrument 53140.
The memory 53139 in the surgical computing device or the monitoring surgical instrument 53135 may be used to store where surgical information was sent. For example, the memory may be used to recall that surgical information was sent to the peer surgical instrument 53140. The memory may include a database and/or lookup table. The memory may include virtual memory which may be linked to servers located within the protected network.
The processor 53137 in the surgical computing device or the monitoring surgical instrument 53135 may access information from, and store data in, any type of suitable memory (e.g., a non-removable memory and/or the removable memory). The non-removable memory may include random-access memory (RAM), read-only memory (ROM), a hard disk, a solid-state drive or any other type of memory storage device. The removable memory may include secure digital memory.
The processor 53137 in the surgical computing device or a monitoring surgical device 53135 may access information from, and store data in an extended storage 53144. (e.g., a non-removable memory and/or the removable memory). In an example, the processor 53137 may access information from, and store data in, memory that is not physically located on the surgical computing device or the monitoring surgical instrument 53135, such as on a server or a secondary edge computing system (not shown).
The processor 53137 in the surgical computing device or a monitoring surgical device 53135 may utilize the machine learning model 53143 to predict parameters associated with a surgical instrument or identify a part of a surgical instrument (e.g., a stapler cartridge), as described herein. The processor 53137 may use the transceiver (not shown in the figure) to directly communicate the surgical information or the predicted surgical parameters or the predicted identification of a surgical part to the peer surgical instrument 53140. The directly communication between the surgical computing device or the monitoring surgical instrument 53135 and the peer surgical instrument 53140 may occur using the established peer-to-peer connection 53145.
As further illustrated in
The processor 53136 in the peer surgical instrument 53140 may be a general purpose processor, a special purpose processor, a conventional processor, a digital signal processor (DSP), a plurality of microprocessors, one or more microprocessors in association with a DSP core, a controller, a microcontroller, Application Specific Integrated Circuits (ASICs), Field Programmable Gate Arrays (FPGAs) circuits, any other type of integrated circuit (IC), a state machine, and the like. The processor 53136 may perform data processing, authentication, input/output processing, and/or any other functionality that may enable the peer surgical instrument 53140 to operate in an environment that is suitable for performing surgical procedures. The processor 53136 may be coupled with a transceiver (not shown). The processor 53136 may use the transceiver (not shown in the figure) to communicate with the surgical computing device or the monitoring surgical instrument 53135.
The memory 53138 in peer surgical instrument 53140 may be used to store where surgical information was sent. For example, the memory may be used to recall that surgical information was sent to the peer surgical instrument 53140. The memory may include a database and/or lookup table. The memory may include virtual memory which may be linked to servers located within the protected network.
The processor 53136 in the peer surgical instrument 53140 may access information from, and store data in, any type of suitable memory (e.g., a non-removable memory and/or the removable memory). The non-removable memory may include random-access memory (RAM), read-only memory (ROM), a hard disk, a solid-state drive or any other type of memory storage device. The removable memory may include secure digital memory.
The processor 53136 in the peer surgical instrument 53140 may access information from, and store data in an extended storage 53148. (e.g., a non-removable memory and/or the removable memory). In an example, the processor 53136 may access information from, and store data in, memory that is not physically located on the peer surgical instrument 53140, such as on a server or a secondary edge computing system (not shown).
The processor 53136 in in the peer surgical instrument 53140 may utilize the local machine learning model 53145 to predict parameters associated with a surgical instrument or identify a part of a surgical instrument (e.g., a stapler cartridge), as described herein. The processor 53136 may use the transceiver (not shown in the figure) to directly communicate to the monitoring surgical instrument 53135 the surgical information or the predicted surgical parameters or the predicted identification of a surgical part. The direct communication between the peer surgical instrument 53140 and the surgical computing device or the monitoring surgical instrument 53135 may occur using the established peer-to-peer connection over interface 53145.
At 53152, the surgical instrument may determine that the second surgical instrument has capability of being a peer surgical instrument that may be monitored by it. The capability of being a peer surgical instrument may include having a capability to establish a peer-to-peer connection with a monitoring surgical instrument and/or having a capability of gathering surgical data associated with a patient, a healthcare professional, or a surgical instrument and sending gathered surgical information to the monitoring surgical instrument. The surgical instrument may configure the second surgical instrument as a peer surgical instrument.
At 53154, the surgical instrument may establish a peer-to-peer connection with the second surgical instrument. The peer-to-peer connection is established between the first surgical instrument and the second surgical instrument for the first surgical instrument to monitor and record surgical information surgical task on the second surgical instrument.
At 53156, the surgical instrument may begin monitoring and recording of surgical data associated with the second surgical instrument using the established peer-to-peer connection with the second surgical instrument.
At 53159, the first surgical instrument may receive an indication of a response message from each of the set of second surgical instrument(s). The indication of the response message may include indication of a surgical instrument type and indication of a capability of each of the second surgical instruments. Based on the surgical instrument type and the capability of the surgical instrument, the first surgical instrument may determine each the second surgical instrument(s) to be a peer surgical instrument. The indication of the response message from the first surgical instrument to each of the set of second surgical instrument(s) may indicate an assigned role.
At 53160, based at least on the surgical instrument type and capability of each of the second surgical instruments, the first surgical instrument may determine that each of the set of second surgical instrument(s) is a peer surgical instrument.
The first surgical instrument may be able to monitor one of the second surgical instruments. The first surgical instrument may be a monitoring surgical instrument and may be able to access data of the one of the second surgical instruments that has been assigned the role of a peer surgical instrument. In an example, the first surgical instrument may be able to set a parameter of the second surgical instrument based on the accessed surgical data.
In an example, the roles of the first surgical instrument and the second surgical instrument may be determined based on a negotiation between the surgical instrument and each of the second surgical instrument(s).
In an example. the first surgical instrument, based at least on its own surgical instrument type and capabilities information may assume the role of a monitoring surgical instrument.
In an example, the first surgical instrument may be a smart surgical instrument (e.g., operating within an interconnect network may be capable of understanding the limitations of the second surgical instrument used in the surgical procedure. This may include the instrument realizing it is the only smart instrument in the procedure as well as identifying other instruments have surgical instrument capabilities of sharing data.
In an example, a set of surgical instruments may be utilized in performing a surgical procedure. Some of the surgical instruments, for example, a smart stapling device may be smarter and/or more advanced than an energy device, for example. The advancement of the smart stapling device over the energy device may be based on revision or level of software (e.g., machine learning software) installed on each of the surgical devices.
In an example, during the startup of the procedure, the smart surgical stapler may obtain information about other surgical instruments that may be active and/or inter-connected to the ecosystem. The smart surgical stapler may have confirmation of other device availability which may be identified based on the instruments available and what operations would be capable of being performed during the surgery based on the instruments in the operating room. Based on identification of the available instruments identified, the smart surgical stapler may attempt to connect directly to the other instruments to have a peer-to-peer connection which may optimize data sharing, transfer speeds, and/or the like. For example, an energy device may be used for dissecting and mobilizing tissue. During a process, it may be recording/processing the tissue viability based on feedback of the parameters collected by the energy device (e.g., power, time, impendence, etc.). The information collected from the energy device may communicate to the surgical stapler to indicate the initial starting speed of the motor for firing the staples. This data may be sent to the surgical stapler directly which may identify an optimal location for tissue dissection with a stapling device based on tissue properties and/or a disease state of tissue or areas with minimization of vessels avoidance. For example, based on the area dissected, the device may process and communicate to the surgical stapler what cartridge should be use e.g., 45 mm or 60 mm and/or color based on tissue thickness collected on jaw.
This application is related to the following, filed contemporaneously, the contents of each of which are incorporated by reference herein: Attorney Docket No. END9438USNP1, concurrently filed, titled A METHOD FOR ADVANCED ALGORITHM SUPPORT.