This disclosure relates to surgery and, more particularly, to systems and methods for detecting perfusion in surgery.
Adequate perfusion, or blood supply, at a surgical site is important in order to increase the likelihood of faster and favorable post-surgery healing. For example, one of the main prerequisites for favorable anastomotic healing in low anterior resection (LAR) surgery is to ensure that adequate perfusion is present. Poor perfusion can lead to a symptomatic anastomotic leak (AL) after LAR surgery. AL's after LAR surgery are associated with a high level of morbidity and a leak-related mortality rate of as high as 39%.
Any or all of the aspects and features detailed herein, to the extent consistent, may be used in conjunction with any or all of the other aspects and features detailed herein.
Provided in accordance with aspects of this disclosure is a surgical system for detecting perfusion. The surgical system includes at least one surgical camera and a computing device. The at least one surgical camera is configured to obtain image data of tissue at a surgical site including first image data and second image data that is temporally-spaced relative to the first image data. The computing device is configured to receive the image data from the at least one surgical camera and includes a non-transitory computer-readable storage medium storing instructions configured to cause the computing device to detect differences between the first and second image data, determine a level of perfusion in the tissue based on the detected differences between the first and second image data, and provide an output indicative of the determined level of perfusion in the tissue.
In an aspect of this disclosure, computing device is further caused to amplify the detected differences between the first and second image data. In such aspects, the level of perfusion in the tissue may be determined based on the amplified detected differences between the first and second image data.
In another aspect of this disclosure, the at least one surgical camera includes first and second surgical cameras such that the image data is stereographic image data from the first and second surgical cameras.
In still another aspect of this disclosure, the surgical system further includes an ultraviolet light source configured to illuminate the tissue at the surgical site such that the image data includes ultraviolet-enhanced image data.
In yet another aspect of this disclosure, the image data is video image data, infrared image data, thermal image data, or ultrasound image data.
In still yet another aspect of this disclosure, the level of perfusion is determined by a machine learning algorithm of the computing device. The machine learning algorithm, in such aspects, may be configured to receive the detected differences between the first and second image data and determine the level of perfusion based on the detected differences between the first and second image data. Alternatively, the machine learning algorithm, in such aspects, may be configured to receive the first and second image data, detect the differences between the first and second image data, and determine the level of perfusion based on the detected differences between the first and second image data.
In another aspect of this disclosure, the output indicative of the determined level of perfusion in the tissue includes a visual indicator on a display configured to display a video feed of the surgical site.
In another aspect of this disclosure, the output indicative of the determined level of perfusion in the tissue includes a visual overlay, on a display, over a video feed of the surgical site.
A method for detecting perfusion in surgery in accordance with aspects of this disclosure includes obtaining, from at least one surgical camera, first image data of tissue at a surgical site; obtaining, from the at least one surgical camera, second image data of the tissue at the surgical site that is temporally-spaced relative to the first image data; detecting differences between the first and second image data; determining a level of perfusion based on the detected differences between the first and second image data; and providing an output indicative of the determined level of perfusion.
In an aspect of this disclosure, the method further includes amplifying the detected differences between the first and second image data before determining the level of perfusion in the tissue such that the level of perfusion in the tissue is determined based on the amplified detected differences between the first and second image data.
In another aspect of this disclosure, the at least one surgical camera includes first and second surgical cameras such that obtaining the first and second image data includes obtaining first and second stereographic image data, respectively.
In still another aspect of this disclosure, the method further includes illuminating the tissue at the surgical site with ultraviolet light such that the first image data is ultraviolet-enhanced image data and the second image data is ultraviolet-enhanced image data.
In yet another aspect of this disclosure, obtaining each of the first and second image data includes obtaining video image data, infrared image data, thermal image data, or ultrasound image data.
In still yet another aspect of this disclosure, determining the level of perfusion based on the detected differences between the first and second image data includes implementing a machine learning algorithm. In such aspects, the machine learning algorithm may be configured to receive the detected differences between the first and second image data and determine the level of perfusion based on the detected differences between the first and second image data. Alternatively, in such aspects, the machine learning algorithm may be configured to receive the first and second image data, to detect the differences between the first and second image data, and to determine the level of perfusion based on the detected differences between the first and second image data.
In another aspect of this disclosure, providing the output indicative of the determined level of perfusion in the tissue includes providing a visual indicator on a display configured to display a video feed of the surgical site.
In another aspect of this disclosure, providing the output indicative of the determined level of perfusion in the tissue includes providing a visual overlay, on a display, over a video feed of the surgical site.
The above and other aspects and features of this disclosure will become more apparent in view of the following detailed description when taken in conjunction with the accompanying drawings wherein like reference numerals identify similar or identical elements.
This disclosure provides systems and methods for detecting perfusion during surgery. Although detailed herein with respect to a low anterior resection (LAR) surgical procedure, it is understood that the present disclosure is equally applicable for use in any other suitable surgical procedure.
Referring to
The at least one surgical instrument 11 may include, for example, a first surgical instrument 12a for manipulating and/or treating tissue, a second surgical instrument 12b for manipulating and/or treating tissue, and/or a third surgical instrument 13 for visualizing and/or providing access to a surgical site. The first and/or second surgical instruments 12a, 12b may include: energy-based surgical instruments for grasping, sealing, and dividing tissue such as, for example, an electrosurgical forceps, an ultrasonic dissector, etc.; energy-based surgical instruments for tissue dissection, resection, ablation and/or coagulation such as, for example, an electrosurgical pencil, a resection wire, an ablation (microwave, radiofrequency, cryogenic, etc.) device, etc.; mechanical surgical instruments configured to clamp and close tissue such as, for example, a surgical stapler, a surgical clip applier, etc.; mechanical surgical instruments configured to facilitate manipulation and/or cutting of tissue such as, for example, a surgical grasper, surgical scissors, a surgical retractor, etc.; and/or any other suitable surgical instruments. Although first and second surgical instruments 12a, 12b are shown in
The third surgical instrument 13 may include, for example, an endoscope or other suitable surgical camera to enable visualizing into a surgical site. The third surgical instrument 13 may additionally or alternatively include one or more access channels to enable insertion of first and second surgical instruments 12a, 12b, aspiration/irrigation, insertion of any other suitable surgical tools, etc. The third surgical instrument 13 may be coupled, via wired or wireless connection, to controller 14 (and/or computing device 18) for processing the video data for displaying the same on display 17. Although one third surgical instrument 13 is shown in
Surgical system 10, in aspects, also includes at least one surgical camera 19 such as, for example, one or more surgical cameras 19 configured to collect imaging data from a surgical site, e.g., using still picture imaging, video imaging, thermal imaging, infrared imaging, ultrasound imaging, etc. In aspects, the at least one surgical camera 19 is provided in addition to or as an alternative to the one or more third surgical instruments 13. In other aspects, third surgical instrument(s) 13 provide the functionality of surgical camera(s) 19. Surgical camera(s) 19 is coupled, via wired or wireless connection, to computing device 18 for providing the image data thereto, e.g., in real time, to enable the computing device 18 to process the received image data, e.g., in real time, and provide a suitable output based thereon, as detailed below.
Continuing with reference to
Although computing device 18 is shown as a single unit disposed on control tower 16, computing device 18 may include one or more local, remote, and/or virtual computers that communicate with one another and/or the other devices of surgical system 10 using any suitable communication network based on wired or wireless communication protocols. Computing device 18, more specifically, may include, by way of non-limiting examples, one or more: server computers, desktop computers, laptop computers, notebook computers, Internet appliances, mobile smartphones, tablet computers, personal digital assistants, embedded computers, and the like. Computing device 18 further includes an operating system configured to perform executable instructions. The operating system is, for example, software, including programs and data, which manages the device's hardware and provides services for execution of applications. Those of skill in the art will recognize that suitable server operating systems include, by way of non-limiting examples, FreeBSD, OpenBSD, NetBSD®, Linux, Apple® Mac OS X Server®, Oracle® Solaris®, Windows Server®, Novell® NetWare®, and the likes. In aspects, the operating system may be provided by cloud computing.
Computing device 18 includes a storage implemented as one or more physical apparatus used to store data or programs on a temporary or permanent basis. The storage may be volatile memory, which requires power to maintain stored information, or non-volatile memory, which retains stored information even when the computing device 18 is not powered on. In aspects, the non-volatile memory includes flash memory, dynamic random-access memory (DRAM), ferroelectric random-access memory (FRAM), and phase-change random access memory (PRAM). In aspects, the storage may include, by way of non-limiting examples, CD-ROMs, DVDs, flash memory devices, magnetic disk drives, magnetic tapes drives, optical disk drives, solid-state drive, universal serial bus (USB) drive, and cloud computing-based storage. In aspects, the storage may be any combination of storage media such as those disclosed herein.
Computing device 18 further includes a processor, an extension, an input/output device, and a network interface, although additional or alternative components are also contemplated. The processor executes instructions which implement tasks or functions of programs. When a user executes a program, the processor reads the program stored in the storage, loads the program on the RAM, and executes instructions prescribed by the program. Although referred to herein in the singular, it is understood that the term processor includes multiple similar or different processes locally, remotely, or both locally and remotely distributed.
The processor of computing device 18 may include a field programmable gate array (FPGA), a digital signal processor (DSP), a central processing unit (CPU), a graphical processing unit (GPU), a microprocessor, application specific integrated circuit (ASIC), and combinations thereof, each of which includes electronic circuitry within a computer that carries out instructions of a computer program by performing the basic arithmetic, logical, control and input/output (I/O) operations specified by the instructions. Those skilled in the art will appreciate that the processor may be substituted for by using any logic processor (e.g., control circuit) adapted to execute algorithms, calculations, and/or set of instructions described herein.
In aspects, the extension may include several ports, such as one or more USBs, IEEE 1394 ports, parallel ports, and/or expansion slots such as peripheral component interconnect (PCI) and PCI express (PCIe). The extension is not limited to the list but may include other slots or ports that can be used for appropriate purposes. The extension may be used to install hardware or add additional functionalities to a computer that may facilitate the purposes of the computer. For example, a USB port can be used for adding additional storage to the computer and/or an IEEE 1394 may be used for receiving moving/still image data.
The network interface is used to communicate with other computing devices, wirelessly or via a wired connection following suitable communication protocols. Through the network interface, computing device 18 may transmit, receive, modify, and/or update data from and to an outside computing device, server, or clouding space. Suitable communication protocols may include, but are not limited to, transmission control protocol/internet protocol (TCP/IP), datagram protocol/internet protocol (UDP/IP), and/or datagram congestion control protocol (DCCP). Wireless communication may be achieved via one or more wireless configurations, e.g., radio frequency-embedded millimeter wave transvers optical, Wi-Fi, Bluetooth (an open wireless protocol for exchanging data over short distances, using short length radio waves, from fixed and mobile devices, creating personal area networks (PANs), ZigBee® (a specification for a suite of high level communication protocols using small, low-power digital radios based on the IEEE 122.15.4-2003 standard for wireless personal area networks (WPANs)).
Turning to
Referring to
With reference to
As shown in
Turning to
As indicated at 620, differences between the temporally spaced first and second image data are detected. For example, differences in pixel color and/or intensity between the first image data and the second image data may be detected. As another example, movement and/or change in the size (expansion, contraction, etc.) of identified structures between the first image data and the second image data may be detected. In aspects, these differences are amplified so as to exaggerate, for example, the differences in pixel colors and/or intensities between the first image data and the second image data, and/or movements and/or size changes of identified structures between the first image data and the second image data. This amplification may be performed such as detailed in U.S. Pat. Nos. 9,805,475 and/or 9,811,901, each of which is hereby incorporated herein by reference. In other aspects, the differences are not amplified. In either configuration, the differences may be further processed to facilitate analysis.
At 630, a level of perfusion is determined based on the detected differences between the temporally spaced first and second image data (whether or not amplified or processed in any other suitable manner). More specifically, the detected differences between the temporally spaced first and second image data enables the detection of pulsations (expansions and contractions) of tissue such as blood vessels within or on the surface of tissue, e.g., the rectum “R” and colon “C” (see
Continuing with reference to
Referring to
Based on the input data 702, 706, the machine learning algorithm 708 determines, as an output 710, a level of perfusion. The machine learning algorithm 708 may implement one or more of: supervised learning, semi-supervised learning, unsupervised learning, reinforcement learning, association rule learning, decision tree learning, anomaly detection, feature learning, computer vision, etc., and may be modeled as one or more of a neural network, Bayesian network, support vector machine, genetic algorithm, etc. The machine learning algorithm 708 may be trained based on empirical data and/or other suitable data and may be trained prior to deployment for use during a surgical procedure or may continue to learn based on usage data after deployment and use in a surgical procedure(s).
Referring to
Regardless of the particular configuration of indicator 810, method 600 (
Turning to
It is understood that the various aspects disclosed herein may be combined in different combinations than the combinations specifically presented hereinabove and in the accompanying drawings. In addition, while certain aspects of the present disclosure are described as being performed by a single module or unit for purposes of clarity, it is understood that the techniques of this disclosure may be performed by a combination of units or modules associated with, for example, a surgical system.
Accordingly, although several aspects and features of of the disclosure are shown in the drawings, it is not intended that the disclosure be limited thereto, as it is intended that the disclosure be as broad in scope as the art will allow and that the specification be read likewise. Therefore, the above description should not be construed as limiting, but merely as exemplifications of particular aspects and features. Those skilled in the art will envision other modifications within the scope and spirit of the claims appended hereto.