The present embodiments relate generally to apparatuses and methods for tracking and control in surgery and interventional medical procedures.
There is currently no technology for robust image-guidance in automated surgery. What is available in the market as so called “robotic surgery” is truly just robot-assisted surgery because the robot only follows direct commands of the surgeon with very little intelligence or autonomy. Some research groups have looked into closing the loop of control for surgical robots with existing sensors, however, special conditions and considerations applied to operations in vivo make it extremely difficult to achieve such goals.
The present embodiments address at least this problem by introducing a robust tracking technique which requires minimal changes to the current robot-assisted surgical workflow and closing the loop with an effector function.
The foregoing paragraphs have been provided by way of general introduction, and are not intended to limit the scope of the following claims. The described embodiments, together with further advantages, will be best understood, by reference to the following detailed, description taken in conjunction with the accompanying drawings.
A more complete appreciation of the embodiments described herein, and many of the attendant advantages thereof will be readily obtained as the same becomes better understood by reference to the following detailed description when considered in connection with the accompanying drawings, wherein
The terms “a” or “an”, as used herein, are defined as one or more than one. The term “plurality”, as used herein, is defined as two or more than two. The term “another”, as used herein, is defined as at least a second or more. The terms “including” and/or “having”, as used herein, are defined as comprising (i.e., open language). Reference throughout this document to “one embodiment”, “certain embodiments”, “an embodiment”, “an implementation”, “an example” or similar terms means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment of the present disclosure. Thus, the appearances of such phrases or in various places throughout this specification are not necessarily all referring to the same embodiment. Furthermore, the particular features, structures, or characteristics may be combined in any suitable manner in one or more embodiments without limitation.
According to an embodiment, the present disclosure describes a system for tracking and control in medical procedures. The system includes a device configured to deploy fluorescent material on at least one of an organ under surgery and a surgical tool, a visual light source, a fluorescent light source corresponding to an excitation wavelength of the fluorescent material, an image acquisition and control element configured to control the visual light source and the fluorescent light source, and configured to capture and digitize at least one of resulting visual images and fluorescent images, and an image-based tracking module configured to apply image processing to the visual and fluorescent images, the image processing detecting fluorescent markers on at least one of the organ and the surgical tool.
According to another embodiment of the present disclosure, the system further includes in the system a surgical robot and a visual servoing control module configured to receive tracking information from the image-based tracking module and to control the surgical robot, based on the tracking information, to perform a surgical operation.
According to another embodiment of the present disclosure, the system further includes a surgical robot and a visual servoing control module configured to receive tracking information from the image-based tracking module and to control the surgical robot, based on the tracking information, to perform a surgical operation.
According to another embodiment of the present disclosure, the system further includes a manual control module configured to enable manual control of the surgical robot in place of control by the visual servoing control module.
According to another embodiment of the present disclosure, the visual servoing control module is further configured to receive manual input and to control the surgical robot, based on the manual input, to perform a surgical operation.
According to another embodiment of the present disclosure, the system further includes a surgical robot and a manual control module configured to receive manual input and execute master-slave control of the surgical robot.
According to another embodiment of the present disclosure, the system further includes a display configured to display at least one of the visual images and the fluorescent images.
According to another embodiment of the present disclosure, the image-based tracking module further identifies the organ or the surgical tool based on the detected fluorescent markers.
According to another embodiment of the present disclosure, the image acquisition and control element further includes a dynamic tunable filter configured to alternatively pass visual light and light emitted by the fluorescent material, and a charged coupled device configured to capture at least one of visual images and fluorescent images.
According to another embodiment of the present disclosure, the display is stereoscopic or monoscopic.
According to another embodiment of the present disclosure, the image acquisition and control element generates stereoscopic or monoscopic images.
According to another embodiment of the present dislcosure, the stereoscopic display is further configured to display visual images and a color coded overlay of fluorescent images.
According to another embodiment of the present disclosure, the stereoscopic display is further configured to display an augmented reality image by overlaying target points detected by the image-based tracking module.
According to another embodiment of the present disclosure, the system is configured to provide at least one of visual, audio, and haptic feedback to a system operator, based on information provided by the image-based tracking module.
According to another embodiment of the present disclosure, the system is configured to operate in each of a manual mode, a semi-autonomous mode, and an autonomous mode.
According to another embodiment of the system, the image-based tracking module identifies virtual boundaries based on the detected fluorescent markers to designate critical structures.
According to another embodiment of the present disclosure, the system further includes a detection device configured to determine whether a surgical tool has passed a boundary and to provide constraints on motion or provide alarms when the boundary has been crossed in order to protect the critical structures.
According to another embodiment of the present disclosure, the fluorescent light source is a near-infrared (NIR) light source.
According to another embodiment of the present disclosure, the image acquisition and control element includes two charge coupled devices (CCDs), one assigned to a visual spectrum and one assigned to a NIR spectrum.
According to another embodiment of the present disclosure, light generated by the visual light source and the fluorescent light source is split by either a beam-splitting or a dichromatic prism.
According to another embodiment of the present disclosure, light generated by the visual light source and the fluorescent light source are provided separate light paths to the two CCDs.
According to one embodiment, the present disclosure describes a method for performing a medical procedure. The method includes the steps of deploying fluorescent material on at least one of an organ under surgery and a surgical tool, illuminating the organ, the surgical tool, or both, with a visual light source and a fluorescent light source, the fluorescent light source corresponding to an excitation wavelength of the fluorescent material, capturing and digitizing images resulting from the illumination by the visual light source and the fluorescent light source, and applying image processing to the digitized images, the image processing detecting fluorescent markers on at least one of the organ and the surgical tool.
According to another embodiment, the method of the present disclosure further includes generating tracking information by tracking the organ, the surgical tool, or both based on the detected fluorescent markers.
According to another embodiment, the method of the present disclosure further includes controlling a surgical robot, based on the tracking information, to perform a surgical operation.
According to another embodiment, the method of the present disclosure further includes receiving manual input and controlling the surgical robot, based on the manual input, to perform the surgical operation.
According to another embodiment, the method of the present disclosure further includes receiving manual input and executing master-slave control of a surgical robot based on the on manual input.
According to another embodiment, the method of the present disclosure further includes providing a stereoscopic or monoscopic display of the digitized images.
According to another embodiment, the method of the present disclosure further includes generating stereoscopic or monoscopic images in order to capture and digitize images.
According to another embodiment, the method of the present disclosure further includes displaying visual images and a color coded overlay of fluorescent images.
According to another embodiment, the method of the present disclosure further includes displaying an augmented reality image by overlaying target points detected by the image-based tracking module.
According to another embodiment, the method of the present disclosure further includes providing at least one of visual, audio, or haptic feedback to a system operator, based on the tracking information.
According to another embodiment, the method of the present disclosure further includes identifying the organ or the surgical tool based on the detected fluorescent markers.
According to another embodiment, the method of the present disclosure further includes performing a surgical procedure based on the detected fluorescent markers.
According to another embodiment, the method of the present disclosure further includes designating critical structures by identifying virtual boundaries based on the detected fluorescent markers.
According to another embodiment, the method of the present disclosure further includes determining whether a surgical tool has passed a boundary and providing constraints on motion or providing alarms when the boundary has been crossed in order to protect the critical structures.
According to one embodiment, the present disclosure describes a system for tracking and control in medical procedures. The system includes means for deploying fluorescent material on at least one of an organ under surgery and a surgical tool, a visual light source, a fluorescent light source corresponding to an excitation wavelength of the fluorescent material, means for controlling the visual light source and the fluorescent light source, means for capturing and digitizing at least one of resulting visual images and fluorescent images, and means for applying image processing to the visual and fluorescent images, the image processing detecting fluorescent markers on at least one of the organ and the surgical tool.
The embodiments disclosed herein may be applied in the field of automated anastomosis, where tubular structures (vessels, bile ducts, urinary tract, etc.) are coapted, or brought into contact, and sealed. An anastomosis is one of the four major steps in every surgery that includes (1) access through incision, (2) exposure and dissection, (3) resection and removal of pathology, and (4) reconstruction and closure, or anastomosis. An anastomosis is currently achieved by suturing or applying clips or glue to the anastomosis site. The anastomosis procedure, itself, may be performed manually or by using robots through master-slave control. Both of these described techniques can be time consuming and cumbersome in an arena where these parameters are at a premium. The present embodiments make it possible for the surgeon to mark the anastomosis site by applying fluorescent markers via, for instance miniature clips, spray, paint, tapes, and the like, which can be detected and tracked using the dual-spectrum imaging technology. In addition, a robotic system can be controlled through visual servoing using this tracking information, in order to apply sutures, clips, glue, weld and the like at specified positions.
According to embodiments, several other applications include but are not limited to:
Variants of embodiments of the technology are listed below. For instance, the technology can be used with multiple dyes having excitation/emission wavelengths at different wavelengths. This technology can be applied so that different markers can be used for tracking multiple objects. In an embodiment, fluorescent dyes A and B can be used to mark two sides of a tubular structure prior to automated anastomosis.
In an embodiment, dyes, or markers, can be applied to the targets that are internal as well as targets that are external. The fluorescent dye can be attached to the target by clips, staples, and glue or can be applied by painting or spraying. The dye can also be injected to the tissue to mark specific points or can be injected through blood. The dye can be selected in order to bind with specific types of cells to mark specific structures such, for instance, tumors.
In an embodiment, “no-fly zones” or “virtual fixtures” can be provided to prevent surgical tools from approaching critical structures. For instance, the surgeon can mark a critical structure prior to initiating a task and the marked borders can be tracked using the dual-mode imaging technology. Coordinates can be used to force constraints on the motion of the surgical tools during the automated or semi-automated task. It can also be used to provide alarms in manual tasks, including, among others, visual alarms, audio alarms, or haptic alarms.
In an embodiment, the imaging, system can be monoscopic and provide two-dimensional location of the tracked points which can be used for image-based visual servoing. In another embodiment, the imaging system can be stereoscopic and provide three-dimensional location of the tracked structures, thereby being used for image-based or position-based visual servoing.
Embodiments of the present disclosure can be applied to automated or semi-automated applications. The system of the present disclosure can also provide guidance for manual operations through visual feedback, audio feedback, haptic feedback, and the like.
As introduction, automation of a surgical procedure is a challenging task. As the surgical scene is dynamic, deformable organs may occlude a surgeon's view and variations in illumination can make it difficult to robustly track myriad targets and objects inside the patient's body. Several attempts have been made to develop image-based tracking algorithms for minimally invasive and/or open surgeries, but these attempts rely on special conditions and are not robust and therefore, cannot be practically used to control surgical tools or to automate steps of a surgery.
According to an embodiment, the present disclosure addresses these limitations by using a dual-spectrum imaging device capable of imaging in the visual spectrum as well as in the near-infrared (NIR) spectrum. To this end, a surgeon places fluorescent markers on the locations which should be tracked (e.g., tools and tissue). An excitation light can be generated by the imaging device and cause the fluorophores to emit NIR light which can be detected by the imaging device. Due to limited auto-fluorescence of the tissue compared to the fluorescent dyes, and a lack of other NIR sources in the patient's body, the resulting acquired light has a high signal to noise ratio (SNR). This high SNR makes the tracking algorithm robust and reliable. Moreover, NIR light is able to penetrate tissue at a depth greater than visible light, making it possible to track an object even when occluded by another organ, flipped over, covered by blood, and the like. A combination of visual and NIR images, therefore, can be used to make image-based tracking algorithms even more robust.
According to an embodiment, the present disclosure describes a system for automation of surgical tasks. The system is based on deploying fluorescent markers on the organ under surgery and/or on the surgical tool, allowing for tracking of the fluorescent markers in real-time and controlling the surgical tool via visually servoing.
According to an embodiment, the image-based tracking module 107 also includes, a tracking module that performs pre-processing of the NIR image and visual tracking based on the processed image information. In an embodiment, the pre-processing algorithm can include image processing algorithms, such as image smoothing, to mitigate the effect of sensor noise, image histogram equalization to enhance the pixel intensity values, and image segmentation based on pixel intensity values to extract templates for the NIR markers. The visual trackers can be initialized first. Initialization of the visual trackers starts by detection and segmentation of the NIR marker. Segmentation can be based on applying an adaptive intensity threshold on the enhanced NIR image to obtain a binary template for the NIR markers. A two dimensional (2D) median filter and additional morphology-based binary operators (binary image processing algorithms such as image erosion and dilation) may be applied on the binary template to remove segmentation noise. The binary template may be used as a starting base for visual tracking of NIR markers using visual tracking algorithms. After pre-processing and segmentation, the NIR template can be a white blob on a darker background, the darker background representing the remainder of the surgical field in the NIR image.
In autonomous mode (
According to an embodiment, the tracked visual markers can be used to guide the motion of the robot. Each visual marker can be represented by a representative vector of numbers, which is typically called a visual feature. Examples of visual features are coordinates of the centers of NIR markers extracted from the binary image, and/or their higher-order image moments, such as their area in terms of pixels.
Robotic motion can be performed by transforming the sensor measurements into global Cartesian coordinate form for the robot. In one embodiment, the NIR and the tool markers can be tracked in the stereo images to compute the 3D coordinates of the marker or tool with respect to the surgical field, as shown in
The NIR based robot motion control is a core technology which has not been developed in the past. Previous methods and apparatuses for NIR based imaging (without robot control. Frangioni 2012, U.S. Pat. No. 8,229.548 B2) and NIR based display (Mohr and Mohr, US 2011/0082369) fail to consider robot motion control or any control whatsoever. With a stereo imaging system consisting of two NIR cameras with appropriate filters, a properly excited NIR agent can be seen in both stereo images. Image processing and visual tracking algorithms, such as the algorithms described above as being implemented by the image-based tracking module 107, can be utilized to visually track each NIR marker in the image. The 3D estimate of a marker position can be found by triangulation of the NIR marker image as seen in both left 701 and right 703 NIR stereo image pairs. The 3D estimate of the NIR marker can then be re-projected as an overlay in the RGB image 702. The tool position can also be found from the stereo image pair. The stereo NIR system can be replaced by a 3D sensing camera capable of NIR observation.
The embodiments described herein can also be applied in non-stereo applications. For example, the system can be implemented for mono camera applications. For manual and master-slave modes (
As is shown in
Certain portions or all of the disclosed processing, such as the image processing and visual tracking algorithms, for example, can be implemented using some form of computer microprocessor. As one of ordinary skill in the art would recognize, the computer processor can be implemented as discrete logic gates, as an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other Complex Programmable Logic Device (CPLD). An FPGA or CPLD implementation may be coded in VHDL, Verilog or any other hardware description language and the code may be stored in an electronic memory directly within the FPGA or CPLD, or as a separate electronic memory. Further, the electronic memory may be non-volatile, such as ROM, EPROM, EEPROM or FLASH memory. The electronic memory may also be volatile, such as static or dynamic RAM, and a processor, such as a microcontroller or microprocessor, may be provided to manage the electronic memory as well as the interaction between the FPGA or CPLD and the electronic memory.
Alternatively, the computer processor may execute a computer program including a set of computer-readable instructions that perform the functions described herein, the program being stored in any of the above-described non-transitory electronic memories and/or a hard disk drive, CD, DVD, FLASH drive or any other known storage media. Further, the computer-readable instructions may be provided as a utility application, background daemon, or component of an operating system, or combination thereof, executing in conjunction with a processor, such as a Xenon processor from Intel of America or an Opteron processor from AMD of America and an operating system, such as Microsoft VISTA, UNIX, Solaris, LINUX, Apple, MAC-OSX and other operating systems known to those skilled in the art.
In addition, certain features of the embodiments can be implemented using a computer based system (
The computer 1000 may also include a disk controller coupled to the bus B to control one or more storage devices for storing information and instructions, such as mass storage 1002, and drive device 1006 (e.g., floppy disk drive, read-only compact disc drive, read/write compact disc drive, compact disc jukebox, tape drive, and removable magneto-optical drive). The storage devices may be added to the computer 1000 using an appropriate device interface (e.g., small computer system interface (SCSI), integrated device electronics (IDE), enhanced-IDE (E-IDE), direct memory access (DMA), or ultra-DMA).
The computer 1000 may also include special purpose logic devices (e.g., application specific integrated circuits (ASICs)) or configurable logic devices (e.g., simple programmable logic devices (SPLDs), complex programmable logic devices (CPLDs), and, field programmable gate arrays (FPGAs)).
The computer 1000 may also include a display controller coupled to the bus B to control a display, such as a cathode ray tube (CRT), for displaying information to a computer user. The computer system includes input devices, such as a keyboard and a pointing device, for interacting with a computer user and providing information to the processor. The pointing device, for example, may be a mouse, a trackball, or a pointing stick for communicating direction information and command selections to the processor and for controlling cursor movement on the display. In addition, a printer may provide printed listings of data stored and/or generated by the computer system.
The computer 1000 performs at least a portion of the processing steps of the invention in response to the CPU 1004 executing one or more sequences of one or more instructions contained in a memory. such as the memory unit 1003. Such instructions may be read into the memory unit from another computer readable medium, such as the mass storage 1002 or a removable media 1001. One or more processors in a multi-processing arrangement may also be employed to execute the sequences of instructions contained in memory unit 1003. In alternative embodiments, hard-wired circuitry may be used in place of or in combination with software instructions. Thus, embodiments are not limited to any specific combination of hardware circuitry and software.
As stated above, the computer 1000 includes at least one computer readable medium 1001 or memory for holding instructions programmed according to the teachings of the invention and for containing data structures, tables, records, or other data described herein. Examples of computer readable media are compact discs, hard disks. floppy disks, tape, magneto-optical disks, PROMs (EPROM, EEPROM, flash EPROM), DRAM, SRAM, SDRAM, or any other magnetic medium, compact discs (e.g., CD-ROM), or any other medium from which a computer can read.
Stored on any one or on a combination of computer readable media, the present invention includes software for controlling the main processing unit 1004, for driving a device or devices for implementing the invention, and for enabling the main processing unit 1004 to interact with a human user. Such software may include, but is not limited to, device drivers, operating systems, development tools, and applications software. Such computer readable media further includes the computer program product of the present invention for performing all or a portion (if processing is distributed) of the processing performed in implementing the invention.
The computer code elements on the medium of the present invention may be any interpretable or executable code mechanism, including but not limited to scripts, interpretable programs, dynamic link libraries (DLLs). Java classes, and complete executable programs. Moreover, pans of the processing of the present invention may be distributed for better performance, reliability, and/or cost.
The term “computer readable medium” as used herein refers to any medium that participates in providing instructions to the CPU 1004 for execution. A computer readable medium may take many forms, including but not limited to, non-volatile media, and volatile media. Non-volatile media includes, for example, optical, magnetic disks, and magneto-optical disks, such as the mass storage 1002 or the removable media 1001. Volatile media includes dynamic memory, such as the memory unit 1003.
Various forms of computer readable media may be involved in carrying out one or more sequences of one or more instructions to the CPU 1004 for execution. For example, the instructions may initially be carried on a magnetic disk of a remote computer. An input coupled to the bus B can receive the data and place the data on the bus B. The bus B carries the data to the memory unit 1003, from which the CPU 1004 retrieves and executes the instructions. The instructions received by the memory unit 1003 may optionally be stored on mass storage 1002 either before or after execution by the CPU 1004.
The computer 1000 also includes a communication interface 1005 coupled to the bus B. The communication interface 1004 provides a two-way data communication coupling to a network that is connected to, for example, a local area network (LAN), or to another communications network such as the Internet. For example, the communication interface 1005 may be a network interface card to attach to any packet switched LAN. As another example, the communication interface 1005 may be an asymmetrical digital subscriber line (ADSL) card, an integrated services digital network (ISDN) card or a modern to provide a data communication connection to a corresponding type of communications line. Wireless links may also be implemented. In any such implementation, the communication interface 1005 sends and receives electrical, electromagnetic or optical signals that carry digital data streams representing various types of information.
The network typically provides data communication through one or more networks. to other data devices. For example, the network may provide a connection to another computer through a local network (e.g., a LAN) or through equipment operated by a service provider, which provides communication services through a communications network. The local network and the communications network use, for example, electrical, electromagnetic, or optical signals that carry digital data streams, and the associated physical layer (e.g., CAT 5 cable, coaxial cable, optical fiber, etc). Moreover, the network may provide a connection to a mobile device such as a personal digital assistant (PDA) laptop computer, or cellular telephone.
Returning to a fundamental description of the present disclosure,
In order to be used in the surgical field, a marker should be robust to occlusion from collateral tissue, blood, and smoke, among other surgical conditions, so as to be visible. The marker should also be easily segmented from surrounding tissue, such that it is distinguishable from natural features in the surgical field and can be tracked. Additionally, the marker should be stable such that it can be visualized for an entirety of a procedure. In the exemplary embodiment described above, a marker can be made from, at least, near infrared (NIR) fluorophore such as indocyanine green (ICG). The use of ICG is favorable for three reasons. First, ICG is an FDA approved drug that is routinely administered in a number of surgeries, including cardiac, procedures as a contrast agent. Second, the fluorescence of ICG is in the NIR spectrum allowing it to be easily segmented from sounding features in a NIR image. Third, once activated, the fluorescence of ICG will remain stable for multiple hours allowing it to be used in most surgical procedures. It can be appreciated that the use of ICG is merely exemplary of a variety of NIR fluorescent compounds that could be used in creating a marker. For instance, the NIR fluorescent compound may be Prussian blue. In addition to the near infrared spectrum, the marker may contain compounds that allow fluorescence in other spectra, as well. For instance, an ultraviolet (UV) fluorescent compound may have similar benefits in the surgical field as the NIR compounds discussed above. Fluorescence in the UV range would allow the markers to be segmented and tracked for the entirety of the surgical procedure, including instances when the marker may be occluded by blood or surrounding tissue. Additionally, the marker may contain a compound that improves visibility of the marker in the visible range. One example of this technique would be to add a bluish/cyan pigment, such as methyl blue, to the marker composition. While visualization is not as robust when occluded by blood or tissue in the surgical field, the pigment would create high contrast between the marker and tissue which would allow it to be segmented from the background.
In addition to visibility, as shown in
In an embodiment, the marker may contain a compound that chemically or biologically binds to the target tissue. In one example, the marker may contain an algae compound that binds and creates a fibrous network when applied to a cut edge of tissue. In another embodiment, the marker may contain a coagulant compound such as aluminum sulfate that chemically binds to a bleeding edge of tissue. In another embodiment, the marker may contain a compound that can selectively bind to any number of proteins that are exposed along a cut edge of tissue. Finally, the marker may be designed such that it mechanically binds to target tissue. In these embodiments, it is assumed that the marker is applied to the target tissue in a solid state. In an embodiment, and as shown in
According to an embodiment, and depending on the target location, it may be desirable to enhance the visualization of the surgical markers. In the exemplary embodiment described with respect to
The combination of the fluorescent, binding, and stability compounds described with respect to
According to an embodiment, the shape of a marker can affect functionality in its application. For instance, a small circular marker may be ideal for tracking the motion of a point of interest while a linear marker may be ideal for tracking the boundary of diseased tissue. Further still. an irregular marker may be ideal for separating background from fore ground tissue. A surgical marker according to the present disclosure can be applied to target tissue in one of two states, either as a liquid or as a solid. In the event that a liquid marker is applied to the surgical site, it may remain a liquid for the duration of the procedure, functionally bind to the surgical site, or undergo a chemical reaction to solidify on the target tissue.
In an embodiment, and as shown in
In an embodiment, and as shown in
In an embodiment, and as shown in
Similarly, in an embodiment and as shown in
According to an embodiment, the surgical markers may be applied to the target tissue in the solid state. Applying a solid marker to the surgical field has the advantage of predefining and creating a marker shape with tight and repeatable tolerances. Additionally, a solid marker may be more robust in maintaining a predefined shape for the duration of a surgical procedure.
According to an embodiment, the solid markers can be manufactured into any arbitrary and predefined shape through standard injection molding with plastic and silicone polymers, for instance. As described in
Of equal import to the composition and shape of the surgical marker, an apparatus used to apply the surgical marker will now be described. The marking apparatus can be of a variety of forms including, for example, a tool used for marking target tissue in open surgical procedures and a tool designed to mark tissue for laparoscopic surgical tasks. In addition, features of the marking tool that allows components of a surgical marker to be mixed prior to application are disclosed.
In an embodiment, the marking apparatus may dispense liquid marker by an ink pen mechanism with felt tip, cotton nib, foam nib, or other suitable tip. An advantage of this method is that the shape and size of the marking may be controlled by hand. It may be particularly useful for segmentation tasks and automation, such as drawing “no-fly zones” or drawing lines along which to cut.
In another embodiment, a liquid marker may be dispensed to the target tissue by application of a compressive force to the outside of the tool. As shown in
In an embodiment, the marker components may be separated into distinct ampules prior to their use, then shaken together to mix before application, thereby ensuring components that are volatile retain activity before application.
Additionally, the dispensing tool may be compatible with a laparoscopic hand tool for use in a laparoscopic surgery.
In another embodiment, the dispensing tool may be combined with traditional surgical tools in order to facilitate relevant marking during a procedure. For example, a pair of surgical scissors may be modified so that they automatically dispense ICG along the cut edge, perhaps for tracking and later sewing together the new edges after a resection procedure. In another embodiment, a stapler may be configured such that it automatically. dispenses ICG along the stapled edge. applicable for a myriad of purposes including tracking and tissue analysis.
In another embodiment, the dispensing tool may be a device capable of spraying the liquid marker on the target tissue in a fine mist. This device could be a spray bottle configured for dispensing a precise amount of liquid marker, or a pressurized gun applicator allowing control of the rate of liquid application. By spraying target tissue. one may be able to mark large areas of tissue with a speckled pattern.
With reference to
In another embodiment, the marking apparatus may be used to embed liquid or solid markers in or beneath a superficial layer of target tissue.
In addition to using an application tool, one could develop a system to deliver the marker to target tissue using biological process in the body. In one embodiment, the marker could be encapsulated in a nanoparticle and injected into the periphery of the patient. The nanoparticle may contain a structure capable of binding to receptor sites at the target tissue. The markers would travel through the body until they reach the target binding, sites. Alternatively, instead of injecting the marker into the patient, the markers could be encapsulated and ingested by the patient as a liquid or pill. After the marker is ingested, it can be absorbed by the blood stream and distributed throughout the body. In an alternative embodiment, the markers may be encapsulated in a structure capable of binding to proteins present at the surgical site. The proteins may be inherent to target tissue or present because of a surgical task. Such proteins can include proteins for coagulation after a cut or proteins embedded in tissue layers that are exposed after a cut. The markers could then be applied to the entire surgical scene and selectively bind to target tissue sites.
In another embodiment, a mask or stencil could be used to allow marking in a predefined region. This could allow differentiation between marker types, 2D/3D registration and tracking of specific markers and analysis of the deformation of underlying tissue. For example, the deformation of the heart could be imaged using video cameras and, after a single dispersion of marker through a calibrated stencil, the motions of the heart could be quantified and the underlying 3D geometry known, in a technique similar to structured light approaches.
In a similar embodiment to the stencil approach mentioned previously, precise marker geometry could be molded and used in a similar fashion for tracking and deformation estimation. It could be used for tool or tissue tracking applications, as reference markings for a manual or robotic surgical procedure, or as a way of precisely defining different, easily differentiable markers to allow segmentation of features based on marker type. For example, tool tracking applications may prefer a premolded geometry which allows 3D tracking and registration.
Another embodiment of the present disclosure includes a NIRF clip hooked into tissue providing tracking while possibly simultaneously holding tissue in place. NIRF clips could be used similarly to optical trackers but with the advantages described above. including resistance to occlusion. Clips could be disposable and/or dissolvable or removed after the procedure is finished.
In another embodiment, and as illustrated in
Understanding the location of sutures can be highly useful for manual and robotic procedures. Such knowledge can allow a surgeon to more easily keep track of a thread during a procedure. Additionally, robotic systems may benefit from thread tracking in difficult thread management applications. An automated segmentation routine could be used to determine suture position in an image as shown in
In another embodiment, the 2D information from a 2D NIR camera could be registered to a 3D camera and used to define and track markers in 3D. The 3D camera could be a plenoptic, structured light, stereo camera, lidar, or any other camera capable of imaging with depth information. This 3D information can then be used as part of an algorithm to aid the surgeon or robot during a procedure. For example, a robot could be commanded to approach a 3D marker with a tool or a surgeon could be informed about proper spacing and fixation of NIR sutures.
In another embodiment, positional information gleaned from the markers can be used to control the camera itself. This could be as simple as adjusting the intensity or gain of the image to maintain robust tracking changing the focus to maintain, a crisp image of the marker, or it could be more advanced. For example the camera may be mounted on a robotic arm and the marker position and orientation could in turn control the position and orientation of the camera. This may allow the marker to always be in a suitable field-of-view or angle relative to the camera. Additionally, the tracking of markers on tools could prevent tool-camera occlusions or collisions.
When applying liquid markers, it is desirable that the applied solution is uniform. Ideally, the liquid markers would be pre mixed and placed in sterile packaging. In an alternative embodiment, the individual components of the marker may be provided to the user as separate components. One advantage of this approach is an increase in the shelf-life of the liquid markers. In the exemplary embodiment, and as shown in
In another embodiment and as shown in
The marker composition, shape, and apparatus of the present disclosure have all been described in context of an open or laparoscopic surgical procedure. Herein, we discuss the potential applications of these markers in the surgical space.
In another embodiment, markers may be placed in regions to designate one or more boundaries where a surgical tool should not be placed. As shown in
In addition to identifying tissue types, markers can be placed in the surgical field to mark a tissue's landmarks and features. Knowing where specific features are on a tissue may make tissue motion easier to track in space. Additionally, marking features of a tissue such as a mesenteric edge may allow one to recreate tissue geometries in the surgical field.
By interpolating between markers, one can reconstruct the entire tissue edge and use this information to plan a surgical procedure. Additionally, reconstructing a tissue's geometry would be beneficial for automating a surgical task. In one example, all markers in
In an additional embodiment, markers may be placed in the surgical scene for automatic positioning of a robotic platform “on the fly” during a surgical procedure by tracking tools or other instruments inside the body. Today, visual tracking of robotic tools is performed by visualization of optical markers placed on the back end of the tool. The liquid markers proposed here can be placed directly on the tool tip of the robotic instrument at any time during a procedure. Having the trackers directly in the surgical scene makes them more robust to surgical occlusion from equipment and personnel in the operating room. Additionally, a second marker can be placed on target tissue, and a closed loop positioning algorithm could be used to bring the robotic tool directly on top of the target tissue using a method known as visual servoing.
In an additional embodiment, combinations of these marker types and applications can be used to define a surgical procedure in the context of an autonomous or semi-autonomous robotic procedure. For example, markers may define a cut edge for a robotic resection algorithm. Or a series of markers may be used to define locations for a suturing robot. The added contrast and ability to discern between marker types can define more complex procedure plans using combinations of the embodiments described above, such as a “no-fly zone” combined with a cutting task, or a resection plan that updates based on interaction rules for different tissue types, or a suturing plan to connect two disparate tissues.
The specific embodiments described above have been shown by way of example in a surgical case and it should be understood that these embodiments may be susceptible to various modifications and alternative forms. It should be further understood that the claims are not intended to be limited to the particular forms disclosed, but rather to cover all modifications, equivalents, and alternatives falling within the spirit and scope of this. disclosure.
As used herein, the terms “comprises,” “comprising,” “including,” and “includes” are to be construed as being inclusive and open-ended. Specifically, when used in this document, the terms “comprises,” “comprising,” “including,” “includes,” and variations thereof, mean the specified features, steps or components included in the described features of the present disclosure. These terms are not to be interpreted to exclude the presence of other features, steps or components.
The following description includes features of the various embodiments.
(1) A system for tracking and control in medical procedures, the system comprising:
(2) The system of (1), further comprising:
(3) The system of (2), further comprising:
(4) The system of (2), wherein the visual servoing control module is further configured to receive manual input and to control the surgical robot, based on the manual input, to perform a surgical operation.
(5) The system of (1), further comprising:
(6) The system of (1), further comprising:
(7) The system of (1), wherein the image-based tracking module further identifies the organ or the surgical tool based on the detected fluorescent markers.
(8) The system of (1), wherein the image acquisition and control element further comprises:
(9) The system of (6), wherein the display is stereoscopic or monoscopic
(10) The system of (1), wherein the image acquisition and control element generates stereoscopic or monoscopic images.
(11) The system of (6), wherein the stereoscopic display is further configured to display visual images and a color coded overlay of fluorescent images.
(12) The system of (6), wherein the stereoscopic display is further configured to display an augmented reality image by overlaying target points detected by the image-based tracking module.
(13) The system of (1), wherein the system is configured to provide at least one of visual, audio, and haptic feedback to a system operator, based on information provided by the image-based tracking module.
(14) The system of (1) wherein the system is configured to operate in each of a manual mode, a semi-autonomous mode, and an autonomous mode.
(15) The system of (1), wherein image-based tracking module identifies virtual boundaries based on the detected fluorescent markers to designate critical structures.
(16) The system of (15). further comprising:
(17) The system of (1). wherein the fluorescent light source is a near-infrared (NIR) light source.
(18) The system of (1), wherein the device that deploys the fluorescent material is configured to deploy the fluorescent material by spraying, painting, attachment, tissue injection, or intravenous injection.
(19) A method for performing a medical procedure, the method comprising the steps of:
(20) The method according to (19), further comprising:
(21) The method of (19), further comprising:
(22) The method of (21), further comprising:
(23) The method of (19), further comprising:
(24) The method of (19), further comprising:
(25) The method of (19), wherein the step of capturing and digitizing images further
(26) The method of (24), further comprising:
(27) The method of (24), further comprising: displaying an augmented reality image by overlaying target points detected by the image-based tracking module.
(28) The method of (19), further comprising:
(29) The method of (19), further comprising:
(30) The method of (19), further comprising:
(31) The method of (19), further comprising:
(32) The method of (31), further comprising:
(33) A system for tracking and control in medical procedures, the system comprising:
means for controlling the visual light source and the fluorescent light source;
(34) A system for tracking and control in medical procedures, the system comprising:
While certain embodiments have been described, these embodiments have been presented by way of example only, and are not intended to limit the scope of the inventions. Indeed the novel methods and systems described herein may be embodied in a variety of other forms; furthermore, various omissions, substitutions, and changes in the form of the methods and systems described herein may be made without departing from the spirit of the inventions. As used herein the words “a” and “an” and the like carry the meaning of “one or more.” The accompanying claims and their equivalents are intended to cover such forms or modifications as would fall within the scope and spirit of the inventions.
The present application claims priority to U.S. Provisional Application No. 62/647,141, filed on Mar. 23, 2018, the teaching of which is hereby incorporated by reference in its entirety for all purposes. The present application further claims priority to U.S. application Ser. No. 13/863,954, filed on Apr. 16, 2013, which claims priority U.S. Provisional Application No. 61/1624,665, filed on Apr. 16, 2012, the teachings of which are hereby incorporated by reference in their entirety for all purposes.
Number | Date | Country | |
---|---|---|---|
62647141 | Mar 2018 | US | |
61624665 | Apr 2012 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 13863954 | Apr 2013 | US |
Child | 16364067 | US |