In an aspect, a method may be used for tracking a medical instrument. The method may include capturing image data by various sources. The method may include capturing ultrasound data. The ultrasound data may be captured via an ultrasound probe. The method may include dewarping the image data. The method may include searching for a marker in the dewarped image data. If it is determined that the marker is found, the method may include extracting an identification. The method may include comparing fiducials with a known geometry. The method may include determining a pose. The method may include determining a location of the medical instrument relative to the ultrasound probe, determining ultrasound data, obtaining an ultrasound image, or any combination thereof. The method may include overlaying a three-dimensional projection of the medical instrument onto the ultrasound data, the ultrasound image, or both.
Another aspect may include a system for tracking a medical instrument. The system may include an ultrasound probe, a camera, a marker, and a computing device. The ultrasound probe may be configured to capture ultrasound data. The camera may be coupled to the ultrasound probe. The camera may be configured to capture marker image data. The marker may be coupled to the medical instrument. The computing device may be configured to dewarp the image data. The computing device may be configured to search for the marker in the dewarped image data. The computing device may be configured to extract an identification. The computing device may be configured to compare fiducials with a known geometry. The computing device may be configured to determine a pose. The computing device may be configured to determine a location of the marker and medical instrument relative to the ultrasound probe, determine ultrasound data, obtain an ultrasound image, or any combination thereof. The computing device may be configured to overlay a three-dimensional projection of the marker and medical instrument onto the ultrasound data, the ultrasound image, or both.
The disclosure is best understood from the following detailed description when read in conjunction with the accompanying drawings. It is emphasized that, according to common practice, the various features of the drawings are not to-scale. On the contrary, the dimensions of the various features are arbitrarily expanded or reduced for clarity.
Reference will now be made in greater detail to a preferred embodiment of the invention, an example of which is illustrated in the accompanying drawings. Wherever possible, the same reference numerals will be used throughout the drawings and the description to refer to the same or like parts.
Many medical procedures require the placement of a needle in order for a procedure to be performed. These procedures include, and are not limited to, central line access, peripheral venous access, peripheral nerve blocks, and core needle biopsies. For example, vessels near the surface of the skin can be easily seen, however in some cases, the target vessel is too deep to see from the surface giving the medical provider no indication of the position relative to the target vessel. The medical provider may be a physician, a physician assistant, a nurse, a nurse practitioner, or any other qualified medical personnel. In some cases, the medical provider may be a robot or robot assisted clinician. Ultrasound is a standard method to identify subsurface vessels and tissues for prospective needle placement in deep tissue. Ultrasound guidance provides a cross-section of the target. Using ultrasound guidance, care providers may obtain live feedback of the position of an instrument relative to the target location when the image of the needle becomes a data image from the ultrasound probe. Ultrasound guidance may reduce the risk of missing targeted tissue, potential complications, and increases the ability of a care provider to access previously inaccessible areas, however it cannot locate and track the tip of the needle position in real-time, either prior to skin insertion or during insertion prior to the needle being imaged by the ultrasound probe. If the provider advances the needle too deep, the ultrasound image will appear to indicate that the needle is placed correctly in the target vessel, when actually the needle has penetrated and passed through the intended target. Due to the limitations of a single two-dimensional plane ultrasound imaging system, it is difficult to co-locate the trajectory of an instrument and the target tissue or vessel both prior to skin insertion and after skin insertion.
Typical solutions use an electromagnetic field to track the instrument tip location. An external antenna that is placed near the patient emits an electromagnetic field. These solutions require that a sensor is placed in the tip of the instrument to be tracked. The sensor is connected to a device configured to resolve the orientation of the sensor in three-dimensional space via a wired connection. These solutions require a second sensor that is attached to the ultrasound probe to determine the orientation of the ultrasound probe. These solutions are expensive and require a large antenna field footprint. In addition, the components are non-disposable and require sterilization, which increases the risk of spreading infection. Having a wire connected to the instrument may block the functionality of the instrument. In addition, these solutions require the use of proprietary instruments, potentially increasing ongoing costs. Some solutions include a physical needle guide that may be clipped onto the ultrasound probe, however these solutions are impractical in use. The embodiments disclosed herein offer a low cost solution by providing the care provider one or more synchronized co-located optimal views of the target tissue and instrument. The embodiments disclosed herein are compatible with any existing ultrasound equipment and any medical instrument. The embodiments disclosed herein incur minimal disruption to standard operating procedures.
As used herein, the terminology “instrument” may be any device that may be used for ultrasound guided applications, including, but not limited to central venous cannulation, local/regional nerve block, cyst aspiration, fine needle aspiration (FNA), core needle biopsy, peripherally inserted central catheter (PICC) line placement, arterial line placement, peripheral venous cannulation, and radio frequency (RF) ablation. In some embodiments, the instrument may include a needle or any type of device that is configured for insertion into a patient.
As used herein, the terminology “computer” or “computing device” includes any unit, or combination of units, capable of performing any method, or any portion or portions thereof, disclosed herein.
As used herein, the terminology “processor” indicates one or more processors, such as one or more special purpose processors, one or more digital signal processors, one or more microprocessors, one or more controllers, one or more microcontrollers, one or more application processors, one or more central processing units (CPU)s, one or more graphics processing units (GPU)s, one or more digital signal processors (DSP)s, one or more application specific integrated circuits (ASIC)s, one or more application specific standard products, one or more field programmable gate arrays, any other type or combination of integrated circuits, one or more state machines, or any combination thereof.
As used herein, the terminology “memory” indicates any computer-usable or computer-readable medium or device that can tangibly contain, store, communicate, or transport any signal or information that may be used by or in connection with any processor. For example, a memory may be one or more read only memories (ROM), one or more random access memories (RAM), one or more registers, low power double data rate (LPDDR) memories, one or more cache memories, one or more semiconductor memory devices, one or more magnetic media, one or more optical media, one or more magneto-optical media, or any combination thereof.
As used herein, the terminology “instructions” may include directions or expressions for performing any method, or any portion or portions thereof, disclosed herein, and may be realized in hardware, software, or any combination thereof. For example, instructions may be implemented as information, such as a computer program, stored in memory that may be executed by a processor to perform any of the respective methods, algorithms, aspects, or combinations thereof, as described herein. Instructions, or a portion thereof, may be implemented as a special purpose processor, or circuitry, that may include specialized hardware for carrying out any of the methods, algorithms, aspects, or combinations thereof, as described herein. In some implementations, portions of the instructions may be distributed across multiple processors on a single device, on multiple devices, which may communicate directly or across a network such as a local area network, a wide area network, the Internet, or a combination thereof.
As used herein, the terminology “determine” and “identify,” or any variations thereof, includes selecting, ascertaining, computing, looking up, receiving, determining, establishing, obtaining, or otherwise identifying or determining in any manner whatsoever using one or more of the devices and methods shown and described herein.
As used herein, the terminology “example,” “embodiment,” “implementation,” “aspect,” “feature,” or “element” indicates serving as an example, instance, or illustration. Unless expressly indicated, any example, embodiment, implementation, aspect, feature, or element is independent of each other example, embodiment, implementation, aspect, feature, or element and may be used in combination with any other example, embodiment, implementation, aspect, feature, or element.
As used herein, the terminology “or” is intended to mean an inclusive “or” rather than an exclusive “or.” That is, unless specified otherwise, or clear from context, “X includes A or B” is intended to indicate any of the natural inclusive permutations. That is, if X includes A; X includes B; or X includes both A and B, then “X includes A or B” is satisfied under any of the foregoing instances. In addition, the articles “a” and “an” as used in this application and the appended claims should generally be construed to mean “one or more” unless specified otherwise or clear from the context to be directed to a singular form.
Further, for simplicity of explanation, although the figures and descriptions herein may include sequences or series of steps or stages, elements of the methods disclosed herein may occur in various orders or concurrently. Additionally, elements of the methods disclosed herein may occur with other elements not explicitly presented and described herein. Furthermore, not all elements of the methods described herein may be required to implement a method in accordance with this disclosure. Although aspects, features, and elements are described herein in particular combinations, each aspect, feature, or element may be used independently or in various combinations with or without other aspects, features, and elements.
The ultrasound device 110 includes a probe 120. The probe 120 may be a handheld probe. The probe 120 is configured to obtain a two-dimensional planar image of a portion of a patient. The probe 120 may be configured to use ultrasound, magnetic resonance, light, radio frequency (RF), or any other suitable diagnostic method capable of visualizing an internal portion of a patient. In this example, the probe 120 may communicate with the ultrasound device 110 via an ultrasound data cable. In some embodiments, the probe 120 may communicate with the ultrasound device 110 wirelessly, for example using any 802 technology, Bluetooth, near-field communication (NFC), or any other suitable wireless technology.
The probe 120 may be configured with a camera 130. The camera 130 may be removably attached to the probe 120, or it may be integrated with the probe 120. In some examples, the probe 120 may include two or more cameras. The camera 130 is configured to capture image data and send the image data to the computing device 140. The image data may be transmitted via a wired or wireless communication link. In an example, the camera 130 may be configured to rotate or flip such that the angle of the camera is adjustable and configurable by the user based on an angle of approach or user preference.
The ultrasound device 110 is configured to obtain ultrasound data via the probe 120. The ultrasound device 110 may include a processor 115 that is configured to process the ultrasound data and generate a video output. The ultrasound device 110 is configured to send the video output to the computing device 140. The ultrasound device 110 may transmit the video output via a wired or wireless communication link.
The computing device 140 is configured to receive the video output from the ultrasound device 110 and the image data from the camera 130. The computing device 140 may include a processor 145 that is configured to determine a position of a medical instrument (not shown) based on the image data from the camera 130 in real-time. The processor 145 of the computing device 140 may be configured to generate an overlay image that includes the determined position of the medical instrument in real-time. The processor 145 of the computing device 140 may be configured to merge the overlay image with the received video output from the ultrasound device 110 in real-time. The computing device 140 may be configured to overlay the positional information on the video stream and output the merged image to the monitor 150 for display in real-time. The computing device 140 may be configured to output the merged image in real-time via a wired or wireless communication link.
The ultrasound device 210 includes a probe 220. The probe 220 may be a handheld probe. The probe 220 is configured to obtain a two-dimensional planar image of a portion of a patient. The probe 220 may be configured to use ultrasound, magnetic resonance, light, radio frequency (RF), or any other suitable diagnostic method capable of visualizing an internal portion of a patient. the probe 220 may communicate with the ultrasound device 210 via an ultrasound data cable. In some embodiments, the probe 220 may communicate with the ultrasound device 210 wirelessly, for example using any 802 technology, Bluetooth, NFC, or any other suitable wireless technology.
The probe 220 may be configured with a camera 230. The camera 230 may be removably attached to the probe 220, or it may be integrated with the probe 220. In some examples, the probe 220 may include two or more cameras. The camera 230 is configured to capture image data and send the image data to the ultrasound device 210. The image data may be transmitted via a wired or wireless communication link. In an example, the camera 230 may be configured to rotate or flip such that the angle of the camera is adjustable and configurable by the user based on an angle of approach or user preference.
The ultrasound device 210 is configured to obtain ultrasound data via the probe 220. The ultrasound device 210 may include a processor 215 that is configured to process the ultrasound data and generate a video output. The ultrasound device 210 may be configured to receive the image data from the camera 230. The processor 215 may be is configured to determine a position of a medical instrument (not shown) based on the image data from the camera 230. The processor 215 of the ultrasound device 210 may be configured to generate an overlay image that includes the determined position of the medical instrument. The processor 215 of the ultrasound device 210 may be configured to merge the overlay image with the generated video output. The ultrasound device 210 may be configured to output the merged image to the monitor 250 for display. The ultrasound device 210 may be configured to output the merged image via a wired or wireless communication link.
In some examples, the camera 230 may include a processor (not shown) that is configured to determine a position of a medical instrument (not shown) based on the image data from the camera 230. The processor of the camera 230 may be configured to generate an overlay image that includes the determined position of the medical instrument and transmit the overlay image to the ultrasound device 210. The processor 215 of the ultrasound device 210 may be configured to merge the overlay image with the generated video output. The ultrasound device 210 may be configured to output the merged image to the monitor 250 for display. The ultrasound device 210 may be configured to output the merged image via a wired or wireless communication link.
The optical system 700 includes an instrument 740. In this example, the instrument 740 may be a needle. As shown in
A body part 760 of a patient is shown as an example in
The monitor display 800 includes a projected side view 840 of the instrument trajectory. The projected side view 840 may show the distance between a current instrument position 850 and a side view of the plane of the ultrasound cross section 860. In this example, the current instrument position 850 may correspond with a tip of the instrument, for example the tip of a needle. The ultrasound cross section 805 is a front view of the ultrasound cross section 860. The projected side view 840 includes a trajectory 870 of the instrument.
The target area is shown as the point where the trajectory 870 and the ultrasound cross section 860 intersect. The current instrument projection 850 is used to track the depth of the tip of the instrument. The projected side view 840 may be used to determine if the current instrument position 850 passes the ultrasound cross section along the trajectory beyond the target area.
In another example, a depth gauge 880 may be displayed as an overlay on the monitor display 800. As shown in
In this example, one or more of the points 1030A-D (shown in dotted lines) of the marker 1020 may be used as a reference for the software to determine the three-dimensional position of the marker 1020. The points 1030A-D may be referred to as fiducials of the marker. In this example, points 1030A-D are shown as the four corners of the marker 1020, however the points 1030A-D may represent any of the one or more points of the marker 1020 and not limited to the four corners. In this example, the marker 1020 may be a square marker, but the fiducials may be three-sided, such as in a triangle marker, to infinitely sided. The three-dimensional position of the marker 1020 may be used in conjunction with the identification of the marker 1020 to determine the location of the tip of the needle 1010.
As shown in
A translation vector (tvec) and a rotational vector (rvec) may be determined. The tvec is associated with the (x,y,z) location of the center 1040 of the marker relative to the center of the camera. Z may be the distance away from the camera. The rvec may be associated with euler angles of how the marker 1020 is rotated along each of the axes, for example the x-axis may represent the pitch, the y-axis may represent the yaw, and the z-axis may represent the roll angle.
The processor of the camera, processor 115 of
The computing device may be configured to compare 1135 the fiducials with one or more previously known geometries. The computing device may be configured to determine 1140 a pose, for example, as discussed with reference to
Although some embodiments herein refer to methods, it will be appreciated by one skilled in the art that they may also be embodied as a system or computer program product. Accordingly, aspects of the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment (including firmware, resident software, micro-code, etc.) or an embodiment combining software and hardware aspects that may all generally be referred to herein as a “processor,” “device,” or “system.” Furthermore, aspects of the present invention may take the form of a computer program product embodied in one or more computer readable mediums having computer readable program code embodied thereon. Any combination of one or more computer readable mediums may be utilized. The computer readable medium may be a computer readable signal medium or a computer readable storage medium. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. More specific examples (a non-exhaustive list) of the computer readable storage medium include the following: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a random access memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or Flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the context of this document, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
A computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to CDs, DVDs, wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
Computer program code for carrying out operations for aspects of the present invention may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, Smalltalk, C++ or the like and conventional procedural programming languages, such as the “C” programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the latter scenario, the remote computer may be connected to the user's computer through any type of network, including a local area network (LAN) or a wide area network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet Service Provider).
Aspects of the present invention are described below with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the invention. It will be understood that each block of the flowchart illustrations and/or block diagrams, and combinations of blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks. These computer program instructions may also be stored in a computer readable medium that can direct a computer, other programmable data processing apparatus, or other devices to function in a particular manner, such that the instructions stored in the computer readable medium produce an article of manufacture including instructions which implement the function/act specified in the flowchart and/or block diagram block or blocks. The computer program instructions may also be loaded onto a computer, other programmable data processing apparatus, or other devices to cause a series of operational steps to be performed on the computer, other programmable apparatus or other devices to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide processes for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks. The flowcharts and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present invention. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems that perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
While the disclosure has been described in connection with certain embodiments, it is to be understood that the disclosure is not to be limited to the disclosed embodiments but, on the contrary, is intended to cover various modifications and equivalent arrangements included within the scope of the appended claims, which scope is to be accorded the broadest interpretation so as to encompass all such modifications and equivalent structures as is permitted under the law.
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/US2020/039058 | 6/23/2020 | WO |
Number | Date | Country | |
---|---|---|---|
62865375 | Jun 2019 | US |