Ultrasound imaging is a widely accepted tool for guiding interventional instruments such as needles to targets such as blood vessels or organs in the human body. In order to successfully guide, for example, a needle to a blood vessel using ultrasound imaging, the needle is monitored in real-time both immediately before and after a percutaneous puncture in order to enable a clinician to determine the distance and the orientation of the needle to the blood vessel and ensure successful access thereto. Current needle guiding systems include various limitations. Mechanical needle guides used with and attached to ultrasound probes restrict needle movement. Ultrasound images displayed on a screen require viewing the screen while inserting the needle thus requiring the user to look away from the insertion site during insertion. Magnetic needle tracking system require the added expense of magnetized needles and magnetometers.
Disclosed herein are systems, devices, and methods that address these and other limitations associated with utilizing ultrasound imaging to provide guidance during vascular access procedures.
Disclosed herein is an ultrasound probe that, according to some embodiments, includes a probe head having an array of ultrasonic transducers configured to (i) emit generated ultrasound signals into a target area of a patient, (ii) receive reflected ultrasound signals from the patient, and (iii) convert the reflected ultrasound signals into corresponding electrical signals. The ultrasound probe further includes a light source configured to project a visual indication onto a skin surface of the patient and a console coupled with the probe head and the light source. The console includes a signal converter configured to convert the electrical signals into ultrasound image data including an ultrasound image of the target area, one or more processors, and a non-transitory computer-readable medium having logic stored thereon. The logic, when executed by the one or more processors, causes operations of the probe that include (i) performing a determination process on the ultrasound image data to determine when an anatomical target is present within the ultrasound image and (ii) activating the light source to project the visual indication onto the skin surface, where the visual indication includes one or more visual characteristics based on one or more characteristics of the anatomical target.
In some embodiments, the ultrasound probe includes a button configured to enable a user to selectively activate and deactivate the light source, and in some embodiments, the operations further include deactivating the light source when the anatomical target is not present within the ultrasound image.
In some embodiments, the light source includes a separate light source module attached to and operably coupled with the ultrasound probe. The separate light source module may be configured to attach to and operably couple with the ultrasound probe when a sterile barrier is covering the probe, where the sterile barrier is disposed between the separate light source module and the ultrasound probe. The separate light source module may be wirelessly coupled with the ultrasound probe, and the separate light source module may be configured for single use.
In some embodiments, the one or more visual characteristics include at least one of a dot a line, and/or a number of colors. In some embodiments, the one or more visual characteristics include the line and the line may extend away from the ultrasound probe in a direction perpendicular to a front face of the ultrasound probe. In some embodiments, the one or more characteristics of the anatomical target include an identity of the anatomical target and/or a location of the anatomical target with respect to the ultrasound probe.
In some embodiments, the operations further include performing a location process on the ultrasound image data to determine the location of the anatomical target with respect to the ultrasound probe, and in some embodiments, activating the light source includes projecting the visual indication onto the skin surface at a location above the anatomical target. In some embodiments, the location of the visual indication defines an optimal or preferred insertion site for a needle to access the anatomical target.
In some embodiments, the operations further include performing an identification process on the ultrasound image data to identify the anatomical target as a vein or as an anatomical element other than a vein. In some embodiments, the one or more visual characteristics include a first color when the identification process identifies the anatomical target as a vein, and the one or more visual characteristics include a second color different from the first color when the identification process identifies the anatomical target as the anatomical element other than a vein.
In some embodiments, the one or more visual characteristics include a third color when the location process determines that the anatomical target is centrally located with respect to the ultrasound probe, and the one or more visual characteristics include a fourth color different from the third color when the location process determines that the anatomical target is located away from a center of the ultrasound probe.
In some embodiments, performing the location process includes applying a first trained machine-learning model to the ultrasound image data resulting in the determination of the location of the anatomical target with respect to the ultrasound probe. In some embodiments, performing the identification process includes applying a second trained machine-learning model to the ultrasound image data resulting in the identification of the anatomical target as a vein or as an anatomical element other than a vein.
In some embodiments, the ultrasound probe is operably coupled with a needle tracking system configured to determine a location and an orientation of a trackable needle with respect to the ultrasound probe, where the operations further include receiving needle tracking data from the needle tracking system and performing a tracking process on the ultrasound image data in combination with the needle tracking data to determine a location of the trackable needle with respect to the anatomical target. In such embodiments, the one or more visual characteristics include visual characteristics based on the location of the trackable needle with respect to the anatomical target. The visual characteristics based on the location of the trackable needle may be configured to indicate when the trackable needle is aligned with the anatomical target. In some embodiments, visual characteristic based on the location of the trackable needle include (i) a fifth color when the tracking process determines that the trackable needle is not aligned with the anatomical target and (ii) a sixth color different from the fifth color when the tracking process determines that the trackable needle is aligned with the anatomical target.
Also disclosed herein is an ultrasound system that includes an ultrasound probe according to any of the embodiments described above except where the ultrasound probe is coupled with a headset (e.g., an augmented or virtual reality headset) in leu of the light source.
Also disclosed herein is a computerized method that, according to some embodiments, includes receiving ultrasound image data converted from electrical signals generated by an ultrasound probe head of an ultrasound probe, where the ultrasound probe head is placed on a skin surface of a patient over a target area, and where the ultrasound probe head includes an array of ultrasonic transducers configured to (i) emit generated ultrasound signals into a target area of a patient, (ii) receive reflected ultrasound signals from the patient, and (iii) convert the reflected ultrasound signals into corresponding electrical signals. The method further includes performing a determination process on the ultrasound image data to determine when an anatomical target is present within the ultrasound image and activating the light source of the ultrasound probe to project a visual indication onto the skin surface, where the visual indication includes one or more visual characteristics based on one or more characteristics of the anatomical target.
In some embodiments, the method further includes performing a location process on the ultrasound image data to determine the location of the anatomical target within the target area with respect to the ultrasound probe, where activating the light source further includes projecting the visual indication onto the skin surface at a location above the anatomical target, and in some embodiments, performing the location process includes applying a first trained machine-learning model to the ultrasound image data resulting in the determination of the location of the anatomical target with respect to the ultrasound probe.
In some embodiments, the method further includes performing an identification process on the ultrasound image data to identify the anatomical target as a vein or as an anatomical element other than a vein, where activating the light source further includes at least one of (i) projecting the visual indication having a first color when the identification process identifies the anatomical target as a vein or (ii) projecting the visual indication having second color, different from the first color, when the identification process identifies the anatomical target as the anatomical element other than a vein, and in some embodiments, performing the identification process includes applying a second trained machine-learning model to the ultrasound image data resulting in the identification of the anatomical target as the vein or as the anatomical element other than a vein.
Also disclosed herein is a ultrasound imaging system that, according to some embodiments, includes a plurality of ultrasound probes, where each ultrasound probe includes a probe head having an array of ultrasonic transducers configured to (i) emit generated ultrasound signals into a target area of a patient, (ii) receive reflected ultrasound signals from the patient, and (iii) convert the reflected ultrasound signals into corresponding electrical signals. Each ultrasound probe includes further includes a light source configured to project a visual indication onto a skin surface of the patient and a console coupled with the probe head and the light source. The console includes a signal converter configured to convert the electrical signals into ultrasound image data including an ultrasound image of the target area. The console further includes one or more processors and a non-transitory computer-readable medium having logic stored thereon. The logic, when executed by the one or more processors, causes operations of the probe that include (i) performing a location process on the ultrasound image data to determine a location of the anatomical target with respect to the ultrasound probe, where performing the location process includes applying a first trained machine-learning (ML) model to the ultrasound image data and (ii) activating the light source to project the visual indication onto the skin surface at a location above the anatomical target. The system further includes a computing system coupled with each of the plurality of ultrasound probes, where the computing system includes a non-transitory computer-readable medium having ML logic stored thereon. The ML logic, when executed by processors, performs ML operations that include performing a first ML algorithm on historic ultrasound image data sets to define the first trained ML model. The historical ultrasound image data sets include anatomical target location data sets received from the ultrasound probes and actual anatomical target location data sets, and each actual anatomical target location data set corresponds to an anatomical target location data set in a one-to-one relationship.
In some embodiments of the system, the operations further include performing an identification process on the ultrasound image data to determine an identity of the anatomical target as a vein or as an anatomical element other than a vein, and performing the identification process includes applying a second trained ML model to the ultrasound image data. The operations further include (i) activating the light source to project the visual indication having a first color when the identity of the anatomical target includes a vein and/or (ii) activating the light source to project the visual indication having a second color when the identity of the anatomical target includes the anatomical element other than a vein, where the second color is different from the first color. The ML operations further include performing a second ML algorithm on the historic ultrasound image data sets to define the second trained ML model, where the historical ultrasound image data sets further include anatomical target identification data sets received from the ultrasound probes and actual anatomical target identification data sets, and where each actual anatomical target identification data set corresponds to an anatomical target identification data set in a one-to-one relationship.
These and other features of the concepts provided herein will become more apparent to those of ordinary skill in the art in view of the accompanying drawings and following description, which describe particular embodiments of such concepts in greater detail. Further details and features of the concepts provided here may be disclosed in one or more of U.S. Pat. No. 10,322,230 and U.S. Published Application No. 2021-0085282, each of which is incorporated by reference in its entirety into this application.
Embodiments of the disclosure are illustrated by way of example and not by way of limitation in the figures of the accompanying drawings, in which like references indicate similar elements and in which:
Before some particular embodiments are disclosed in greater detail, it should be understood that the particular embodiments disclosed herein do not limit the scope of the concepts provided herein. It should also be understood that a particular embodiment disclosed herein can have features that can be readily separated from the particular embodiment and optionally combined with or substituted for features of any of a number of other embodiments disclosed herein.
Regarding terms used herein, it should also be understood the terms are for the purpose of describing some particular embodiments, and the terms do not limit the scope of the concepts provided herein. Ordinal numbers (e.g., first, second, third, etc.) are generally used to distinguish or identify different features or steps in a group of features or steps, and do not supply a serial or numerical limitation. For example, “first,” “second,” and “third” features or steps need not necessarily appear in that order, and the particular embodiments including such features or steps need not necessarily be limited to the three features or steps. Labels such as “left,” “right,” “top,” “bottom,” “front,” “back,” and the like are used for convenience and are not intended to imply, for example, any particular fixed location, orientation, or direction. Instead, such labels are used to reflect, for example, relative location, orientation, or directions. Singular forms of “a,” “an,” and “the” include plural references unless the context clearly dictates otherwise.
The term “logic” may be representative of hardware, firmware or software that is configured to perform one or more functions. As hardware, the term logic may refer to or include circuitry having data processing and/or storage functionality. Examples of such circuitry may include, but are not limited or restricted to a hardware processor (e.g., microprocessor, one or more processor cores, a digital signal processor, a programmable gate array, a microcontroller, an application specific integrated circuit “ASIC”, etc.), a semiconductor memory, or combinatorial elements.
Additionally, or in the alternative, the term logic may refer to or include software such as one or more processes, one or more instances, Application Programming Interface(s) (API), subroutine(s), function(s), applet(s), servlet(s), routine(s), source code, object code, shared library/dynamic link library (dll), or even one or more instructions. This software may be stored in any type of a suitable non-transitory storage medium, or transitory storage medium (e.g., electrical, optical, acoustical or other form of propagated signals such as carrier waves, infrared signals, or digital signals). Examples of a non-transitory storage medium may include, but are not limited or restricted to a programmable circuit; non-persistent storage such as volatile memory (e.g., any type of random access memory “RAM”); or persistent storage such as non-volatile memory (e.g., read-only memory “ROM”, power-backed RAM, flash memory, phase-change memory, etc.), a solid-state drive, hard disk drive, an optical disc drive, or a portable memory device. As firmware, the logic may be stored in persistent storage.
Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by those of ordinary skill in the art.
The phrases “connected to,” “coupled with,” and “in communication with” refer to any form of interaction between two or more entities, including but not limited to mechanical, electrical, magnetic, electromagnetic, fluid, and thermal interaction. Two components may be coupled with each other even though they are not in direct contact with each other. For example, two components may be coupled with each other through an intermediate component.
Any methods disclosed herein include one or more steps or actions for performing the described method. The method steps and/or actions may be interchanged with one another. In other words, unless a specific order of steps or actions is required for proper operation of the embodiment, the order and/or use of specific steps and/or actions may be modified. Moreover, sub-routines or only a portion of a method described herein may be a separate method within the scope of this disclosure. Stated otherwise, some methods may include only a portion of the steps described in a more detailed method. Additionally, all embodiments disclosed herein are combinable and/or interchangeable unless stated otherwise or such combination or interchange would be contrary to the stated operability of either embodiment.
Although not required, the probe 100 may be coupled, via a wired or wireless connection, with a display 140 so that an ultrasound image 141 as defined by the ultrasound image data may be depicted on the display 140. In the illustrated embodiment, the ultrasound image 141 depicts an anatomical target image 150 of the anatomical target 50. As shown, the anatomical target image 150 is centrally located within the ultrasound image 141 (i.e., the position 151 of the anatomical target image 150 is aligned with a central axis 145 of the ultrasound image 141) consistent with the central location of the anatomical target 50 with respect to the probe 100. As also shown, the anatomical target image 150 may be depicted at locations 152 or 153 with respect to the central axis 145 consistent with the respective positions 52 or 53 of the anatomical target 50 with respect to the probe 100.
The probe 100 further includes a light source 120 configured to project a visual indication onto the skin surface as described further in relation to
The visual indication 210 may include a shape such as a line 224 or a dot 222 configured to indicate a location on the skin surface 41. In some embodiments, the line 224 or a dot 222 may be projected in alignment with a second central axis 205 of the probe 100, where the second central axis 205 (i) intersects the central axis 105 shown in
When the visual indication 210 includes the line 224, the line 224 may indicate the presence of the anatomical target 50 directly beneath the line 224. As such, the user 30 may be confident that a needle 60 inserted into the patient along the line 224 will intersect the anatomical target 50. Similarly, when the visual indication 210 includes the dot 225, the dot 225 may indicate the presence of the anatomical target 50 directly beneath the dot 225. As such, the user 30 may be confident that a needle 50 inserted into the patient at the dot 225 will intersect the anatomical target 50. In some embodiment, the dot 225 may be projected at a defined distance from the front face 102 to indicate an optimal or preferred insertion site for the needle 50. In some embodiments, the visual indication 210 may include a set of graduation lines 226 (or other indicium) that indicate defined distances from the front face 102, such as 0.5 cm, 1 cm, 1.5 cm, and 2 cm, for example. Of course, other distances may be indicated by the graduation lines 226 as may be contemplated by one of ordinary skill.
The visual indication 210 may also include a number of colors to indicate characteristics of the anatomical target 50. In some embodiments, a characteristic of the anatomical target 50 may include an identity. In the illustrated embodiment, the logic may determine the identity of the anatomical target 50 as a blood vessel and may further identify the blood vessel as a vein or some other anatomical element including an artery. As such, the visual indication 210 may also include a color or some other visual characteristic in accordance with the identity of the anatomical target 50. According to one embodiment, the logic may determine that the anatomical target 50 is a vein and project the visual indication 210 having a first color (e.g., green). Similarly, the logic may determine that the anatomical target 50 is an anatomical element other than a vein (e.g., an artery) and project the visual indication 210 having a second color (e.g., red) that is different from the first color.
According to another embodiment, the logic may determine that the anatomical target 50 is located beneath the line 224 (i.e., centrally located with respect to the ultrasound probe) and project the visual indication 210 having a third color. Similarly, the logic may determine that the anatomical target 50 is located at a position spaced away from the line 224 and project the visual indication 210 having a fourth color that is different from the third color.
In some instances of use, the user may deploy the probe 100 to find a vein to be accessed by the needle 60. In such an instances, the user may adjust the position of the probe 100 on the skin surface until the probe 100 projects the visual indication having the third color, in which case the user may have confidence that the needle 60, when inserted into the patient 40 along the line 224 or at the dot 222, will intersect the vein.
Other visual characteristics of the visual indication 210 are also considered as may be contemplated by one of ordinary skill, such as a blinking or flashing light, color variation, textual messages, indicia, shapes, light intensity, or multiple projections, for example, to indicate the characteristics of the anatomical target 50 described above, or other characteristics such as case of access, depth from the skin surface 41, or the presence of an obstruction, for example.
The console 115 includes an interface module 332 (e.g., a connector set) configured to enable operative coupling of the console 115 with the probe head 110 and/or the light source 120. A signal conditioner 331 converts electrical signals from the probe head 110 to ultrasound image data for processing by the one or more processor 310 according to the logic. Similarly, the signal conditioner 331 converts digital data from the processors 310 to electrical signals for the probe head 110 and/or the light source 120.
The determination logic 322 receives ultrasound image data from the probe head 110 and performs a determination process on the ultrasound image data to detect/determine the presence of the anatomical target 50 within the target area 45. Upon detection of the anatomical target 50, the location logic 324 performs a location process on the ultrasound image data to determine the position of the anatomical target 50 with respect to the probe 100, such as at the positions 51, 52, or 53, for example.
Further upon detection of the anatomical target 50, the identification logic 326 may perform an identification process on the ultrasound image data to identify the anatomical target 50, i.e., determine if the anatomical target 50 is a vein or is some other anatomical element, such as a bone, a cluster of nerves, an artery, or a bifurcation of a blood vessel, for example. According to one embodiment, the ultrasound image data may include doppler ultrasound data and the identification logic 326 may be configured to identify the anatomical target 50 based at least partially on the doppler ultrasound data, where the doppler ultrasound data is configured to detect/determine a motion of the anatomical target 50 or portion thereof. Such motion may include pulsing of at least a portion of the anatomical target 50 or a flow of blood within the anatomical target 50.
According to one embodiment, the memory 320 may optionally include a location trained machine-learning (ML) model 325 and performing the location process on the ultrasound image data may include applying the location trained ML model 325 to the ultrasound image data. A result of the applying the location trained ML model 325 to the ultrasound image data may include the determination of the location of the anatomical target 50 with respect to the probe 100.
According to one embodiment, the memory 320 may optionally include an identification trained machine-learning (ML) model 327 and performing the identification process on the ultrasound image data may include applying the identification trained ML model 327 to the ultrasound image data. A result of the applying the identification trained ML model 327 to the ultrasound image data may include the determination of the identity of the anatomical target 50 as a vein, or some other anatomical element, such as an artery, for example.
The method 400 may further include performing a location process on the ultrasound image data to determine the location of the anatomical target within the target area with respect to the ultrasound probe and projecting the visual indication onto the skin surface at a location above the anatomical target (block 440). The method 400 may further include applying a first trained machine-learning model to the ultrasound image data resulting in the determination of the location of the anatomical target with respect to the ultrasound probe (block 450).
The method 400 may further include performing an identification process on the ultrasound image to identify the anatomical target as a vein or some other anatomical element and projecting the visual indication having a first color when the identification process identifies the anatomical target as a vein (block 460). The method 400 may further include projecting the visual indication having second color, different from the first color, when the identification process identifies the anatomical target as an anatomical element other than a vein, including an artery. The method 400 may further include applying a second trained machine-learning model to the ultrasound image data resulting in the identification of the anatomical target as a vein or some other anatomical element (block 470).
The external computing device 330 includes a database 530 and machine-learning (ML) logic 532 stored in memory 510 (e.g., a non-transitory computer-readable medium). The ML logic 532 is configured to acquire historical ultrasound image data sets from the plurality of probes 510 and/or the EMR system 550 to form a training data set 531 stored in the data base 530. The ML logic 532 is further configured to apply an ML algorithm 534 to the training data set 531 to define the location trained ML model 325 and/or the identification trained ML model 327, where the ML logic 532 may be composed of or configured to execute a plurality of ML algorithms 534 (e.g., predictive algorithms such as linear regression, logistic regression, classification and regression trees, Naïve Bayes, K-Nearest neighbors, etc.). The historical ultrasound image data sets include location data sets and/or identification data sets received from the plurality of probes 510 and actual anatomical target data sets that correspond individually (i.e., according to one-to-one relationship) to the ultrasound image data sets. More specifically each ultrasound image data set corresponds with an actual anatomical target data set for a single ultrasound imaging event.
The location data set for the ultrasound imaging event includes the determined location of the anatomical target, i.e., the location of the anatomical target 50 with respect to the probe 100 such as one of the positions 51-53 (see
The external computing device 330 may be coupled with the EMR system 550, and the ML logic 532 may acquire the actual anatomical target data sets from the EMR system 550. The location trained ML model 325 and/or the identification trained ML model 327 may be stored in the memory 520 of the external computing device 330. The ML logic 532 may transmit or communicate location trained ML model 325 and/or the identification trained ML model 327 to the probes 100 for storage in the memory 320.
The ultrasound probe (probe) 600 includes a light source module 610 that is a separate component from the probe 600. The light source module 610 is attachable to the probe 600. In the illustrated embodiments, the light source module 610 is configured to attach to the front face 602 of the probe 600. However, in other embodiments, the light source module 610 may be attached to probe 600 at other locations, such as the right side, left side or back side, for example. The light source module 610 may also be detachable from the probe 600. In some embodiments, the light source module 610 may be configured for single use, i.e., the light source module 610 may be a disposable component. The light source module 610 includes the light source 620. The light source module 610 may be attached to the probe 600 via any suitable fashion, such as via a strap, a clip, a clamp, an adhesive, or one or more magnets, for example.
The light source module 610 is configured to operably coupled with the probe 600 when the light source module 610 is attached to the probe. Although, in some embodiments, the light source module 610 may operably couple with the probe 600 even when the light source module 610 is not physically attached to the probe 600. In some embodiments, the light source module 610 may include a number of electrical connecting members (e.g., pins) configured to make electrical contact with corresponding electrical connecting members (e.g., sockets) of the probe 600.
According to one embodiment, the light source module 610 may be configured to wirelessly couple with the probe 600. As such, the light source module 610 may include console components, such as a battery, a processor, memory, and a wireless module, for example to enable the light source module 610 to operably couple with the probe 600.
In some embodiments, the probe 600 may include a sterile barrier 630, such as a plastic or elastomeric covering (e.g., a bag) that covers the probe 600 including the front face 602. In such embodiments, the light source module 610 may attach to the probe 600, where the sterile barrier 630 is disposed between the light source module 610 and the probe 600. In other words, the light source module 610 is configured to attach to the probe 600 without disabling the sterile barrier 630.
The needle tacking system 880 is configured to magnetically track the trackable needle 881. The trackable needle 881 includes a number (e.g., 1, 2, 3 or more) of magnetic elements 882 configured to generate one or more magnetic fields 883. The needle tacking system 880 further includes a number (e.g., 1, 2, 3, or more) of magnetometers 885 configured to detect the one or more magnetic fields 883. In the illustrated embodiment, the console 815 may in some respects resemble the components and features of the console 115 of
A visual indication 810 may include all or any subset of the features of the visual indication 210 and may further include one or more visual characteristics based on the location of the trackable needle 881 with respect to the anatomical target 50. In the illustrated embodiment, the visual characteristics are configured to indicate when the trackable needle 881 is aligned with the anatomical target 50. More specifically, the visual characteristics based on the location of the trackable needle 881 with respect to the anatomical target 50 are configured to indicate when a location and orientation of the trackable needle 881 with respect to the anatomical target 50 are such that insertion of the trackable needle 881 into the patient 40 will enter or intersect the anatomical a target 50. In some embodiments, the visual characteristics based on the location of the trackable needle 881 include (i) a fifth color (e.g., red) when the tracking process determines that the trackable needle is not aligned with the anatomical target and (ii) a sixth color (e.g., green) different from the fifth color when the tracking process determines that the trackable needle is aligned with the anatomical target.
Further details regarding the needle tracking system 880 can be found in the following U.S. patent and patent application publications 2014/0257080; 2014/0257104; U.S. Pat. Nos. 9,155,517; 9,257,220; 9,459,087; and 9,597,008, each of which is incorporated by reference in its entirety into this application.
While some particular embodiments have been disclosed herein, and while the particular embodiments have been disclosed in some detail, it is not the intention for the particular embodiments to limit the scope of the concepts provided herein. Additional adaptations and/or modifications can appear to those of ordinary skill in the art, and, in broader aspects, these adaptations and/or modifications are encompassed as well. Accordingly, departures may be made from the particular embodiments disclosed herein without departing from the scope of the concepts provided herein.
This application claims the benefit of priority to U.S. Provisional Application No. 63/529,217, filed Jul. 27, 2023, which is incorporated by reference in its entirety into this application.
Number | Date | Country | |
---|---|---|---|
63529217 | Jul 2023 | US |