Ultrasound system with pressure and flow determination capability

Information

  • Patent Grant
  • 12165315
  • Patent Number
    12,165,315
  • Date Filed
    Tuesday, November 30, 2021
    3 years ago
  • Date Issued
    Tuesday, December 10, 2024
    a month ago
Abstract
An ultrasound imaging system is disclosed that includes an ultrasound probe including a plurality of ultrasound transducers configured to acquire ultrasound images, a processor and non-transitory computer-readable medium having stored thereon logic that, when executed by the processor, is configured to perform operations including receiving ultrasound imaging data, detecting one or more blood vessels within the ultrasound imaging data, identifying at least one blood vessel of the one or more blood vessels as an artery or as a vein, and rendering a visualization of at least a subset of the one or more blood vessels on a display. The logic may, when executed by the processor, cause performances of further operations including identifying the at least one blood vessel as the artery at least one differentiating characteristic of a plurality of differentiating characteristics of blood vessel type.
Description
BACKGROUND

There is currently a variety of existing ultrasound systems that include wired or wireless ultrasound probes connected to visual displays. These systems may be used by a clinician to hold and manipulate the ultrasound probe to place a vascular access device (VAD) such as a catheter in a patient. Ultrasound imaging is commonly used for guiding a needle to targets such as veins of the patient. The needle may be monitored in real-time prior to and after a percutaneous insertion. This way a clinician may be able to determine the distance and the orientation of the needle in relation to a target vein and ensure accurate insertion with minimal discomfort to the patient. In some instances, a target vein may be difficult to distinguish from an adjacent artery. As such, in some instances, the clinician may inadvertently puncture an adjacent artery. A puncture of an artery may cause harm and/or discomfort to the patient. The arterial puncture may also require unplanned intervention by the clinician to stop bleeding from the artery. Thus, distinguishing a vein from an artery prior to needle insertion may inhibit harm and discomfort to a patient and may be logistically advantageous for the clinician.


Accordingly, disclosed herein are ultrasound imaging systems and methods that distinguish arteries from veins based on, at least, blood vessel differentiating characteristics.


SUMMARY OF THE INVENTION

Briefly summarized, disclosed herein is an ultrasound system including, an ultrasound probe comprising a plurality of ultrasound transducers configured to acquire ultrasound images. The system further includes a processor and non-transitory computer-readable medium having stored thereon a logic module that, when executed by the processor, is configured to perform operations including, receiving real-time ultrasound imaging data, detecting one or more blood vessels within the ultrasound imaging data, 1) identifying at least one blood vessel of the one or more blood vessels as an artery or 2) identifying at least one blood vessel of the one or more blood vessels a vein, and rendering a visualization of at least a subset of the one or more blood vessels on a display of the console.


In some embodiments, the processor, the non-transitory, computer-readable medium and the display screen comprise a console. In some embodiments, the ultrasound imaging system includes an operation identifying the blood vessel as an artery.


In some embodiments, the system employs at least one differentiating characteristic among a plurality of differentiating characteristics when identifying the blood vessel. The differentiating characteristics are defined by the logic module and include a blood vessel diameter, a blood vessel wall thickness, an image pulsatility of a blood vessel, a depth of a blood vessel with respect to a skin surface of a patient, a location of a first blood vessel in relation to a location of a blood second vessel, and a cross-sectional shape of a blood vessel.


In some embodiments, the system employs at least two differentiating characteristics when identifying the blood vessel.


In some embodiments, the logic module defines one or more thresholds employed when identifying a blood vessel.


In some embodiments, the logic module includes an operation of identifying at least one differentiating characteristic of the blood vessel within the real-time imaging data.


In some embodiments, the logic module includes an operation of comparing the real-time imaging data pertaining to a differentiating characteristic with one or more thresholds defined by the logic module resulting in a confidence level for the identification of the blood vessel.


In some embodiments, the real-time imaging data includes image pulsatility data of the blood vessel, and wherein identifying the blood vessel includes comparing the image pulsatility data to one or more image pulsatility thresholds to obtain the confidence level for the identification of the blood vessel.


In some embodiments, the ultrasound probe of the ultrasound imaging system includes a pressure sensor configured to obtain pressure pulsatility data of the blood vessel and the logic module includes an operation of receiving real-time pressure pulsatility data in coordination with receiving real-time imaging data. The logic module includes a differentiating characteristic pertaining to pressure pulsatility of a blood vessel, and an operation of the logic module includes comparing the pressure pulsatility data of the blood vessel to one or more pressure pulsatility thresholds to obtain a confidence level for the identification of the blood vessel.


In some embodiments, comparing the pressure pulsatility data is performed in combination with comparing the image pulsatility data to obtain a combined confidence level for the identification of the blood vessel.


In some embodiments, the logic module includes an operation of rendering a visualization of one or more blood vessels on the display of the console and includes rendering indicia on the display to indicate to the clinician if any of the blood vessels is an artery.


In some embodiments, the logic module includes operations of tracking a position of a needle tip in relation to the one or more blood vessels, and generating an alert to a clinician if the needle tip is positioned within a perimeter threshold of an artery. The alert may further include rendering indicia on the display that includes a text notification or an arrow indicating a direction to move the needle away from the artery.


In some embodiments, the ultrasound imaging system includes an artificial intelligence module configured to define the thresholds. In some embodiments, the artificial intelligence module may define one or more of the differentiating characteristics.


Disclosed herein is a method for identifying an artery among a plurality of blood vessels, including obtaining ultrasound images via an ultrasound imaging system. The ultrasound imaging system includes an ultrasound probe having a plurality of ultrasound transducers configured to acquire ultrasound images and a console. The console includes a processor and non-transitory computer-readable medium having stored thereon a logic module configured to perform operations. The operations include receiving real-time ultrasound imaging data, detecting one or more blood vessels within the ultrasound imaging data, identifying at least one blood vessel of the one or more blood vessels as an artery or identifying at least one blood vessel of the one or more blood vessels as a vein.


In some embodiments of the method, the operations include identifying the blood vessel as an artery.


In some embodiments of the method, the logic module includes an operation of rendering a visualization of at least a subset of the one or more blood vessels on a display of the console, wherein the subset includes the blood vessel. The rendering operation may further include rendering indicia on the display indicating that the blood vessel is an artery.


In some embodiments of the method, the operations include identifying at least one differentiating characteristic of the blood vessel within the real-time imaging data, and comparing the real-time imaging data to one or more thresholds defined by the logic module to obtain a confidence level that the blood vessel is an artery.


In some embodiments of the method, identifying a blood vessel as an artery includes, receiving image pulsatility data for the blood vessel, and comparing the image pulsatility data to one or more image pulsatility thresholds to obtain the confidence level that the blood vessel is an artery.


In some embodiments of the method, the ultrasound probe further comprises a pressure sensor configured to obtain pressure pulsatility data in combination with acquiring ultrasound images, and wherein identifying a blood vessel includes, receiving pressure pulsatility data for the blood vessel, and comparing the pressure pulsatility data to one or more pressure pulsatility thresholds defined by the logic module to obtain an additional confidence level for the identification of the blood vessel.


These and other features of the concepts provided herein will become more apparent to those of skill in the art in view of the accompanying drawings and following description, which describe particular embodiments of such concepts in greater detail.





BRIEF DESCRIPTION OF DRAWINGS

A more particular description of the present disclosure will be rendered by reference to specific embodiments thereof that are illustrated in the appended drawings. It is appreciated that these drawings depict only typical embodiments of the invention and are therefore not to be considered limiting of its scope. Example embodiments of the invention will be described and explained with additional specificity and detail through the use of the accompanying drawings in which:



FIG. 1 illustrates an ultrasound imaging system including an ultrasound probe connected to an image processing device, in accordance with some embodiments.



FIG. 2 is a front perspective view of an ultrasound probe assembly including a pressure sensor, in accordance with some embodiments.



FIG. 3A illustrates the ultrasound probe of FIG. 2 together with a cross-sectional portion of a patient, in accordance with some embodiments.



FIG. 3B illustrates the ultrasound probe of FIG. 2 together with a cross-sectional portion of a patient when a force is applied to the ultrasound probe, in accordance with some embodiments.



FIG. 4 is a block diagram of the ultrasound imaging system of FIG. 1, in accordance with some embodiments.





DETAILED DESCRIPTION

Before some particular embodiments are disclosed in greater detail, it should be understood that the particular embodiments disclosed herein do not limit the scope of the concepts provided herein. It should also be understood that a particular embodiment disclosed herein can have features that can be readily separated from the particular embodiment and optionally combined with or substituted for features of any of a number of other embodiments disclosed herein.


Regarding terms used herein, it should also be understood the terms are for the purpose of describing some particular embodiments, and the terms do not limit the scope of the concepts provided herein. Ordinal numbers (e.g., first, second, third, etc.) are generally used to distinguish or identify different features or steps in a group of features or steps, and do not supply a serial or numerical limitation. For example, “first,” “second,” and “third” features or steps need not necessarily appear in that order, and the particular embodiments including such features or steps need not necessarily be limited to the three features or steps. Labels such as “left,” “right,” “top,” “bottom,” “front,” “back,” “upward,” “downward,” and the like are used for convenience and are not intended to imply, for example, any particular fixed location, orientation, or direction. Instead, such labels are used to reflect, for example, relative location, orientation, or directions. Singular forms of “a,” “an,” and “the” include plural references unless the context clearly dictates otherwise. Also, the words “including,” “has,” and “having,” as used herein, including the claims, shall have the same meaning as the word “comprising.”


Lastly, in the following description, the terms “or” and “and/or” as used herein are to be interpreted as inclusive or meaning any one or any combination. As an example, “A, B or C” or “A, B and/or C” mean “any of the following: A; B; C; A and B; A and C; B and C; A, B and C.” An exception to this definition will occur only when a combination of elements, components, functions, steps or acts are in some way inherently mutually exclusive.


Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by those of ordinary skill in the art.



FIG. 1 illustrates an ultrasound imaging system 100 including an ultrasound probe 101 connected to console 105, in accordance with some embodiments. In this embodiment, the ultrasound probe 101 is connected to the console 105 over a wired connection. In other embodiments, a wireless connection may be used. The ultrasound probe 101 includes a body that may house a console (FIG. 4) operatively connected to the console 105. The ultrasound probe 101 may be configured to assist a user such as a clinician with insertion of an access device such as a needle 112 into a target of a patient such as the vein 110.


Ultrasonic transducers 130 located in a probe head 102 of the ultrasound probe 101 are configured to capture 2-D ultrasound images to be visualized on a display of the console 105 in an ultrasound image window 122. The ultrasonic transducers 130 may be arranged and/or activated in a linear array or a 2-D array. The ultrasonic transducers 130 may be implemented as piezoelectric transducers or capacitive micro-machined ultrasonic transducers (CMUTs). When the ultrasound probe 101 is configured with the 2-D array of the ultrasonic transducers 130, a subset of the ultrasonic transducers may be linearly activated as a linear array as may be beneficial for ultrasound imaging based on ultrasound-imaging data being captured. The ultrasound transducers 130 may be configured to maintain the target, such as the vein 110, in an image plane parallel to a medical-device plane or switch to a different image plane perpendicular to the medical-device plane. In some embodiments, the ultrasound probe 101 may be configured with the moveable linear array of the ultrasonic transducers 130 that may be activated for ultrasound imaging.


In use, the probe head 102 may be placed against the skin 104 of a patient proximate to a needle-insertion site so the activated ultrasonic transducers 130 in the probe head 102 may generate and emit the ultrasound signals into the patient as a sequence of pulses. Transmitters within the probe head 102 (not shown) may receive reflected ultrasound signals (i.e., reflections of the generated ultrasonic pulses from the patient's body). The reflected ultrasound signals may be converted into corresponding electrical signals for processing into ultrasound images by a console of the ultrasound imaging system 100. Thus, a clinician may employ the ultrasound imaging system 100 to determine a suitable insertion site and establish vascular access to the target vein 110 with the needle 112 or another medical device.


As discussed above, the exemplary ultrasound imaging system 100 may be capable of detecting and identifying a vein 110 and an artery 120 and providing a visualization of veins 110 and arteries 120 in the ultrasound window 122. While a single vein 110 and a single artery 120 are shown in FIG. 1, the ultrasound imaging system 100 may be capable of obtaining ultrasound images including multiple veins 110 and arteries 120 and providing a visualization of multiple veins 110 and arteries 120 in the ultrasound window 122. As discussed above, the exemplary ultrasound imaging system 100 may provide visualization of an ultrasound image in real-time. In other words, the ultrasound imaging system 100 may provide a real-time ultrasound image in the ultrasound window 122 during an ultrasound imaging procedure.


Additionally, as will be discussed below, the ultrasound imaging system 100 may include one or more logic modules that, upon processing by a processor, are capable of processing the obtained ultrasound images, identifying one or more blood vessels (e.g., the artery 120 and the vein 110) and distinguishing the artery 120 from the vein 110. For example, the logic modules(s) of the ultrasound imaging system 100 may identify an artery 120 and distinguish the artery 120 from the vein 110 according to a plurality of differentiating characteristics. As one illustrative embodiment, the artery 120 may have a smaller diameter and a thicker wall than the vein 110, which may be detected and identified by the logic modules. As another illustrative embodiment, the artery 120 may have a higher internal pressure than the vein 110 and may have detectable variations of its diameter due to pressure pulses within the artery 120, whereas the vein 110 may have small or non-detectable pressure pulses. The ultrasound imaging system 100 may provide real-time imaging information of a blood vessel, e.g., size, cross-sectional shape, wall thickness, and movement the blood vessel or portions thereof. The ultrasound imaging system 100, via processing of the ultrasound images by the one or more logic modules, may provide detection and identification of the vein 110 and/or the artery 120 for general monitoring, diagnostics, or assessment prior to and/or post an intervention. In some embodiment, the one or more logic modules may identify fluid or body cavities (e.g., pleural space); therefore, the discussion provided herein pertaining to detection and identification of blood vessels may also apply to pleural space. It will be understood by those having skill in that art that certain embodiments described herein may only be applicable to blood vessels (e.g., distinguishing an artery from a vein).


In some embodiments, the ultrasound imaging system 100 may be configured to detect and track a position of the needle 112. In one embodiment, the ultrasound imaging system 100 may include positional tracking of the needle 112 with respect to a target, e.g., the vein 110. The clinician may adjust the position of the needle 112 for correct placement of the needle 112 in relation to the vein 110 in response to ultrasound imaging displayed in the ultrasound window 122. In some embodiments, the needle tracking can be implemented using the teachings of one or more of the following: U.S. Pat. Nos. 9,456,766, 9,492,097, 9,521,961, 9,554,716, U.S. Ser. No. 10/524,691, U.S. Ser. No. 10/449,330, and US 2018/0116551, each of which is incorporated by reference in its entirety into this application.



FIG. 2 is a front perspective view of the ultrasound probe 101 including the probe head 102 comprising the ultrasound transducers 130. In some embodiments, the ultrasound probe 101 may include a pressure sensor 240 as a component of the probe head 102. The pressure sensor 240 may be integrated into a front surface of the probe head 102 of the ultrasound probe 101. The pressure sensor 240 may be configured to provide an indication of pressure at the skin surface. In other words, the pressure sensor 240 may indicate a pressure between the probe head 102 and the skin 104 when the clinician urges the ultrasound probe 101 against the skin 104. In some embodiments, the pressure sensor 240 may detect pressure pulses originating beneath the skin surface. In other words, a pressure pulse within the artery 120 may travel through body tissue from the artery 120 to the surface of the skin 104 to be detected by the pressure sensor 240. In some embodiments, the pressure sensor 240 may comprise one or more pressure or force transducers. In other embodiments, the pressure sensor 240 may comprise one or more of a strain gauge, a piezo-resistive strain gauge, a capacitive gauge, an electromagnetic gauge, an optical sensor, or any other suitable device for converting pressure at the probe head 102 into an electrical signal. The pressure sensor 240 may be coupled to the console to provide electrical signals to the console.



FIG. 3A is a side view illustration of the ultrasound probe 101 projecting ultrasound signals 330 through the skin 104 and body tissue 304 of a patient to the vein 110 and the artery 120. As discussed above, the probe head 102 comprises ultrasound transducers 130 and a pressure sensor 240. As shown in this exemplary illustration, the vein 110 has a cross-sectional diameter 310 that is larger diameter than a cross-sectional of diameter 320 of the artery 120. As may be anatomically typical, the artery 120 is illustrated at a greater depth from the skin 104 than the vein 110. The difference in depth between the artery 120 and the vein 110 may be used as a differentiating characteristic between the artery 120 and the vein 110. The artery 120 has a higher internal pressure than the vein 110 and pulsating blood pressure within the artery 120 causes the diameter 320 of the artery 120 to expand as shown by an arrow and the dotted lines depicting the pulsatility of the artery 120. Changes of the diameter 320 in real-time may be used for identification of the artery 120 and/or for distinguishing the artery 120 from the vein 110 via processing of the ultrasound signals 330 and pressure pulses 340, discussed below.


As described above, in some embodiments, the probe head 102 of the ultrasound probe 101 may include the pressure sensor 240 capable of and configured to detect pressure pulses 340 emanating from the artery 120. FIG. 3A illustrates the pressure pulses 340 traveling through the body tissue 304 to the surface of the skin 104 and the pressure sensor 240. In some embodiments, signals from the pressure sensor 240 may be used to distinguish the artery 120 from the vein 110. In other words, differentiating characteristics between the artery 120 and the vein 110 may include pressure pulsatility as detected by the pressure sensor 240.



FIG. 3B is another side view illustration of the ultrasound probe 101 positioned in relation to the skin 104 of the patient similar to FIG. 3A. FIG. 3B differs from FIG. 3A, in that an indicated downward force is applied to the ultrasound probe 101 to provide a compression force, i.e., a compressive pressure 350, to the skin 104. In some embodiments, the compressive pressure 350 may be quantifiably measurable by the pressure sensor 240. As shown, the compressive pressure 350 translates through the body tissue 304 to the vein 110 and the artery 120 causing the vein 110 and the artery 120 to at least partially collapse. In some instances, the compressive pressure 350 may be sufficient to cause the vein 110 and/or the artery 120 to totally collapse occluding blood flow. In such an instance, the compressive pressure 350, as measured by the pressure sensor 240, may be indicative of a pressure within the vein 110 and/or the artery 120. In some embodiments, the cross-sectional shape of the vein 110 and/or the artery 120 may be detectable in the real-time ultrasound image data. In some embodiments, a difference between a compressive pressure 350 to cause the vein 110 to collapse and a compressive pressure 350 to cause the artery 120 to collapse may be a differentiating characteristic between the vein 110 and the artery 120.


In some embodiments, the logic modules of the ultrasound imaging system 100 may be configured to determine a pressure within the vein 110 and/or the artery 120 based on the pressure pulses 340 (and optionally, the ultrasound signals 330). In one instance, the compressive pressure 340 may be sufficient to cause the artery 120 to totally collapse only in the absence of a pressure pulse. In such an instance, the compressive pressure 340 as measured by the pressure sensor 240 may be indicative of a diastolic blood pressure within the artery 120. In a corresponding instance, the compressive pressure 340 may be sufficient to cause the artery 120 to totally collapse and remained collapsed throughout the pressure pulse. In this corresponding instance, the compressive pressure 340 as measured by the pressure sensor 240 may be indicative of a systolic blood pressure within the artery 120.


Referring to FIG. 4, a block diagram of the ultrasound imaging system 100 in accordance with some embodiments is shown. The console 105 may include a variety of components of the ultrasound imaging system 100. A processor 416 and memory (e.g., non-transitory, computer-readable medium) 418 such as random-access memory (RAM) or non-volatile memory, e.g., electrically erasable programmable read-only memory (EEPROM) may be included in the console 105 for controlling functions of the ultrasound imaging system 100, as well as for executing various logic operations or algorithms during operation of the ultrasound imaging system 100 in accordance with a logic module 420 stored in the memory 418 for execution by the processor 416. For example, the console 105 may be configured to instantiate by way of the logic module 420 one or more processes for adjusting 1) a distance of activated ultrasonic transducers 130 from a predefined target, e.g., a target vein 110 or area, or 2) an orientation of the activated ultrasonic transducers 130 with respect to the predefined target or area, or 3) both the distance and the orientation of the activated ultrasonic transducers 448 with respect to the predefined target or area. Additional operations of the logic module(s) 420 upon execution by the processor 416 are discussed below. The console 105 may also be configured to process electrical signals from the ultrasound probe 101 into ultrasound images. The activated ultrasonic transducers 130 may be adjusted using ultrasound-imaging data, magnetic-field data, shape-sensing data, or a combination thereof received by the console 105. The console 105 may activate certain ultrasonic transducers of a 2-D array of the ultrasonic transducers 130 or converting the already activated transducers into a linear array of the ultrasonic transducers 130.


A digital controller/analog interface 422 may be included with the console 105 and be in communication with both the processor 416 and other system components to govern interfacing between the ultrasound probe 101 and other system components set forth herein. The ultrasound imaging system 100 further includes ports 424 for connection with additional components such as optional components 426 including a printer, storage media, keyboard, etc. The ports 424 may be implemented as universal serial bus (USB) ports, though other types of ports can be used for this connection or any other connections shown or described herein. A power connection 428 is included with the console 105 to enable operable connection to an external power supply 430. An internal power supply 432 (e.g., a battery) may also be employed either with or exclusive of the external power supply 430. Power management circuitry 434 is included with the digital controller/analog interface 422 of the console 105 to regulate power use and distribution. Optionally, a stand-alone optical interrogator 454 may be communicatively coupled to the console 105 by way of one of the ports 424. Alternatively, the console 105 may include an optical interrogator integrated into the console 105. Such an optical interrogator may be configured to emit input optical signals into a companion optical-fiber stylet 456 for shape sensing with the ultrasound imaging system 100. The optical-fiber stylet 456, in turn, may be configured to be inserted into a lumen of a medical device such as the needle 112 (FIG. 1) and may convey the input optical signals from the optical interrogator 454 to a number of fiber Bragg grating (FBG) sensors along a length of the optical-fiber stylet 456. The optical interrogator 454 may be also configured to receive reflected optical signals conveyed by the optical-fiber stylet 456 reflected from the number of the FBG sensors, the reflected optical signals may be indicative of a shape of the optical-fiber stylet 456.


The optical interrogator 454 may be configured to convert the reflected optical signals into corresponding electrical signals for processing by the console 105 into distance and orientation information with respect to the target and for dynamically adjusting a distance of the activated ultrasonic transducers 130, an orientation of the activated ultrasonic transducers 130, or both the distance and the orientation of the activated ultrasonic transducers 130 with respect to the target (e.g., the target vein 110, depicted in FIG. 1) or the medical device (e.g., the needle 112, also see FIG. 1) when it is brought into proximity of the target. For example, the distance and orientation of the activated ultrasonic transducers 130 may be adjusted with respect to the vein 110 as the target. An image plane may be established by the activated ultrasonic transducers 130 being perpendicular or parallel to the vein 110 based on the orientation of the vein 110. In another example, when a medical device such as the needle 112 is brought into proximity of the ultrasound probe 101, an image plane can be established by the activated ultrasonic transducers 130 being perpendicular to a medical-device plane including the needle 112. The distance and orientation information may also be used for displaying an iconographic representation of the medical device on the display 404.


The display 404 may be integrated into (or connected to) the console 105 to provide a graphical user interface (GUI) and display information for a clinician in a form of ultrasound images acquired by the ultrasound probe 101. In addition, the ultrasound imaging system 100 may enable the distance and orientation of a magnetized medical device such as the needle 112 to be superimposed in real-time atop an ultrasound image, thus enabling the clinician to accurately guide the magnetized medical device toward an intended target (e.g., the vein 110) and/or away from an artery 120 (FIG. 1). As discussed above, the display 404 may alternatively be separate from the console 105 and communicatively (e.g., wirelessly) coupled thereto. A console button interface 436 may be used to selectively call up a desired mode to the display 404 by the clinician for assistance with an ultrasound-based medical procedure. In some embodiments, the display 404 may be implemented as an LCD device. The ultrasound probe 101 may optionally include an internal measurement unit (IMU) 458 that may house and accelerometer 460, a gyroscope 462, and a magnetometer 464.


The ultrasound probe 101 may be employed in connection with ultrasound-based visualization of a target such as the vein 110 (FIG. 1) in preparation for inserting the needle 112 or another medical device into the target. Such visualization gives real-time ultrasound guidance and assists in reducing complications typically associated with such insertion, including inadvertent arterial puncture (e.g., of the artery 120), hematoma, pneumothorax, etc. The ultrasound probe 101 may be configured to provide to the console 105 electrical signals corresponding to the ultrasound-imaging data, the magnetic-field data, the shape-sensing data, or a combination thereof for the real-time ultrasound needle guidance.


As stated above, the ultrasound probe 101 includes ultrasonic transducers 130 (see FIG. 1) and a pressure sensor 240 (see FIG. 2). The ultrasound probe 101 may further include a button and memory controller 438. The ultrasound probe 101 may be coupled to the console 105 via a probe input/output (I/O) interface 440 including a button and memory (I/O) interface 444.


The ultrasound imaging system 100 includes at least one logic module 420 configured to perform various operations when executed by the processor 416. The logic module 420 may, when executed by the processor 416, perform operations including receiving ultrasound imaging data, detecting one or more blood vessels within the ultrasound imaging data, identifying a blood vessel as an artery, and generating a visualization from the ultrasound image that renders the blood vessel as an artery. The operation may further include generating an alert indicating to a clinician that a procedure may need to be halted or adjusted, such as a needle insertion procedure, to prevent puncturing an artery, for example.


Identifying of a blood vessel as an artery may include comparing imaging data for the blood vessel with thresholds stored in the memory 418 pertaining to arteries and/or veins. The logic module 420 may further define a result of the comparison that the blood vessel is an artery or a vein according to a level of confidence. In some embodiments, indicia rendered on the display may include the level of confidence, such as 90% confidence, for example.


In one embodiment, a logic module 420 may be configured to detect a blood vessel size such as a cross-sectional diameter of the blood vessel. The logic module 420 may further compare the size of the vessel to one or more thresholds pertaining to the blood vessel diameter of arteries and/or veins. The logic module 420 may further define a result of the comparison that the blood vessel is an artery or a vein according to a level of confidence.


In one embodiment, a logic module 420 may be configured to detect a vessel wall thickness. The logic module 420 may further compare the vessel wall thickness to one or more thresholds pertaining to the blood vessel wall thicknesses of arteries and/or veins. The logic module 420 may further define a result of the comparison that the blood vessel is an artery or a vein according to a level of confidence.


In one embodiment, a logic module 420 may be configured to detect image pulsatility of a vessel, i.e., the changing of the vessel diameter due to pressure pulsing within the blood vessel. The logic module 420 may further compare the image pulsatility to one or more thresholds pertaining to the image pulsatility of arteries and/or veins. The logic module 420 may further define a result of the comparison that the blood vessel is an artery or a vein according to a level of confidence.


In one embodiment, a logic module 420 may be configured to detect a depth of the blood vessel with respect to the skin surface of the patient. The logic module 420 may further compare the depth to one or more thresholds pertaining to the depth of arteries and/or veins. The logic module 420 may further define a result of the comparison that the blood vessel is an artery or a vein according to a level of confidence.


In one embodiment, a logic module 420 may be configured to detect a difference in depth between a first blood vessel and a second blood vessel. The logic module 420 may further compare the depth difference to one or more thresholds pertaining to the depth of arteries with respect to the depth of veins. The logic module 420 may further define a result of the comparison that the blood vessel is an artery or a vein according to a level of confidence.


In one embodiment, a logic module 420 may be configured to detect a cross-sectional shape of a blood vessel. In some instances, a cross-sectional shape of a vein may be non-round, i.e., oval or flattened, due to the lower pressure within veins, contact with bones or other structure within the body, etc. By way of contrast, the pressure within an artery may generally cause the cross-sectional shape of an artery to be substantially round. The logic module 420 may further compare the cross-sectional shape to one or more thresholds pertaining to the cross-sectional shapes of veins and/or arteries. The logic module 420 may further define a result of the comparison that the blood vessel is an artery or a vein according to a level of confidence.


In some embodiments, a logic module 420 may be configured to identify a blood vessel as an artery or as a vein based on one or more differentiating characteristics of the blood vessel within an anatomical frame of reference, wherein the anatomical frame of reference may include an ultrasound image that renders visualization of the blood vessel in relation to other anatomical characteristics such as bones, tendons, ligaments, organs, a shape of an extremity, a skin surface, and/or other blood vessels, for example. In some embodiments, a logic module 420 may be configured to further identify the blood vessel as an artery or as a vein and/or obtain a greater level of confidence by changing the anatomic frame of reference. Changing the anatomic frame of reference may include altering the visualization to include more or fewer other anatomical characteristics and/or generating the ultrasound image at a different angle relative to the blood vessel or fluid or body cavities (e.g., pleural space).


In one embodiment, a logic module 420 may be configured receive pressure pulsatility data detected by the pressure sensor 240 at the skin surface, and thereby, detect a pressure pulsatility of a blood vessel. The logic module 420 may further compare the pressure pulsatility to one or more thresholds pertaining the pressure pulsatility of arteries and/or veins. The logic module 420 may further define a result of the comparison that the blood vessel is an artery or a vein according to a level of confidence.


In one embodiment, a logic module 420 may be configured receive contact pressure data detected by the pressure sensor 240 at the skin surface, the contact pressure data reflecting a compressive pressure of the probe head 102 (see FIG. 3B) against the skin surface. The logic module 420 may detect a cross-sectional shape of a blood vessel in combination with the detection of the compressive pressure. The logic module 420 may further compare the cross-sectional shape of the blood vessel, when a compressive pressure is applied, to one or more thresholds pertaining to the cross-sectional shape of veins and/or arteries when a compressive pressure is applied. The logic module 420 may further define a result of the comparison that the blood vessel is an artery or a vein according to a level of confidence.


In one embodiment, a logic module 420 may be configured to receive needle-tracking imaging data in combination with identifying a blood vessel. In some embodiments, the logic module 420 may utilize needle tracking to facilitate selection of a blood vessel as a target, such as a target vein. In some embodiments, the clinical may select the target blood vessel by positioning the needle 112 in close proximity to the target blood vessel, i.e., within a predetermined threshold distance from the target blood vessel. Once selected, the logic module 420 may provide feedback to the clinician that the target blood vessel is selected, which feedback may include rendering visualization indicia indicating that the target blood vessel is selected.


In some embodiments, the logic module 420 may detect a position of a tip of the needle 112 (see FIG. 1) with respect to an identified artery. The logic module 420 may further compare the position of the tip to one or more thresholds pertaining to a perimeter of the identified artery, such a safe distance away from the identified artery, for example. The logic module 420 may further generate feedback to the clinician if the position of the needle tip exceeds a perimeter threshold. In some embodiments, the feedback may include rendering visual indicia, such as a text notification or an arrow indicating a direction to move the needle 112, for example. As may be appreciated by one of ordinary skill, the logic module 420 may be figured to provide other feedback to the clinician in the forms of visual indicia, audio alerts, etc., pertaining to aspects of a needle position with respect to a blood vessel, which other feedback is included in this disclosure.


The logic module 420 may combine individual confidence levels (i.e., confidence levels associated with individual differentiating characteristics) to produce a combined confidence level that is greater than any of the individual confidence levels.


The ultrasound imaging system 100 may include an artificial intelligence (AI) module 405 (which may be a sub-module of the logic module 420) that may be employed for identifying a blood vessel as an artery or a vein or otherwise distinguishing an artery from a vein. The AI module 405 may be integrated into the console 105, coupled to the console 105, or accessed remotely on a separate server. The AI module 405 may be configured to receive and process training data sets that include data pertaining to differentiating characteristics which may comprise size or diameter of the blood vessel, position of the blood vessel relative to a skin surface or other body structure, relative position between adjacent blood vessels, motion or changing diameter of blood vessel in response pressure pulsing within the blood vessel, cross-section shape of the blood vessel, cross-section shape of the blood vessel in response to applied pressure, and wall thickness of the blood vessel.


In one example, processing of the AI module 405 may include generation of a machine-learning (ML) model and training of the ML model using received one or more training data sets. The ML model may then be deployed to score ultrasound signals and/or pressure pulse data to detect and/or identify particular targets, such as blood vessels and more specifically, veins or arteries. In some embodiments, the trained ML model may be stored in the memory 418.


The AI module 405 may apply algorithms or other logic operations to define a set of thresholds for any or all of the differentiating characteristics. The AI module 405 may define an initial default set of thresholds via an initial training set of AI data. The logic module 420 may apply default thresholds defined by the AI module 405 in identifying a blood vessel as an artery or a vein or otherwise distinguishing an artery from a vein. In some instances, the AI module 405 may incorporate additional AI data to add precision to the set of thresholds or otherwise redefine a set of thresholds. In some instance, the AI module may define additional differentiating characteristics together with associated thresholds.


In some embodiments, the logic module 420 may store, monitor, and/or analyze data pertaining to confidence levels generated during comparison of real-time data with a set of thresholds. The logic module 420 may then instigate a process of incorporating additional imaging data into the AI process and generate a new and improved set of thresholds.


Embodiments of the invention may be embodied in other specific forms without departing from the spirit of the present disclosure. The described embodiments are to be considered in all respects only as illustrative, not restrictive. The scope of the embodiments is, therefore, indicated by the appended claims rather than by the foregoing description. All changes that come within the meaning and range of equivalency of the claims are to be embraced within their scope.

Claims
  • 1. An ultrasound imaging system comprising: an ultrasound probe comprising a plurality of ultrasound transducers configured to acquire ultrasound images;a processor; andnon-transitory computer-readable medium having stored thereon logic that, when executed by the processor, is configured to perform operations including: receiving ultrasound imaging data,detecting one or more blood vessels within the ultrasound imaging data,identifying at least one blood vessel of the one or more blood vessels as an artery or as a vein,rendering a visualization of at least a subset of the one or more blood vessels on a display,tracking a position of a needle tip in relation to the one or more blood vessels, andgenerating an alert to a clinician if the needle tip is positioned within a perimeter threshold of the artery.
  • 2. The ultrasound imaging system of claim 1, wherein the logic that, when executed by the processor, causes performances of further operations including: identifying the at least one blood vessel as the artery based on at least one differentiating characteristic of a plurality of differentiating characteristics of blood vessel type.
  • 3. The ultrasound imaging system of claim 2, wherein the at least one differentiating characteristic of the plurality of differentiating characteristics is employed when identifying a blood vessel, the plurality of differentiating characteristics comprising: a diameter of the blood vessel,a wall thickness of the blood vessel,an image pulsatility of the blood vessel,a depth of the blood vessel with respect to a skin surface of a patient,a location of the blood vessel in relation to a location of a second blood vessel, ora cross-sectional shape of the blood vessel.
  • 4. The ultrasound imaging system of claim 3, wherein at least two differentiating characteristics are employed when identifying the at least one blood vessel.
  • 5. The ultrasound imaging system of claim 1, wherein the logic defines one or more thresholds employed when identifying the at least one blood vessel.
  • 6. The ultrasound imaging system of claim 5, wherein identifying the at least one blood vessel includes identifying at least one differentiating characteristic of the blood vessel within the ultrasound imaging data.
  • 7. The ultrasound imaging system of claim 6, wherein identifying the blood vessel includes comparing the ultrasound imaging data pertaining to the at least one differentiating characteristic to one or more thresholds defined by the logic.
  • 8. The ultrasound imaging system of claim 7, wherein a result of comparing the ultrasound imaging data to the one or more thresholds is a confidence level for identification of the at least one blood vessel.
  • 9. The ultrasound imaging system of claim 8, wherein the ultrasound imaging data includes image pulsatility data of the blood vessel, and wherein identifying the blood vessel includes comparing the image pulsatility data to one or more image pulsatility thresholds to obtain the confidence level for the identification of the at least one blood vessel.
  • 10. The ultrasound imaging system of claim 3, wherein the ultrasound probe comprises a pressure sensor configured to obtain pressure pulsatility data of the at least one blood vessel.
  • 11. The ultrasound imaging system of claim 10, wherein the logic, when executed by the processor, causes performance of further operations including receiving pressure pulsatility data in coordination with receiving the ultrasound imaging data.
  • 12. The ultrasound imaging system of claim 11, wherein the plurality of differentiating characteristics further comprises a pressure pulsatility of the at least one blood vessel.
  • 13. The ultrasound imaging system of claim 12, wherein identifying the at least one blood vessel includes comparing the pressure pulsatility data of the at least one blood vessel to one or more pressure pulsatility thresholds to obtain a confidence level for identification of the at least one blood vessel.
  • 14. The ultrasound imaging system of claim 13, wherein comparing the pressure pulsatility data is performed in combination with comparing image pulsatility data to obtain a combined confidence level for identification of the at least one blood vessel.
  • 15. The ultrasound imaging system of claim 1, wherein rendering the visualization of the one or more blood vessels on the display includes rendering indicia on the display to indicate to the clinician whether the at least one blood vessel is the artery.
  • 16. The ultrasound imaging system of claim 1, wherein generating the alert includes rendering indicia on the display that includes a text notification or an arrow indicating a direction to move the needle tip away from the artery.
  • 17. The ultrasound imaging system of claim 1, wherein the logic includes an artificial intelligence module configured to generate and train a model for scoring the ultrasound imaging data and pressure pulsatility data in order to detect the at least one blood vessel and identify the at least one blood vessel as either the artery or the vein.
  • 18. The ultrasound imaging system of claim 17, wherein the artificial intelligence module is further configured to utilize one or more of a plurality of differentiating characteristics.
  • 19. An ultrasound imaging system comprising: an ultrasound probe comprising a plurality of ultrasound transducers configured to acquire ultrasound images;a processor; andnon-transitory computer-readable medium having stored thereon logic that, when executed by the processor, is configured to perform operations including: receiving ultrasound imaging data,detecting one or more blood vessels within the ultrasound imaging data,identifying at least one blood vessel of the one or more blood vessels as an artery or as a vein, andrendering a visualization of at least a subset of the one or more blood vessels on a display,wherein the logic includes an artificial intelligence module configured to generate and train a model for scoring the ultrasound imaging data and pressure pulsatility data in order to detect the at least one blood vessel and identify the at least one blood vessel as either the artery or the vein.
  • 20. The ultrasound imaging system of claim 19, wherein the logic that, when executed by the processor, causes performances of further operations including: identifying the at least one blood vessel as the artery based on at least one differentiating characteristic of a plurality of differentiating characteristics of blood vessel type.
  • 21. The ultrasound imaging system of claim 20, wherein the at least one differentiating characteristic of a plurality of differentiating characteristics is employed when identifying a blood vessel, the plurality of differentiating characteristics comprising: a diameter of the blood vessel,a wall thickness of the blood vessel,an image pulsatility of the blood vessel,a depth of the blood vessel with respect to a skin surface of a patient,a location of the blood vessel in relation to a location of a second blood vessel, ora cross-sectional shape of the blood vessel.
  • 22. The ultrasound imaging system of claim 21, wherein at least two differentiating characteristics are employed when identifying the blood vessel.
  • 23. The ultrasound imaging system of claim 19, wherein the logic defines one or more thresholds employed when identifying the at least one blood vessel.
  • 24. The ultrasound imaging system of claim 19, wherein identifying a blood vessel includes identifying at least one differentiating characteristic of the blood vessel within the ultrasound imaging data.
PRIORITY

This application claims the benefit of priority to U.S. Provisional Application No. 63/120,053, filed Dec. 1, 2020, which is incorporated by reference in its entirety into this application.

US Referenced Citations (357)
Number Name Date Kind
3697917 Orth et al. Oct 1972 A
5148809 Biegeleisen-Knight et al. Sep 1992 A
5181513 Touboul et al. Jan 1993 A
5325293 Dorne Jun 1994 A
5349865 Kavli et al. Sep 1994 A
5441052 Miyajima Aug 1995 A
5549554 Miraki Aug 1996 A
5573529 Haak et al. Nov 1996 A
5775322 Silverstein et al. Jul 1998 A
5879297 Haynor et al. Mar 1999 A
5897503 Lyon et al. Apr 1999 A
5908387 LeFree et al. Jun 1999 A
5967984 Chu et al. Oct 1999 A
5970119 Hofmann Oct 1999 A
6004270 Urbano et al. Dec 1999 A
6019724 Gronningsaeter et al. Feb 2000 A
6068599 Saito et al. May 2000 A
6074367 Hubbell Jun 2000 A
6129668 Haynor et al. Oct 2000 A
6132379 Patacsil et al. Oct 2000 A
6216028 Haynor et al. Apr 2001 B1
6233476 Strommer et al. May 2001 B1
6245018 Lee Jun 2001 B1
6263230 Haynor et al. Jul 2001 B1
6375615 Flaherty et al. Apr 2002 B1
6436043 Bonnefous Aug 2002 B2
6498942 Esenaliev et al. Dec 2002 B1
6503205 Manor et al. Jan 2003 B2
6508769 Bonnefous Jan 2003 B2
6511458 Milo et al. Jan 2003 B2
6524249 Moehring et al. Feb 2003 B2
6543642 Milliorn Apr 2003 B1
6554771 Buil et al. Apr 2003 B1
6592520 Peszynski et al. Jul 2003 B1
6592565 Twardowski Jul 2003 B2
6601705 Molina et al. Aug 2003 B2
6612992 Hossack et al. Sep 2003 B1
6613002 Clark et al. Sep 2003 B1
6623431 Sakuma et al. Sep 2003 B1
6641538 Nakaya et al. Nov 2003 B2
6647135 Bonnefous Nov 2003 B2
6687386 Ito et al. Feb 2004 B1
6733458 Steins et al. May 2004 B1
6749569 Pellegretti Jun 2004 B1
6754608 Svanerudh et al. Jun 2004 B2
6755789 Stringer et al. Jun 2004 B2
6840379 Franks-Farah et al. Jan 2005 B2
6857196 Dalrymple Feb 2005 B2
6979294 Selzer et al. Dec 2005 B1
7074187 Selzer et al. Jul 2006 B2
7244234 Ridley et al. Jul 2007 B2
7359554 Klingensmith et al. Apr 2008 B2
7534209 Abend et al. May 2009 B2
7599730 Hunter et al. Oct 2009 B2
7637870 Flaherty et al. Dec 2009 B2
7681579 Schwartz Mar 2010 B2
7691061 Hirota Apr 2010 B2
7699779 Sasaki et al. Apr 2010 B2
7720520 Willis May 2010 B2
7727153 Fritz et al. Jun 2010 B2
7734326 Pedain et al. Jun 2010 B2
7831449 Ying et al. Nov 2010 B2
7905837 Suzuki Mar 2011 B2
7925327 Weese Apr 2011 B2
7927278 Selzer et al. Apr 2011 B2
8014848 Birkenbach et al. Sep 2011 B2
8038619 Steinbacher Oct 2011 B2
8060181 Rodriguez Ponce et al. Nov 2011 B2
8075488 Burton Dec 2011 B2
8090427 Eck et al. Jan 2012 B2
8105239 Specht Jan 2012 B2
8172754 Watanabe et al. May 2012 B2
8175368 Sathyanarayana May 2012 B2
8200313 Rambod et al. Jun 2012 B1
8211023 Swan et al. Jul 2012 B2
8228347 Beasley et al. Jul 2012 B2
8298147 Huennekens et al. Oct 2012 B2
8303505 Webler et al. Nov 2012 B2
8323202 Roschak et al. Dec 2012 B2
8328727 Miele et al. Dec 2012 B2
8388541 Messerly et al. Mar 2013 B2
8409103 Grunwald et al. Apr 2013 B2
8449465 Nair et al. May 2013 B2
8553954 Saikia Oct 2013 B2
8556815 Pelissier et al. Oct 2013 B2
8585600 Liu et al. Nov 2013 B2
8622913 Dentinger et al. Jan 2014 B2
8706457 Hart et al. Apr 2014 B2
8727988 Flaherty et al. May 2014 B2
8734357 Taylor May 2014 B2
8744211 Owen Jun 2014 B2
8754865 Merritt et al. Jun 2014 B2
8764663 Smok et al. Jul 2014 B2
8781194 Malek et al. Jul 2014 B2
8781555 Burnside et al. Jul 2014 B2
8790263 Randall et al. Jul 2014 B2
8849382 Cox et al. Sep 2014 B2
8939908 Suzuki et al. Jan 2015 B2
8961420 Zhang Feb 2015 B2
9022940 Meier May 2015 B2
9138290 Hadjicostis Sep 2015 B2
9199082 Yared et al. Dec 2015 B1
9204858 Pelissier et al. Dec 2015 B2
9220477 Urabe et al. Dec 2015 B2
9295447 Shah Mar 2016 B2
9320493 Visveshwara Apr 2016 B2
9357980 Toji et al. Jun 2016 B2
9364171 Harris et al. Jun 2016 B2
9427207 Sheldon et al. Aug 2016 B2
9445780 Hossack et al. Sep 2016 B2
9456766 Cox et al. Oct 2016 B2
9456804 Tamada Oct 2016 B2
9468413 Hall et al. Oct 2016 B2
9492097 Wilkes et al. Nov 2016 B2
9521961 Silverstein et al. Dec 2016 B2
9554716 Burnside et al. Jan 2017 B2
9582876 Specht Feb 2017 B2
9610061 Ebbini et al. Apr 2017 B2
9636031 Cox May 2017 B2
9649037 Lowe et al. May 2017 B2
9649048 Cox et al. May 2017 B2
9702969 Hope Simpson et al. Jul 2017 B2
9715757 Ng et al. Jul 2017 B2
9717415 Cohen et al. Aug 2017 B2
9731066 Liu et al. Aug 2017 B2
9814433 Benishti et al. Nov 2017 B2
9814531 Yagi et al. Nov 2017 B2
9861337 Patwardhan et al. Jan 2018 B2
9895138 Sasaki Feb 2018 B2
9913605 Harris et al. Mar 2018 B2
9949720 Southard et al. Apr 2018 B2
10043272 Forzoni et al. Aug 2018 B2
10449330 Newman et al. Oct 2019 B2
10524691 Newman et al. Jan 2020 B2
10751509 Misener Aug 2020 B2
11564861 Gaines Jan 2023 B1
20020038088 Imran et al. Mar 2002 A1
20030047126 Tomaschko Mar 2003 A1
20030106825 Molina et al. Jun 2003 A1
20030120154 Sauer et al. Jun 2003 A1
20030125629 Ustuner Jul 2003 A1
20030135115 Burdette et al. Jul 2003 A1
20030149366 Stringer et al. Aug 2003 A1
20040015080 Kelly et al. Jan 2004 A1
20040055925 Franks-Farah et al. Mar 2004 A1
20040197267 Black et al. Oct 2004 A1
20050000975 Carco et al. Jan 2005 A1
20050049504 Lo et al. Mar 2005 A1
20050165299 Kressy et al. Jul 2005 A1
20050251030 Azar et al. Nov 2005 A1
20050267365 Sokulin et al. Dec 2005 A1
20060004290 Smith et al. Jan 2006 A1
20060013523 Childlers et al. Jan 2006 A1
20060015039 Cassidy et al. Jan 2006 A1
20060020204 Serra et al. Jan 2006 A1
20060047617 Bacioiu et al. Mar 2006 A1
20060079781 Germond-Rouet et al. Apr 2006 A1
20060184029 Haim et al. Aug 2006 A1
20060210130 Germond-Rouet et al. Sep 2006 A1
20060241463 Shau et al. Oct 2006 A1
20070043341 Anderson et al. Feb 2007 A1
20070049822 Bunce et al. Mar 2007 A1
20070073155 Park et al. Mar 2007 A1
20070167738 Timinger et al. Jul 2007 A1
20070199848 Ellswood et al. Aug 2007 A1
20070239120 Brock et al. Oct 2007 A1
20070249911 Simon Oct 2007 A1
20070287886 Saadat Dec 2007 A1
20080021322 Stone et al. Jan 2008 A1
20080033293 Beasley et al. Feb 2008 A1
20080033759 Finlay Feb 2008 A1
20080051657 Rold Feb 2008 A1
20080108930 Weitzel et al. May 2008 A1
20080125651 Watanabe et al. May 2008 A1
20080146915 McMorrow Jun 2008 A1
20080177186 Slater et al. Jul 2008 A1
20080221425 Olson et al. Sep 2008 A1
20080269605 Nakaya Oct 2008 A1
20080294037 Richter Nov 2008 A1
20080300491 Bonde et al. Dec 2008 A1
20090012399 Sunagawa et al. Jan 2009 A1
20090012401 Steinbacher Jan 2009 A1
20090074280 Lu et al. Mar 2009 A1
20090124903 Osaka May 2009 A1
20090137887 Shariati et al. May 2009 A1
20090137907 Takimoto May 2009 A1
20090143672 Harms et al. Jun 2009 A1
20090143684 Cermak et al. Jun 2009 A1
20090156926 Messerly et al. Jun 2009 A1
20090281413 Boyden et al. Nov 2009 A1
20090306509 Pedersen et al. Dec 2009 A1
20100010348 Halmann Jan 2010 A1
20100211026 Sheetz et al. Aug 2010 A2
20100249598 Smith et al. Sep 2010 A1
20100286515 Gravenstein et al. Nov 2010 A1
20100312121 Guan Dec 2010 A1
20100324423 El-Aklouk et al. Dec 2010 A1
20110002518 Ziv-Ari et al. Jan 2011 A1
20110026796 Hyun et al. Feb 2011 A1
20110071404 Schmitt et al. Mar 2011 A1
20110074244 Osawa Mar 2011 A1
20110087107 Lindekugel et al. Apr 2011 A1
20110166451 Blaivas et al. Jul 2011 A1
20110282188 Burnside et al. Nov 2011 A1
20110295108 Cox et al. Dec 2011 A1
20110313293 Lindekugel et al. Dec 2011 A1
20120165679 Orome et al. Jun 2012 A1
20120179038 Meurer et al. Jul 2012 A1
20120179042 Fukumoto et al. Jul 2012 A1
20120179044 Chiang et al. Jul 2012 A1
20120197132 O'Connor Aug 2012 A1
20120220865 Brown et al. Aug 2012 A1
20120277576 Lui Nov 2012 A1
20130041250 Pelissier Feb 2013 A1
20130102889 Southard et al. Apr 2013 A1
20130131499 Chan et al. May 2013 A1
20130131502 Blaivas et al. May 2013 A1
20130150724 Blaivas et al. Jun 2013 A1
20130188832 Ma et al. Jul 2013 A1
20130197367 Smok et al. Aug 2013 A1
20130218024 Boctor et al. Aug 2013 A1
20130323700 Samosky et al. Dec 2013 A1
20130338503 Cohen et al. Dec 2013 A1
20130338508 Nakamura et al. Dec 2013 A1
20130345566 Weitzel et al. Dec 2013 A1
20140005530 Liu et al. Jan 2014 A1
20140031694 Solek Jan 2014 A1
20140066779 Nakanishi Mar 2014 A1
20140073976 Fonte et al. Mar 2014 A1
20140100440 Cheline et al. Apr 2014 A1
20140114194 Kanayama et al. Apr 2014 A1
20140170620 Savitsky et al. Jun 2014 A1
20140180098 Flaherty et al. Jun 2014 A1
20140180116 Lindekugel et al. Jun 2014 A1
20140188133 Misener Jul 2014 A1
20140188440 Donhowe et al. Jul 2014 A1
20140276059 Sheehan Sep 2014 A1
20140276069 Amble et al. Sep 2014 A1
20140276081 Tegels Sep 2014 A1
20140276085 Miller Sep 2014 A1
20140276690 Grace Sep 2014 A1
20140343431 Vajinepalli et al. Nov 2014 A1
20140357994 Jin et al. Dec 2014 A1
20150005738 Blacker Jan 2015 A1
20150011887 Ahn et al. Jan 2015 A1
20150065916 Maguire et al. Mar 2015 A1
20150073279 Cai et al. Mar 2015 A1
20150112200 Oberg et al. Apr 2015 A1
20150141821 Yoshikawa et al. May 2015 A1
20150190111 Fry Jul 2015 A1
20150209113 Burkholz et al. Jul 2015 A1
20150209510 Burkholz et al. Jul 2015 A1
20150209526 Matsubara et al. Jul 2015 A1
20150257735 Ball et al. Sep 2015 A1
20150282890 Cohen et al. Oct 2015 A1
20150294497 Ng et al. Oct 2015 A1
20150297097 Matsubara et al. Oct 2015 A1
20150342572 Tahmasebi Maraghoosh et al. Dec 2015 A1
20150359520 Shan et al. Dec 2015 A1
20150359991 Dunbar et al. Dec 2015 A1
20160000367 Lyon Jan 2016 A1
20160026894 Nagase Jan 2016 A1
20160029995 Navratil et al. Feb 2016 A1
20160113699 Sverdlik et al. Apr 2016 A1
20160120607 Sorotzkin et al. May 2016 A1
20160157831 Kang et al. Jun 2016 A1
20160166232 Merritt Jun 2016 A1
20160202053 Walker et al. Jul 2016 A1
20160211045 Jeon et al. Jul 2016 A1
20160213398 Liu Jul 2016 A1
20160220124 Grady et al. Aug 2016 A1
20160259992 Knodt et al. Sep 2016 A1
20160278869 Grunwald Sep 2016 A1
20160287214 Ralovich et al. Oct 2016 A1
20160296208 Sethuraman et al. Oct 2016 A1
20160374644 Mauldin, Jr. et al. Dec 2016 A1
20170014105 Chono Jan 2017 A1
20170020561 Cox et al. Jan 2017 A1
20170079548 Silverstein et al. Mar 2017 A1
20170143312 Hedlund et al. May 2017 A1
20170164923 Matsumoto Jun 2017 A1
20170172666 Govari et al. Jun 2017 A1
20170215842 Ryu et al. Aug 2017 A1
20170252004 Broad et al. Sep 2017 A1
20170328751 Lemke Nov 2017 A1
20170367678 Sirtori et al. Dec 2017 A1
20180015256 Southard et al. Jan 2018 A1
20180116723 Hettrick et al. May 2018 A1
20180125450 Blackbourne et al. May 2018 A1
20180161502 Nanan et al. Jun 2018 A1
20180199914 Ramachandran et al. Jul 2018 A1
20180214119 Mehrmohammadi et al. Aug 2018 A1
20180228465 Southard et al. Aug 2018 A1
20180235649 Elkadi Aug 2018 A1
20180235709 Donhowe et al. Aug 2018 A1
20180289927 Messerly Oct 2018 A1
20180296185 Cox et al. Oct 2018 A1
20180310955 Lindekugel et al. Nov 2018 A1
20180344293 Raju et al. Dec 2018 A1
20190060001 Kohli et al. Feb 2019 A1
20190060014 Hazelton et al. Feb 2019 A1
20190090855 Kobayashi et al. Mar 2019 A1
20190125210 Govari et al. May 2019 A1
20190200951 Meier Jul 2019 A1
20190239848 Bedi et al. Aug 2019 A1
20190307419 Durfee Oct 2019 A1
20190307515 Naito et al. Oct 2019 A1
20190365347 Abe Dec 2019 A1
20190365348 Toume et al. Dec 2019 A1
20190365354 Du Dec 2019 A1
20200069929 Mason et al. Mar 2020 A1
20200113540 Gijsbers et al. Apr 2020 A1
20200163654 Satir et al. May 2020 A1
20200200900 Asami et al. Jun 2020 A1
20200229795 Tadross et al. Jul 2020 A1
20200230391 Burkholz et al. Jul 2020 A1
20200237403 Southard et al. Jul 2020 A1
20200281563 Muller et al. Sep 2020 A1
20200359990 Poland et al. Nov 2020 A1
20200390416 Swan Dec 2020 A1
20210059639 Howell Mar 2021 A1
20210077058 Mashood et al. Mar 2021 A1
20210137492 Imai May 2021 A1
20210161510 Sasaki et al. Jun 2021 A1
20210186467 Urabe et al. Jun 2021 A1
20210212668 Li et al. Jul 2021 A1
20210267570 Ulman et al. Sep 2021 A1
20210295048 Buras et al. Sep 2021 A1
20210315538 Brandl et al. Oct 2021 A1
20210378627 Yarmush et al. Dec 2021 A1
20220039777 Durfee Feb 2022 A1
20220039829 Zijlstra et al. Feb 2022 A1
20220071593 Tran Mar 2022 A1
20220096053 Sethuraman et al. Mar 2022 A1
20220096797 Prince Mar 2022 A1
20220104791 Matsumoto Apr 2022 A1
20220104886 Blanchard et al. Apr 2022 A1
20220117582 McLaughlin et al. Apr 2022 A1
20220160434 Messerly et al. May 2022 A1
20220168050 Sowards et al. Jun 2022 A1
20220296303 McLeod et al. Sep 2022 A1
20220330922 Sowards et al. Oct 2022 A1
20220334251 Sowards et al. Oct 2022 A1
20220361840 Matsumoto et al. Nov 2022 A1
20230107629 Sowards et al. Apr 2023 A1
20230132148 Sowards et al. Apr 2023 A1
20230135562 Misener et al. May 2023 A1
20230138970 Sowards et al. May 2023 A1
20230148872 Sowards et al. May 2023 A1
20230201539 Howell Jun 2023 A1
20230277153 Sowards et al. Sep 2023 A1
20230277154 Sowards et al. Sep 2023 A1
20230293143 Sowards et al. Sep 2023 A1
20230338010 Sturm Oct 2023 A1
20230371928 Rajguru et al. Nov 2023 A1
20230397900 Prince Dec 2023 A1
20240065673 Sowards et al. Feb 2024 A1
Foreign Referenced Citations (48)
Number Date Country
102871645 Jan 2013 CN
105107067 May 2018 CN
0933063 Aug 1999 EP
1504713 Feb 2005 EP
1591074 May 2008 EP
2823766 Jan 2015 EP
3181083 Jun 2017 EP
3870059 Sep 2021 EP
2000271136 Oct 2000 JP
2007222291 Sep 2007 JP
2014150928 Aug 2014 JP
2018175547 Nov 2018 JP
20180070878 Jun 2018 KR
102176196 Nov 2020 KR
2010029521 Mar 2010 WO
2010076808 Jul 2010 WO
2013059714 Apr 2013 WO
2014115150 Jul 2014 WO
2015017270 Feb 2015 WO
2016081023 May 2016 WO
2017096487 Jun 2017 WO
2017214428 Dec 2017 WO
2018026878 Feb 2018 WO
2018134726 Jul 2018 WO
2019232451 Dec 2019 WO
2020002620 Jan 2020 WO
2020016018 Jan 2020 WO
2019232454 Feb 2020 WO
2020044769 Mar 2020 WO
2020067897 Apr 2020 WO
2020083660 Apr 2020 WO
2020186198 Sep 2020 WO
2021198226 Oct 2021 WO
2022072727 Apr 2022 WO
2022081904 Apr 2022 WO
2022119853 Jun 2022 WO
2022115479 Jun 2022 WO
2022119856 Jun 2022 WO
2022221703 Oct 2022 WO
2022221714 Oct 2022 WO
2023059512 Apr 2023 WO
2023076268 May 2023 WO
2023081220 May 2023 WO
2023081223 May 2023 WO
2023091424 May 2023 WO
2023167866 Sep 2023 WO
2023177718 Sep 2023 WO
2024044277 Feb 2024 WO
Non-Patent Literature Citations (84)
Entry
U.S. Appl. No. 17/020,476, filed Sep. 14, 2020 Advisory Action dated Aug. 19, 2022.
U.S. Appl. No. 17/020,476, filed Sep. 14, 2020 Non-Final Office Action dated Sep. 23, 2022.
U.S. Appl. No. 17/471,015, filed Sep. 9, 2021 Non-Final Office Action dated Aug. 16, 2022.
PCT/US2022/025097 filed Apr. 15, 2021 International Preliminary Report on Patentability dated Oct. 26, 2023.
PCT/US2023/030970 filed Aug. 23, 2023 International Search Report and Written Opinion dated Oct. 30, 2023.
U.S. Appl. No. 17/468,318, filed Sep. 7, 2021 Advisory Action dated Nov. 6, 2023.
U.S. Appl. No. 17/468,318, filed Sep. 7, 2021 Notice of Allowance dated Jan. 18, 2024.
U.S. Appl. No. 17/534,099, filed Nov. 23, 2021 Advisory Action dated Dec. 8, 2023.
U.S. Appl. No. 17/538,911, filed Nov. 30, 2021 Advisory Action dated Nov. 22, 2023.
U.S. Appl. No. 17/684,180, filed Mar. 1, 2022 Final Office Action dated Jan. 18, 2024.
U.S. Appl. No. 17/722,111, filed Apr. 15, 2022 Non-Final Office Action dated Dec. 22, 2023.
U.S. Appl. No. 17/722,151, filed Apr. 15, 2022 Advisory Action dated Jan. 2, 2024.
U.S. Appl. No. 17/722,151, filed Apr. 15, 2022 Final Office Action dated Nov. 6, 2023.
U.S. Appl. No. 17/894,460, filed Aug. 24, 2022 Non-Final Office Action dated Nov. 6, 2023.
Lu Zhenyu et al “Recent advances in 5 robot-assisted echography combining perception control and cognition.” Cognitive Computation and Systems the Institution of Engineering and Technology, Michael Faraday House, Six Hills Way, Stevenage Herts. SG1 2AY UK vol. 2 No. 3 Sep. 2, 2020 (Sep. 2, 2020).
PCT/US2021/045218 filed Aug. 9, 2021 International Search Report and Written Opinion dated Nov. 23, 2021.
PCT/US2021/049123 filed Sep. 3, 2021 International Search Report and Written Opinion dated Feb. 4, 2022.
PCT/US2021/053018 filed Sep. 30, 2021 International Search Report and Written Opinion dated May 3, 2022.
PCT/US2021/060622 filed Nov. 23, 2021 International Search Report and Written Opinion dated Mar. 3, 2022.
PCT/US2021/061267 filed Nov. 30, 2021 International Search Report and Written Opinion dated Mar. 9, 2022.
PCT/US2021/061276 filed Nov. 30, 2021 International Search Report and Written Opinion dated Mar. 9, 2022.
Sebastian Vogt: “Real-Time Augmented Reality for Image-Guided Interventions”, Oct. 5, 2009, XPO55354720, Retrieved from the Internet: URL: https://opus4.kobv.de/opus4-fau/frontdoor/deliver/index/docId/1235/file/SebastianVogtDissertation.pdf.
U.S. Appl. No. 15/951,903, filed Apr. 12, 2018 Board Decision dated Apr. 20, 2022.
U.S. Appl. No. 15/951,903, filed Apr. 12, 2018 Notice of Allowance dated May 2, 2022.
William F Garrett et al: “Real-time incremental visualization of dynamic ultrasound volumesusing parallel BSP trees”, Visualization '96. Proceedings, IEEE, NE, Oct. 27, 1996, pp. 235-ff, XPO58399771, ISBN: 978-0-89791-864-0 abstract, figures 1-7, pp. 236-240.
Pagoulatos, N. et al. “New spatial localizer based on fiber optics with applications in 3D ultrasound imaging” Proceeding of Spie, vol. 3976 (Apr. 18, 2000; Apr. 18, 2000).
PCT/US2022/025082 filed Apr. 15, 2022 International Search Report and Written Opinion dated Jul. 11, 2022.
PCT/US2022/025097 filed Apr. 15, 2022 International Search Report and Written Opinion dated Jul. 8, 2022.
U.S. Appl. No. 17/020,476, filed Sep. 14, 2020 Final Office Action dated Jun. 9, 2022.
EP 20866520.8 filed Apr. 5, 2022 Extended European Search Report dated Aug. 22, 2023.
U.S. Appl. No. 17/468,318, filed Sep. 7, 2021 Final Office Action dated Sep. 8, 2023.
U.S. Appl. No. 17/471,015, filed Sep. 9, 2021 Final Office Action dated Oct. 12, 2023.
U.S. Appl. No. 17/534,099, filed Nov. 23, 2021 Final Office Action dated Sep. 29, 2023.
U.S. Appl. No. 17/538,911, filed Nov. 30, 2021 Final Office Action dated Sep. 13, 2023.
U.S. Appl. No. 17/684,180, filed Mar. 1, 2022 Non-Final Office Action dated Jul. 28, 2023.
U.S. Appl. No. 17/722,151, filed Apr. 15, 2022 Non-Final Office Action dated Sep. 7, 2023.
PCT/US12/61182 International Seach Report and Written Opinion dated Mar. 11, 2013.
PCT/US2021/049294 filed Sep. 7, 2021 International Search Report and Written Opinion dated Dec. 8, 2021.
PCT/US2021/049712 filed Sep. 9, 2021 International Search Report and Written Opinion dated Dec. 14, 2021.
PCT/US2021/052055 filed Sep. 24, 2021 International Search Report and Written Opinion dated Dec. 20, 2021.
U.S. Appl. No. 13/656,563, filed Oct. 19, 2012 Decision on Appeal dated Nov. 1, 2017.
U.S. Appl. No. 13/656,563, filed Oct. 19, 2012 Examiner's Answer dated Nov. 16, 2015.
U.S. Appl. No. 13/656,563, filed Oct. 19, 2012 Final Office Action dated Dec. 5, 2014.
U.S. Appl. No. 13/656,563, filed Oct. 19, 2012 Non-Final Office Action dated Jul. 18, 2014.
U.S. Appl. No. 15/650,474, filed Jul. 14, 2017 Final Office Action dated Jun. 2, 2020.
U.S. Appl. No. 15/650,474, filed Jul. 14, 2017 Non-Final Office Action dated Dec. 16, 2019.
U.S. Appl. No. 15/650,474, filed Jul. 14, 2017 Notice of Allowance dated Dec. 11, 2020.
U.S. Appl. No. 15/650,474, filed Jul. 14, 2017 Notice of Allowance dated Mar. 1, 2021.
U.S. Appl. No. 15/951,903, filed Apr. 12, 2018 Advisory Action dated Dec. 22, 2020.
U.S. Appl. No. 15/951,903, filed Apr. 12, 2018 Examiner's Answer dated Jun. 3, 2021.
U.S. Appl. No. 15/951,903, filed Apr. 12, 2018 Final Office Action dated Oct. 13, 2020.
U.S. Appl. No. 15/951,903, filed Apr. 12, 2018 Non-Final Office Action dated May 22, 2020.
U.S. Appl. No. 17/020,476, filed Sep. 14, 2020 Non-Final Office Action dated Feb. 9, 2022.
PCT/US2023/014143 filed Feb. 28, 2023 International Search Report and Written Opinion dated Jun. 12, 2023.
PCT/US2023/015266 filed Mar. 15, 2023 International Search Report and Written Opinion dated May 25, 2023.
U.S. Appl. No. 17/684,180, filed Mar. 1, 2022 Restriction Requirement dated May 19, 2023.
PCT/US2022/048716 filed Nov. 2, 2022 International Search Report and Written Opinion dated Feb. 24, 2023.
PCT/US2022/048722 filed Nov. 2, 2022 International Search Report and Written Opinion dated Feb. 24, 2023.
PCT/US2022/049983 filed Nov. 15, 2022 International Search Report and Written Opinion dated Mar. 29, 2023.
PCT/US2022047727 filed Oct. 25, 2022 International Search Report and Written Opinion dated Jan. 25, 2023.
Saxena Ashish et al Thermographic venous blood flow characterization with external cooling stimulation Infrared Physics and Technology Elsevier Science GB vol. 90 Feb. 9, 2018 Feb. 9, 2018 pp. 8-19 XP085378852.
U.S. Appl. No. 17/020,476, filed Sep. 14, 2020 Final Office Action dated Jan. 5, 2023.
U.S. Appl. No. 17/020,476, filed Sep. 14, 2020 Notice of Allowance dated Apr. 28, 2022.
U.S. Appl. No. 17/468,318, filed Sep. 7, 2021 Non-Final Office Action dated Apr. 12, 2023.
U.S. Appl. No. 17/471,015, filed Sep. 9, 2021 Non-Final Office Action dated Mar. 30, 2023.
U.S. Appl. No. 17/534,099, filed Nov. 23, 2021 Non-Final Office Action dated Mar. 31, 2023.
U.S. Appl. No. 17/538,911, filed Nov. 30, 2021 Non-Final Office Action dated Mar. 2, 2023.
M. Ikhsan, K. K. Tan, AS. Putra, C. F. Kong, et al., “Automatic identification of blood vessel cross-section for central venous catheter placement using a cascading classifier,” 39th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC). pp. 1489-1492 (Year: 2017).
U.S. Appl. No. 17/471,015, filed Sep. 9, 2021 Advisory Action dated Feb. 2, 2024.
U.S. Appl. No. 17/471,015, filed Sep. 9, 2021 Non-Final Office Action dated Mar. 28, 2024.
U.S. Appl. No. 17/534,099, filed Nov. 23, 2021 Non-Final Office Action dated Mar. 14, 2024.
U.S. Appl. No. 17/538,911, filed Nov. 30, 2021 Notice of Allowance dated Mar. 14, 2024.
U.S. Appl. No. 17/684,180, filed Mar. 1, 2022 Advisory Action dated Apr. 4, 2024.
U.S. Appl. No. 17/722,151, filed Apr. 15, 2022 Non-Final Office Action dated Mar. 25, 2024.
U.S. Appl. No. 17/894,460, filed Aug. 24, 2022 Advisory Action dated Apr. 4, 2024.
U.S. Appl. No. 17/894,460, filed Aug. 24, 2022 Final Office Action dated Jan. 31, 2024.
U.S. Appl. No. 18/238,281, filed Aug. 25, 2023 Non-Final Office Action dated Mar. 22, 2024.
U.S. Appl. No. 17/684,180, filed Mar. 1, 2022 Non-Final Office Action dated May 8, 2024.
U.S. Appl. No. 17/722,111, filed Apr. 15, 2022 Final Office Action dated Jul. 12, 2024.
U.S. Appl. No. 17/979,564, filed Nov. 2, 2022 Non-Final Office Action dated Jun. 5, 2024.
U.S. Appl. No. 18/238,281, filed Aug. 25, 2023 Notice of Allowance dated Jul. 16, 2024.
PCT/US2022/045372 filed Sep. 30, 2022 International Search Report and Written Opinion dated Jan. 14, 2023.
U.S. Appl. No. 17/957,562, filed Sep. 30, 2022 Non-Final Office Action dated Jun. 20, 2024.
U.S. Appl. No. 17/979,601, filed Nov. 2, 2022 Non-Final Office Action dated Aug. 20, 2024.
Related Publications (1)
Number Date Country
20220172354 A1 Jun 2022 US
Provisional Applications (1)
Number Date Country
63120053 Dec 2020 US