Ultrasound probe with target tracking capability

Information

  • Patent Grant
  • 12048491
  • Patent Number
    12,048,491
  • Date Filed
    Tuesday, November 30, 2021
    3 years ago
  • Date Issued
    Tuesday, July 30, 2024
    4 months ago
Abstract
An ultrasound imaging system is disclosed that can include an ultrasound probe including a transducer array configured to acquire ultrasound images, and a console including a processor and non-transitory computer-readable medium having stored thereon a plurality of logic modules that, when executed by the processor, are configured to perform operations including receiving an ultrasound image, detecting one or more targets within the ultrasound image, and generating a visualization from the ultrasound image to center the one or more detected targets within a displayed portion of the ultrasound image. Generating the visualization may include cropping the ultrasound image to center the one or more detected targets within a displayed portion of the ultrasound image. Generating the visualization may include increasing a magnification of a cropped portion of the ultrasound image to center the one or more detected targets within a displayed portion of the ultrasound image.
Description
BACKGROUND

There is currently a variety of existing ultrasound systems that include wired or wireless ultrasound probes connected to visual displays. These systems may be used by a clinician to hold and manipulate the ultrasound probe to place a vascular access device (VAD) such as a catheter in a patient. Ultrasound imaging is commonly used for guiding a needle to targets such as veins of the patient. The needle may be monitored in real-time prior and after a percutaneous insertion. This way a clinician may be able to determine the distance and the orientation of the needle to the target vein and ensure accurate insertion with minimal discomfort to the patient. However, inadvertent and unintentional movements of an ultrasound probe during the ultrasound imaging may occur. Such movement may cause the clinician to lose sight of the target vein and the needle. Finding and locating the needle and the target vein to be viewed on a screen of the visual display may be difficult and may waste valuable time. The distance and orientation of the needle right before the percutaneous insertion may be difficult to determine since a needle plane including the needle is perpendicular (or near perpendicular) to an image plane of the ultrasound probe.


It may be easier to monitor the distance and orientation of the needle immediately after the percutaneous insertion when the needle plane is parallel to the image plane. While inadvertently moving or shifting the ultrasound probe, the clinician can lose the vein and/or the needle when adjusting the image plane before and after the percutaneous insertion which can result in a loss of valuable time. The existing ultrasound systems do not provide for convenient needle guidance capability that takes into account the inadvertent movement or shifting the ultrasound probe. Thus, what is needed are a method and system for an ultrasound image target tracking to account for the inadvertent movements or shifting of the ultrasound probe to facilitate efficient needle guidance.


Accordingly, disclosed herein are methods and systems for analyzing ultrasound images to detect targets including anatomic targets and medical devices appearing within an ultrasound imaging area and generating a cropped image to maintain a location of the detected targets in the cropped image in the event of a shift of the ultrasound probe head.


SUMMARY OF THE INVENTION

Briefly summarized, disclosed herein is an ultrasound probe including, in some embodiments, an image target tracking capability. The ultrasound probe system may provide a consistent ultrasound view throughout an ultrasound guided procedure while compensating for inadvertent movements of the ultrasound probe. The exemplary tracking feature advantageously allows for incidental movement of the ultrasound probe during the procedure without drastic movement of the most important imaging data on the screen.


In some embodiments, an ultrasound imaging system is disclosed comprising an ultrasound probe including a transducer array configured to acquire ultrasound images, and a console including a processor and non-transitory computer-readable medium having stored thereon a plurality of logic modules that, when executed by the processor, are configured to perform operations including receiving an ultrasound image, detecting one or more targets within the ultrasound image, and generating a visualization from the ultrasound image to center the one or more detected targets within a displayed portion of the ultrasound image. In some embodiments, generating the visualization includes cropping the ultrasound image to center the one or more detected targets within a displayed portion of the ultrasound image. In some embodiments, generating the visualization includes increasing a magnification of a cropped portion of the ultrasound image to center the one or more detected targets within a displayed portion of the ultrasound image.


In some embodiments, the ultrasound probe is operatively connected to the console via a wired or wireless connection. In some embodiments, the console includes a display, and wherein the plurality of logic modules that, when executed by the processor, are configured to perform further operations including render the visualization of the cropped ultrasound image on a display. In some embodiments, detecting the one or more targets includes distinguishing a component within the ultrasound image according to varying color saturation within the ultrasound image. In specific embodiments, detecting the one or more targets includes identifying each of the one or more targets as a blood vessel, bone, organ or medical device. In other embodiments, identifying each of the one or more targets includes comparing characteristics of each of the one or more targets to thresholds set to define organs, blood vessels, bones or medical devices.


In some embodiments, the characteristics include one or more of a detected pulsatility upon analysis of the ultrasound image and a prior ultrasound image, dimensions of each of the one or more targets or color saturation of each of the one or more targets. In some embodiments, a result of comparing the characteristics to the one or more thresholds is a confidence level for each of the one or more targets indicating a likelihood of an identification of a particular target. In specific embodiments, the plurality of logic modules that, when executed by the processor, are configured to perform further operations including detect that at least a first target of the one or more of the targets is within a threshold distance of an edge of the ultrasound image.


In some embodiments, the plurality of logic modules that, when executed by the processor, are configured to perform further operations including generate an alert indicating to a clinician that the first target is within the threshold of the edge of the ultrasound image. In some embodiments, the alert includes a text notification or an arrow indicating a direction to move the ultrasound probe. In other embodiments, the one or more targets includes a blood vessel and a needle. In yet other embodiments, the one or more targets includes a distal tip of the needle.


In some embodiments, method for obtaining ultrasound images by an ultrasound imaging system is disclosed where the ultrasound imaging system includes an ultrasound probe including a transducer array configured to acquire ultrasound images, and a console including a processor and non-transitory computer-readable medium having stored thereon a plurality of logic modules that, when executed by the processor, are configured to perform operations including receiving an ultrasound image, detecting one or more targets within the ultrasound image, and generating a visualization from the ultrasound image by cropping the ultrasound image around the one or more detected targets. In some embodiments, the method comprises receiving an ultrasound image, detecting one or more targets within the ultrasound image, and generating a visualization from the ultrasound image to center the one or more detected targets within a displayed portion of the ultrasound image. In some embodiments, generating the visualization includes cropping the ultrasound image to center the one or more detected targets within a displayed portion of the ultrasound image. In some embodiments, generating the visualization includes increasing a magnification of a cropped portion of the ultrasound image to center the one or more detected targets within a displayed portion of the ultrasound image.


In some embodiments, the ultrasound probe is operatively connected to the console via a wired or wireless connection. In some embodiments, the console includes a display, and wherein the plurality of logic modules that, when executed by the processor, are configured to perform further operations including render the visualization of the cropped ultrasound image on a display. In some embodiments, detecting the one or more targets includes distinguishing a component within the ultrasound image according to varying color saturation within the ultrasound image. In specific embodiments, detecting the one or more targets includes identifying each of the one or more targets as a blood vessel, bone, organ or medical device. In other embodiments, identifying each of the one or more targets includes comparing characteristics of each of the one or more targets to thresholds set to define organs, blood vessels, bones or medical devices.


In some embodiments, the characteristics include one or more of a detected pulsatility upon analysis of the ultrasound image and a prior ultrasound image, dimensions of each of the one or more targets or color saturation of each of the one or more targets. In some embodiments, a result of comparing the characteristics to the one or more thresholds is a confidence level for each of the one or more targets indicating a likelihood of an identification of a particular target. In specific embodiments, the plurality of logic modules that, when executed by the processor, are configured to perform further operations including detect that at least a first target of the one or more of the targets is within a threshold distance of an edge of the ultrasound image.


In some embodiments, the plurality of logic modules that, when executed by the processor, are configured to perform further operations including generate an alert indicating to a clinician that the first target is within the threshold of the edge of the ultrasound image. In some embodiments, the alert includes a text notification or an arrow indicating a direction to move the ultrasound probe. In other embodiments, the one or more targets includes a blood vessel and a needle. In yet other embodiments, the one or more targets includes a distal tip of the needle.


These and other features of the concepts provided herein will become more apparent to those of skill in the art in view of the accompanying drawings and following description, which describe particular embodiments of such concepts in greater detail.





BRIEF DESCRIPTION OF DRAWINGS

A more particular description of the present disclosure will be rendered by reference to specific embodiments thereof that are illustrated in the appended drawings. It is appreciated that these drawings depict only typical embodiments of the invention and are therefore not to be considered limiting of its scope. Example embodiments of the invention will be described and explained with additional specificity and detail through the use of the accompanying drawings in which:



FIG. 1 illustrates a block diagram of the ultrasound imaging system in accordance with some embodiments is shown.



FIG. 2A illustrates a probe connected to a console in accordance with some embodiments.



FIG. 2B illustrates a probe connected to a console displaying a target vein in a cropped image in accordance with some embodiments.



FIG. 3A illustrates a view of visualization of a cropped image of a target vein in a cropped image in accordance with some embodiments.



FIG. 3B illustrates a view of visualization of a cropped image of a target vein when the probe shifts in accordance with some embodiments.



FIG. 3C illustrates a view of visualization of a cropped image of a target vein including a movement warning when the probe shifts in accordance with some embodiments.



FIG. 3D illustrates a view of a warning message displayed on a console display when the probe shifts and no longer captures the target vein in accordance with some embodiments.



FIG. 4A illustrates a probe connected to a console displaying a target vein and a needle in accordance with some embodiments.



FIG. 4B illustrates a probe connected to a console displaying a target vein and a needle in a cropped image in accordance with some embodiments.



FIG. 5A illustrates a view of visualization of a cropped image of a target vein and tracking of needle projection in accordance with some embodiments.



FIG. 5B illustrates a view of visualization of a cropped image of a target vein and tracking of needle projection when the probe shifts in accordance with some embodiments.



FIG. 5C illustrates a view of visualization of a cropped image of a target vein and tracking of the needle tip projection including a movement warning when the probe shifts is shown in accordance with some embodiments.





DETAILED DESCRIPTION

Before some particular embodiments are disclosed in greater detail, it should be understood that the particular embodiments disclosed herein do not limit the scope of the concepts provided herein. It should also be understood that a particular embodiment disclosed herein can have features that can be readily separated from the particular embodiment and optionally combined with or substituted for features of any of a number of other embodiments disclosed herein.


Regarding terms used herein, it should also be understood the terms are for the purpose of describing some particular embodiments, and the terms do not limit the scope of the concepts provided herein. Ordinal numbers (e.g., first, second, third, etc.) are generally used to distinguish or identify different features or steps in a group of features or steps, and do not supply a serial or numerical limitation. For example, “first,” “second,” and “third” features or steps need not necessarily appear in that order, and the particular embodiments including such features or steps need not necessarily be limited to the three features or steps. Labels such as “left,” “right,” “top,” “bottom,” “front,” “back,” and the like are used for convenience and are not intended to imply, for example, any particular fixed location, orientation, or direction. Instead, such labels are used to reflect, for example, relative location, orientation, or directions. Singular forms of “a,” “an,” and “the” include plural references unless the context clearly dictates otherwise.


For clarity, it is to be understood that the word “distal” refers to a direction relatively closer to a patient on which a medical device is to be used as described herein, while the word “proximal” refers to a direction relatively further from the patient. Also, the words “including,” “has,” and “having,” as used herein, including the claims, shall have the same meaning as the word “comprising.”


Lastly, in the following description, the terms “or” and “and/or” as used herein are to be interpreted as inclusive or meaning any one or any combination. As an example, “A, B or C” or “A, B and/or C” mean “any of the following: A; B; C; A and B; A and C; B and C; A, B and C.” An exception to this definition will occur only when a combination of elements, components, functions, steps or acts are in some way inherently mutually exclusive.


Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by those of ordinary skill in the art.


Embodiments disclosed herein are directed to an ultrasound imaging system to be used for ultrasound imaging while placing a needle into a target vein of a patient. The ultrasound imaging system including, in some embodiments, an image target tracking capability is provided. The ultrasound imaging system may provide a consistent ultrasound view throughout an ultrasound guided procedure while compensating for inadvertent movements of an ultrasound probe. The exemplary tracking feature advantageously allows for incidental movement of the ultrasound probe during the procedure without drastic movement of the most important imaging data on the screen. The ultrasound-imaging system, according to the exemplary embodiments, may be primarily used for insertion of an access device such as a needle. The image tracking provides for the precise placement of the needle into a target vein or another anatomic target regardless of the inadvertent movements of the ultrasound probe.


Referring to FIG. 1, a block diagram of the ultrasound imaging system 100 is shown in accordance with some embodiments is shown. The console 102 may house a variety of components of the ultrasound imaging system 100. A processor 116 and memory 118 such as random-access memory (RAM) or non-volatile memory—e.g., electrically erasable programmable read-only memory EEPROM may be included in the console 102 for controlling functions of the ultrasound imaging system 100, as well as for executing various logic operations or algorithms during operation of the ultrasound imaging system 100 in accordance with executable instructions 120 stored in the memory 118 for execution by the processor 116. For example, the console 102 is configured to instantiate by way of the instructions 120 one or more processes for adjusting a distance of activated ultrasonic transducers 148 from a predefined target (e.g., a target vein) or an area, an orientation of the activated ultrasonic transducers 148 to the predefined target or area, or both the distance and the orientation of the activated ultrasonic transducers 148 with respect to the predefined target or area, as well as process electrical signals from the ultrasound probe 106 into ultrasound images. The activated ultrasonic transducers 148 may be adjusted using ultrasound-imaging data, magnetic-field data, fiber-optic shape-sensing data, or a combination thereof received by the console 102. The console 102 may activate certain ultrasonic transducers of a 2-D array of the ultrasonic transducers 148 or moving the already activated transducers in a linear array of the ultrasonic transducers 148.


A digital controller/analog interface 122 may be also included with the console 102 and is in communication with both the processor 116 and other system components to govern interfacing between the ultrasound probe 106 and other system components set forth herein. The ultrasound imaging system 100 further includes ports 124 for connection with additional components such as optional components 126 including a printer, storage media, keyboard, etc. The ports 124 can be implemented as universal serial bus (USB) ports, though other types of ports can be used for this connection or any other connections shown or described herein. A power connection 128 is included with the console 102 to enable operable connection to an external power supply 130. An internal power supply 132 (e.g., a battery) can also be employed either with or exclusive of the external power supply 130. Power management circuitry 134 is included with the digital controller/analog interface 122 of the console 102 to regulate power use and distribution. Optionally, a stand-alone optical interrogator 154 may be communicatively coupled to the console 102 by way of one of the ports 124. Alternatively, the console 102 may include an optical interrogator integrated into the console 502. Such an optical interrogator is configured to emit input optical signals into a companion optical-fiber stylet 156 for shape sensing with the ultrasound imaging system 100. The optical-fiber stylet 156, in turn, may be configured to be inserted into a lumen of a medical device such as the needle and may convey the input optical signals from the optical interrogator 154 to a number of fiber Bragg grating (FBG) sensors along a length of the optical-fiber stylet 156. The optical interrogator 154 may be also configured to receive reflected optical signals conveyed by the optical-fiber stylet 156 reflected from the number of the FBG sensors, the reflected optical signals may be indicative of a shape of the optical-fiber stylet 156.


The optical interrogator 154 may be also configured to convert the reflected optical signals into corresponding electrical signals for processing by the console 102 into distance and orientation information with respect to the target and for dynamically adjusting a distance of the activated ultrasonic transducers 148, an orientation of the activated ultrasonic transducers 148, or both the distance and the orientation of the activated ultrasonic transducers 148 with respect to the target (e.g., a target vein) or the medical device (e.g., a needle) when it is brought into proximity of the target. For example, the distance and orientation of the activated ultrasonic transducers 148 may be adjusted with respect to the vein as the target. An image plane may be established by the activated ultrasonic transducers 148 being disposed at a particular angle to the target vein based on the orientation of the target vein (e.g., perpendicular or parallel among other configurations). In another example, when a medical device such as the needle is brought into proximity of the ultrasound probe 106, an image plane can be established by the activated ultrasonic transducers 148 being perpendicular to a medical-device plane including the needle 204. The distance and orientation information may also be used for displaying an iconographic representation of the medical device on the display.


The display screen 104 may be integrated into (or connected to) the console 102 to provide a GUI and display information for a clinician in a form of ultrasound images of the target acquired by the ultrasound probe 106. In addition, the ultrasound imaging system 100 may enable the distance and orientation of a magnetized medical device such as the needle to be superimposed in real-time atop an ultrasound image of the target, thus enabling a clinician to accurately guide the magnetized medical device to the intended target (e.g., the vein). The display screen 104 can alternatively be separate from the console 102 and communicatively (e.g., wirelessly) coupled thereto. A console button interface 136 may be used to immediately call up a desired mode to the display screen 104 by the clinician for assistance in an ultrasound-based medical procedure. In some embodiments, the display screen 104 may be implemented as an LCD device. The ultrasound probe 106 may optionally include an internal measurement unit (IMU) 158 that may house and accelerometer 160, a gyroscope 162 and a magnetometer 164.


The ultrasound probe 106 may be employed in connection with ultrasound-based visualization of a target such as the vein in preparation for inserting the needle or another medical device into the target. Such visualization gives real-time ultrasound guidance and assists in reducing complications typically associated with such insertion, including inadvertent arterial puncture, hematoma, pneumothorax, etc. The ultrasound probe 106 may be configured to provide to the console 102 electrical signals corresponding to the ultrasound-imaging data, the magnetic-field data, the shape-sensing data, or a combination thereof for the real-time ultrasound needle guidance.


In one embodiment, target detection logic 166 may be executed by the processor 116 to detect vessels and other anatomic targets in the ultrasound images. The target detection logic 166 may include pulsatility detection logic 168 and component identification logic 170. The target detection logic 166 may use pulsatility detection logic 168 and component identification logic 170. The pulsatility detection logic 168 may compare a sequence of images of a vessel to detect pulses indicated by periodic changes in dimensions of the vessel (e.g., expansions in a diameter of the vessel). The target detection logic 106 may also detect bones by identifying tissues with high density based on color saturation in the ultrasound images. The component identification logic 170 may analyze reflection of echoes in each ultrasound image. This can be implemented, for example, using thresholds set to identify organs, blood vessels and bones. The respective logics 166, 168 and 170 may be stored on a non-transitory computer-readable medium of the console 102. An image cropping logic 172 may be executed on the processor 116 to crop images with the detected anatomic target (e.g., a target vein) so the anatomic target is in the center of the cropped image that of a total ultrasound imaging area as will be discussed in more detail herein. Herein, “cropping” may refer to reducing the amount of the ultrasound image that is displayed. Further, cropping may include increasing a magnification of the cropped portion of the ultrasound image. The target detection logic 166 and image cropping logic 172 may collectively be referred to as “console logic” or “logic of the console 102”; however, the term console logic may also include reference to any of the other logic modules illustrated in FIG. 1.


Referring to FIG. 2A, a probe connected to a console is shown in accordance with some embodiments is shown. In this example, the ultrasound probe 200 is connected to a console 102 over a wired connection. In one embodiment, a wireless connection may be used. The ultrasound probe 200 includes a body that may house a console operatively connected to an ultrasound imaging device 220. The ultrasound probe 200 may be configured to assist a user such as a clinician in insertion of an access device such as a needle into a target vein 210 of a patient. Ultrasonic transducers located in a head 202 of the ultrasound probe are configured to capture 2-D ultrasound images 222 to be displayed on a screen 104 of the console 102. The head 202 may house a linear array of the ultrasonic transducers (not shown) or a 2-D array of the ultrasonic transducers. The ultrasonic transducers may be implemented as piezoelectric transducers or capacitive micro-machined ultrasonic transducers (CMUTs). When the ultrasound probe 200 is configured with the 2-D array of the ultrasonic transducers, a subset of the ultrasonic transducers may be linearly activated as needed for ultrasound imaging based on ultrasound-imaging data being captured.


The transducers may be configured to maintain the target in an image plane or switch to a different image plane (e.g., from a perpendicular plane to a medical-device plane to a plane parallel to the medical-device plane) including the target. If the ultrasound probe 200 is configured with the moveable linear array of the ultrasonic transducers, the ultrasonic transducers may be already activated for ultrasound imaging. For example, a subset of the ultrasonic transducers or all of the available ultrasonic transducers may be moved together on the moveable linear array as needed for ultrasound imaging based on the ultrasound-imaging data to maintain the target in an image plane established by the activated ultrasonic transducers or to switch to a different image plane including the target.


The probe head 202 may be placed against the skin of a patient proximate to a needle-insertion site so the activated ultrasonic transducers in the probe head 202 may generate and emit the generated ultrasound signals into the patient as a sequence of pulses. Then, the transmitters (not shown) may receive reflected ultrasound signals (i.e., reflections of the generated ultrasonic pulses from the patient's body). The reflected ultrasound signals may be converted into corresponding electrical signals for processing into ultrasound images by the console of the probe 200. Thus, a clinician may employ the ultrasound imaging system 100 depicted in FIG. 2A to determine a suitable insertion site and establish vascular access to the target vein 210 with a needle or another medical device.


The ultrasound imaging system 100 depicted in FIG. 2A is capable of target vein 210 visualization in the total available ultrasound image 222 shown on a display 104 of a console 102. In one embodiment, the image data is received from the probe 200 into the console 102 depicted in FIG. 1. The target detection logic 166 may process the image data to render the ultrasound images in the total available ultrasound image 222. As discussed above with reference to FIG. 1, the target detection logic 166 may use pulsatility detection logic 168 and component identification logic 170. The pulsatility detection logic 168 may compare a sequence of images of a vessel to detect pulses indicated by periodic changes in dimensions of the vessel (e.g., expansions in a diameter of the vessel). The component identification logic 170 may also detect bones by identifying tissues with high density based on color saturation in the ultrasound images. The component identification logic 170 may analyze reflection of echoes in each image. This can be implemented using thresholds set to define organs, blood vessels and bones. The respective logics may be stored on a non-transitory computer-readable medium of the console 102. As discussed above, the target detection logic 166 may process the image data including the target vein 210 to render the ultrasound images 222.


The ultrasound imaging system 100 depicted in FIG. 2A may be used for insertion procedure site assessment. Note that while the ultrasound probe assembly depicted in FIG. 2A has a generic shape, the ultrasound probe 200 may be of a different shape as long as the probe captures the insertion site and a target vein 210.


Referring to now FIG. 2B, the probe 200 is shown as being connected to the console 102, which is displaying a target vein 210 in a cropped image 224 of a total available ultrasound image. As discussed above with reference to FIG. 2A, the ultrasound probe 200 is connected to the console 102 over a wired connection. In one embodiment, a wireless connection may be used. The ultrasound probe 200 includes a body that may house a console operatively connected to an ultrasound imaging device 220. The ultrasound probe 200 may be configured to assist a user such as a clinician in insertion of an access device such as a needle into a target vein 210 of a patient. The probe head 202 may be placed against the skin of a patient proximate to a needle-insertion site so the activated ultrasonic transducers in the probe head 202 may generate and emit ultrasound signals into the patient as a sequence of pulses. Then, the transmitters (not shown) may receive reflected ultrasound signals (i.e., reflections of the generated ultrasonic pulses from the patient's body). The reflected ultrasound signals may be converted into corresponding electrical signals for processing into ultrasound images by the console of the probe 200. Thus, a clinician may employ the ultrasound imaging system 100 depicted in FIG. 2B to determine a suitable insertion site and establish vascular access to the target vein 210 with a needle or another medical device.


The ultrasound imaging system 100 depicted in FIG. 2B is capable of imaging and detecting a target vein 210 and providing visualizations as a cropped image 320 shown on the display 104 of the console 102. The cropped image 224 is a subset of the total ultrasound image. In one embodiment, the image data is received from the probe 200 by the console 102 as depicted in FIG. 1. The target detection logic 166 running on the console 102 may process the image data to detect an anatomic target (the target vein 210) within the ultrasound images.


The target detection logic 166 may use pulsatility detection logic 168 and component identification logic 170 depicted in FIG. 1. The pulsatility detection logic 168 may compare a sequence of images of a vessel to detect pulses indicated by periodic changes in dimensions of the vessel (e.g., expansions in a diameter of the vessel). The component identification logic 170 may also detect bones by identifying tissues with high density based on color saturation in the ultrasound images. The component identification logic 170 may analyze reflection of echoes in each ultrasound image. This can be implemented using thresholds set to define anatomic targets such as organs, blood vessels, bones, etc. In one embodiment, the image cropping logic 172 depicted in FIG. 1 may crop the ultrasound image capturing the imaging area 300 such that the detected anatomic target (e.g., the target vein 210) is located the center of the cropped image 224. Then, the cropped image 224 includes the vein 210 at its center and is displayed in the display 104 of the console 102. In addition to cropping the ultrasound image capturing the imaging area 300, the cropped image 224 may be magnified to fill, or substantially fill, the display 104. As seen in a comparison of FIGS. 2A-2B, the image of the target vein 210 in FIG. 2B appears larger than the image of the target vein 210 in FIG. 2A, which indicates a magnification has occurred with the cropped image 224.


Referring now to FIG. 3A, a view of a display of a cropped image of a target vein is shown in accordance with some embodiments. As discussed with reference to FIGS. 2A-2B, the ultrasound probe 200 includes a body and a head 202 that houses transducers that may generate and emit the generated ultrasound signals into the patient. The ultrasound imaging system 100 depicted in FIG. 3A is configured to obtain ultrasound images, detect the target vein 210 and render a cropped visualization on a display illustrating of the target vein 210. In this example, the ultrasound probe 200 emits ultrasound pulses causing the ultrasound probe 200 to receive reflected data encompassing an imaging area 300, which includes the target vein 210.


The ultrasound image of the imaging area 300 is provided to the console 102 where the console logic processes the ultrasound image. Specifically, the target detection logic 166 analyzes the ultrasound image to detect the target vein 210. For example, the target detection logic 166 may place a bounding box surrounding the target vein 210 or may detect coordinates of a box around the target vein 210. It should be understood that the term “box” is not limited to a square or rectangle but may refer to any other shape, such as a circle, oval, etc. The image cropping logic 172 then crops the ultrasound image illustrating imaging area 300 around the image of the target vein 210 in such a way that the target vein 210 is located in the center of the cropped image 320. For example, the image cropping logic 172, when executed by the processor 116, may crop the ultrasound image illustrating the imaging area 300 at the bounding box or coordinates determined by the target detection logic 166. Then, the cropped image 320 containing the target vein 210 may be displayed on the screen of the console 102.


Referring to FIG. 3B, a view of visualization of a cropped image of a target vein when the probe shifts is shown in accordance with some embodiments. In this example, the probe 200 inadvertently shifts in a first direction, e.g., to the left. The shift of the probe 200 may produce a corresponding shift of the location of the target vein 210 within the imaging area 300, where the corresponding shift of the target vein 210 may be thought of as being in a second direction opposite the first direction.


However, according to an exemplary embodiment, logic of the ultrasound imaging system 100 is configured to detect the target vein 210 and display an image on the console 102 where the target vein 210 is displayed in the center of the image (i.e., compensating for the shift of the probe 200). Therefore, even as the probe 200 may be inadvertently shifted, the image displayed by the console 102 does maintains the target vein 210 at the center of the displayed image; thus, enabling the clinician to continue focusing on the target vein 210 itself as opposed to focusing on the inadvertent shift of the probe 200.


In other words, the cropped image 320 advantageously does not change in response to the shifting of the probe 200. The ultrasound imaging system 100 may identify and distinguish anatomic targets such as the target vein 210. Then, the ultrasound imaging system 100 may identify the anatomic target and perform image tracking of that target. The console 102 of the ultrasound imaging system 100 may employ console logic (e.g., target detection logic 166 and image cropping logic 172, as referenced above) to receive a sequence of ultrasound images (or a continuous signal) from the probe 200. Further, the console logic may repeatedly detect the target vein 210 within each image and may crop the current image for visualization (e.g., as the cropped image 320). This way, the display of the cropped image 320 of the target vein 210 remains unaffected by the shifting of the probe 200 allowing the clinician to advantageously maintain sight of the visualization of the target vein 210.


Referring now to FIG. 3C, a view of a display of a cropped image of a target vein with a movement warning when the probe shifts is shown in accordance with some embodiments. In one embodiment, the ultrasound imaging system 100 may retain and communicate information on the location of the identified target vein 210. For example, the console logic may determine that a shift of the probe 200 has resulted in the target vein 210 being within a threshold distance from an edge of the imaging area 300 and in response, generate a warning or indication (e.g., such as the warning 322) that is configured to inform the user (e.g., a clinician) that the target vein 210 may soon be out of sight of the probe 200 due to shifts or movement of the probe 200. In some embodiments, the warning or indication may be a visual warning displayed by the console 102 such as the warning 322 as seen in FIG. 3C. The warning 322 may include text, e.g., “Movement warning” and/or an indication of a direction in which to move the probe 200 relative to the target vein 210 (e.g., the arrow 324) in order to more centrally locate the ultrasound imaging area 300 over the target vein 210.


For example, the console logic may detect the target vein 210 in each ultrasound image received from the probe 200. When the ultrasound probe 200 accidentally moves in such a way that the probe head 202 is about to stop capturing the target vein 210, the console logic provides a “movement warning” alert that is displayed on the display 104, e.g., as an overlay on the cropped image 320. The console logic may detect the location of the target vein 210 relative to boundaries of the total ultrasound image area 300. The visual alert may be accompanied by an arrow indicating which way to move the probe to get away from the edge of the screen. This way, the clinician is alerted in time before losing sight of the visualization of the target vein 210. In one embodiment, the “movement warning” alert may be generated by an alert generating logic component of the console logic. In some embodiments, the warning or alert may be an audio alert such as beeping. In some embodiments, the warning or alert may be a vibration of the probe 200, in which case the probe 200 would include a vibration motor that is communicatively coupled to the console logic. In some embodiments, the warning or alert may be any combination of a visual alert, an audio alert and/or a vibration.


Referring now to FIG. 3D, a view of a warning message displayed on a console display when the probe shifts and no longer captures the target vein is shown in accordance with some embodiments. As discussed above, the ultrasound imaging system 100 may retain and communicate information on the location of the identified target vein 210. For example, the console logic may inform the clinician that the target vein 210 has moved off the screen in a specific direction, e.g., by analyzing an ultrasound image, failing to detect the target vein 210, and generating a visualization to be rendered on the display 104. When the ultrasound probe 200 moves in such a way (shown by double arrows) that it is no longer capturing the target vein 210 (i.e., the target vein is not within the imaging area 300), the console logic can generate an alert configured to be rendered on the display 104 for viewing by the clinician.


In the example depicted in FIG. 3D, console logic may analyze each ultrasound image received from the probe 200 in order to detect the target vein 210. When the ultrasound probe 200 accidentally moves in such a way that the probe head 202 is no longer capturing the target vein 210 (e.g., the target vein 210 is outside of the imaging area 300), the console logic provides a “movement warning” alert that is displayed on the display 104, e.g., as an overlay on the cropped image 320. For instance, the console logic may provide a “Move Probe” message alert 326 that is displayed as an overlay over a rendering of the total imaging area 300. In some embodiments, the console logic may also provide an arrow 324 that indicates the direction in which the probe needs to be moved in order to resume capturing of the target vein 210. This way, the clinician is alerted to move the probe and resume the visualization of the target vein 210.


Referring to FIG. 4A, the probe connected to the console of FIG. 2A including imaging of a target vein and a needle is shown in accordance with some embodiments is shown. As noted above, the ultrasound probe 200 may be connected to the console 102 via a wired or wireless connection. As illustrated, the probe head 202 may be placed against the skin of a patient proximate to a needle-insertion site so the activated ultrasonic transducers in the probe head 202 may generate and emit the generated ultrasound signals into the patient as a sequence of pulses. Then, the transmitters (not shown) may receive reflected ultrasound signals (i.e., reflections of the generated ultrasonic pulses from the patient's body). The reflected ultrasound signals may be converted into corresponding electrical signals for processing into ultrasound images by the console of the probe 200. Thus, a clinician may employ the ultrasound imaging system 100 depicted in FIG. 2A to determine a suitable insertion site and establish vascular access to the target vein 210 with a needle or another medical device. Further, the reflected ultrasound signals may include reflections from the needle 404, thus enabling the ultrasound imaging system 100 to display an ultrasound image illustrating the imaging area 400.


Referring to now FIG. 4B, the probe 200 is shown as being connected to the console 102, which is displaying a target vein 210 and a portion of the needle 404 in a cropped image 420 of a total available ultrasound image. As discussed, the ultrasound imaging system 100 may be configured to obtain an ultrasound image illustrating an ultrasound imaging area 400 and render a cropped image 420 illustrating a portion of the ultrasound imaging area 400 on the display 104 of the console 102. In some such embodiments, target detection logic 166 of the console 102 may process the image data (ultrasound reflection data) to crop the ultrasound images and cause the rendering of the cropped image 420. Specifically, the target detection logic 166 may use pulsatility detection logic 168 and component identification logic 170. The pulsatility detection logic 168 may compare a sequence of images of a vessel to detect pulses indicated by periodic changes in dimensions of the vessel (e.g., expansions in a diameter of the vessel). The component identification logic 170 may also detect bones by identifying tissues with high density based on color saturation in the ultrasound images.


The component identification logic 170 may analyze reflection of echoes in each image. The identification of components may be implemented based on comparing characteristics of detected components (e.g., pulsatility over a plurality of images, dimensions of the detected components, color saturation, etc.) to thresholds set to define organs, blood vessels and bones. Based on the result of comparing the characteristics of detected components to the one or more thresholds, a confidence level (or score) may be determined indicating a likelihood of an identification of a particular component (e.g., a confidence score that a particular detected component is a bone or a blood vessel).


Further and in a similar manner, the target detection logic 166 may also be configured to detect a needle with an ultrasound image. A needle, such as the needle 404, may include specific and known reflection characteristics (e.g., dimensions, color saturation, etc.) such that the component identification logic 170 of the target detection logic 166 may detect and identify a needle in the same manner as discussed with respect to vessel and bone detection. Thus, the ultrasound probe 200 may be configured to assist a user such as a clinician in insertion of an access device, e.g., the needle 404, into a target vein 210 of a patient. The probe head 202 may be placed against the skin of a patient proximate to a needle-insertion site so the activated ultrasonic transducers in the probe head 202 may generate and emit ultrasound signals into the patient as a sequence of pulses. Then, the transmitters (not shown) may receive reflected ultrasound signals (i.e., reflections of the generated ultrasonic pulses from the patient's body). The reflected ultrasound signals may be converted into corresponding electrical signals for processing into ultrasound images by the console of the probe 200. Thus, a clinician may employ the ultrasound imaging system 100 depicted in FIG. 2B to determine a suitable insertion site and establish vascular access to the target vein 210 with a needle or another medical device.


Following detection and identification of components included within the imaging area 400, the ultrasound imaging system 100 may be configured to generate a cropped image, such as the cropped image 420, which includes both the target vein 210 and a portion of the needle 404. In one embodiment, the image cropping logic 172 depicted in FIG. 1 may crop the ultrasound image illustrating the imaging area 400 such that the detected anatomic target (e.g., the target vein 210) is located the center of the cropped image 420. Then, the cropped image 420 includes the vein 210 at its center and is displayed on the display 104 of the console 102. In addition to cropping the ultrasound image 300, the cropped image 420 may be magnified to fill, or substantially fill, the display 104. As seen in a comparison of FIGS. 4A-4B, the image of the target vein 210 in FIG. 4B appears larger than the image of the target vein 210 in FIG. 4A, which indicates a magnification has occurred with the cropped image 420.


In some embodiments, a determination of a boundary at which to crop the ultrasound image illustrating the imaging area 400 includes determination of the positioning of the needle 404 and its distance from the target vein 210. For example, the cropped image 420 may consist of a smaller bounding box surrounding the target vein 210 when the needle 404 is in close proximity to the target vein 210 and consist of a larger bounding box when the needle 404 is further away from the target vein 210. Therefore, in both situations, the cropped image 420 illustrates the target vein 210 and the needle 404. However, in other embodiments, the bounding box upon which the cropped image 420 is created is a predetermined size and cropping would not take into consideration a location of the needle 404. As noted above, it should be understood that the term “box” is not limited to a square or rectangle but may refer to any other shape, such as a circle, oval, etc.


Referring now to FIG. 5A, a view of a display of a cropped image of a target vein and a portion of a needle is shown in accordance with some embodiments. As discussed with reference to FIGS. 4A-4B, the ultrasound imaging system 100 is configured to obtain ultrasound images, detect the target vein 210 and the needle 404, including the distal tip 501 of the needle 404. Additionally, the ultrasound imaging system 100 may be configured to render a cropped visualization, e.g., the cropped image 520, on a display illustrating of the target vein 210 and the needle 404. In some embodiments, the needle tip tracking can be implemented using the teachings of one or more patents of U.S. Pat. Nos. 5,775,322; 5,879,297; 6,129,668; 6,216,028; and 6,263,230, each of which is incorporated by reference in its entirety into this application.


Referring to FIG. 5B, a view of visualization of a cropped image of a target vein and a portion of a needle is shown in accordance with some embodiments. In this example, the probe 200 inadvertently shifts in a first direction, e.g., to the left. The shift of the probe 200 may produce a corresponding shift of the location of the target vein 210 within the imaging area 500, where the corresponding shift of the target vein 210 may be thought of as being in a second direction opposite the first direction. However, according to the exemplary embodiment, there is no change occurs in the cropped image 520 of the vein 210 and the distal tip 501 shown to a clinician in the cropped image 520 following the accidental shifting of the probe 200. In other words, the cropped image 520 advantageously does not change in response to the shifting of the probe 200. The ultrasound imaging system 100 may identify and distinguish anatomic targets such as the target vein 210 and the distal tip 501 of the needle 404 in order to perform image tracking of the distal tip 501. The ultrasound imaging system 100 may employ console logic to receive a sequence of ultrasound images (or a continuous signal) from the probe 200. Then, the console logic may repeatedly detect the target vein 210 within each image and may crop the current image for visualization in the cropped image 520. This way the target vein 210 displayed in the cropped image 520 remains unaffected by the shifting of the probe 200. In other words, focus is maintained on the target vein 210 and on the tracking of the needle tip 501. Thus, the clinician advantageously does not lose sight of the visualization of the target vein 210 and tracking of the distal tip 501.


In other words, the cropped image 520 advantageously does not change in response to the shifting of the probe 200. The ultrasound imaging system 100 may identify and distinguish anatomic targets such as the target vein 210. Then, the ultrasound imaging system 100 may identify the anatomic target and perform image tracking of that target. The console 102 of the ultrasound imaging system 100 may employ console logic (e.g., target detection logic 166 and image cropping logic 172, as referenced above) to receive a sequence of ultrasound images (or a continuous signal) from the probe 200. Further, the console logic may repeatedly detect the target vein 210 and the needle 404 within each image and may crop the current image for visualization (e.g., as the cropped image 520). This way, the display of the cropped image 520 of the target vein 210 remains unaffected by the shifting of the probe 200 allowing the clinician to advantageously maintain sight of the visualization of the target vein 210 and the needle 404 as the needle 404 approaches the target vein 210.


Referring now to FIG. 5C, a view of a display of a cropped image of a target vein and a portion of a needle including a movement warning when the probe shifts is shown in accordance with some embodiments. In one embodiment, the ultrasound imaging system 100 may retain and communicate information on the location of the identified target vein 210. For example, the console logic may determine that a shift of the probe 200 has resulted in the target vein 210 and/or a distal tip 501 of the needle 404 being within a threshold distance from an edge of the imaging area 300 and in response, generate a warning or indication (e.g., such as the warning 322) that is configured to inform the user (e.g., a clinician) that the target vein 210 or the distal tip 501 of the needle 404 may soon be out of sight of the probe 200 due to shifts or movement of the probe 200. In some embodiments, the warning or indication may be a visual warning displayed by the console 102 such as the warning 522 as seen in FIG. 5C. The warning 522 may include text, e.g., “Movement warning” and/or an indication of a direction in which to move the probe 200 relative to the target vein 210 (e.g., the arrow 324) in order to more centrally locate the ultrasound imaging area 500 over the target vein 210.


For example, the console logic may detect the target vein 210 and the needle 404, include a distal tip 501 of the needle 404, in each ultrasound image received from the probe 200. When the ultrasound probe 200 accidentally moves in such a way that the probe head 202 is about to stop capturing the target vein 210 or the distal tip 501 of the needle 404, the console logic provides a “movement warning” alert that is displayed on the display 104, e.g., as an overlay on the cropped image 520. The console logic may detect the location of the target vein 210 relative to boundaries of the total ultrasound image area 500. The visual alert may be accompanied by an arrow indicating which way to move the probe to get away from the edge of the screen. This way, the clinician is alerted in time before losing sight of the visualization of the target vein 210 or the distal tip 501 of the needle 404. In one embodiment, the “movement warning” alert may be generated by an alert generating logic component of the console logic. In some embodiments, the warning or alert may be an audio alert such as beeping. In some embodiments, the warning or alert may be a vibration of the probe 200, in which case the probe 200 would include a vibration motor that is communicatively coupled to the console logic. In some embodiments, the warning or alert may be any combination of a visual alert, an audio alert and/or a vibration.


Having a system that not only provides for ultrasound imaging, but ensures that the needle is precisely inserted into a target based on ultrasound image tracking regardless of accidental shifting the ultrasound probe, advantageously reduces a risk of puncturing patient's skin in a wrong place or even in several places.


Embodiments of the invention may be embodied in other specific forms without departing from the spirit of the present disclosure. The described embodiments are to be considered in all respects only as illustrative, not restrictive. The scope of the embodiments is, therefore, indicated by the appended claims rather than by the foregoing description. All changes that come within the meaning and range of equivalency of the claims are to be embraced within their scope.

Claims
  • 1. An ultrasound imaging system comprising: an ultrasound probe configured to acquire a series of ultrasound images during a needle insertion procedure; anda console communicatively coupled to the ultrasound probe and including a processor and non-transitory computer-readable medium having stored thereon logic that, when executed by the processor, is configured to perform operations including: repeatedly receiving ultrasound images comprising the series of ultrasound images acquired by the ultrasound probe during the needle insertion procedure,repeatedly detecting a first target blood vessel within the ultrasound images by: detecting pulses within the ultrasound images that indicate periodic changes in a dimension of the first target blood vessel,detecting that the first target blood vessel of one or more targets detected is within a threshold distance of an edge of a first ultrasound image of the series of ultrasound images, andanalyzing color saturation within the ultrasound images with respect to one or more thresholds indicative of a blood vessel, andgenerating a visualization from the ultrasound images including (i) determining a bounding box surrounding the first target blood vessel, wherein the first target blood vessel is centered in the bounding box, (ii) cropping the ultrasound images at the bounding box, and (iii) displaying cropped ultrasound images on a display screen during the needle insertion procedure, wherein the display screen is integrated into or connected to the console, and wherein the visualization is repeatedly regenerated during the needle insertion procedure based on repeated receipt of the ultrasound images.
  • 2. The ultrasound imaging system of claim 1, wherein generating the visualization includes increasing a magnification of the cropped ultrasound images to center the one or more targets detected within the cropped ultrasound image.
  • 3. The ultrasound imaging system of claim 1, wherein the ultrasound probe is communicatively connected to the console via a wired or wireless connection.
  • 4. The ultrasound imaging system of claim 1, wherein the console includes the display screen, and wherein the logic that, when executed by the processor, is configured to perform further operations including rendering the visualization of the cropped ultrasound images on the display screen.
  • 5. The ultrasound imaging system of claim 1, wherein detecting the one or more targets includes distinguishing a component within the ultrasound images according to varying color saturation within the ultrasound images.
  • 6. The ultrasound imaging system of claim 1, wherein detecting the one or more targets includes identifying each of the one or more targets as the blood vessel, a bone, an organ or a medical device.
  • 7. The ultrasound imaging system of claim 6, wherein identifying each of the one or more targets includes comparing characteristics of each of the one or more targets to thresholds set to define organs, blood vessels, bones or medical devices.
  • 8. The ultrasound imaging system of claim 7, wherein the characteristics include one or more of a detected pulsatility upon analysis of the first ultrasound image of the series of ultrasound images and a prior ultrasound image of the series of ultrasound images, dimensions of each of the one or more targets or color saturation of each of the one or more targets.
  • 9. The ultrasound imaging system of claim 8, wherein a result of comparing the characteristics to the one or more thresholds is a confidence level for each of the one or more targets indicating a likelihood of an identification of a particular target.
  • 10. The ultrasound imaging system of claim 1, wherein the logic that, when executed by the processor, is configured to perform further operations including: generating an alert indicating to a clinician that the first target blood vessel is within the threshold distance of the edge of the first ultrasound image.
  • 11. The ultrasound imaging system of claim 10, wherein the alert includes a text notification or an arrow indicating a direction to move the ultrasound probe.
  • 12. The ultrasound imaging system of claim 1, wherein the one or more targets includes the blood vessel and a needle.
  • 13. The ultrasound imaging system of claim 12, wherein the one or more targets includes a distal tip of the needle.
  • 14. A method for obtaining ultrasound images by an ultrasound imaging system including an ultrasound probe configured to acquire a series of ultrasound images during a needle insertion procedure, and a console communicatively coupled to the ultrasound probe and including a processor and non-transitory computer-readable medium having stored thereon a logic that, when executed by the processor, is configured to perform operations, the method comprising: repeatedly receiving ultrasound images comprising the series of ultrasound images acquired by the ultrasound probe during the needle insertion procedure,repeatedly detecting a first target blood vessel within ultrasound images by: detecting pulses within the ultrasound images that indicate periodic changes in a dimension of the first target blood vessel,detecting that the first target blood vessel of one or more targets detected is within a threshold distance of an edge of a first ultrasound image of the series of ultrasound images, andanalyzing color saturation within the ultrasound images with respect to one or more thresholds indicative of a blood vessel, and
  • 15. The method of claim 14, wherein generating the visualization includes increasing a magnification of the cropped ultrasound images to center the one or more targets detected within the cropped ultrasound images.
  • 16. The method of claim 14, wherein the ultrasound probe is communicatively connected to the console via a wired or wireless connection.
  • 17. The method of claim 14, wherein the console includes the display screen, and wherein the logic that, when executed by the processor, is configured to perform further operations including rendering the visualization of the cropped ultrasound images on the display screen.
  • 18. The method of claim 14, wherein detecting the one or more targets includes distinguishing a component within the ultrasound images according to varying color saturation within the ultrasound images.
  • 19. The method of claim 14, wherein detecting the one or more targets includes identifying each of the one or more targets as the blood vessel, a bone, an organ or a medical device.
  • 20. The method of claim 19, wherein identifying each of the one or more targets includes comparing characteristics of each of the one or more targets to thresholds set to define organs, blood vessels, bones or medical devices.
  • 21. The method of claim 20, wherein the characteristics include one or more of a detected pulsatility upon analysis of the first ultrasound image of the series of ultrasound images and a prior ultrasound image of the series of ultrasound images, dimensions of each of the one or more targets or color saturation of each of the one or more targets.
  • 22. The method of claim 21, wherein a result of comparing the characteristics to the one or more thresholds is a confidence level for each of the one or more targets indicating a likelihood of an identification of a particular target.
  • 23. The method of claim 14, further comprising: generating an alert indicating to a clinician that the first target blood vessel is within the threshold distance of the edge of the first ultrasound image.
  • 24. The method of claim 23, wherein the alert includes a text notification or an arrow indicating a direction to move the ultrasound probe.
  • 25. The method of claim 14, wherein the one or more targets includes the blood vessel and a needle.
  • 26. The method of claim 25, wherein the one or more targets includes a distal tip of the needle.
PRIORITY

This application claims the benefit of priority to U.S. Provisional Application No. 63/119,829, filed Dec. 1, 2020, which is incorporated by reference in its entirety into this application.

US Referenced Citations (334)
Number Name Date Kind
3697917 Orth et al. Oct 1972 A
5148809 Biegeleisen-Knight et al. Sep 1992 A
5181513 Touboul et al. Jan 1993 A
5325293 Dorne Jun 1994 A
5349865 Kavli et al. Sep 1994 A
5441052 Miyajima Aug 1995 A
5549554 Miraki Aug 1996 A
5573529 Haak et al. Nov 1996 A
5775322 Silverstein et al. Jul 1998 A
5879297 Haynor et al. Mar 1999 A
5897503 Lyon et al. Apr 1999 A
5908387 LeFree et al. Jun 1999 A
5967984 Chu et al. Oct 1999 A
5970119 Hofmann Oct 1999 A
6004270 Urbano et al. Dec 1999 A
6019724 Gronningsaeter et al. Feb 2000 A
6068599 Saito et al. May 2000 A
6074367 Hubbell Jun 2000 A
6129668 Haynor et al. Oct 2000 A
6132379 Patacsil et al. Oct 2000 A
6216028 Haynor et al. Apr 2001 B1
6233476 Strommer et al. May 2001 B1
6245018 Lee Jun 2001 B1
6263230 Haynor et al. Jul 2001 B1
6375615 Flaherty et al. Apr 2002 B1
6436043 Bonnefous Aug 2002 B2
6498942 Esenaliev et al. Dec 2002 B1
6503205 Manor et al. Jan 2003 B2
6508769 Bonnefous Jan 2003 B2
6511458 Milo et al. Jan 2003 B2
6524249 Moehring et al. Feb 2003 B2
6543642 Milliorn Apr 2003 B1
6554771 Buil et al. Apr 2003 B1
6592520 Peszynski et al. Jul 2003 B1
6592565 Twardowski Jul 2003 B2
6601705 Molina et al. Aug 2003 B2
6612992 Hossack et al. Sep 2003 B1
6613002 Clark et al. Sep 2003 B1
6623431 Sakuma et al. Sep 2003 B1
6641538 Nakaya et al. Nov 2003 B2
6647135 Bonnefous Nov 2003 B2
6687386 Ito et al. Feb 2004 B1
6733458 Steins et al. May 2004 B1
6749569 Pellegretti Jun 2004 B1
6754608 Svanerudh et al. Jun 2004 B2
6755789 Stringer et al. Jun 2004 B2
6840379 Franks-Farah et al. Jan 2005 B2
6857196 Dalrymple Feb 2005 B2
6979294 Selzer et al. Dec 2005 B1
7074187 Selzer et al. Jul 2006 B2
7244234 Ridley et al. Jul 2007 B2
7359554 Klingensmith et al. Apr 2008 B2
7534209 Abend et al. May 2009 B2
7599730 Hunter et al. Oct 2009 B2
7637870 Flaherty et al. Dec 2009 B2
7681579 Schwartz Mar 2010 B2
7691061 Hirota Apr 2010 B2
7699779 Sasaki et al. Apr 2010 B2
7720520 Willis May 2010 B2
7727153 Fritz et al. Jun 2010 B2
7734326 Pedain et al. Jun 2010 B2
7831449 Ying et al. Nov 2010 B2
7905837 Suzuki Mar 2011 B2
7925327 Weese Apr 2011 B2
7927278 Selzer et al. Apr 2011 B2
8014848 Birkenbach et al. Sep 2011 B2
8038619 Steinbacher Oct 2011 B2
8060181 Rodriguez Ponce et al. Nov 2011 B2
8075488 Burton Dec 2011 B2
8090427 Eck et al. Jan 2012 B2
8105239 Specht Jan 2012 B2
8172754 Watanabe et al. May 2012 B2
8175368 Sathyanarayana May 2012 B2
8200313 Rambod et al. Jun 2012 B1
8211023 Swan et al. Jul 2012 B2
8228347 Beasley et al. Jul 2012 B2
8298147 Huennekens et al. Oct 2012 B2
8303505 Webler et al. Nov 2012 B2
8323202 Roschak et al. Dec 2012 B2
8328727 Miele et al. Dec 2012 B2
8388541 Messerly et al. Mar 2013 B2
8409103 Grunwald et al. Apr 2013 B2
8449465 Nair et al. May 2013 B2
8553954 Saikia Oct 2013 B2
8556815 Pelissier et al. Oct 2013 B2
8585600 Liu et al. Nov 2013 B2
8622913 Dentinger et al. Jan 2014 B2
8706457 Hart et al. Apr 2014 B2
8727988 Flaherty et al. May 2014 B2
8734357 Taylor May 2014 B2
8744211 Owen Jun 2014 B2
8754865 Merritt et al. Jun 2014 B2
8764663 Smok et al. Jul 2014 B2
8781194 Malek et al. Jul 2014 B2
8781555 Burnside et al. Jul 2014 B2
8790263 Randall et al. Jul 2014 B2
8849382 Cox et al. Sep 2014 B2
8939908 Suzuki et al. Jan 2015 B2
8961420 Zhang Feb 2015 B2
9022940 Meier May 2015 B2
9138290 Hadjicostis Sep 2015 B2
9199082 Yared et al. Dec 2015 B1
9204858 Pelissier et al. Dec 2015 B2
9220477 Urabe et al. Dec 2015 B2
9295447 Shah Mar 2016 B2
9320493 Visveshwara Apr 2016 B2
9357980 Toji et al. Jun 2016 B2
9364171 Harris et al. Jun 2016 B2
9427207 Sheldon et al. Aug 2016 B2
9445780 Hossack et al. Sep 2016 B2
9456766 Cox et al. Oct 2016 B2
9456804 Tamada Oct 2016 B2
9468413 Hall et al. Oct 2016 B2
9492097 Wilkes et al. Nov 2016 B2
9521961 Silverstein et al. Dec 2016 B2
9554716 Burnside et al. Jan 2017 B2
9582876 Specht Feb 2017 B2
9610061 Ebbini et al. Apr 2017 B2
9636031 Cox May 2017 B2
9649037 Lowe et al. May 2017 B2
9649048 Cox et al. May 2017 B2
9702969 Hope Simpson et al. Jul 2017 B2
9715757 Ng et al. Jul 2017 B2
9717415 Cohen et al. Aug 2017 B2
9731066 Liu et al. Aug 2017 B2
9814433 Benishti et al. Nov 2017 B2
9814531 Yagi et al. Nov 2017 B2
9861337 Patwardhan et al. Jan 2018 B2
9895138 Sasaki Feb 2018 B2
9913605 Harris et al. Mar 2018 B2
9949720 Southard et al. Apr 2018 B2
10043272 Forzoni et al. Aug 2018 B2
10449330 Newman et al. Oct 2019 B2
10524691 Newman et al. Jan 2020 B2
10751509 Misener Aug 2020 B2
11564861 Gaines Jan 2023 B1
20020038088 Imran et al. Mar 2002 A1
20030047126 Tomaschko Mar 2003 A1
20030106825 Molina et al. Jun 2003 A1
20030120154 Sauer et al. Jun 2003 A1
20030125629 Ustuner Jul 2003 A1
20030135115 Burdette et al. Jul 2003 A1
20030149366 Stringer et al. Aug 2003 A1
20040015080 Kelly et al. Jan 2004 A1
20040055925 Franks-Farah et al. Mar 2004 A1
20040197267 Black et al. Oct 2004 A1
20050000975 Carco et al. Jan 2005 A1
20050049504 Lo et al. Mar 2005 A1
20050165299 Kressy et al. Jul 2005 A1
20050251030 Azar et al. Nov 2005 A1
20050267365 Sokulin et al. Dec 2005 A1
20060004290 Smith et al. Jan 2006 A1
20060013523 Childlers et al. Jan 2006 A1
20060015039 Cassidy et al. Jan 2006 A1
20060020204 Serra Jan 2006 A1
20060047617 Bacioiu et al. Mar 2006 A1
20060079781 Germond-Rouet et al. Apr 2006 A1
20060111634 Wu May 2006 A1
20060184029 Haim Aug 2006 A1
20060210130 Germond-Rouet et al. Sep 2006 A1
20070043341 Anderson et al. Feb 2007 A1
20070049822 Bunce et al. Mar 2007 A1
20070073155 Park et al. Mar 2007 A1
20070167738 Timinger et al. Jul 2007 A1
20070199848 Ellswood et al. Aug 2007 A1
20070239120 Brock et al. Oct 2007 A1
20070249911 Simon Oct 2007 A1
20070287886 Saadat Dec 2007 A1
20080021322 Stone et al. Jan 2008 A1
20080033293 Beasley et al. Feb 2008 A1
20080033759 Finlay Feb 2008 A1
20080051657 Rold Feb 2008 A1
20080108930 Weitzel et al. May 2008 A1
20080125651 Watanabe et al. May 2008 A1
20080146915 McMorrow Jun 2008 A1
20080177186 Slater et al. Jul 2008 A1
20080221425 Olson et al. Sep 2008 A1
20080294037 Richter Nov 2008 A1
20080300491 Bonde et al. Dec 2008 A1
20090012399 Sunagawa et al. Jan 2009 A1
20090012401 Steinbacher Jan 2009 A1
20090074280 Lu et al. Mar 2009 A1
20090124903 Osaka May 2009 A1
20090137887 Shariati et al. May 2009 A1
20090143672 Harms et al. Jun 2009 A1
20090143684 Cermak et al. Jun 2009 A1
20090156926 Messerly et al. Jun 2009 A1
20090281413 Boyden et al. Nov 2009 A1
20090306509 Pedersen et al. Dec 2009 A1
20100010348 Halmann Jan 2010 A1
20100211026 Sheetz et al. Aug 2010 A2
20100249598 Smith et al. Sep 2010 A1
20100286515 Gravenstein et al. Nov 2010 A1
20100312121 Guan Dec 2010 A1
20100324423 El-Aklouk et al. Dec 2010 A1
20110002518 Ziv-Ari et al. Jan 2011 A1
20110026796 Hyun Feb 2011 A1
20110071404 Schmitt et al. Mar 2011 A1
20110074244 Osawa Mar 2011 A1
20110087107 Lindekugel et al. Apr 2011 A1
20110166451 Blaivas et al. Jul 2011 A1
20110282188 Burnside et al. Nov 2011 A1
20110295108 Cox et al. Dec 2011 A1
20110313293 Lindekugel et al. Dec 2011 A1
20120165679 Orome et al. Jun 2012 A1
20120179038 Meurer et al. Jul 2012 A1
20120179042 Fukumoto et al. Jul 2012 A1
20120179044 Chiang et al. Jul 2012 A1
20120197132 O'Connor Aug 2012 A1
20120220865 Brown et al. Aug 2012 A1
20120277576 Lui Nov 2012 A1
20130041250 Pelissier et al. Feb 2013 A1
20130102889 Southard et al. Apr 2013 A1
20130131499 Chan et al. May 2013 A1
20130131502 Blaivas et al. May 2013 A1
20130150724 Blaivas et al. Jun 2013 A1
20130188832 Ma et al. Jul 2013 A1
20130197367 Smok et al. Aug 2013 A1
20130218024 Boctor et al. Aug 2013 A1
20130323700 Samosky et al. Dec 2013 A1
20130338503 Cohen et al. Dec 2013 A1
20130338508 Nakamura et al. Dec 2013 A1
20140005530 Liu et al. Jan 2014 A1
20140031694 Solek Jan 2014 A1
20140066779 Nakanishi Mar 2014 A1
20140073976 Fonte et al. Mar 2014 A1
20140100440 Cheline et al. Apr 2014 A1
20140114194 Kanayama et al. Apr 2014 A1
20140180098 Flaherty et al. Jun 2014 A1
20140180116 Lindekugel et al. Jun 2014 A1
20140188133 Misener Jul 2014 A1
20140188440 Donhowe et al. Jul 2014 A1
20140276059 Sheehan Sep 2014 A1
20140276069 Amble et al. Sep 2014 A1
20140276081 Tegels Sep 2014 A1
20140276085 Miller Sep 2014 A1
20140276690 Grace Sep 2014 A1
20140343431 Vajinepalli et al. Nov 2014 A1
20140357994 Jin et al. Dec 2014 A1
20150005738 Blacker Jan 2015 A1
20150011887 Ahn et al. Jan 2015 A1
20150065916 Maguire et al. Mar 2015 A1
20150073279 Cai et al. Mar 2015 A1
20150112200 Oberg et al. Apr 2015 A1
20150209113 Burkholz et al. Jul 2015 A1
20150209510 Burkholz et al. Jul 2015 A1
20150209526 Matsubara et al. Jul 2015 A1
20150282890 Cohen Oct 2015 A1
20150297097 Matsubara et al. Oct 2015 A1
20150359520 Shan et al. Dec 2015 A1
20150359991 Dunbar et al. Dec 2015 A1
20160000367 Lyon Jan 2016 A1
20160026894 Nagase Jan 2016 A1
20160029995 Navratil et al. Feb 2016 A1
20160113699 Sverdlik et al. Apr 2016 A1
20160120607 Sorotzkin et al. May 2016 A1
20160157831 Kang et al. Jun 2016 A1
20160166232 Merritt Jun 2016 A1
20160202053 Walker et al. Jul 2016 A1
20160211045 Jeon Jul 2016 A1
20160213398 Liu Jul 2016 A1
20160259992 Knodt et al. Sep 2016 A1
20160278869 Grunwald Sep 2016 A1
20160296208 Sethuraman et al. Oct 2016 A1
20160317119 Tahmasebi Maraghoosh Nov 2016 A1
20160374644 Mauldin, Jr. et al. Dec 2016 A1
20170020561 Cox et al. Jan 2017 A1
20170079548 Silverstein et al. Mar 2017 A1
20170143312 Hedlund et al. May 2017 A1
20170164923 Matsumoto Jun 2017 A1
20170172666 Govari et al. Jun 2017 A1
20170215842 Ryu et al. Aug 2017 A1
20170252004 Broad et al. Sep 2017 A1
20170328751 Lemke Nov 2017 A1
20170367678 Sirtori et al. Dec 2017 A1
20180015256 Southard et al. Jan 2018 A1
20180116723 Hettrick et al. May 2018 A1
20180125450 Blackbourne et al. May 2018 A1
20180161502 Nanan et al. Jun 2018 A1
20180199914 Ramachandran et al. Jul 2018 A1
20180214119 Mehrmohammadi et al. Aug 2018 A1
20180228465 Southard et al. Aug 2018 A1
20180235649 Elkadi Aug 2018 A1
20180235709 Donhowe et al. Aug 2018 A1
20180289927 Messerly Oct 2018 A1
20180296185 Cox et al. Oct 2018 A1
20180310955 Lindekugel et al. Nov 2018 A1
20180344293 Raju et al. Dec 2018 A1
20190060001 Kohli et al. Feb 2019 A1
20190060014 Hazelton et al. Feb 2019 A1
20190125210 Govari et al. May 2019 A1
20190200951 Meier Jul 2019 A1
20190239848 Bedi et al. Aug 2019 A1
20190307419 Durfee Oct 2019 A1
20190307515 Naito et al. Oct 2019 A1
20190365347 Abe Dec 2019 A1
20190365348 Toume et al. Dec 2019 A1
20200069929 Mason et al. Mar 2020 A1
20200113540 Gijsbers et al. Apr 2020 A1
20200163654 Satir et al. May 2020 A1
20200200900 Asami et al. Jun 2020 A1
20200205774 Duffy Jul 2020 A1
20200230391 Burkholz et al. Jul 2020 A1
20200281563 Muller et al. Sep 2020 A1
20200359990 Poland et al. Nov 2020 A1
20210059639 Howell Mar 2021 A1
20210137492 Imai May 2021 A1
20210161510 Sasaki et al. Jun 2021 A1
20210186467 Urabe et al. Jun 2021 A1
20210212668 Li et al. Jul 2021 A1
20210267570 Ulman et al. Sep 2021 A1
20210315538 Brandl et al. Oct 2021 A1
20220039777 Durfee Feb 2022 A1
20220039829 Zijlstra et al. Feb 2022 A1
20220071593 Tran Mar 2022 A1
20220096797 Prince Mar 2022 A1
20220104791 Matsumoto Apr 2022 A1
20220104886 Blanchard et al. Apr 2022 A1
20220117582 McLaughlin et al. Apr 2022 A1
20220160434 Messerly et al. May 2022 A1
20220172354 Misener et al. Jun 2022 A1
20220330922 Sowards et al. Oct 2022 A1
20220334251 Sowards et al. Oct 2022 A1
20230107629 Sowards et al. Apr 2023 A1
20230132148 Sowards et al. Apr 2023 A1
20230135562 Misener et al. May 2023 A1
20230138970 Sowards et al. May 2023 A1
20230148872 Sowards et al. May 2023 A1
20230201539 Howell Jun 2023 A1
20230277153 Sowards et al. Sep 2023 A1
20230277154 Sowards et al. Sep 2023 A1
20230293143 Sowards et al. Sep 2023 A1
20230397900 Prince Dec 2023 A1
20240065673 Sowards et al. Feb 2024 A1
Foreign Referenced Citations (45)
Number Date Country
102871645 Jan 2013 CN
105107067 May 2018 CN
0933063 Aug 1999 EP
1504713 Feb 2005 EP
1591074 May 2008 EP
2823766 Jan 2015 EP
3181083 Jun 2017 EP
3870059 Sep 2021 EP
2000271136 Oct 2000 JP
2007222291 Sep 2007 JP
2014150928 Aug 2014 JP
2018175547 Nov 2018 JP
20180070878 Jun 2018 KR
102176196 Nov 2020 KR
2010029521 Mar 2010 WO
2010076808 Jul 2010 WO
2013059714 Apr 2013 WO
2014115150 Jul 2014 WO
2015017270 Feb 2015 WO
2016081023 May 2016 WO
2017096487 Jun 2017 WO
2017214428 Dec 2017 WO
2018026878 Feb 2018 WO
2018134726 Jul 2018 WO
2019232451 Dec 2019 WO
2020002620 Jan 2020 WO
2020016018 Jan 2020 WO
2019232454 Feb 2020 WO
2020044769 Mar 2020 WO
2020067897 Apr 2020 WO
2020083660 Apr 2020 WO
2020186198 Sep 2020 WO
2021198226 Oct 2021 WO
2022072727 Apr 2022 WO
2022081904 Apr 2022 WO
2022119853 Jun 2022 WO
2022115479 Jun 2022 WO
2022119856 Jun 2022 WO
2022221703 Oct 2022 WO
2022221714 Oct 2022 WO
2023059512 Apr 2023 WO
2023091424 May 2023 WO
2023076268 May 2023 WO
2023081220 May 2023 WO
2023081223 May 2023 WO
Non-Patent Literature Citations (74)
Entry
Lu Zhenyu et al “Recent advances in 5 robot-assisted echography combining perception control and cognition.” Cognitive Computation and Systems the Institution of Engineering and Technology, Michael Faraday House, Six Hills Way, Stevenage Herts. SG1 2AY UK vol. 2 No. 3 Sep. 2, 2020 (Sep. 2, 2020).
PCT/US2021/045218 filed Aug. 9, 2021 International Search Report and Written Opinion dated Nov. 23, 2021.
PCT/US2021/049123 filed Sep. 3, 2021 International Search Report and Written Opinion dated Feb. 4, 2022.
PCT/US2021/053018 filed Sep. 30, 2021 International Search Report and Written Opinion dated May 3, 2022.
PCT/US2021/060622 filed Nov. 23, 2021 International Search Report and Written Opinion dated Mar. 3, 2022.
PCT/US2021/061267 filed Nov. 30, 2021 International Search Report and Written Opinion dated Mar. 9, 2022.
PCT/US2021/061276 filed Nov. 30, 2021 International Search Report and Written Opinion dated Mar. 9, 2022.
Sebastian Vogt: “Real-Time Augmented Reality for Image-Guided Interventions”, Oct. 5, 2009, XPO55354720, Retrieved from the Internet: URL: https://opus4.kobv.de/opus4-fau/frontdoor/deliver/index/docId/1235/file/SebastianVogtDissertation.pdf.
U.S. Appl. No. 15/951,903, filed Apr. 12, 2018 Board Decision dated Apr. 20, 2022.
U.S. Appl. No. 15/951,903, filed Apr. 12, 2018 Notice of Allowance dated May 2, 2022.
William Garrett et al: “Real-time incremental visualization of dynamic ultrasound volumes using parallel BSP trees”, Visualization '96. Proceedings, IEEE, NE, Oct. 27, 1996, pp. 235-ff, XPO58399771, ISBN: 978-0-89791-864-0 abstract, figures 1-7, pp. 236-240.
PCT/US12/61182 International Seach Report and Written Opinion dated Mar. 11, 2013.
PCT/US2021/049294 filed Sep. 7, 2021 International Search Report and Written Opinion dated Dec. 8, 2021.
PCT/US2021/049712 filed Sep. 9, 2021 International Search Report and Written Opinion dated Dec. 14, 2021.
PCT/US2021/052055 filed Sep. 24, 2021 International Search Report and Written Opinion dated Dec. 20, 2021.
U.S. Appl. No. 13/656,563, filed Oct. 19, 2012 Decision on Appeal dated Nov. 1, 2017.
U.S. Appl. No. 13/656,563, filed Oct. 19, 2012 Examiner's Answer dated Nov. 16, 2015.
U.S. Appl. No. 13/656,563, filed Oct. 19, 2012 Final Office Action dated Dec. 5, 2014.
U.S. Appl. No. 13/656,563, filed Oct. 19, 2012 Non-Final Office Action dated Jul. 18, 2014.
U.S. Appl. No. 15/650,474, filed Jul. 14, 2017 Final Office Action dated Jun. 2, 2020.
U.S. Appl. No. 15/650,474, filed Jul. 14, 2017 Non-Final Office Action dated Dec. 16, 2019.
U.S. Appl. No. 15/650,474, filed Jul. 14, 2017 Notice of Allowance dated Dec. 11, 2020.
U.S. Appl. No. 15/650,474, filed Jul. 14, 2017 Notice of Allowance dated Mar. 1, 2021.
U.S. Appl. No. 15/951,903, filed Apr. 12, 2018 Advisory Action dated Dec. 22, 2020.
U.S. Appl. No. 15/951,903, filed Apr. 12, 2018 Examiner's Answer dated Jun. 3, 2021.
U.S. Appl. No. 15/951,903, filed Apr. 12, 2018 Final Office Action dated Oct. 13, 2020.
U.S. Appl. No. 15/951,903, filed Apr. 12, 2018 Non-Final Office Action dated May 22, 2020.
U.S. Appl. No. 17/020,476, filed Sep. 14, 2020 Non-Final Office Action dated Feb. 9, 2022.
Pagoulatos, N. et al. “New spatial localizer based on fiber optics with applications in 3D ultrasound imaging” Proceeding of Spie, vol. 3976 (Apr. 18, 2000; Apr. 18, 2000).
PCT/US2022/025082 filed Apr. 15, 2022 International Search Report and Written Opinion dated Jul. 11, 2022.
PCT/US2022/025097 filed Apr. 15, 2022 International Search Report and Written Opinion dated Jul. 8, 2022.
U.S. Appl. No. 17/020,476, filed Sep. 14, 2020 Final Office Action dated Jun. 9, 2022.
U.S. Appl. No. 17/020,476, filed Sep. 14, 2020 Advisory Action dated Aug. 19, 2022.
U.S. Appl. No. 17/020,476, filed Sep. 14, 2020 Non-Final Office Action dated Sep. 23, 2022.
U.S. Appl. No. 17/471,015, filed Sep. 9, 2021 Non-Final Office Action dated Aug. 16, 2022.
PCT/US2023/014143 filed Feb. 28, 2023 International Search Report and Written Opinion dated Jun. 12, 2023.
PCT/US2023/015266 filed Mar. 15, 2023 International Search Report and Written Opinion dated May 25, 2023.
U.S. Appl. No. 17/684,180, filed Mar. 1, 2022 Non-Final Office Action dated Jul. 28, 2023.
PCT/US2022/048716 filed Nov. 2, 2022 International Search Report and Written Opinion dated Feb. 24, 2023.
PCT/US2022/048722 filed Nov. 2, 2022 International Search Report and Written Opinion dated Feb. 24, 2023.
PCT/US2022/049983 filed Nov. 15, 2022 International Search Report and Written Opinion dated Mar. 29, 2023.
PCT/US2022047727 filed Oct. 25, 2022 International Search Report and Written Opinion dated Jan. 25, 2023.
Saxena Ashish et al Thermographic venous blood flow characterization with external cooling stimulation Infrared Physics and Technology Elsevier Science GB vol. 90 Feb. 9, 2018 Feb. 9, 2018 pp. 8-19 XP085378852.
U.S. Appl. No. 17/020,476, filed Sep. 14, 2020 Final Office Action dated Jan. 5, 2023.
U.S. Appl. No. 17/020,476, filed Sep. 14, 2020 Notice of Allowance dated Apr. 28, 2022.
U.S. Appl. No. 17/468,318, filed Sep. 7, 2021 Non-Final Office Action dated Apr. 12, 2023.
U.S. Appl. No. 17/471,015, filed Sep. 9, 2021 Non-Final Office Action dated Mar. 30, 2023.
U.S. Appl. No. 17/534,099, filed Nov. 23, 2021 Non-Final Office Action dated Mar. 31, 2023.
U.S. Appl. No. 17/684,180, filed Mar. 1, 2022 Restriction Requirement dated May 19, 2023.
EP 20866520.8 filed Apr. 5, 2022 Extended European Search Report dated Aug. 22, 2023.
PCT/US2022/025097 filed Apr. 15, 2021 International Preliminary Report on Patentability dated Oct. 26, 2023.
PCT/US2023/030970 filed Aug. 23, 2023 International Search Report and Written Opinion dated Oct. 30, 2023.
U.S. Appl. No. 17/468,318, filed Sep. 7, 2021 Advisory Action dated Nov. 6, 2023.
U.S. Appl. No. 17/468,318, filed Sep. 7, 2021 Final Office Action dated Sep. 8, 2023.
U.S. Appl. No. 17/471,015, filed Sep. 9, 2021 Final Office Action dated Oct. 12, 2023.
U.S. Appl. No. 17/534,099, filed Nov. 23, 2021 Final Office Action dated Sep. 29, 2023.
U.S. Appl. No. 17/722,151, filed Apr. 15, 2022 Final Office Action dated Nov. 6, 2023.
U.S. Appl. No. 17/722,151, filed Apr. 15, 2022 Non-Final Office Action dated Sep. 7, 2023.
U.S. Appl. No. 17/894,460, filed Aug. 24, 2022 Non-Final Office Action dated Nov. 6, 2023.
U.S. Appl. No. 17/468,318, filed Sep. 7, 2021 Notice of Allowance dated Jan. 18, 2024.
U.S. Appl. No. 17/471,015, filed Sep. 9, 2021 Advisory Action dated Feb. 2, 2024.
U.S. Appl. No. 17/534,099, filed Nov. 23, 2021 Advisory Action dated Dec. 8, 2023.
U.S. Appl. No. 17/538,943, filed Nov. 30, 2021 Non-Final Office Action dated Jan. 30, 2024.
U.S. Appl. No. 17/684,180, filed Mar. 1, 2022 Final Office Action dated Jan. 18, 2024.
U.S. Appl. No. 17/722,111, filed Apr. 15, 2022 Non-Final Office Action dated Dec. 22, 2023.
U.S. Appl. No. 17/722,151, filed Apr. 15, 2022 Advisory Action dated Jan. 2, 2024.
U.S. Appl. No. 17/894,460, filed Aug. 24, 2022 Final Office Action dated Jan. 31, 2024.
M. Ikhsan, K. K. Tan, AS. Putra, C. F. Kong, et al., “Automatic identification of blood vessel cross-section for central venous catheter placement using a cascading classifier,” 39th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC). pp. 1489-1492 (Year: 2017).
U.S. Appl. No. 17/471,015, filed Sep. 9, 2021 Non-Final Office Action dated Mar. 28, 2024.
U.S. Appl. No. 17/534,099, filed Nov. 23, 2021 Non-Final Office Action dated Mar. 14, 2024.
US 17/684, 180 filed Mar. 1, 2022 Advisory Action dated Apr. 4, 2024.
US 17/722, 151 filed Apr. 15, 2022 Non-Final Office Action dated Mar. 25, 2024.
U.S. Appl. No. 17/894,460, filed Aug. 24, 2022 Advisory Action dated Apr. 4, 2024.
U.S. Appl. No. 18/238,281, filed Aug. 25, 2023 Non-Final Office Action dated Mar. 22, 2024.
Related Publications (1)
Number Date Country
20220168050 A1 Jun 2022 US
Provisional Applications (1)
Number Date Country
63119829 Dec 2020 US