Ultrasound system with target and medical instrument awareness

Information

  • Patent Grant
  • 12213746
  • Patent Number
    12,213,746
  • Date Filed
    Tuesday, November 23, 2021
    3 years ago
  • Date Issued
    Tuesday, February 4, 2025
    a month ago
Abstract
Blood vessel recognition and needle guidance systems, components, and methods thereof are disclosed. A console can be configured to instantiate a target recognition process for recognizing an anatomical target, such as a blood vessel, of a patient and a needle guidance process for guiding insertion of a needle into the anatomical target using ultrasound-imaging data received by the console. The system can perform target identification based on artificial intelligence (AI), which can be trained to recognize targets based on the ultrasound image. The ultrasound probe can be configured to provide to the console electrical signals corresponding to the ultrasound-imaging data. The ultrasound probe can include an array of transducers and, optionally, an array of magnetic sensors respectively configured to convert reflected ultrasound signals from the patient and magnetic signals from the needle, when magnetized, into the electrical signals.
Description
SUMMARY

Briefly summarized, embodiments disclosed herein are directed to systems, methods and apparatuses for ultrasound target identification and medical instrument tracking. One potential problem that arises with ultrasound imaging during catheter insertion is identifying the proper targets. For example, challenges can include differentiating veins vs. arteries, finding a blood vessel suitable for catheter insertion, or finding a particular blood vessel. Locating the correct targets may consume clinical time and resources, and errors could cause potentially dangerous complications.


A second potential problem is locating and guiding the needle tip, particularly in its approach to the target blood vessel. For example, when inserting a peripherally inserted central catheter (PICC), the needle must be guided over a large distance from its peripheral insertion site through a vein or artery to the target, and it can be challenging to track or estimate correctly the needle's position. Moreover, the target may often be an essential, central blood vessel, such as the superior vena cava (SVC) or the cavoatrial junction (CAJ) thereof. Accordingly, it can be important to know the location and orientation of the needle precisely, in order to target the catheter insertion in a minimally invasive way and with the least risk of harm.


Other potential problems include navigation of the catheter tip, e.g., a PICC tip, during insertion, and confirmation of the tip location following insertion. In particular, it is very important to track the location of a PICC, central venous catheter (CVC), another catheter, or another medical device, in order to perform implantation minimally invasively and with minimal risk. It is likewise crucial to confirm that a medical device, such as a PICC, has been implanted in the correct target, such as the SVC. However, some methods for tip tracking and confirmation require fluoroscopy or X-ray exposure, and may also require exposure to harmful contrast media.


Disclosed herein is a target recognition and needle guidance system. The target recognition and needle guidance system comprises a console including memory and a processor, and an ultrasound probe. The console is configured to instantiate a target recognition process for recognizing an anatomical target of a patient, by applying an artificial intelligence model to features of candidate targets in ultrasound-imaging data to determine a recognition score. The console is further configured to instantiate a needle guidance process for guiding insertion of a needle into the anatomical target using the ultrasound-imaging data received by the console. The ultrasound probe is configured to provide to the console electrical signals corresponding to the ultrasound-imaging data. The ultrasound probe includes an array of transducers (optionally piezoelectric) configured to convert reflected ultrasound signals from the patient into an ultrasound-imaging portion of the electrical signals.


In some embodiments, the artificial intelligence model includes a supervised learning model trained based on previously-identified training targets.


In some embodiments, the recognition score comprises a probability or confidence level associated with a classification of the candidate targets in the ultrasound-imaging data.


In some embodiments, the artificial intelligence comprises one or more of the following supervised learning methods: logistic regression; other linear classifiers; support vector machines; quadratic classifiers; kernel estimation; decision trees; neural networks; or learning vector quantization.


In some embodiments, the artificial intelligence model includes the logistic regression, and the score is determined as a weighted sum of the features of the candidate targets.


In some embodiments, the artificial intelligence model includes the neural network, and the score is determined based on one or more nonlinear activation functions of the features of the candidate targets.


In some embodiments, the supervised learning model is trained by iteratively minimizing an error, for classifications predicted by a candidate model compared to actual classifications of the previously-identified training targets.


In some embodiments, the anatomical target comprises a blood vessel.


In some embodiments, the blood vessel comprises a superior vena cava.


In some embodiments, the ultrasound probe provides a first ultrasound image prior to the insertion of the needle. The artificial intelligence model is based in part on a trajectory of the needle.


In some embodiments, the features of the ultrasound-imaging data include one or more of the following features: a shape of a candidate blood vessel of the candidate targets in the ultrasound-imaging data; a size of the candidate blood vessel; a number of branches of the candidate blood vessel; or a complexity of branches of the candidate blood vessel.


In some embodiments, recognizing the anatomical target comprises one or more of: providing identifying information characterizing one or more potential targets to a user; suggesting the one or more potential targets for a selection by the user; or selecting a target of the one or more potential targets.


In some embodiments, recognizing the anatomical target comprises providing identifying information characterizing one or more potential targets to a user. The identifying information comprises a predicted classification of the one or more potential targets and a confidence level associated with the classification.


In some embodiments, the target recognition and needle guidance system further comprises a display screen configured for graphically guiding the insertion of the needle into the anatomical target of the patient.


In some embodiments, a catheter is disposed within the needle. The catheter is implanted into the anatomical target following the insertion of the needle.


In some embodiments, the target recognition and needle guidance system further includes a catheter magnetic sensor or a catheter light detector. The catheter magnetic sensor can be configured to provide to the console electrical signals corresponding to magnetic information about the catheter. The catheter light detector can be configured to provide to the console electrical signals corresponding to optical information about the catheter. The processor is further configured to instantiate a magnetic catheter tip tracking process based on the catheter magnetic information or an optical catheter tip tracking process based on the catheter optical information.


In some embodiments, the target recognition and needle guidance system further includes one or more electrocardiography probes configured to provide to the console electrical signals corresponding to electrocardiography information. The processor is further configured to instantiate a catheter tip placement confirmation process based on the electrocardiography information.


In some embodiments, the anatomical target comprises a biopsy target, a nerve to be blocked, or an abscess to be drained.


In some embodiments, the system further comprises a needle magnetizer incorporated into the console configured to magnetize a needle to obtain a magnetized needle.


In some embodiments, the needle guidance process for guiding insertion of the magnetized needle into the anatomical target uses a combination of the ultrasound-imaging data and magnetic-field data received by the console.


In some embodiments, the ultrasound probe is configured to provide to the console the magnetic-field data, the ultrasound probe further including an array of magnetic sensors configured to convert magnetic signals from the magnetized needle into a magnetic-field portion of the electrical signals.


Also disclosed herein is a method of a target recognition and needle guidance system. The method includes, in some embodiments, instantiating in memory of a console a target recognition process and a needle guidance process. The target recognition process can be for recognizing an anatomical target of a patient, by applying an artificial intelligence model to features of candidate targets in ultrasound-imaging data to determine a recognition score. The needle guidance process can be for guiding insertion of a needle into the anatomical target using the ultrasound-imaging data received by the console. The method further includes loading the ultrasound-imaging data in the memory, the ultrasound-imaging data corresponding to electrical signals received from an ultrasound probe. The method further includes processing the ultrasound-imaging data with a processor of the console according to the target recognition process and the needle guidance process. The method further includes guiding the insertion of the needle, according to the needle guidance process, into the anatomical target of the patient.


Also disclosed herein is a method of a target recognition and needle guidance system. The method includes, in some embodiments, obtaining a needle. The method further includes imaging an anatomical region of a patient with an ultrasound probe to produce ultrasound-imaging data. The method further includes recognizing, by the console, an anatomical target of the anatomical region of the patient, by applying an artificial intelligence model to features of candidate targets in ultrasound-imaging data to determine a recognition score. The method further includes orienting the needle for insertion into the anatomical target of the patient while imaging the anatomical region with the ultrasound probe. The method further includes inserting the needle into the anatomical target of the anatomical region in accordance with guidance provided by a needle guidance process instantiated by the console upon processing a combination of the ultrasound-imaging data.


In some embodiments, a catheter is disposed within the needle. The method further comprises implanting the catheter into the anatomical target following the insertion of the needle. In some embodiments, the needle guidance process for guiding insertion of a magnetized needle into the anatomical target uses a combination of the ultrasound-imaging data and magnetic-field data received by the console.


In some embodiments, the ultrasound probe is configured to provide to the console the magnetic-field data, the ultrasound probe further including an array of magnetic sensors configured to convert magnetic signals from a magnetized needle into a magnetic-field portion of the electrical signals.


These and other features of the concepts provided herein will become more apparent to those of skill in the art in view of the accompanying drawings and following description, which disclose particular embodiments of such concepts in greater detail.





BRIEF DESCRIPTION OF THE DRAWINGS

Embodiments of the disclosure are illustrated by way of example and not by way of limitation in the figures of the accompanying drawings, in which like references indicate similar elements and in which:



FIG. 1 illustrates an example ultrasound probe and needle guidance system with a patient, according to some embodiments;



FIG. 2 illustrates insertion of a needle into a blood vessel using an ultrasound probe and needle guidance system, according to some embodiments;



FIG. 3 displays an example optical catheter tip tracking system and a catheter, according to some embodiments;



FIG. 4A illustrates an example ultrasound probe, target identification system, and needle guidance system with a patient, according to some embodiments;



FIG. 4B illustrates an example console display of an ultrasound, needle guidance, and target identification system, according to some embodiments;



FIG. 5 shows a block diagram of a needle guidance and target identification system, in accordance with some embodiments;



FIG. 6A illustrates a target identification system in use in conjunction with magnetic catheter tip tracking, in accordance with some embodiments;



FIG. 6B illustrates a target identification system in use in conjunction with optical catheter tip tracking, in accordance with some embodiments;



FIG. 7 shows catheter tip confirmation, in accordance with some embodiments;



FIG. 8 shows a flowchart of an example method for ultrasound imaging, target identification, and needle guidance, according to some embodiments; and



FIG. 9 shows a flowchart of an example method for target identification, according to some embodiments.





DETAILED DESCRIPTION

Before some particular embodiments are disclosed in greater detail, it should be understood that the particular embodiments disclosed herein do not limit the scope of the concepts provided herein. It should also be understood that a particular embodiment disclosed herein can have features that can be readily separated from the particular embodiment and optionally combined with or substituted for features of any of a number of other embodiments disclosed herein.


Regarding terms used herein, it should also be understood the terms are for the purpose of describing some particular embodiments, and the terms do not limit the scope of the concepts provided herein. Ordinal numbers (e.g., first, second, third, etc.) are generally used to distinguish or identify different features or steps in a group of features or steps, and do not supply a serial or numerical limitation. For example, “first,” “second,” and “third” features or steps need not necessarily appear in that order, and the particular embodiments including such features or steps need not necessarily be limited to the three features or steps. Labels such as “left,” “right,” “top,” “bottom,” “front,” “back,” and the like are used for convenience and are not intended to imply, for example, any particular fixed location, orientation, or direction. Instead, such labels are used to reflect, for example, relative location, orientation, or directions. Singular forms of “a,” “an,” and “the” include plural references unless the context clearly dictates otherwise.


With respect to “proximal,” a “proximal portion” or a “proximal end portion” of, for example, a probe disclosed herein includes a portion of the probe intended to be near a clinician when the probe is used on a patient. Likewise, a “proximal length” of, for example, the probe includes a length of the probe intended to be near the clinician when the probe is used on the patient. A “proximal end” of, for example, the probe includes an end of the probe intended to be near the clinician when the probe is used on the patient. The proximal portion, the proximal end portion, or the proximal length of the probe can include the proximal end of the probe; however, the proximal portion, the proximal end portion, or the proximal length of the probe need not include the proximal end of the probe. That is, unless context suggests otherwise, the proximal portion, the proximal end portion, or the proximal length of the probe is not a terminal portion or terminal length of the probe.


With respect to “distal,” a “distal portion” or a “distal end portion” of, for example, a probe disclosed herein includes a portion of the probe intended to be near or in a patient when the probe is used on the patient. Likewise, a “distal length” of, for example, the probe includes a length of the probe intended to be near or in the patient when the probe is used on the patient. A “distal end” of, for example, the probe includes an end of the probe intended to be near or in the patient when the probe is used on the patient. The distal portion, the distal end portion, or the distal length of the probe can include the distal end of the probe; however, the distal portion, the distal end portion, or the distal length of the probe need not include the distal end of the probe. That is, unless context suggests otherwise, the distal portion, the distal end portion, or the distal length of the probe is not a terminal portion or terminal length of the probe.


The term “logic” may be representative of hardware, firmware or software that is configured to perform one or more functions. As hardware, the term logic may refer to or include circuitry having data processing and/or storage functionality. Examples of such circuitry may include, but are not limited or restricted to a hardware processor (e.g., microprocessor, one or more processor cores, a digital signal processor, a programmable gate array, a microcontroller, an application specific integrated circuit “ASIC”, etc.), a semiconductor memory, or combinatorial elements.


Additionally, or in the alternative, the term logic may refer to or include software such as one or more processes, one or more instances, Application Programming Interface(s) (API), subroutine(s), function(s), applet(s), servlet(s), routine(s), source code, object code, shared library/dynamic link library (dll), or even one or more instructions. This software may be stored in any type of a suitable non-transitory storage medium, or transitory storage medium (e.g., electrical, optical, acoustical or other form of propagated signals such as carrier waves, infrared signals, or digital signals). Examples of a non-transitory storage medium may include, but are not limited or restricted to a programmable circuit; non-persistent storage such as volatile memory (e.g., any type of random access memory “RAM”); or persistent storage such as non-volatile memory (e.g., read-only memory “ROM”, power-backed RAM, flash memory, phase-change memory, etc.), a solid-state drive, hard disk drive, an optical disc drive, or a portable memory device. As firmware, the logic may be stored in persistent storage.


Disclosed herein is a system, apparatus and method directed to an ultrasound probe, target identification system, and needle guidance system. One frequent challenge in procedures and therapies utilizing ultrasound imaging, such as catheter insertion, biopsy, nerve blocks, or drainage, is identifying the proper targets. A second challenge is locating and guiding a needle tip, particularly in its approach to a target blood vessel. Additional challenges can include navigation of a catheter tip, such as a peripherally inserted central catheter (PICC) tip, during insertion in the target, and confirmation of the tip location following insertion.



FIG. 1 illustrates an example ultrasound probe 106 included within a needle guidance system 100 with a patient P, according to some embodiments. System 100 performs ultrasound imaging of an insertion target, in this example a vasculature of patient P, together with magnetic needle guidance. The needle guidance system 100 can have a variety of uses, including placement of needles in preparation for inserting the catheter 112 or other medical components into a body of a patient. In this example, a clinician employs ultrasound probe 106 and needle guidance system 100 while performing a procedure to place a catheter 112 into a vasculature of the patient P through a skin insertion site S. Together, the ultrasound imaging and needle guidance functionalities enable the clinician to guide the needle accurately to its intended target.


However, there remains a need for assistance identifying the insertion target efficiently and accurately. In particular, there remains a need to reduce target identification errors that could lead to dangerous complications for the patient, while also reducing the costs and clinical time necessary to locate, identify, and insert the needle into the target. The system and methods disclosed herein below can address these needs.



FIG. 1 further depicts a needle-based device, namely a catheter-insertion device 144, used to gain initial access to the vasculature of the patient P via the insertion site S to deploy the catheter 112. Placement of a needle into a vasculature at the insertion site S is typically performed prior to insertion of the catheter 112. The catheter 112 generally includes a proximal portion 114 that remains exterior to the patient and a distal portion 116 that resides within the patient P's vasculature after placement is complete. The needle guidance system 100 is employed to ultimately position a distal tip 118 of the catheter 112 in a desired position within the vasculature. The proximal portion 114 of the catheter 112 further includes a Luer connector 120 configured to operably connect the catheter 112 with one or more other medical devices or systems.


The needle of the catheter-insertion device is configured to cooperate with the needle guidance system 100 in enabling the needle guidance system 100 to detect the position, orientation, and advancement of the needle during an ultrasound-based placement procedure. In some examples, other needles or medical devices may also be magnetized and used with the needle guidance system 100.


The display screen 104 is integrated into the console 102 and can display information to the clinician during the placement procedure, such as an ultrasound image of the targeted internal body portion obtained by ultrasound probe 106. In particular, the needle guidance process of the needle guidance system 100 can graphically guide insertion of a needle into the target by way of the display screen 104.



FIG. 2 illustrates insertion of a needle into a blood vessel using an ultrasound probe 106 and needle guidance system, in accordance with some embodiments. The ultrasound probe 106 includes a sensor array for detecting the position, orientation, and movement of a needle during ultrasound imaging procedures. The sensor array includes a number of magnetic sensors 258 embedded within or included on the housing of the ultrasound probe 106. The magnetic sensors 258 detect a magnetic field or magnetic signals associated with the needle when the needle is magnetized and in proximity to the sensor array. Sensors 258 also convert the magnetic signals from the magnetized needle into a magnetic-field portion of the foregoing electrical signals to the console 102. (See the illustrated magnetic field B of the needle.) Accordingly, the sensor array enables the needle guidance system 100 to track a magnetized needle or the like.


Though illustrated in this example as magnetic sensors, sensors 258 may be sensors of other types and configurations. Also, in some examples, the magnetic sensors 258 of the sensor array can be included in a component separate from the ultrasound probe 106, such as a separate handheld device. In some embodiments, the magnetic sensors 258 are disposed in an annular configuration about the head 146 of the ultrasound probe 106, though it is appreciated that the magnetic sensors 258 can be arranged in other configurations, such as in an arched, planar, or semi-circular arrangement.


Five magnetic sensors 258 are included in the sensor array so as to enable detection of a needle three spatial dimensions (i.e., X, Y, Z coordinate space), as well as the pitch and yaw orientation of the needle itself. More generally, the number, size, type, and placement of the magnetic sensors 258 of the sensor array may vary from what is explicitly shown here.


A needle of a magnetizable material enables the needle to be magnetized by the magnetizer 108 and later be tracked by the needle guidance system 100 when the needle is percutaneously inserted into a body of a patient (e.g., the body of the patient P) during a procedure to insert the needle or an associated medical device (e.g., the catheter 112 of the catheter-insertion device 144) into the body of the patient P. For example, the needle may be composed of a stainless steel such as SS 304 stainless steel, or of some other suitable needle material that is capable of being magnetized, and is not limited by the present disclosure. The needle material may be ferromagnetic or paramagnetic. Accordingly, the needle can produce a magnetic field or signals detectable by the sensor array of the ultrasound probe 106 so as to enable the location, orientation, and movement of the needle to be tracked by the needle guidance system 100.


Needle guidance system 100 and magnetic sensors 258 are described further in US 2021/0169585, which is incorporated by reference in its entirety into this application.


In addition to tracking the location of the insertion needle, it also can be useful to track a medical device, such as the tip and/or stylet of a PICC, central venous catheter (CVC), or other catheter, during insertion. For example, such tip tracking can improve the accuracy and efficiency of catheter placement, improve patient outcomes, and reduce the risk of complications or injuries. Accordingly, in some examples, the system can use passive magnetic catheter tip tracking to locate and/or track tips of medical devices like PICCs, catheters, and/or stylets, and can thereby determine where such tips are located in relation to their target anatomical structures. This will be discussed further in the example of FIG. 6A below. In some cases, magnetic and electromagnetic means for tracking the tips of the medical devices are prone to interference. Accordingly, optical tip-tracking methods can also be used.



FIG. 3 displays an example optical catheter tip tracking system 300 and a catheter 350, according to some embodiments. As shown, the optical tip-tracking system 300 includes a light-emitting stylet 310, a light detector 320, and a console 102 configured to operably connect to the light-emitting stylet 310 and the light detector 320. Optical tip-tracking system 300 also includes a display screen 104, which may be a standalone screen or be integrated into console 102. Optical tip-tracking system 300 can also include a medical device, such as the catheter 350 in this example.


In this example, the catheter 350 is a peripherally inserted central catheter (PICC). In another example catheter 350 can be a central venous catheter (CVC). A light-emitting stylet 310 including a light source, such as one or more LEDs in a distal-end portion (e.g., a tip) of the stylet 310, can be disposed in one or more lumens of catheter 350. Alternatively, the light-emitting stylet 310 can convey light (e.g., via an optical fiber), from an external source (e.g., a light within console 102) to emit light from the distal-end portion (e.g., the tip) of stylet 310.


In this example, catheter 350 is a diluminal catheter including a catheter tube 352, a bifurcated hub 354, two extension legs 356, and two Luer connectors 358. Alternatively, catheter 350 may be a monoluminal catheter, or a multi-luminal catheter with three or more lumens. In this example, two lumens extend through diluminal catheter 350, and are formed of adjoining lumen portions. Each of the two extension legs 356 has an extension-leg lumen fluidly connected to one of the two hub lumens. Either lumen extending through the catheter 350 can accommodate a light-emitting stylet 310 disposed therein.


The system 300 can further include light detector 320, which includes a plurality of photodetectors 122 disposed within the light detector 320, and configured to detect the light emitted from the light source of light-emitting stylet 310. The photodetectors 122 are arranged in an array such that the light emitted from light-emitting stylet 310 remains detectable by at least one of the photodetectors 122, even when the light is anatomically blocked (e.g., by a rib) from another one or more of the photodetectors 122.


Optical catheter tip tracking system 300 and light-emitting stylet 310 are described further in US 2021/0154440, which is incorporated by reference in its entirety into this application.


However, an additional problem commonly arising with ultrasound imaging is identifying the proper anatomical targets. For example, some challenges faced by clinicians while performing ultrasound-assisted catheter insertion include differentiating veins from arteries, finding a particular target artery or vein, such as the superior vena cava (SVC), and generally finding a suitable target blood vessel for catheter insertion. Locating the correct targets may consume clinical time and resources, and errors could cause potentially dangerous complications. The disclosed system and methods can address these issues by providing a target identification system along with an ultrasound probe and a needle guidance system. Accordingly, the disclosed system can simultaneously determine the location of anatomical targets and medical instruments in relation to the ultrasound probe. The disclosed embodiments can also be applied in other ultrasound-based procedures and therapies, for example for biopsy, nerve blocks, or drainage procedures.



FIG. 4A illustrates an example ultrasound probe 406, and a target identification and needle guidance system 400 with a patient, according to some embodiments. In this example, system 400 includes both target identification and needle guidance functionalities. In particular, system 400 can perform target identification based on artificial intelligence (AI) methods, which can be trained to recognize targets based on the ultrasound image.


For example, the system may recognize specific blood vessels or classes of blood vessels, or recognize targets in ultrasound-based biopsy, nerve block, or drainage procedures, based on AI. In the case of blood vessel targets, system 400 may be trained using ultrasound data from correctly-identified blood vessels, and may classify or segment blood vessels based on features such as shape (e.g., eccentricity, interior angles, or aspect ratio), size (e.g., minor or major axis length, diameter, or perimeter), number and complexity of branches, and the like. In an example, the training may proceed by iteratively minimizing an error or loss function for predicted compared to actual classifications in a training set, as a function of model weights. Hyper-parameters of the model may further be tuned with respect to a validation data set, and the model may be further evaluated against a testing data set. The trained model may then be deployed for use in a clinical setting.


In a typical usage example, a medical practitioner may use ultrasound probe 406 and target identification and needle guidance system 400 while inserting a PICC into the superior vena cava of patient P. For example, the PICC can be advanced through patient P's right basilic vein, right axillary vein, right subclavian vein, right brachiocephalic vein, and into the superior vena cava. In this example, the system 400 can help the practitioner identify the superior vena cava in ultrasound images, while also helping guide the insertion needle toward the superior vena cava. In some embodiments, system 400 can additionally track the position of the PICC and confirm correct insertion of the PICC into the superior vena cava, as described herein below.


In an embodiment, the system 400 can assist the clinician to identify targets by providing the clinician with information characterizing potential targets, for example a predicted classification and a confidence level associated with the classification. Alternatively, system 400 can suggest one or more potential targets for the clinician's approval, or even select the target automatically.


The system can perform classification or segmentation of targets based on any AI or machine learning (ML) method, and is not limited by the present disclosure. In some embodiments, the AI or machine learning methods include supervised classification methods and/or image segmentation methods. For example, the system may use logistic regression or other linear classifiers, support vector machines, quadratic classifiers, kernel estimation, decision trees, neural networks, convolutional neural networks, or learning vector quantization. In a typical embodiment, system 400 uses a supervised learning method to perform image segmentation or classification of potential targets. For example, system 400 may identify specific targets, such as affirmatively identifying the superior vena cava or cavoatrial junction.


Alternatively, system 400 may provide a probability that a potential target is the user's desired target, such as an 80% or 90% probability that a particular blood vessel is the SVC. In another example, system 400 may classify targets more broadly, such as by identifying whether a blood vessel is a vein or artery.


The AI or machine learning methods may result in models that may be trained, as described above, before clinical usage based on a training set including ultrasound images of known targets. For example, the clinician may download training data and/or a trained and tuned model from a centralized repository. In such an example, the clinician may occasionally download new training data or a new trained model in order to recognize new targets or classes of targets, or to improve the recognition accuracy. Alternatively, the clinician can train system 400, for example by inputting information about the characteristics or identities of potential targets to system 400, or correcting any mistaken identifications of targets. In some embodiments, the system may alternatively use unsupervised learning methods to segment or classify potential targets, and is not limited by the present disclosure.


In addition to models that are only pre-trained (e.g., models trained prior to utilization), AI or machine learning methods may be utilized to generate models that may be dynamically trained, e.g., results of scoring of the model are utilized as training data.


By providing ultrasound-based target identification together with needle guidance, system 400 can provide accurate, safe, and expeditious methods for ensuring a needle is implanted in the intended target. System 400 can automate key steps of ultrasound-based therapies, such as identifying an anatomical target and advancing the needle during catheter insertion. This automation thereby reduces time and effort for clinicians, and may improve clinical accuracy and reduce the rate of errors and complications.


In addition, FIG. 4A shows the general relation of system 400 to the patient P during a procedure to place a catheter 412 into a vasculature of the patient P through a skin insertion site S. As in the example of FIG. 1 above, the catheter 412 includes proximal portion 414, which remains exterior to the patient, and distal portion 416, which resides within the vasculature after placement is complete. System 400 is employed to position a distal tip 418 of the catheter 412 in a desired position within the vasculature, with the aid of ultrasound imaging, magnetic needle guidance, and target identification.


Placement of a needle into a vasculature of a patient such as the patient P at the insertion site S is typically performed prior to insertion of the catheter 412. It should be appreciated the target identification and needle guidance system 400 has a variety of uses, including placement of needles in preparation for inserting the catheter 412 or other medical components into the body of patient P, such as X-ray or ultrasound markers, biopsy sheaths, nerve blocks, drainage, ablation components, bladder scanning components, vena cava filters, etc. For example, endoscopic ultrasound can be used to perform fine-needle aspiration biopsy, by guiding the placement of the biopsy needle. In this case, the disclosed ultrasound and target identification and needle guidance system 400 can be used to image and identify the target of the biopsy, and to guide the biopsy needle to the target. In another example, ultrasound can be used to guide needle placement in nerve block procedures. In this case, the disclosed system 400 can be used to image and identify the targeted nerve, and to guide the needle to the target. In a third example, ultrasound can be used to guide drainage procedures, e.g., to guide placement of a drainage catheter for draining collected fluid, such as from an abscess or a post-surgical collection of fluid. In this case, the disclosed system 400 can be used to image and identify the targeted abscess, and to guide the needle and/or catheter to the target.


Display screen 404 is integrated into the console 402 and is used to display information to the clinician, such as an ultrasound image of the targeted internal body portion obtained by the ultrasound probe 406. Indeed, the needle guidance process of target identification and needle guidance system 400 graphically guides insertion of a needle into a target (e.g., a blood vessel) of a patient by way of the display screen 404. In some examples, the display screen 404 may be separate from the console 402. In some embodiments, the display screen 404 is an LCD device.


In this example, display 404 shows the ultrasound image obtained by ultrasound probe 406. In particular, the image may include the insertion target of the needle, for example a target blood vessel for insertion of a PICC. System 400 can apply AI methods to identify this target by applying a supervised learning model to process features of the target in the ultrasound image, and thereby determine a classification of the target and an associated confidence score, as disclosed herein. In addition, the system can use magnetic needle guidance in concert with ultrasound, thereby informing the clinician when the needle reaches the vicinity of the target, for example by an ultrasound reflection or “flash” from the needle. This will be described further in the example of FIG. 4B below. In some embodiments, system 400 can further provide PICC tracking and PICC placement confirmation, as described in the examples of FIGS. 6A, 6B, and 7 below.



FIG. 4A additionally depicts a needle-based device, namely a catheter-insertion device 444, used to gain initial access to the vasculature of the patient P via the insertion site S to deploy the catheter 412. The needle of the catheter-insertion device 444 can cooperate with the needle guidance system 400 in enabling the needle guidance system 400 to detect the position, orientation, and advancement of the needle during an ultrasound-based placement procedure. Note that the needle of the catheter-insertion device 444 is merely one example of a needle or medical device that may be magnetized and used with the needle guidance system 400.


The needle magnetizer 408 is configured to magnetize all or a portion of a needle such as the needle of the catheter-insertion device 444 so as to enable the needle to be tracked during a placement procedure. The needle magnetizer 408 can include a single permanent magnet, a single electromagnet, a plurality of permanent magnets, a plurality of electromagnets, or a combination thereof within a body of needle magnetizer 408. For example, if a single permanent magnet, the permanent magnet can be a hollow cylindrical magnet disposed within the body of the needle magnetizer. If more than one permanent magnet, the permanent magnets can be disposed within the body of the needle magnetizer 408 in a multipole arrangement. Alternatively, the permanent magnets are annular magnets disposed within the body of the needle magnetizer in a stacked arrangement. In some embodiments, the needle magnetizer 408 can also be used to magnetize part or all of a catheter, such as the catheter's tip.



FIG. 4B illustrates an example console display 450 of an ultrasound, needle guidance, and target identification system, according to some embodiments. In this example, a clinician is inserting a needle into blood vessel 460, as shown in the ultrasound image 465 in console display 450. The ultrasound image 465 shows a depth-wise section of the patient's tissue, which is located under ultrasound probe 406 and corresponds to ultrasound beam 260 in the example of FIG. 2. The tissue section includes the target blood vessel 460. Depth scale 470 shows the advancing depth along the vertical dimension of the displayed tissue section. Display 450 also shows an outline of the needle shaft (solid lines), as the needle advances. The system can determine the location of the needle shaft via magnetic or optical needle tracking methods, as described herein. In addition, display 450 shows a projection of the needle's trajectory (dotted lines).


In this example, the system can apply artificial intelligence (AI) methods to identify target blood vessel 460. In a typical embodiment, the system uses a supervised learning method to segment or classify potential targets. For example, the system can use logistic regression or other linear classifiers, support vector machines, quadratic classifiers, kernel estimation, decision trees, neural networks, convolutional neural networks, or learning vector quantization. In some embodiments, the ultrasound image 465 can be filtered (e.g., by a linear filter such as a mean filter, or a non-linear filter such as an order statistic filter) in order to improve image quality. In some embodiments, segmentation of the image 465 into smaller blocks, or an active contour model can be applied. In some embodiments, pattern matching based on Markov random fields or neural networks can be applied.


In an embodiment, system 400 can classify potential targets based on an AI model. In applying the model, system 400 can evaluate the model based on features corresponding to observable properties of the potential targets, such as blood vessels, in image 465. For example, the features may include shape of a blood vessel (e.g., eccentricity, interior angles, or aspect ratio), size (e.g., minor or major axis length, diameter, or perimeter), number and complexity of branches, and the like. In an example, the model is a regression model, and system 400 evaluates the model as a probability function of the feature values, such as a weighted sum. In another example, the model is an artificial neural network such as a convolutional neural network, and system 400 evaluates the model by successively applying one or more hidden layers of neurons on the feature values. Each neuron can comprise a nonlinear activation function on its inputs. In particular, the first layer of neurons can apply nonlinear activation functions on all the feature values, and subsequent layers can apply nonlinear activation functions on all the outputs of the previous layers. Such an artificial neural network can adjust its weighted associations according to a learning rule based on an error value or another cost function. In the case of multiple hidden layers, system 400 can identify abstract features, based on the original set of input features, via deep learning.


Based on AI, the system may identify target 460, and/or may assist the clinician to identify targets by providing the clinician with identifying information characterizing potential targets. For example, the clinician may inform the system that the desired target is the patient's superior vena cava. In this example, the target identification logic may then determine that a particular blood vessel is the superior vena cava, and may accordingly flag this blood vessel for recommendation to the clinician as the insertion target. Alternatively, the target identification logic may automatically assume this identified blood vessel is the target, and may proceed to guide the clinician based on this assumption. In identifying a particular target, such as the SVC, the console may receive the ultrasound image 465, and may use image 465 as input into an AI-based model previously trained on ultrasound images of identified blood vessels. Evaluation of the AI-based model can result in a probability or confidence score that a blood vessel within the ultrasound image is the particular vessel requested by the user, e.g., the SVC.


In an alternative example, the clinician may not inform the system of a particular desired target. In this case, the target identification logic may then identify all potential targets present in the ultrasound image 465, and may flag all these potential targets for communication to the clinician. In an example, the target identification logic may rank the potential targets, for example by relevance or importance as an insertion target, by confidence of the classification, etc., and may communicate the recommendations to the clinician in the order of the ranking. In identifying all potential targets present in the ultrasound image 465, the console may receive image 465, and may use image 465 as input into an AI-based model trained with ultrasound images of identified blood vessels. The console can evaluate the AI-based model for each respective potential target in the received image 465, resulting in a respective probability or confidence score that each respective potential target is any of a set of known targets.


Once the potential targets have been identified and/or flagged for display, the console can direct the display 450 to output identifying information about these targets to the user. For example, display 450 may show a predicted classification of target 460, such as a blood vessel type, location, or name, and may show a confidence level associated with this classification. The predicted classification can be based on the AI model, e.g. by choosing the classification with the highest confidence score or a classification with a confidence score higher than a threshold value, such as 95%. Alternatively, system 400 can suggest one or more potential targets for the clinician's approval, or even select the target automatically. For example, if more than one potential target is present in the depth-wise section of the patient's tissue on display 450, system 400 can indicate the potential targets to the user, e.g. by highlighting them on display 450, or can provide the user with a menu of one or more suggested targets, or select the target automatically. Such suggestions can be based on the output of the AI model, for example based on having a high confidence score.


The system can display the predicted classification of target 460 and/or of a list of potential targets based on the probability or confidence score from evaluating the AI-based segmentation or classification model using image 465 as input. For example, if image 465 contains two potential targets, and the system has been trained to recognize 90 identified target blood vessels, the system can evaluate a classification probability for each of the two potential targets being each the 90 known targets.


In an example, the system may determine with high confidence that the first potential target is a particular blood vessel, such as the SVC, and may accordingly identify the first potential target as such via display 450. The system may identify the first potential target when the probability or confidence score from the AI-based model exceeds a threshold, for example greater than 97% confidence. In another example, the system may identify that the second potential target has a 75% chance of being the right renal vein, and a 25% chance of being the right suprarenal vein, and may accordingly display both of these probabilities via display 450. The system may display multiple probabilities when the scores from the AI-based model are lower than a threshold, for example lower than 97% confidence.


In some embodiments, the console can display the targets by overlaying a visual indicator (e.g., a circle or square, or another icon) on the target 460 in ultrasound image 465. In an embodiment, the display 450 can indicate whether the needle trajectory will intersect with the target 460, e.g. by highlighting the intersection. In some embodiments, the system can determine the target in part based on the needle's projected trajectory, for example the system may identify potential targets that are in, or close to, the needle's trajectory. For example, the AI model may incorporate one or more features measuring proximity of a potential target to the needle's trajectory. In this case, the AI model may be trained based on these proximity features, together with geometrical features of the potential targets. In an embodiment, the console can determine the distance from the needle's current location to the target, and display this distance. In an embodiment, the console can also estimate the time until the needle will advance to the target, and display this estimate. In various embodiments, the distance and/or time to the target can be estimated by a formula, and/or based on AI methods. For example, the system may be trained to estimate the time based on a training set of previous implantations.


Note that insertion can be performed out-of-plane (i.e., when the needle's trajectory is not in the plane of the ultrasound beam 260, as shown in the example of FIG. 2) or in-plane (i.e., when the needle's trajectory is in the plane of the ultrasound beam). During in-plane insertion, the solid lines show the outline of the needle's actual location. However, during out-of-plane insertion, because part or all of the needle may not actually be located in the plane of the ultrasound beam, the solid lines show a projection of the needle's outline into the plane of the beam. In both cases, the location of the needle may be determined via magnetic or optical needle tracking, as described herein, rather than directly from the ultrasound image. In the case of out-of-plane insertion, display 450 can also show an Intersection Window, which is the intersection between the needle shaft and the plane of the ultrasound beam.


During out-of-plane insertion, adjusting the angle of the needle up or down or changing the distance between the needle and probe may change the position of the Intersection Window. A flat angle is required for shallow insertion, while a steep angle is required for deeper insertion.


When the needle overlay advances through the Intersection Window, a needle flash appears in the Intersection Window. The needle flash is the reflection of the ultrasound energy from the needle. The presence of the needle flash is an indication the needle is at the target, because it shows the needle tip has reached the plane of the ultrasound beam.


In some embodiments, the system 100 of FIG. 1 may include an alternative-reality (AR) headset and corresponding AR anatomy representation logic. The AR anatomy representation logic may be configured to generate a virtual object that is presented in an AR environment such as a virtual reality, augmented reality or a mixed reality in which the virtual object overlays an image produced by the system 10 (e.g., an ultrasound image) or a series of images (e.g., a video) associated with a real-world setting (e.g., video including a patient or a body part of the patient, a physical structure, etc.). The virtual object may be visible through the AR headset or visible on a display of the console 102 without the AR headset. For example, the display 450 of FIG. 4B may be represented as a virtual object visible through the AR headset or visible on a display of the console 102. Alternatively, in lieu of the system 100 as described herein, it is contemplated that a magnetic field imaging system may be deployed. It is contemplated that components and functions of the console 102 described in reference to the system 100 should be understood to apply to the magnetic field imaging system or a similar system. Notwithstanding the foregoing, in some embodiments of the system 100, at least a portion of the functionality of the AR anatomy representation logic may be deployed within the AR headset in lieu of the console 102. Herein, the AR headset or another component operating in cooperation with the AR headset may serve as the console or performs the functions (e.g., processing) thereof.


With respect to “alternative reality,” the term alternative reality may pertain to virtual reality, augmented reality, and mixed reality unless context suggests otherwise. “Virtual reality” includes virtual content in a virtual setting, which setting can be a fantasy or a real-world simulation. “Augmented reality” and “mixed reality” include virtual content in a real-world setting such as a real depiction of a portion of a patient's body including the anatomical element. Augmented reality includes the virtual content in the real-world setting, but the virtual content is not necessarily anchored in the real-world setting. For example, the virtual content can be information overlying the real-world setting. The information can change as the real-world setting changes due to time or environmental conditions in the real-world setting, or the information can change as a result of a consumer of the augmented reality moving through the real-world setting; however, the information remains overlying the real-world setting. Mixed reality includes the virtual content anchored in every dimension of the real-world setting. For example, the virtual content can be a virtual object anchored in the real-world setting. The virtual object can change as the real-world setting changes due to time or environmental conditions in the real-world setting, or the virtual object can change to accommodate the perspective of a consumer of the mixed reality as the consumer moves through the real-world setting. The virtual object can also change in accordance with any interactions with the consumer or another real-world or virtual agent. Unless the virtual object is moved to another location in the real-world setting by the consumer of the mixed reality, or some other real-world or virtual agent, the virtual object remains anchored in the real-world setting. Mixed reality does not exclude the foregoing information overlying the real-world setting described in reference to augmented reality.


An alternative-reality (AR) headset and corresponding AR anatomy representation logic are described further in U.S. patent application Ser. No. 17/397,486, filed Aug. 9, 2021, which is incorporated by reference in its entirety into this application.



FIG. 5 shows a block diagram of a needle guidance and target identification system 500, in accordance with some embodiments. A console button interface 540 and control buttons 142 (see FIG. 1) included on the ultrasound probe 406 can be used to immediately call up a desired mode to the display screen 404 by the clinician to assist in the placement procedure. As shown, the ultrasound probe 406 further includes a button and memory controller 548 for governing button and ultrasound probe operation. The button and memory controller 548 can include non-volatile memory (e.g., EEPROM). The button and memory controller 548 is in operable communication with a probe interface 550 of the console 402, which includes a piezo input/output component 552 for interfacing with the probe piezoelectric array and a button and memory input/output component 554 for interfacing with the button and memory controller 548.


The console 402 houses a variety of components of the needle guidance and target identification system 500, and console 402 may take a variety of forms. A processor 522, including memory 524 such as random-access memory (“RAM”) and non-volatile memory (e.g., electrically erasable programmable read-only memory (“EEPROM”)) is included in the console 402 for controlling system function and executing various algorithms during operation of the needle guidance and target identification system 500.


For example, the console 402 is configured with target awareness logic 560 to instantiate a target identification process for recognizing an anatomical target (e.g., a blood vessel such as the SVC) of a patient. The target identification process involves processing of ultrasound images by the console 402 with artificial intelligence methods to identify anatomical targets, as disclosed herein. In particular, the target awareness logic 560 can apply image processing logic 562 to receive the ultrasound images and locate potential targets within the images. It can then apply target identification logic 564 to identify the potential targets, for example by determining the identity of the individual potential targets, or by matching one potential target to a desired target. Target identification logic 564 may use a supervised learning heuristic, as described in the example of FIGS. 4A and 4B above. Finally, target awareness logic 560 can apply target recommendation logic 566 to suggest one or more targets to the user. For example, target recommendation logic 566 can provide the clinician with information identifying and/or characterizing potential targets, for example a predicted classification of a potential target and a confidence level associated with the classification. Alternatively, target recommendation logic 566 can suggest one or more potential targets for the clinician's approval, or even select a particular target automatically.


In addition, the console 402 is configured to instantiate a needle guidance process for guiding insertion of a needle into the anatomical target. The needle guidance process uses a combination of ultrasound-imaging data and magnetic-field data received by the console 402 for guiding the insertion of the needle into the target of the patient.


A digital controller/analog interface 526 is also included with the console 402 and is in communication with both the processor 522 and other system components to govern interfacing between the ultrasound probe 406, the needle magnetizer 408, and RFID-tag reader 410 and other system components.


The RFID reader 410 is configured to read at least passive RFID tags included with needles or other medical devices (e.g., catheter-insertion device 144, as shown in FIG. 1), which enables the needle guidance and target identification system 500 to customize its operation to particular needle or medical-device parameters (e.g., type, size, etc.). For example, the needle-guidance process, once instantiated by the needle guidance and target identification system 500, is configured to adjust needle-guidance parameters in accordance with electronically stored information read from an RFID tag for a needle. In order to read such an RFID tag, the RFID-tag reader is configured to emit interrogating radio waves into the RFID tag and read electronically stored information from the RFID tag.


The needle guidance and target identification system 500 further includes ports 528 for connection with additional components such as optional components 530 including a printer, storage media, keyboard, etc. The ports 528 in some embodiments are universal serial bus (“USB”) ports, though other port types or a combination of port types can be used for this and the other interfaces connections described herein. A power connection 532 is included with the console 402 to enable operable connection to an external power supply 534. An internal power supply 536 (e.g., a battery) can also be employed either with or exclusive of the external power supply 534. Power management circuitry 538 is included with the digital controller/analog interface 526 of the console 402 to regulate power use and distribution.


In addition to tracking the location of the insertion needle, it also can be useful to track a medical device during insertion, such as the tip and/or stylet of a PICC, central venous catheter (CVC), or other catheter. For example, such tip tracking can improve the accuracy and efficiency of catheter placement, improve patient outcomes, and reduce the risk of complications or injuries.



FIG. 6A illustrates a target identification system 600 in use together with magnetic catheter tip tracking, in accordance with some embodiments. In some embodiments, the system 600 can use passive magnetic catheter tip tracking to locate and/or track tips of medical devices, such as PICCs, catheters, and/or stylets, and can thereby determine where such tips are located in relation to their target anatomical structures. In this example, PICC 412 is magnetically tracked by system 600 while being inserted into patient P. Alternatively, another catheter, or another medical instrument, may be tracked. Operational details of the magnetic tip tracking may function similarly to magnetic needle guidance, as in the example of FIG. 4B above. In particular, PICC 412, or the tip of stylet 635 disposed within PICC 412, can be magnetized. Sensor 610, which can be positioned on the chest of patient P, can detect the magnetic field from PICC 412.


In this example, sensor 610 of the tip-tracking system can detect the location of PICC 412, and can send signals representing the location of PICC 412 to console 402 to display on screen 404. In conjunction with the tip-tracking system, console 402 can also apply AI to recognize targets, such as target blood vessels. Display screen 404 can display an ultrasound image of target blood vessel 630 together with the location of PICC 412 and/or the insertion needle.


Initially the tip of PICC 412, or the tip of stylet 635 disposed within PICC 412, may be outside the range of sensor 610. As the stylet tip approaches the sensor 610, the console can display the location, orientation, and depth of the stylet 635 in relation to the sensor 610. The PICC 412 must be advanced slowly (e.g., 1 cm per second) and steadily in order to be tracked accurately. A clinician can continue to advance PICC 412 slowly until PICC 412 is inserted to the external measurements as required.



FIG. 6A also shows electrocardiography (ECG) probes 620 and 625. In an embodiment, during or following insertion of PICC 412 into the target blood vessel (or other target anatomical structure), the system can also be used to perform catheter tip confirmation based on ECG. Tip confirmation can confirm that the catheter tip has been inserted properly into the target. This will be discussed in the example of FIG. 7 below.


Fluoroscopic methods and confirmation of guidance for medical devices, such as guidewires and catheters, typically can expose clinicians and patients to potentially harmful X-ray radiation and contrast media. The disclosed magnetic methods for tip tracking in this example, optical methods (see FIG. 6B) for tip tracking, and ECG methods for tip confirmation (see FIG. 7) can accurately locate and confirm medical devices like guidewires and catheters without such exposure. Accordingly, by providing ultrasound-based target identification, needle guidance, and catheter tip tracking, system 600 provides safe and efficient methods for ensuring PICC 412 is correctly inserted in its intended target. The combination of target identification, needle guidance, and magnetic or optical catheter tracking can reduce clinical time and effort, and may improve patient outcomes.



FIG. 6B illustrates a target identification system in use together with optical catheter tip tracking, in accordance with some embodiments. In some cases, magnetic and electromagnetic means for tracking the tips of the medical devices are prone to interference. Accordingly, optical tip-tracking methods, such as in the example of FIG. 3 above, can also be used in conjunction with the disclosed system and methods. As for the case of magnetic catheter tracking, the combination of target identification, needle guidance, and optical catheter tracking can reduce clinical time and effort, and improve outcomes.


In this example, light detector 660 of the optical tip-tracking system can detect the location of light-emitting stylet 635, and send a signal representing the location of light-emitting stylet 635 to console 402 to display on screen 404. In conjunction with the optical tip-tracking system, console 402 can also apply AI to recognize targets, such as target blood vessels. Accordingly, display screen 404 displays an ultrasound image of target blood vessel 630 together with the location 640 of light-emitting stylet 635.


A clinician can place light detector 660 of the optical tip-tracking system over the patient P, for example, over the patient's chest, as shown. A sterile drape 801 can also be draped over both the patient P and the light detector 660. The clinician can pierce the sterile drape 801 with a drape-piercing connector of the light-emitting stylet 635. The clinician can advance the catheter 412 from an insertion site to a destination within a vasculature of the patient P while emitting light from the light source (e.g., one or more LEDs) and detecting the light with photodetectors of light detector 660. The light source of stylet 635 may distally extend beyond a distal end of the catheter 412.


When the catheter 412 is a PICC, the PICC can be advanced with the light-emitting stylet 635 disposed therein through a right basilic vein, a right axillary vein, a right subclavian vein, a right brachiocephalic vein, and into an SVC of patient P. When the catheter 412 is a CVC, the CVC can be advanced with stylet 635 disposed therein through a right internal jugular vein, a right brachiocephalic vein, and into an SVC. The clinician can view the display screen 404 of the optical tip-tracking system while the display screen 404 graphically tracks the distal-end portion of the light-emitting stylet 635 through the vasculature of the patient P. The clinician can cease to advance the catheter 412 through the vasculature of the patient P after determining the distal-end portion of light-emitting stylet 635 is located at the destination by way of the display screen 404. In particular, the clinician can recognize the destination with the aid of the target identification system. For example, the target identification system can provide the clinician with information identifying and/or characterizing potential targets, for example a predicted classification of target blood vessel 630 and a confidence level associated with the classification. Alternatively, system 400 can suggest one or more potential targets for the clinician's approval, or even select the target (such as target blood vessel 630) automatically.



FIG. 7 shows catheter tip confirmation based on electrocardiography (ECG), in accordance with some embodiments. In some embodiments, the system can use changes in an adult patient's measured ECG signal as an alternative to methods of PICC tip placement confirmation, like chest X-ray and fluoroscopy. In particular, ECG-based tip confirmation can accurately confirm the proper implantation of a catheter or PICC in a target in proximity to the cavoatrial junction, providing high confidence that the catheter or PICC is placed within 1 cm of the cavoatrial junction. ECG-based tip confirmation also reduces the exposure of both patient P and the clinician to harmful radiation, such as X-rays, and reduces clinical costs and time associated with taking such images. Moreover, clinicians can use ultrasound probe 406 for vessel access during insertion, and then easily transition to ECG to confirm catheter placement without the need for additional equipment. Because delays in therapy may negatively impact clinical care, the disclosed system potentially improves patient outcomes by expeditiously readying patients for treatment. Accordingly, the combination of target identification, needle guidance, and catheter placement confirmation can reduce clinical time and effort, as well as improving patient outcomes.


The cavoatrial junction is the point at which the superior vena cava meets the right atrium. At this intersection, blood volume and turbulence are high, which creates a favorable location for medicinal delivery to the body. Accordingly, by helping to position the PICC tip in proximity to the cavoatrial junction, the system can facilitate compliance with guidelines for proper PICC placement, which recommend placing PICC tips at the lower third of the superior vena cava, close to the cavoatrial junction. In patients where alterations of cardiac rhythm change the presentation of the P-wave, (e.g., in atrial fibrillation, atrial flutter, severe tachycardia, and pacemaker driven rhythm), an additional confirmation method may be required.


ECG probes 620 and 625 can be respectively placed on the patient's lower right shoulder and lower left side along the mid-axillary line (see FIG. 6A), with good skin-electrode contact. The system can display an ECG signal, such as signals 700 and 710 in this example, which is detected by intravascular electrodes and the body electrodes, such as ECG probes 620 and 625 in the example of FIG. 6A above. In various embodiments, the system may display signals 700 and 710 on display 404 of console 402, or on another display. In an embodiment, display 404 can provide simultaneous views of both catheter tip tracking and ECG signals 700 and 710. In an embodiment, the system displays two distinct ECG waveforms: waveform 700 is an external rhythm that establishes the baseline ECG, and waveform 710 is an intravascular waveform that shows changes in P-wave amplitude as the catheter tip approaches the cavoatrial junction.


In patients with a distinct P-wave ECG signal, the P-wave will increase in amplitude as the catheter approaches the cavoatrial junction. In an embodiment, the system can highlight the P-wave on the displayed signals. As the catheter advances into the right atrium, the P-wave in displayed signal 710 will decrease in amplitude and may become biphasic or inverted.


The clinician can observe the P-wave in order to confirm that the PICC has been properly placed. Both waveforms 700 and 710 can be frozen on the display 404 reference screen to compare changes in P-wave magnitude over time. To document proper catheter placement, the clinician can print or save a procedural record on the system.



FIG. 8 shows a flowchart of an example method 800 for ultrasound imaging, target identification, and needle guidance, according to some embodiments. Each block illustrated in FIG. 8 represents an operation performed in the method 800 of ultrasound imaging, target identification, and needle guidance. The method can be performed by a target identification and needle guidance system, such as target identification and needle guidance system 400 in the examples of FIGS. 4A and 5 above. In an embodiment, the method can be performed by a console of such a system, such as console 402 in the examples of FIGS. 4A and 5 above.


As an initial step in the method 800, the needle can be magnetized (block 810). For example, a needle magnetizer such as needle magnetizer 408 may be configured to magnetize all or a portion of a needle such as the needle of the catheter-insertion device so as to enable the needle to be tracked during a placement procedure. The needle magnetizer can include a single permanent magnet, a single electromagnet, a plurality of permanent magnets, a plurality of electromagnets, or a combination thereof within a body of needle magnetizer.


Next, the console receives an ultrasound image and magnetic field measurements associated with the needle (block 820). The console can receive the ultrasound image from an ultrasound probe, such as ultrasound probe 406 in the example of FIG. 4A above. The console can receive the magnetic field measurements from magnetic sensors, such as magnetic sensors 258 in the example of FIG. 2 above.


Next, the console can process the ultrasound image and magnetic field measurements to determine the needle's position and orientation, and to identify the target (block 830). Based on the processed magnetic field measurements, the console can display the location, orientation, and depth of the needle in relation to the ultrasound probe. Processing the ultrasound image to identify the target will be described further in FIG. 9 below.


Next, the console can identify a target based on the ultrasound image (block 840). In a typical example, the target may be a target blood vessel for the eventual implantation of a medical device, such as a PICC or other catheter. However, any other type of target is possible, and is not limited by the present disclosure. In some embodiments, any of the AI-methods disclosed herein may be utilized to identify a target. As one example, any of the AI-methods disclosed herein may be utilized to identify a target insertion vessel (e.g., a vein) immediately prior to insertion of the needle into a patient. In such an example, the method 900 of FIG. 9 may be utilized. In an alternative example, any of the AI-methods disclosed herein may be utilized to identify a target during advancement of the needle within a patient's vasculature. For example, the ultrasound probe 406 may be utilized to obtain ultrasound images with the intention that the images track the advancement of the needle through the vasculature, e.g., along the arm of patient P as seen in FIG. 4A. Upon receipt of the ultrasound images, the target awareness logic 560, processing on the console 402, may analyze the ultrasound images using an AI-based model trained to identify particular targets, such as the SVC. Thus, as the console 402 continues to receive ultrasound images from the probe 406, the display 404 may provide a graphic indicating a confidence that the SVC is present within the ultrasound image rendered on the display 404. As an example, the target awareness logic 560, and particularly the target identification logic 564, may identify the SVC within an ultrasound image with at least a certain confidence score that was derived via processing of an AI-based model (e.g., by providing the ultrasound image as an input to the model) and, in response, generate an alert to the clinician that is rendered on the display 404 and/or render an indicator around the entry to the SVC. For example, such a rendering may include an intersection window around the entry to the SVC, similar to the intersection window illustrated in FIG. 4B.


Finally, the console can guide insertion of the magnetized needle into the target vessel (block 850). For example, the console can continue to update the displayed location, orientation, and depth of the needle as the clinician continues to advance the needle toward the target. The console can also continue to update the displayed ultrasound image of the patient's tissue containing the target. The console can also continue to display and identify the target via any of the AI-methods disclosed herein. In some embodiments, the console may also display a magnetically and/or optically tracked location of the catheter tip, such as a PICC tip, while the clinician advances the catheter. Finally, in some embodiments, the console can receive and display an ECG signal from ECG probes, which can enable the clinician to confirm correct insertion of the catheter.



FIG. 9 shows a flowchart of an example method 900 for target identification, according to some embodiments. Each block illustrated in FIG. 9 represents an operation performed in the method 900 of target identification. In various embodiments, method 900 can be performed by a target identification and needle guidance system, such as target identification and needle guidance system 400 in the examples of FIGS. 4A and 5 above, by a console of such a system, such as console 402 in the examples of FIGS. 4A and 5 above, and/or by a target awareness logic, such as target awareness logic 560 in the example of FIG. 5 above.


As an initial step in the method 900, the console and/or target awareness logic can obtain an ultrasound image (block 910). In an embodiment, the console and/or target awareness logic can receive the ultrasound image from an ultrasound probe, such as ultrasound probe 406 in the example of FIG. 4A above.


Next, the image processing logic of the target awareness logic can process the image with a supervised learning heuristic (block 920). The image processing logic can apply a supervised learning method to segment or classify potential targets within the ultrasound image data. For example, the image processing logic can use logistic regression or other linear classifiers, support vector machines, quadratic classifiers, kernel estimation, decision trees, neural networks, convolutional neural networks, or learning vector quantization. In the cases of logistic regression or other linear classifiers, the model may determine a predicted probability as a weighted sum of feature values. For a neural network or other deep learning process, the model may transform the features via one or more layers of non-linear functions, such as artificial neurons, to determine a final predicted probability.


In an example where the targets are blood vessels, the system may be trained using ultrasound data from correctly-identified blood vessels, and may classify or segment blood vessels based on features such as shape (e.g., eccentricity, interior angles, or aspect ratio), size (e.g., minor or major axis length, diameter, or perimeter), number and complexity of branches, and the like. In an embodiment, the training may proceed by iteratively minimizing an error or loss function as a function of model weights, for predicted classifications compared to actual classifications in a training data set. Hyper-parameters of the model may further be tuned with respect to a validation data set, and the model may be further evaluated against a testing data set. The trained model may then be deployed for use in a clinical setting.


Next, the target identification logic of the target awareness logic can identify one or more insertion targets based on the processing (block 930). Based on the classification or segmentation from AI, the target identification logic may determine the specific identify of one or more potential targets. For example, the clinician may inform the system that the desired target is the patient's superior vena cava. In this example, the target identification logic may then determine that a particular blood vessel is the superior vena cava, and may accordingly flag this blood vessel for recommendation to the clinician as the insertion target. Alternatively, the target identification logic may automatically assume this identified blood vessel is the target, and may proceed to guide the clinician based on this assumption.


In another alternative example, the clinician may not inform the system of a particular desired target. In this alternative example, the target identification logic may then identify all potential targets present in the ultrasound image, and may flag all these potential targets for communication to the clinician. In an example, the target identification logic may rank the potential targets, for example by relevance or importance as an insertion target, by confidence of the classification, etc., and may communicate the recommendations to the clinician in the order of the ranking. In a final example, the target identification logic may simply attempt to identify all blood vessels, and inform the clinician of the best estimate classification of each blood vessel, as well as any other possible classifications and the confidence level of each classification.


Finally, the target recommendation logic of the target awareness logic can recommend an insertion target (block 940). Based on AI, the target recommendation logic may identify the target to the clinician, and/or may assist the clinician to identify targets by providing the clinician with information about potential targets. For example, the console display may show a predicted classification of the target, such as a blood vessel type, location, or name, and may show a confidence level associated with this classification. Alternatively, the console can suggest one or more potential targets for the clinician's approval, or even select the target automatically. For example, if more than one potential target is present in the ultrasound image, the console can indicate a set of potential targets to the user, e.g. by highlighting them on the display. Alternatively, the console can provide the user with a menu of one or more suggested targets, or select the target automatically.


While some particular embodiments have been disclosed herein, and while the particular embodiments have been disclosed in some detail, it is not the intention for the particular embodiments to limit the scope of the concepts provided herein. Additional adaptations and/or modifications can appear to those of ordinary skill in the art, and, in broader aspects, these adaptations and/or modifications are encompassed as well. Accordingly, departures may be made from the particular embodiments disclosed herein without departing from the scope of the concepts provided herein.

Claims
  • 1. A target recognition and needle guidance system, comprising: a console including a processor and non-transitory, computer-readable medium having stored thereon logic that, when executed by the processor, is configured to initiate: a target recognition process configured to (1) receive user input indicating a desired target, and (ii) identify a blood vessel of a patient corresponding to the desired target by applying a machine learning model to features of candidate targets in ultrasound-imaging data, wherein the blood vessel identified as the desired target is associated with a highest confidence score determined by the machine learning model, anda needle guidance process for guiding insertion of a needle into the blood vessel identified as the desired target by the machine learning model, wherein the machine learning model determines a proximity of a trajectory of the needle to the blood vessel identified as the desired target based on magnetic information about the needle, and wherein a catheter is implanted into the blood vessel identified as the desired target following the insertion of the needle; andan ultrasound probe configured to provide to the console electrical signals corresponding to the ultrasound-imaging data, the ultrasound probe including: an array of transducers configured to convert reflected ultrasound signals from the patient into an ultrasound-imaging portion of the electrical signals, and a magnetic sensor configured to provide to the console second electrical signals corresponding to the magnetic information about the needle.
  • 2. The target recognition and needle guidance system of claim 1, wherein the machine learning model includes a supervised learning model trained based on previously-identified training targets.
  • 3. The target recognition and needle guidance system of claim 1, wherein each of the candidate targets in the ultrasound-imaging data is associated with a confidence score determined by the machine learning model.
  • 4. The target recognition and needle guidance system of claim 1, wherein the machine learning model includes one or more of the following supervised learning models: logistic regression,other linear classifiers,support vector machines,quadratic classifiers,kernel estimation,decision trees,neural networks, orlearning vector quantization.
  • 5. The target recognition and needle guidance system of claim 4, wherein the machine learning model includes the logistic regression, and wherein a confidence score of a first candidate target is determined as a weighted sum of the features of the candidate targets.
  • 6. The target recognition and needle guidance system of claim 4, wherein the machine learning model includes the neural networks, and wherein a confidence score of a first candidate target is determined based on one or more nonlinear activation functions of the features of the candidate targets.
  • 7. The target recognition and needle guidance system of claim 2, wherein the supervised learning model is trained by iteratively minimizing an error, for classifications predicted by a candidate model compared to actual classifications of the previously-identified training targets.
  • 8. The target recognition and needle guidance system of claim 1, wherein the blood vessel comprises a superior vena cava.
  • 9. The target recognition and needle guidance system of claim 1, wherein the ultrasound probe provides a first ultrasound image prior to insertion of the needle, and wherein the machine learning model is based in part on the trajectory of the needle.
  • 10. The target recognition and needle guidance system of claim 1, wherein the features of the candidate targets include one or more of the following features: a shape of a candidate blood vessel of the candidate targets in the ultrasound-imaging data,a size of the candidate blood vessel,a number of branches of the candidate blood vessel, ora complexity of branches of the candidate blood vessel.
  • 11. The target recognition and needle guidance system of claim 1, wherein: recognizing the blood vessel identified as the desired target comprises providing identifying information characterizing one or more candidate targets to a user, wherein the identifying information comprises a predicted classification of one or more potential targets and a confidence level associated with the predicted classification.
  • 12. The target recognition and needle guidance system of claim 1, further comprising a display screen configured for graphically guiding the insertion of the needle into the blood vessel identified as the desired target.
  • 13. The target recognition and needle guidance system of claim 1, further comprising: one or more electrocardiogramabes configured to provide to the console electrical signals corresponding to electrocardiogramnformation,wherein the logic, when executed by the processor, is further configured to initiate a catheter tip placement confirmation process based on the electrocardiogramnformation.
  • 14. The target recognition and needle guidance system of claim 1, further comprising: a needle magnetizer incorporated into the console configured to magnetize the needle.
  • 15. The target recognition and needle guidance system of claim 14, wherein the needle guidance process for guiding insertion of the magnetized needle into the blood vessel identified as the desired target uses a combination of the ultrasound-imaging data and magnetic-field data received by the console.
  • 16. The target recognition and needle guidance system of claim 15, wherein the ultrasound probe is configured to provide to the console the magnetic-field data, the ultrasound probe further including an array of magnetic sensors configured to convert magnetic signals from the magnetized needle into a magnetic-field portion of the electrical signals.
  • 17. A target recognition and needle guidance system, comprising: a console including a processor and non-transitory, computer-readable medium having stored thereon logic that, when executed by the processor, is configured to initiate: a target recognition process configured to (1) receive user input indicating a desired target, and (ii) identify a blood vessel of a patient corresponding to the desired target by applying a machine learning model to features of candidate targets in ultrasound-imaging data, wherein the blood vessel identified as the desired target is associated with a highest confidence score determined by the machine learning model, and a needle guidance process for guiding insertion of a needle into the blood vessel identified as the desired target by the machine learning model, and wherein a catheter is implanted into the blood vessel identified as the desired target following the insertion of the needle, wherein the machine learning model determines a proximity of a trajectory of the needle to the blood vessel identified as the desired target based on optical information about the catheter; an ultrasound probe configured to provide to the console electrical signals corresponding to the ultrasound-imaging data, the ultrasound probe including: an array of transducers configured to convert reflected ultrasound signals from the patient into an ultrasound-imaging portion of the electrical signals; and a magnetic sensor configured to provide to the console second electrical signals corresponding to the optical information about the catheter.
PRIORITY

This application claims the benefit of priority to U.S. Patent Application No. 63/117,883, filed Nov. 24, 2020, which is incorporated by reference in its entirety into this application.

US Referenced Citations (379)
Number Name Date Kind
3697917 Orth et al. Oct 1972 A
5148809 Biegeleisen-Knight et al. Sep 1992 A
5181513 Touboul et al. Jan 1993 A
5325293 Dorne Jun 1994 A
5349865 Kavli et al. Sep 1994 A
5441052 Miyajima Aug 1995 A
5549554 Miraki Aug 1996 A
5573529 Haak et al. Nov 1996 A
5758650 Miller et al. Jun 1998 A
5775322 Silverstein et al. Jul 1998 A
5879297 Haynor et al. Mar 1999 A
5897503 Lyon et al. Apr 1999 A
5908387 LeFree et al. Jun 1999 A
5967984 Chu et al. Oct 1999 A
5970119 Hofmann Oct 1999 A
6004270 Urbano et al. Dec 1999 A
6019724 Gronningsaeter et al. Feb 2000 A
6068599 Saito et al. May 2000 A
6074367 Hubbell Jun 2000 A
6129668 Haynor et al. Oct 2000 A
6132379 Patacsil et al. Oct 2000 A
6216028 Haynor et al. Apr 2001 B1
6233476 Strommer et al. May 2001 B1
6245018 Lee Jun 2001 B1
6263230 Haynor et al. Jul 2001 B1
6375615 Flaherty et al. Apr 2002 B1
6436043 Bonnefous Aug 2002 B2
6498942 Esenaliev et al. Dec 2002 B1
6503205 Manor et al. Jan 2003 B2
6508769 Bonnefous Jan 2003 B2
6511458 Milo et al. Jan 2003 B2
6524249 Moehring et al. Feb 2003 B2
6543642 Milliorn Apr 2003 B1
6554771 Buil et al. Apr 2003 B1
6592520 Peszynski et al. Jul 2003 B1
6592565 Twardowski Jul 2003 B2
6601705 Molina et al. Aug 2003 B2
6612992 Hossack et al. Sep 2003 B1
6613002 Clark et al. Sep 2003 B1
6623431 Sakuma et al. Sep 2003 B1
6641538 Nakaya et al. Nov 2003 B2
6647135 Bonnefous Nov 2003 B2
6687386 Ito et al. Feb 2004 B1
6733458 Steins et al. May 2004 B1
6749569 Pellegretti Jun 2004 B1
6754608 Svanerudh et al. Jun 2004 B2
6755789 Stringer et al. Jun 2004 B2
6840379 Franks-Farah et al. Jan 2005 B2
6857196 Dalrymple Feb 2005 B2
6979294 Selzer et al. Dec 2005 B1
7074187 Selzer et al. Jul 2006 B2
7244234 Ridley et al. Jul 2007 B2
7359554 Klingensmith et al. Apr 2008 B2
7534209 Abend et al. May 2009 B2
7599730 Hunter et al. Oct 2009 B2
7637870 Flaherty et al. Dec 2009 B2
7681579 Schwartz Mar 2010 B2
7691061 Hirota Apr 2010 B2
7699779 Sasaki et al. Apr 2010 B2
7720520 Willis May 2010 B2
7727153 Fritz et al. Jun 2010 B2
7734326 Pedain et al. Jun 2010 B2
7831449 Ying et al. Nov 2010 B2
7905837 Suzuki Mar 2011 B2
7925327 Weese Apr 2011 B2
7927278 Selzer et al. Apr 2011 B2
8014848 Birkenbach et al. Sep 2011 B2
8038619 Steinbacher Oct 2011 B2
8060181 Rodriguez Ponce et al. Nov 2011 B2
8075488 Burton Dec 2011 B2
8090427 Eck et al. Jan 2012 B2
8105239 Specht Jan 2012 B2
8172754 Watanabe et al. May 2012 B2
8175368 Sathyanarayana May 2012 B2
8200313 Rambod et al. Jun 2012 B1
8211023 Swan et al. Jul 2012 B2
8228347 Beasley et al. Jul 2012 B2
8298147 Huennekens et al. Oct 2012 B2
8303505 Webler et al. Nov 2012 B2
8323202 Roschak et al. Dec 2012 B2
8328727 Miele et al. Dec 2012 B2
8388541 Messerly et al. Mar 2013 B2
8409103 Grunwald et al. Apr 2013 B2
8449465 Nair et al. May 2013 B2
8553954 Saikia Oct 2013 B2
8556815 Pelissier et al. Oct 2013 B2
8585600 Liu et al. Nov 2013 B2
8622913 Dentinger et al. Jan 2014 B2
8706457 Hart et al. Apr 2014 B2
8727988 Flaherty et al. May 2014 B2
8734357 Taylor May 2014 B2
8744211 Owen Jun 2014 B2
8754865 Merritt et al. Jun 2014 B2
8764663 Smok et al. Jul 2014 B2
8781194 Malek et al. Jul 2014 B2
8781555 Burnside et al. Jul 2014 B2
8790263 Randall et al. Jul 2014 B2
8849382 Cox et al. Sep 2014 B2
8939908 Suzuki et al. Jan 2015 B2
8961420 Zhang Feb 2015 B2
9022940 Meier May 2015 B2
9138290 Hadjicostis Sep 2015 B2
9199082 Yared et al. Dec 2015 B1
9204858 Pelissier et al. Dec 2015 B2
9220477 Urabe et al. Dec 2015 B2
9295447 Shah Mar 2016 B2
9320493 Visveshwara Apr 2016 B2
9357980 Toji et al. Jun 2016 B2
9364171 Harris et al. Jun 2016 B2
9427207 Sheldon et al. Aug 2016 B2
9445780 Hossack et al. Sep 2016 B2
9456766 Cox et al. Oct 2016 B2
9456804 Tamada Oct 2016 B2
9468413 Hall et al. Oct 2016 B2
9492097 Wilkes et al. Nov 2016 B2
9521961 Silverstein et al. Dec 2016 B2
9554716 Burnside et al. Jan 2017 B2
9582876 Specht Feb 2017 B2
9610061 Ebbini et al. Apr 2017 B2
9636031 Cox May 2017 B2
9649037 Lowe et al. May 2017 B2
9649048 Cox et al. May 2017 B2
9702969 Hope Simpson et al. Jul 2017 B2
9715757 Ng et al. Jul 2017 B2
9717415 Cohen et al. Aug 2017 B2
9731066 Liu et al. Aug 2017 B2
9814433 Benishti et al. Nov 2017 B2
9814531 Yagi et al. Nov 2017 B2
9861337 Patwardhan et al. Jan 2018 B2
9895138 Sasaki Feb 2018 B2
9913605 Harris et al. Mar 2018 B2
9949720 Southard et al. Apr 2018 B2
10043272 Forzoni et al. Aug 2018 B2
10449330 Newman et al. Oct 2019 B2
10524691 Newman et al. Jan 2020 B2
10751509 Misener Aug 2020 B2
11564861 Gaines Jan 2023 B1
20020038088 Imran et al. Mar 2002 A1
20030047126 Tomaschko Mar 2003 A1
20030106825 Molina et al. Jun 2003 A1
20030120154 Sauer et al. Jun 2003 A1
20030125629 Ustuner Jul 2003 A1
20030135115 Burdette Jul 2003 A1
20030149366 Stringer et al. Aug 2003 A1
20040015080 Kelly et al. Jan 2004 A1
20040055925 Franks-Farah et al. Mar 2004 A1
20040197267 Black Oct 2004 A1
20050000975 Carco et al. Jan 2005 A1
20050049504 Lo et al. Mar 2005 A1
20050075597 Vournakis et al. Apr 2005 A1
20050165299 Kressy et al. Jul 2005 A1
20050251030 Azar et al. Nov 2005 A1
20050267365 Sokulin et al. Dec 2005 A1
20060004290 Smith et al. Jan 2006 A1
20060013523 Childlers et al. Jan 2006 A1
20060015039 Cassidy et al. Jan 2006 A1
20060020204 Serra et al. Jan 2006 A1
20060047617 Bacioiu Mar 2006 A1
20060079781 Germond-Rouet et al. Apr 2006 A1
20060184029 Haim Aug 2006 A1
20060210130 Germond-Rouet et al. Sep 2006 A1
20060241463 Shau et al. Oct 2006 A1
20070043341 Anderson et al. Feb 2007 A1
20070049822 Bunce et al. Mar 2007 A1
20070073155 Park et al. Mar 2007 A1
20070167738 Timinger Jul 2007 A1
20070199848 Ellswood et al. Aug 2007 A1
20070239005 Ogasawara Oct 2007 A1
20070239120 Brock et al. Oct 2007 A1
20070249911 Simon Oct 2007 A1
20070287886 Saadat Dec 2007 A1
20080021322 Stone et al. Jan 2008 A1
20080033293 Beasley et al. Feb 2008 A1
20080033759 Finlay Feb 2008 A1
20080051657 Rold Feb 2008 A1
20080108930 Weitzel et al. May 2008 A1
20080125651 Watanabe et al. May 2008 A1
20080146915 McMorrow Jun 2008 A1
20080177186 Slater et al. Jul 2008 A1
20080221425 Olson et al. Sep 2008 A1
20080269605 Nakaya Oct 2008 A1
20080294037 Richter Nov 2008 A1
20080300491 Bonde et al. Dec 2008 A1
20090012399 Sunagawa et al. Jan 2009 A1
20090012401 Steinbacher Jan 2009 A1
20090074280 Lu Mar 2009 A1
20090124903 Osaka May 2009 A1
20090137887 Shariati et al. May 2009 A1
20090137907 Takimoto et al. May 2009 A1
20090143672 Harms et al. Jun 2009 A1
20090143684 Cermak et al. Jun 2009 A1
20090156926 Messerly et al. Jun 2009 A1
20090281413 Boyden Nov 2009 A1
20090306509 Pedersen et al. Dec 2009 A1
20100010348 Halmann Jan 2010 A1
20100168576 Poland et al. Jul 2010 A1
20100211026 Sheetz et al. Aug 2010 A2
20100249598 Smith et al. Sep 2010 A1
20100286515 Gravenstein et al. Nov 2010 A1
20100312121 Guan Dec 2010 A1
20100324423 El-Aklouk et al. Dec 2010 A1
20110002518 Ziv-Ari et al. Jan 2011 A1
20110026796 Hyun et al. Feb 2011 A1
20110071404 Schmitt et al. Mar 2011 A1
20110074244 Osawa Mar 2011 A1
20110087107 Lindekugel et al. Apr 2011 A1
20110166451 Blaivas et al. Jul 2011 A1
20110282188 Burnside Nov 2011 A1
20110295108 Cox et al. Dec 2011 A1
20110313293 Lindekugel et al. Dec 2011 A1
20120136242 Qi May 2012 A1
20120136256 Nozaki et al. May 2012 A1
20120165679 Orome et al. Jun 2012 A1
20120179038 Meurer et al. Jul 2012 A1
20120179042 Fukumoto et al. Jul 2012 A1
20120179044 Chiang et al. Jul 2012 A1
20120197132 O'Connor Aug 2012 A1
20120220865 Brown et al. Aug 2012 A1
20120277576 Lui Nov 2012 A1
20130041250 Pelissier et al. Feb 2013 A1
20130102889 Southard Apr 2013 A1
20130131499 Chan et al. May 2013 A1
20130131502 Blaivas et al. May 2013 A1
20130144166 Specht et al. Jun 2013 A1
20130150724 Blaivas et al. Jun 2013 A1
20130188832 Ma et al. Jul 2013 A1
20130197367 Smok et al. Aug 2013 A1
20130218024 Boctor et al. Aug 2013 A1
20130323700 Samosky Dec 2013 A1
20130338503 Cohen et al. Dec 2013 A1
20130338508 Nakamura et al. Dec 2013 A1
20130345566 Weitzel et al. Dec 2013 A1
20140005530 Liu et al. Jan 2014 A1
20140031694 Solek Jan 2014 A1
20140066779 Nakanishi Mar 2014 A1
20140073976 Fonte et al. Mar 2014 A1
20140100440 Cheline et al. Apr 2014 A1
20140114194 Kanayama et al. Apr 2014 A1
20140170620 Savitsky et al. Jun 2014 A1
20140180098 Flaherty et al. Jun 2014 A1
20140180116 Lindekugel et al. Jun 2014 A1
20140188133 Misener Jul 2014 A1
20140188440 Donhowe et al. Jul 2014 A1
20140276059 Sheehan Sep 2014 A1
20140276069 Amble et al. Sep 2014 A1
20140276081 Tegels Sep 2014 A1
20140276085 Miller Sep 2014 A1
20140276690 Grace Sep 2014 A1
20140343431 Vajinepalli et al. Nov 2014 A1
20140357994 Jin et al. Dec 2014 A1
20150005738 Blacker Jan 2015 A1
20150011887 Ahn et al. Jan 2015 A1
20150065916 Maguire et al. Mar 2015 A1
20150073279 Cai et al. Mar 2015 A1
20150112200 Oberg et al. Apr 2015 A1
20150141821 Yoshikawa et al. May 2015 A1
20150190111 Fry Jul 2015 A1
20150209003 Halmann et al. Jul 2015 A1
20150209113 Burkholz et al. Jul 2015 A1
20150209510 Burkholz et al. Jul 2015 A1
20150209526 Matsubara et al. Jul 2015 A1
20150257735 Ball et al. Sep 2015 A1
20150272448 Fonte et al. Oct 2015 A1
20150282890 Cohen et al. Oct 2015 A1
20150294497 Ng et al. Oct 2015 A1
20150297097 Matsubara et al. Oct 2015 A1
20150342572 Tahmasebi Maraghoosh et al. Dec 2015 A1
20150359520 Shan et al. Dec 2015 A1
20150359991 Dunbar Dec 2015 A1
20160000367 Lyon Jan 2016 A1
20160000399 Halmann et al. Jan 2016 A1
20160026894 Nagase Jan 2016 A1
20160029995 Navratil et al. Feb 2016 A1
20160081674 Bagwan et al. Mar 2016 A1
20160113517 Lee et al. Apr 2016 A1
20160113699 Sverdlik et al. Apr 2016 A1
20160120607 Sorotzkin et al. May 2016 A1
20160157831 Kang Jun 2016 A1
20160166232 Merritt Jun 2016 A1
20160202053 Walker et al. Jul 2016 A1
20160211045 Jeon et al. Jul 2016 A1
20160213398 Liu Jul 2016 A1
20160220124 Grady et al. Aug 2016 A1
20160259992 Knodt et al. Sep 2016 A1
20160278869 Grunwald Sep 2016 A1
20160287214 Ralovich et al. Oct 2016 A1
20160296208 Sethuraman et al. Oct 2016 A1
20160374644 Mauldin, Jr. et al. Dec 2016 A1
20170014105 Chono Jan 2017 A1
20170020561 Cox et al. Jan 2017 A1
20170079548 Silverstein et al. Mar 2017 A1
20170103534 Park et al. Apr 2017 A1
20170143312 Hedlund et al. May 2017 A1
20170164923 Matsumoto Jun 2017 A1
20170172666 Govari et al. Jun 2017 A1
20170215842 Ryu et al. Aug 2017 A1
20170252004 Broad et al. Sep 2017 A1
20170258522 Goshgarian et al. Sep 2017 A1
20170328751 Lemke Nov 2017 A1
20170367678 Sirtori et al. Dec 2017 A1
20180015256 Southard et al. Jan 2018 A1
20180116723 Hettrick et al. May 2018 A1
20180125450 Blackbourne et al. May 2018 A1
20180161502 Nanan et al. Jun 2018 A1
20180199914 Ramachandran et al. Jul 2018 A1
20180214119 Mehrmohammadi et al. Aug 2018 A1
20180228465 Southard et al. Aug 2018 A1
20180235649 Elkadi Aug 2018 A1
20180235709 Donhowe et al. Aug 2018 A1
20180289927 Messerly Oct 2018 A1
20180296185 Cox et al. Oct 2018 A1
20180310955 Lindekugel Nov 2018 A1
20180333135 Kim et al. Nov 2018 A1
20180344293 Raju et al. Dec 2018 A1
20190060001 Kohli Feb 2019 A1
20190060014 Hazelton et al. Feb 2019 A1
20190090855 Kobayashi et al. Mar 2019 A1
20190125210 Govari May 2019 A1
20190200951 Meier Jul 2019 A1
20190239848 Bedi et al. Aug 2019 A1
20190239850 Dalvin et al. Aug 2019 A1
20190307419 Durfee Oct 2019 A1
20190307515 Naito et al. Oct 2019 A1
20190365347 Abe Dec 2019 A1
20190365348 Toume et al. Dec 2019 A1
20190365354 Du Dec 2019 A1
20200069929 Mason et al. Mar 2020 A1
20200113540 Gijsbers et al. Apr 2020 A1
20200163654 Satir et al. May 2020 A1
20200200900 Asami et al. Jun 2020 A1
20200229795 Tadross et al. Jul 2020 A1
20200230391 Burkholz Jul 2020 A1
20200237403 Southard et al. Jul 2020 A1
20200281563 Muller et al. Sep 2020 A1
20200359990 Poland et al. Nov 2020 A1
20200390416 Swan et al. Dec 2020 A1
20210059639 Howell Mar 2021 A1
20210077058 Mashood et al. Mar 2021 A1
20210137492 Imai May 2021 A1
20210146167 Barthe et al. May 2021 A1
20210161510 Sasaki et al. Jun 2021 A1
20210186467 Urabe et al. Jun 2021 A1
20210212668 Li et al. Jul 2021 A1
20210267569 Yamamoto Sep 2021 A1
20210267570 Ulman et al. Sep 2021 A1
20210295048 Buras et al. Sep 2021 A1
20210315538 Brandl et al. Oct 2021 A1
20210373602 Min Dec 2021 A1
20210378627 Yarmush et al. Dec 2021 A1
20220039777 Durfee Feb 2022 A1
20220039829 Zijlstra et al. Feb 2022 A1
20220071593 Tran Mar 2022 A1
20220096053 Sethuraman et al. Mar 2022 A1
20220096797 Prince Mar 2022 A1
20220104791 Matsumoto Apr 2022 A1
20220104886 Blanchard et al. Apr 2022 A1
20220117582 McLaughlin et al. Apr 2022 A1
20220168050 Sowards et al. Jun 2022 A1
20220172354 Misener et al. Jun 2022 A1
20220296303 McLeod et al. Sep 2022 A1
20220304652 Peterson et al. Sep 2022 A1
20220330922 Sowards et al. Oct 2022 A1
20220334251 Sowards et al. Oct 2022 A1
20220361840 Matsumoto et al. Nov 2022 A1
20230048327 Lampe et al. Feb 2023 A1
20230107629 Sowards et al. Apr 2023 A1
20230132148 Sowards et al. Apr 2023 A1
20230135562 Misener et al. May 2023 A1
20230138970 Sowards et al. May 2023 A1
20230148872 Sowards et al. May 2023 A1
20230201539 Howell Jun 2023 A1
20230277153 Sowards et al. Sep 2023 A1
20230277154 Sowards et al. Sep 2023 A1
20230293143 Sowards et al. Sep 2023 A1
20230338010 Sturm Oct 2023 A1
20230371928 Rajguru et al. Nov 2023 A1
20230397900 Prince Dec 2023 A1
20240065673 Sowards et al. Feb 2024 A1
20240307024 Sowards et al. Sep 2024 A1
Foreign Referenced Citations (51)
Number Date Country
102871645 Jan 2013 CN
105107067 May 2018 CN
0933063 Aug 1999 EP
1504713 Feb 2005 EP
1591074 May 2008 EP
2823766 Jan 2015 EP
3181083 Jun 2017 EP
3870059 Sep 2021 EP
2000271136 Oct 2000 JP
2007222291 Sep 2007 JP
2014150928 Aug 2014 JP
2018175547 Nov 2018 JP
20180070878 Jun 2018 KR
102176196 Nov 2020 KR
2004082749 Sep 2004 WO
2007115174 Oct 2007 WO
2010029521 Mar 2010 WO
2010076808 Jul 2010 WO
2013059714 Apr 2013 WO
2014115150 Jul 2014 WO
2015017270 Feb 2015 WO
2016081023 May 2016 WO
2017096487 Jun 2017 WO
2017214428 Dec 2017 WO
2018026878 Feb 2018 WO
2018134726 Jul 2018 WO
2019232451 Dec 2019 WO
2020002620 Jan 2020 WO
2020016018 Jan 2020 WO
2019232454 Feb 2020 WO
2020044769 Mar 2020 WO
2020067897 Apr 2020 WO
2020083660 Apr 2020 WO
2020186198 Sep 2020 WO
2021123905 Jun 2021 WO
2021198226 Oct 2021 WO
2022072727 Apr 2022 WO
2022081904 Apr 2022 WO
2022119853 Jun 2022 WO
2022115479 Jun 2022 WO
2022119856 Jun 2022 WO
2022221703 Oct 2022 WO
2022221714 Oct 2022 WO
2023059512 Apr 2023 WO
2023076268 May 2023 WO
2023081220 May 2023 WO
2023081223 May 2023 WO
2023091424 May 2023 WO
2023167866 Sep 2023 WO
2023177718 Sep 2023 WO
2024044277 Feb 2024 WO
Non-Patent Literature Citations (90)
Entry
M. Ikhsan, K. K. Tan, A. S. Putra, C. F. Kong, et al., “Automatic identification of blood vessel cross-section for central venous catheter placement using a cascading classifier,” 39th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC).pp. 1489-1492 (Year: 2017).
U.S. Appl. No. 17/020,476, filed Sep. 14, 2020 Advisory Action dated Aug. 19, 2022.
U.S. Appl. No. 17/020,476, filed Sep. 14, 2020 Non-Final Office Action dated Sep. 23, 2022.
U.S. Appl. No. 17/471,015, filed Sep. 9, 2021 Non-Final Office Action dated Aug. 16, 2022.
Lu Zhenyu et al “Recent advances in 5 robot-assisted echography combining perception control and cognition.” Cognitive Computation and Systems the Institution of Engineering and Technology, Michael Faraday House, Six Hills Way, Stevenage Herts. SG1 2AY UK vol. 2 No. 3 Sep. 2, 2020 (Sep. 2, 2020).
PCT/US2021/045218 filed Aug. 9, 2021 International Search Report and Written Opinion dated Nov. 23, 2021.
PCT/US2021/049123 filed Sep. 3, 2021 International Search Report and Written Opinion dated Feb. 4, 2022.
PCT/US2021/060622 filed Nov. 23, 2021 International Search Report and Written Opinion dated Mar. 3, 2022.
PCT/US2021/061267 filed Nov. 30, 2021 International Search Report and Written Opinion dated Mar. 9, 2022.
PCT/US2021/061276 filed Nov. 30, 2021 International Search Report and Written Opinion dated Mar. 9, 2022.
Sebastian Vogt: “Real-Time Augmented Reality for Image-Guided Interventions”, Oct. 5, 2009, XPO55354720, Retrieved from the Internet: URL: https://opus4.kobv.de/opus4-fau/frontdoor/deliver/index/docId/1235/file/SebastianVogtDissertation.pdf.
U.S. Appl. No. 15/951,903, filed Apr. 12, 2018 Board Decision dated Apr. 20, 2022.
U.S. Appl. No. 17/020,476, filed Sep. 14, 2020 Non-Final Office Action dated Feb. 9, 2022.
William F Garrett et al: “Real-time incremental visualization of dynamic ultrasound volumes using parallel BSP trees”, Visualization '96. Proceedings, IEEE, NE, Oct. 27, 1996, pp. 235-ff, XPO58399771, ISBN: 978-0-89791-864-0 abstract, figures 1-7, pp. 236-240.
PCT/US12/61182 International Seach Report and Written Opinion dated Mar. 11, 2013.
PCT/US2021/049294 filed Sep. 7, 2021 International Search Report and Written Opinion dated Dec. 8, 2021.
PCT/US2021/049712 filed Sep. 9, 2021 International Search Report and Written Opinion dated Dec. 14, 2021.
PCT/US2021/052055 filed Sep. 24, 2021 International Search Report and Written Opinion dated Dec. 20, 2021.
U.S. Appl. No. 13/656,563, filed Oct. 19, 2012 Decision on Appeal dated Nov. 1, 2017.
U.S. Appl. No. 13/656,563, filed Oct. 19, 2012 Examiner's Answer dated Nov. 16, 2015.
U.S. Appl. No. 13/656,563, filed Oct. 19, 2012 Final Office Action dated Dec. 5, 2014.
U.S. Appl. No. 13/656,563, filed Oct. 19, 2012 Non-Final Office Action dated Jul. 18, 2014.
U.S. Appl. No. 15/650,474, filed Jul. 14, 2017 Final Office Action dated Jun. 2, 2020.
U.S. Appl. No. 15/650,474, filed Jul. 14, 2017 Non-Final Office Action dated Dec. 16, 2019.
U.S. Appl. No. 15/650,474, filed Jul. 14, 2017 Notice of Allowance dated Dec. 11, 2020.
U.S. Appl. No. 15/650,474, filed Jul. 14, 2017 Notice of Allowance dated Mar. 1, 2021.
U.S. Appl. No. 15/951,903, filed Apr. 12, 2018 Advisory Action dated Dec. 22, 2020.
U.S. Appl. No. 15/951,903, filed Apr. 12, 2018 Examiner's Answer dated Jun. 3, 2021.
U.S. Appl. No. 15/951,903, filed Apr. 12, 2018 Final Office Action dated Oct. 13, 2020.
U.S. Appl. No. 15/951,903, filed Apr. 12, 2018 Non-Final Office Action dated May 22, 2020.
Pagoulatos, N. et al. “New spatial localizer based on fiber optics with applications in 3D ultrasound imaging” Proceeding of Spie, vol. 3976 (Apr. 18, 2000; Apr. 18, 2000).
PCT/US2022/025082 filed Apr. 15, 2022 International Search Report and Written Opinion dated Jul. 11, 2022.
PCT/US2022/025097 filed Apr. 15, 2022 International Search Report and Written Opinion dated Jul. 8, 2022.
U.S. Appl. No. 17/020,476, filed Sep. 14, 2020 Final Office Action dated Jun. 9, 2022.
PCT/US2022/048716 filed Nov. 2, 2022 International Search Report and Written Opinion dated Feb. 24, 2023.
PCT/US2022/048722 filed Nov. 2, 2022 International Search Report and Written Opinion dated Feb. 24, 2023.
PCT/US2022/049983 filed Nov. 15, 2022 International Search Report and Written Opinion dated Mar. 29, 2023.
PCT/US2022047727 filed Oct. 25, 2022 International Search Report and Written Opinion dated Jan. 25, 2023.
PCT/US2023/014143 filed Feb. 28, 2023 International Search Report and Written Opinion dated Jun. 12, 2023.
PCT/US2023/015266 filed Mar. 15, 2023 International Search Report and Written Opinion dated May 25, 2023.
Saxena Ashish et al Thermographic venous blood flow characterization with external cooling stimulation Infrared Physics and Technology Elsevier Science GB vol. 90 Feb. 9, 2018 Feb. 9, 2018 pp. 8-19 XP085378852.
U.S. Appl. No. 17/020,476, filed Sep. 14, 2020 Final Office Action dated Jan. 5, 2023.
U.S. Appl. No. 17/020,476, filed Sep. 14, 2020 Notice of Allowance dated Apr. 28, 2022.
U.S. Appl. No. 17/468,318, filed Sep. 7, 2021 Non-Final Office Action dated Apr. 12, 2023.
U.S. Appl. No. 17/471,015, filed Sep. 9, 2021 Non-Final Office Action dated Mar. 30, 2023.
U.S. Appl. No. 17/538,911, filed Nov. 30, 2021 Non-Final Office Action dated Mar. 2, 2023.
U.S. Appl. No. 17/684,180, filed Mar. 1, 2022 Restriction Requirement dated May 19, 2023.
PCT/US2022/025097 filed Apr. 15, 2021 International Preliminary Report on Patentability dated Oct. 26, 2023.
PCT/US2023/030970 filed Aug. 23, 2023 International Search Report and Written Opinion dated Oct. 30, 2023.
U.S. Appl. No. 17/468,318, filed Sep. 7, 2021 Advisory Action dated Nov. 6, 2023.
U.S. Appl. No. 17/471,015, filed Sep. 9, 2021 Final Office Action dated Oct. 12, 2023.
U.S. Appl. No. 17/538,911, filed Nov. 30, 2021 Advisory Action dated Nov. 22, 2023.
U.S. Appl. No. 17/722,151, filed Apr. 15, 2022 Final Office Action dated Nov. 6, 2023.
U.S. Appl. No. 17/894,460, filed Aug. 24, 2022 Non-Final Office Action dated Nov. 6, 2023.
U.S. Appl. No. 17/468,318, filed Sep. 7, 2021 Notice of Allowance dated Jan. 18, 2024.
U.S. Appl. No. 17/471,015, filed Sep. 9, 2021 Advisory Action dated Feb. 2, 2024.
U.S. Appl. No. 17/471,015, filed Sep. 9, 2021 Non-Final Office Action dated Mar. 28, 2024.
U.S. Appl. No. 17/538,911, filed Nov. 30, 2021 Notice of Allowance dated Mar. 14, 2024.
U.S. Appl. No. 17/538,943, filed Nov. 30, 2021 Non-Final Office Action dated Jan. 30, 2024.
U.S. Appl. No. 17/684,180, filed Mar. 1, 2022 Advisory Action dated Apr. 4, 2024.
U.S. Appl. No. 17/684,180, filed Mar. 1, 2022 Final Office Action dated Jan. 18, 2024.
U.S. Appl. No. 17/684,180, filed Mar. 1, 2022 Non-Final Office Action dated May 8, 2024.
U.S. Appl. No. 17/722,111, filed Apr. 15, 2022 Non-Final Office Action dated Dec. 22, 2023.
U.S. Appl. No. 17/722,151, filed Apr. 15, 2022 Advisory Action dated Jan. 2, 2024.
U.S. Appl. No. 17/722,151, filed Apr. 15, 2022 Non-Final Office Action dated Mar. 25, 2024.
U.S. Appl. No. 17/894,460, filed Aug. 24, 2022 Advisory Action dated Apr. 4, 2024.
U.S. Appl. No. 17/894,460, filed Aug. 24, 2022 Final Office Action dated Jan. 31, 2024.
U.S. Appl. No. 17/979,564, filed Nov. 2, 2022 Non-Final Office Action dated Jun. 5, 2024.
U.S. Appl. No. 18/238,281, filed Aug. 25, 2023 Non-Final Office Action dated Mar. 22, 2024.
EP 20866520.8 filed Apr. 5, 2022 Extended European Search Report dated Aug. 22, 2023.
U.S. Appl. No. 17/468,318, filed Sep. 7, 2021 Final Office Action dated Sep. 8, 2023.
U.S. Appl. No. 17/538,911, filed Nov. 30, 2021 Final Office Action dated Sep. 13, 2023.
U.S. Appl. No. 17/684,180, filed Mar. 1, 2022 Non-Final Office Action dated Jul. 28, 2023.
U.S. Appl. No. 17/722,151, filed Apr. 15, 2022 Non-Final Office Action dated Sep. 7, 2023.
Thermographic venous blood flow characterization with external cooling stimulation (Year: 2018).
U.S. Appl. No. 17/538,943, filed Nov. 30, 2021 Notice of Allowance dated Aug. 14, 2024.
U.S. Appl. No. 17/684,180 filed Mar. 1, 2022 Final Office Action dated Sep. 23, 2024.
U.S. Appl. No. 17/722,111, filed Apr. 15, 2022 Final Office Action dated Jul. 12, 2024.
U.S. Appl. No. 17/722,151 filed Apr. 15, 2022 Final Office Action dated Sep. 20, 2024.
U.S. Appl. No. 17/894,460, filed Aug. 24, 2022 Non-Final Office Action dated Sep. 25, 2024.
U.S. Appl. No. 17/957,562, filed Sep. 30, 2022 Non-Final Office Action dated Jun. 20, 2024.
U.S. Appl. No. 17/979,601, filed Nov. 2, 2022 Non-Final Office Action dated Aug. 20, 2024.
U.S. Appl. No. 17/987,698, filed Nov. 15, 2022 Non-Final Office Action dated Sep. 20, 2024.
U.S. Appl. No. 18/238,281, filed Aug. 25, 2023 Notice of Allowance dated Jul. 16, 2024.
PCT/US2024/037647 filed Jul. 11, 2024 International Search Report and Written Opinion dated Oct. 16, 2024.
U.S. Appl. No. 17/471,015, filed Sep. 9, 2021 Notice of Allowance dated Oct. 29, 2024.
U.S. Appl. No. 17/722,111 filed Apr. 15, 2022 Advisory Action dated Oct. 23, 2024.
U.S. Appl. No. 17/957,562, filed Sep. 30, 2022 Final Office Action dated Nov. 27, 2024.
U.S. Appl. No. 17/979,564, filed Nov. 2, 2022 Final Office Action dated Oct. 18, 2024.
U.S. Appl. No. 18/113,003, filed Feb. 22, 2023 Non-Final Office Action dated Nov. 27, 2024.
Related Publications (1)
Number Date Country
20220160434 A1 May 2022 US
Provisional Applications (1)
Number Date Country
63117883 Nov 2020 US