There is currently a variety of existing ultrasound systems that include wired or wireless ultrasound probes connected to visual displays. These systems may be used by a clinician to hold and manipulate the ultrasound probe to place a vascular access device (VAD) such as a catheter in a patient. Ultrasound imaging is commonly used for guiding a needle to targets such as veins of the patient. The needle may be monitored in real-time prior and after a percutaneous insertion. This way a clinician may be able to determine the distance and the orientation of the needle to the target vein and ensure accurate insertion with minimal discomfort to the patient. However, inadvertent and unintentional movements of an ultrasound probe during the ultrasound imaging may occur. Such movement may cause the clinician to lose sight of the target vein and the needle. Finding and locating the needle and the target vein to be viewed on a screen of the visual display may be difficult and may waste valuable time. The distance and orientation of the needle right before the percutaneous insertion may be difficult to determine since a needle plane including the needle is perpendicular (or near perpendicular) to an image plane of the ultrasound probe.
It may be easier to monitor the distance and orientation of the needle immediately after the percutaneous insertion when the needle plane is parallel to the image plane. While inadvertently moving or shifting the ultrasound probe, the clinician can lose the vein and/or the needle when adjusting the image plane before and after the percutaneous insertion which can result in a loss of valuable time. The existing ultrasound systems do not provide for convenient needle guidance capability that takes into account the inadvertent movement or shifting the ultrasound probe. Thus, what is needed are a method and system for an ultrasound image target tracking to account for the inadvertent movements or shifting of the ultrasound probe to facilitate efficient needle guidance.
Accordingly, disclosed herein are methods and systems for analyzing ultrasound images to detect targets including anatomic targets and medical devices appearing within an ultrasound imaging area and generating a cropped image to maintain a location of the detected targets in the cropped image in the event of a shift of the ultrasound probe head.
Briefly summarized, disclosed herein is an ultrasound probe including, in some embodiments, an image target tracking capability. The ultrasound probe system may provide a consistent ultrasound view throughout an ultrasound guided procedure while compensating for inadvertent movements of the ultrasound probe. The exemplary tracking feature advantageously allows for incidental movement of the ultrasound probe during the procedure without drastic movement of the most important imaging data on the screen.
In some embodiments, an ultrasound imaging system is disclosed comprising an ultrasound probe including a transducer array configured to acquire ultrasound images, and a console including a processor and non-transitory computer-readable medium having stored thereon a plurality of logic modules that, when executed by the processor, are configured to perform operations including receiving an ultrasound image, detecting one or more targets within the ultrasound image, and generating a visualization from the ultrasound image to center the one or more detected targets within a displayed portion of the ultrasound image. In some embodiments, generating the visualization includes cropping the ultrasound image to center the one or more detected targets within a displayed portion of the ultrasound image. In some embodiments, generating the visualization includes increasing a magnification of a cropped portion of the ultrasound image to center the one or more detected targets within a displayed portion of the ultrasound image.
In some embodiments, the ultrasound probe is operatively connected to the console via a wired or wireless connection. In some embodiments, the console includes a display, and wherein the plurality of logic modules that, when executed by the processor, are configured to perform further operations including render the visualization of the cropped ultrasound image on a display. In some embodiments, detecting the one or more targets includes distinguishing a component within the ultrasound image according to varying color saturation within the ultrasound image. In specific embodiments, detecting the one or more targets includes identifying each of the one or more targets as a blood vessel, bone, organ or medical device. In other embodiments, identifying each of the one or more targets includes comparing characteristics of each of the one or more targets to thresholds set to define organs, blood vessels, bones or medical devices.
In some embodiments, the characteristics include one or more of a detected pulsatility upon analysis of the ultrasound image and a prior ultrasound image, dimensions of each of the one or more targets or color saturation of each of the one or more targets. In some embodiments, a result of comparing the characteristics to the one or more thresholds is a confidence level for each of the one or more targets indicating a likelihood of an identification of a particular target. In specific embodiments, the plurality of logic modules that, when executed by the processor, are configured to perform further operations including detect that at least a first target of the one or more of the targets is within a threshold distance of an edge of the ultrasound image.
In some embodiments, the plurality of logic modules that, when executed by the processor, are configured to perform further operations including generate an alert indicating to a clinician that the first target is within the threshold of the edge of the ultrasound image. In some embodiments, the alert includes a text notification or an arrow indicating a direction to move the ultrasound probe. In other embodiments, the one or more targets includes a blood vessel and a needle. In yet other embodiments, the one or more targets includes a distal tip of the needle.
In some embodiments, method for obtaining ultrasound images by an ultrasound imaging system is disclosed where the ultrasound imaging system includes an ultrasound probe including a transducer array configured to acquire ultrasound images, and a console including a processor and non-transitory computer-readable medium having stored thereon a plurality of logic modules that, when executed by the processor, are configured to perform operations including receiving an ultrasound image, detecting one or more targets within the ultrasound image, and generating a visualization from the ultrasound image by cropping the ultrasound image around the one or more detected targets. In some embodiments, the method comprises receiving an ultrasound image, detecting one or more targets within the ultrasound image, and generating a visualization from the ultrasound image to center the one or more detected targets within a displayed portion of the ultrasound image. In some embodiments, generating the visualization includes cropping the ultrasound image to center the one or more detected targets within a displayed portion of the ultrasound image. In some embodiments, generating the visualization includes increasing a magnification of a cropped portion of the ultrasound image to center the one or more detected targets within a displayed portion of the ultrasound image.
In some embodiments, the ultrasound probe is operatively connected to the console via a wired or wireless connection. In some embodiments, the console includes a display, and wherein the plurality of logic modules that, when executed by the processor, are configured to perform further operations including render the visualization of the cropped ultrasound image on a display. In some embodiments, detecting the one or more targets includes distinguishing a component within the ultrasound image according to varying color saturation within the ultrasound image. In specific embodiments, detecting the one or more targets includes identifying each of the one or more targets as a blood vessel, bone, organ or medical device. In other embodiments, identifying each of the one or more targets includes comparing characteristics of each of the one or more targets to thresholds set to define organs, blood vessels, bones or medical devices.
In some embodiments, the characteristics include one or more of a detected pulsatility upon analysis of the ultrasound image and a prior ultrasound image, dimensions of each of the one or more targets or color saturation of each of the one or more targets. In some embodiments, a result of comparing the characteristics to the one or more thresholds is a confidence level for each of the one or more targets indicating a likelihood of an identification of a particular target. In specific embodiments, the plurality of logic modules that, when executed by the processor, are configured to perform further operations including detect that at least a first target of the one or more of the targets is within a threshold distance of an edge of the ultrasound image.
In some embodiments, the plurality of logic modules that, when executed by the processor, are configured to perform further operations including generate an alert indicating to a clinician that the first target is within the threshold of the edge of the ultrasound image. In some embodiments, the alert includes a text notification or an arrow indicating a direction to move the ultrasound probe. In other embodiments, the one or more targets includes a blood vessel and a needle. In yet other embodiments, the one or more targets includes a distal tip of the needle.
These and other features of the concepts provided herein will become more apparent to those of skill in the art in view of the accompanying drawings and following description, which describe particular embodiments of such concepts in greater detail.
A more particular description of the present disclosure will be rendered by reference to specific embodiments thereof that are illustrated in the appended drawings. It is appreciated that these drawings depict only typical embodiments of the invention and are therefore not to be considered limiting of its scope. Example embodiments of the invention will be described and explained with additional specificity and detail through the use of the accompanying drawings in which:
Before some particular embodiments are disclosed in greater detail, it should be understood that the particular embodiments disclosed herein do not limit the scope of the concepts provided herein. It should also be understood that a particular embodiment disclosed herein can have features that can be readily separated from the particular embodiment and optionally combined with or substituted for features of any of a number of other embodiments disclosed herein.
Regarding terms used herein, it should also be understood the terms are for the purpose of describing some particular embodiments, and the terms do not limit the scope of the concepts provided herein. Ordinal numbers (e.g., first, second, third, etc.) are generally used to distinguish or identify different features or steps in a group of features or steps, and do not supply a serial or numerical limitation. For example, “first,” “second,” and “third” features or steps need not necessarily appear in that order, and the particular embodiments including such features or steps need not necessarily be limited to the three features or steps. Labels such as “left,” “right,” “top,” “bottom,” “front,” “back,” and the like are used for convenience and are not intended to imply, for example, any particular fixed location, orientation, or direction. Instead, such labels are used to reflect, for example, relative location, orientation, or directions. Singular forms of “a,” “an,” and “the” include plural references unless the context clearly dictates otherwise.
For clarity, it is to be understood that the word “distal” refers to a direction relatively closer to a patient on which a medical device is to be used as described herein, while the word “proximal” refers to a direction relatively further from the patient. Also, the words “including,” “has,” and “having,” as used herein, including the claims, shall have the same meaning as the word “comprising.”
Lastly, in the following description, the terms “or” and “and/or” as used herein are to be interpreted as inclusive or meaning any one or any combination. As an example, “A, B or C” or “A, B and/or C” mean “any of the following: A; B; C; A and B; A and C; B and C; A, B and C.” An exception to this definition will occur only when a combination of elements, components, functions, steps or acts are in some way inherently mutually exclusive.
Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by those of ordinary skill in the art.
Embodiments disclosed herein are directed to an ultrasound imaging system to be used for ultrasound imaging while placing a needle into a target vein of a patient. The ultrasound imaging system including, in some embodiments, an image target tracking capability is provided. The ultrasound imaging system may provide a consistent ultrasound view throughout an ultrasound guided procedure while compensating for inadvertent movements of an ultrasound probe. The exemplary tracking feature advantageously allows for incidental movement of the ultrasound probe during the procedure without drastic movement of the most important imaging data on the screen. The ultrasound-imaging system, according to the exemplary embodiments, may be primarily used for insertion of an access device such as a needle. The image tracking provides for the precise placement of the needle into a target vein or another anatomic target regardless of the inadvertent movements of the ultrasound probe.
Referring to
A digital controller/analog interface 122 may be also included with the console 102 and is in communication with both the processor 116 and other system components to govern interfacing between the ultrasound probe 106 and other system components set forth herein. The ultrasound imaging system 100 further includes ports 124 for connection with additional components such as optional components 126 including a printer, storage media, keyboard, etc. The ports 124 can be implemented as universal serial bus (USB) ports, though other types of ports can be used for this connection or any other connections shown or described herein. A power connection 128 is included with the console 102 to enable operable connection to an external power supply 130. An internal power supply 132 (e.g., a battery) can also be employed either with or exclusive of the external power supply 130. Power management circuitry 134 is included with the digital controller/analog interface 122 of the console 102 to regulate power use and distribution. Optionally, a stand-alone optical interrogator 154 may be communicatively coupled to the console 102 by way of one of the ports 124. Alternatively, the console 102 may include an optical interrogator integrated into the console 502. Such an optical interrogator is configured to emit input optical signals into a companion optical-fiber stylet 156 for shape sensing with the ultrasound imaging system 100. The optical-fiber stylet 156, in turn, may be configured to be inserted into a lumen of a medical device such as the needle and may convey the input optical signals from the optical interrogator 154 to a number of fiber Bragg grating (FBG) sensors along a length of the optical-fiber stylet 156. The optical interrogator 154 may be also configured to receive reflected optical signals conveyed by the optical-fiber stylet 156 reflected from the number of the FBG sensors, the reflected optical signals may be indicative of a shape of the optical-fiber stylet 156.
The optical interrogator 154 may be also configured to convert the reflected optical signals into corresponding electrical signals for processing by the console 102 into distance and orientation information with respect to the target and for dynamically adjusting a distance of the activated ultrasonic transducers 148, an orientation of the activated ultrasonic transducers 148, or both the distance and the orientation of the activated ultrasonic transducers 148 with respect to the target (e.g., a target vein) or the medical device (e.g., a needle) when it is brought into proximity of the target. For example, the distance and orientation of the activated ultrasonic transducers 148 may be adjusted with respect to the vein as the target. An image plane may be established by the activated ultrasonic transducers 148 being disposed at a particular angle to the target vein based on the orientation of the target vein (e.g., perpendicular or parallel among other configurations). In another example, when a medical device such as the needle is brought into proximity of the ultrasound probe 106, an image plane can be established by the activated ultrasonic transducers 148 being perpendicular to a medical-device plane including the needle 204. The distance and orientation information may also be used for displaying an iconographic representation of the medical device on the display.
The display screen 104 may be integrated into (or connected to) the console 102 to provide a GUI and display information for a clinician in a form of ultrasound images of the target acquired by the ultrasound probe 106. In addition, the ultrasound imaging system 100 may enable the distance and orientation of a magnetized medical device such as the needle to be superimposed in real-time atop an ultrasound image of the target, thus enabling a clinician to accurately guide the magnetized medical device to the intended target (e.g., the vein). The display screen 104 can alternatively be separate from the console 102 and communicatively (e.g., wirelessly) coupled thereto. A console button interface 136 may be used to immediately call up a desired mode to the display screen 104 by the clinician for assistance in an ultrasound-based medical procedure. In some embodiments, the display screen 104 may be implemented as an LCD device. The ultrasound probe 106 may optionally include an internal measurement unit (IMU) 158 that may house and accelerometer 160, a gyroscope 162 and a magnetometer 164.
The ultrasound probe 106 may be employed in connection with ultrasound-based visualization of a target such as the vein in preparation for inserting the needle or another medical device into the target. Such visualization gives real-time ultrasound guidance and assists in reducing complications typically associated with such insertion, including inadvertent arterial puncture, hematoma, pneumothorax, etc. The ultrasound probe 106 may be configured to provide to the console 102 electrical signals corresponding to the ultrasound-imaging data, the magnetic-field data, the shape-sensing data, or a combination thereof for the real-time ultrasound needle guidance.
In one embodiment, target detection logic 166 may be executed by the processor 116 to detect vessels and other anatomic targets in the ultrasound images. The target detection logic 166 may include pulsatility detection logic 168 and component identification logic 170. The target detection logic 166 may use pulsatility detection logic 168 and component identification logic 170. The pulsatility detection logic 168 may compare a sequence of images of a vessel to detect pulses indicated by periodic changes in dimensions of the vessel (e.g., expansions in a diameter of the vessel). The target detection logic 106 may also detect bones by identifying tissues with high density based on color saturation in the ultrasound images. The component identification logic 170 may analyze reflection of echoes in each ultrasound image. This can be implemented, for example, using thresholds set to identify organs, blood vessels and bones. The respective logics 166, 168 and 170 may be stored on a non-transitory computer-readable medium of the console 102. An image cropping logic 172 may be executed on the processor 116 to crop images with the detected anatomic target (e.g., a target vein) so the anatomic target is in the center of the cropped image that of a total ultrasound imaging area as will be discussed in more detail herein. Herein, “cropping” may refer to reducing the amount of the ultrasound image that is displayed. Further, cropping may include increasing a magnification of the cropped portion of the ultrasound image. The target detection logic 166 and image cropping logic 172 may collectively be referred to as “console logic” or “logic of the console 102”; however, the term console logic may also include reference to any of the other logic modules illustrated in
Referring to
The transducers may be configured to maintain the target in an image plane or switch to a different image plane (e.g., from a perpendicular plane to a medical-device plane to a plane parallel to the medical-device plane) including the target. If the ultrasound probe 200 is configured with the moveable linear array of the ultrasonic transducers, the ultrasonic transducers may be already activated for ultrasound imaging. For example, a subset of the ultrasonic transducers or all of the available ultrasonic transducers may be moved together on the moveable linear array as needed for ultrasound imaging based on the ultrasound-imaging data to maintain the target in an image plane established by the activated ultrasonic transducers or to switch to a different image plane including the target.
The probe head 202 may be placed against the skin of a patient proximate to a needle-insertion site so the activated ultrasonic transducers in the probe head 202 may generate and emit the generated ultrasound signals into the patient as a sequence of pulses. Then, the transmitters (not shown) may receive reflected ultrasound signals (i.e., reflections of the generated ultrasonic pulses from the patient's body). The reflected ultrasound signals may be converted into corresponding electrical signals for processing into ultrasound images by the console of the probe 200. Thus, a clinician may employ the ultrasound imaging system 100 depicted in
The ultrasound imaging system 100 depicted in
The ultrasound imaging system 100 depicted in
Referring to now
The ultrasound imaging system 100 depicted in
The target detection logic 166 may use pulsatility detection logic 168 and component identification logic 170 depicted in
Referring now to
The ultrasound image of the imaging area 300 is provided to the console 102 where the console logic processes the ultrasound image. Specifically, the target detection logic 166 analyzes the ultrasound image to detect the target vein 210. For example, the target detection logic 166 may place a bounding box surrounding the target vein 210 or may detect coordinates of a box around the target vein 210. It should be understood that the term “box” is not limited to a square or rectangle but may refer to any other shape, such as a circle, oval, etc. The image cropping logic 172 then crops the ultrasound image illustrating imaging area 300 around the image of the target vein 210 in such a way that the target vein 210 is located in the center of the cropped image 320. For example, the image cropping logic 172, when executed by the processor 116, may crop the ultrasound image illustrating the imaging area 300 at the bounding box or coordinates determined by the target detection logic 166. Then, the cropped image 320 containing the target vein 210 may be displayed on the screen of the console 102.
Referring to
However, according to an exemplary embodiment, logic of the ultrasound imaging system 100 is configured to detect the target vein 210 and display an image on the console 102 where the target vein 210 is displayed in the center of the image (i.e., compensating for the shift of the probe 200). Therefore, even as the probe 200 may be inadvertently shifted, the image displayed by the console 102 does maintains the target vein 210 at the center of the displayed image; thus, enabling the clinician to continue focusing on the target vein 210 itself as opposed to focusing on the inadvertent shift of the probe 200.
In other words, the cropped image 320 advantageously does not change in response to the shifting of the probe 200. The ultrasound imaging system 100 may identify and distinguish anatomic targets such as the target vein 210. Then, the ultrasound imaging system 100 may identify the anatomic target and perform image tracking of that target. The console 102 of the ultrasound imaging system 100 may employ console logic (e.g., target detection logic 166 and image cropping logic 172, as referenced above) to receive a sequence of ultrasound images (or a continuous signal) from the probe 200. Further, the console logic may repeatedly detect the target vein 210 within each image and may crop the current image for visualization (e.g., as the cropped image 320). This way, the display of the cropped image 320 of the target vein 210 remains unaffected by the shifting of the probe 200 allowing the clinician to advantageously maintain sight of the visualization of the target vein 210.
Referring now to
For example, the console logic may detect the target vein 210 in each ultrasound image received from the probe 200. When the ultrasound probe 200 accidentally moves in such a way that the probe head 202 is about to stop capturing the target vein 210, the console logic provides a “movement warning” alert that is displayed on the display 104, e.g., as an overlay on the cropped image 320. The console logic may detect the location of the target vein 210 relative to boundaries of the total ultrasound image area 300. The visual alert may be accompanied by an arrow indicating which way to move the probe to get away from the edge of the screen. This way, the clinician is alerted in time before losing sight of the visualization of the target vein 210. In one embodiment, the “movement warning” alert may be generated by an alert generating logic component of the console logic. In some embodiments, the warning or alert may be an audio alert such as beeping. In some embodiments, the warning or alert may be a vibration of the probe 200, in which case the probe 200 would include a vibration motor that is communicatively coupled to the console logic. In some embodiments, the warning or alert may be any combination of a visual alert, an audio alert and/or a vibration.
Referring now to
In the example depicted in
Referring to
Referring to now
The component identification logic 170 may analyze reflection of echoes in each image. The identification of components may be implemented based on comparing characteristics of detected components (e.g., pulsatility over a plurality of images, dimensions of the detected components, color saturation, etc.) to thresholds set to define organs, blood vessels and bones. Based on the result of comparing the characteristics of detected components to the one or more thresholds, a confidence level (or score) may be determined indicating a likelihood of an identification of a particular component (e.g., a confidence score that a particular detected component is a bone or a blood vessel).
Further and in a similar manner, the target detection logic 166 may also be configured to detect a needle with an ultrasound image. A needle, such as the needle 404, may include specific and known reflection characteristics (e.g., dimensions, color saturation, etc.) such that the component identification logic 170 of the target detection logic 166 may detect and identify a needle in the same manner as discussed with respect to vessel and bone detection. Thus, the ultrasound probe 200 may be configured to assist a user such as a clinician in insertion of an access device, e.g., the needle 404, into a target vein 210 of a patient. The probe head 202 may be placed against the skin of a patient proximate to a needle-insertion site so the activated ultrasonic transducers in the probe head 202 may generate and emit ultrasound signals into the patient as a sequence of pulses. Then, the transmitters (not shown) may receive reflected ultrasound signals (i.e., reflections of the generated ultrasonic pulses from the patient's body). The reflected ultrasound signals may be converted into corresponding electrical signals for processing into ultrasound images by the console of the probe 200. Thus, a clinician may employ the ultrasound imaging system 100 depicted in
Following detection and identification of components included within the imaging area 400, the ultrasound imaging system 100 may be configured to generate a cropped image, such as the cropped image 420, which includes both the target vein 210 and a portion of the needle 404. In one embodiment, the image cropping logic 172 depicted in
In some embodiments, a determination of a boundary at which to crop the ultrasound image illustrating the imaging area 400 includes determination of the positioning of the needle 404 and its distance from the target vein 210. For example, the cropped image 420 may consist of a smaller bounding box surrounding the target vein 210 when the needle 404 is in close proximity to the target vein 210 and consist of a larger bounding box when the needle 404 is further away from the target vein 210. Therefore, in both situations, the cropped image 420 illustrates the target vein 210 and the needle 404. However, in other embodiments, the bounding box upon which the cropped image 420 is created is a predetermined size and cropping would not take into consideration a location of the needle 404. As noted above, it should be understood that the term “box” is not limited to a square or rectangle but may refer to any other shape, such as a circle, oval, etc.
Referring now to
Referring to
In other words, the cropped image 520 advantageously does not change in response to the shifting of the probe 200. The ultrasound imaging system 100 may identify and distinguish anatomic targets such as the target vein 210. Then, the ultrasound imaging system 100 may identify the anatomic target and perform image tracking of that target. The console 102 of the ultrasound imaging system 100 may employ console logic (e.g., target detection logic 166 and image cropping logic 172, as referenced above) to receive a sequence of ultrasound images (or a continuous signal) from the probe 200. Further, the console logic may repeatedly detect the target vein 210 and the needle 404 within each image and may crop the current image for visualization (e.g., as the cropped image 520). This way, the display of the cropped image 520 of the target vein 210 remains unaffected by the shifting of the probe 200 allowing the clinician to advantageously maintain sight of the visualization of the target vein 210 and the needle 404 as the needle 404 approaches the target vein 210.
Referring now to
For example, the console logic may detect the target vein 210 and the needle 404, include a distal tip 501 of the needle 404, in each ultrasound image received from the probe 200. When the ultrasound probe 200 accidentally moves in such a way that the probe head 202 is about to stop capturing the target vein 210 or the distal tip 501 of the needle 404, the console logic provides a “movement warning” alert that is displayed on the display 104, e.g., as an overlay on the cropped image 520. The console logic may detect the location of the target vein 210 relative to boundaries of the total ultrasound image area 500. The visual alert may be accompanied by an arrow indicating which way to move the probe to get away from the edge of the screen. This way, the clinician is alerted in time before losing sight of the visualization of the target vein 210 or the distal tip 501 of the needle 404. In one embodiment, the “movement warning” alert may be generated by an alert generating logic component of the console logic. In some embodiments, the warning or alert may be an audio alert such as beeping. In some embodiments, the warning or alert may be a vibration of the probe 200, in which case the probe 200 would include a vibration motor that is communicatively coupled to the console logic. In some embodiments, the warning or alert may be any combination of a visual alert, an audio alert and/or a vibration.
Having a system that not only provides for ultrasound imaging, but ensures that the needle is precisely inserted into a target based on ultrasound image tracking regardless of accidental shifting the ultrasound probe, advantageously reduces a risk of puncturing patient's skin in a wrong place or even in several places.
Embodiments of the invention may be embodied in other specific forms without departing from the spirit of the present disclosure. The described embodiments are to be considered in all respects only as illustrative, not restrictive. The scope of the embodiments is, therefore, indicated by the appended claims rather than by the foregoing description. All changes that come within the meaning and range of equivalency of the claims are to be embraced within their scope.
This application claims the benefit of priority to U.S. Provisional Application No. 63/119,829, filed Dec. 1, 2020, which is incorporated by reference in its entirety into this application.
Number | Name | Date | Kind |
---|---|---|---|
3697917 | Orth et al. | Oct 1972 | A |
5148809 | Biegeleisen-Knight et al. | Sep 1992 | A |
5181513 | Touboul et al. | Jan 1993 | A |
5325293 | Dorne | Jun 1994 | A |
5349865 | Kavli et al. | Sep 1994 | A |
5441052 | Miyajima | Aug 1995 | A |
5549554 | Miraki | Aug 1996 | A |
5573529 | Haak et al. | Nov 1996 | A |
5775322 | Silverstein et al. | Jul 1998 | A |
5879297 | Haynor et al. | Mar 1999 | A |
5897503 | Lyon et al. | Apr 1999 | A |
5908387 | LeFree et al. | Jun 1999 | A |
5967984 | Chu et al. | Oct 1999 | A |
5970119 | Hofmann | Oct 1999 | A |
6004270 | Urbano et al. | Dec 1999 | A |
6019724 | Gronningsaeter et al. | Feb 2000 | A |
6068599 | Saito et al. | May 2000 | A |
6074367 | Hubbell | Jun 2000 | A |
6129668 | Haynor et al. | Oct 2000 | A |
6132379 | Patacsil et al. | Oct 2000 | A |
6216028 | Haynor et al. | Apr 2001 | B1 |
6233476 | Strommer et al. | May 2001 | B1 |
6245018 | Lee | Jun 2001 | B1 |
6263230 | Haynor et al. | Jul 2001 | B1 |
6375615 | Flaherty et al. | Apr 2002 | B1 |
6436043 | Bonnefous | Aug 2002 | B2 |
6498942 | Esenaliev et al. | Dec 2002 | B1 |
6503205 | Manor et al. | Jan 2003 | B2 |
6508769 | Bonnefous | Jan 2003 | B2 |
6511458 | Milo et al. | Jan 2003 | B2 |
6524249 | Moehring et al. | Feb 2003 | B2 |
6543642 | Milliorn | Apr 2003 | B1 |
6554771 | Buil et al. | Apr 2003 | B1 |
6592520 | Peszynski et al. | Jul 2003 | B1 |
6592565 | Twardowski | Jul 2003 | B2 |
6601705 | Molina et al. | Aug 2003 | B2 |
6612992 | Hossack et al. | Sep 2003 | B1 |
6613002 | Clark et al. | Sep 2003 | B1 |
6623431 | Sakuma et al. | Sep 2003 | B1 |
6641538 | Nakaya et al. | Nov 2003 | B2 |
6647135 | Bonnefous | Nov 2003 | B2 |
6687386 | Ito et al. | Feb 2004 | B1 |
6733458 | Steins et al. | May 2004 | B1 |
6749569 | Pellegretti | Jun 2004 | B1 |
6754608 | Svanerudh et al. | Jun 2004 | B2 |
6755789 | Stringer et al. | Jun 2004 | B2 |
6840379 | Franks-Farah et al. | Jan 2005 | B2 |
6857196 | Dalrymple | Feb 2005 | B2 |
6979294 | Selzer et al. | Dec 2005 | B1 |
7074187 | Selzer et al. | Jul 2006 | B2 |
7244234 | Ridley et al. | Jul 2007 | B2 |
7359554 | Klingensmith et al. | Apr 2008 | B2 |
7534209 | Abend et al. | May 2009 | B2 |
7599730 | Hunter et al. | Oct 2009 | B2 |
7637870 | Flaherty et al. | Dec 2009 | B2 |
7681579 | Schwartz | Mar 2010 | B2 |
7691061 | Hirota | Apr 2010 | B2 |
7699779 | Sasaki et al. | Apr 2010 | B2 |
7720520 | Willis | May 2010 | B2 |
7727153 | Fritz et al. | Jun 2010 | B2 |
7734326 | Pedain et al. | Jun 2010 | B2 |
7831449 | Ying et al. | Nov 2010 | B2 |
7905837 | Suzuki | Mar 2011 | B2 |
7925327 | Weese | Apr 2011 | B2 |
7927278 | Selzer et al. | Apr 2011 | B2 |
8014848 | Birkenbach et al. | Sep 2011 | B2 |
8038619 | Steinbacher | Oct 2011 | B2 |
8060181 | Rodriguez Ponce et al. | Nov 2011 | B2 |
8075488 | Burton | Dec 2011 | B2 |
8090427 | Eck et al. | Jan 2012 | B2 |
8105239 | Specht | Jan 2012 | B2 |
8172754 | Watanabe et al. | May 2012 | B2 |
8175368 | Sathyanarayana | May 2012 | B2 |
8200313 | Rambod et al. | Jun 2012 | B1 |
8211023 | Swan et al. | Jul 2012 | B2 |
8228347 | Beasley et al. | Jul 2012 | B2 |
8298147 | Huennekens et al. | Oct 2012 | B2 |
8303505 | Webler et al. | Nov 2012 | B2 |
8323202 | Roschak et al. | Dec 2012 | B2 |
8328727 | Miele et al. | Dec 2012 | B2 |
8388541 | Messerly et al. | Mar 2013 | B2 |
8409103 | Grunwald et al. | Apr 2013 | B2 |
8449465 | Nair et al. | May 2013 | B2 |
8553954 | Saikia | Oct 2013 | B2 |
8556815 | Pelissier et al. | Oct 2013 | B2 |
8585600 | Liu et al. | Nov 2013 | B2 |
8622913 | Dentinger et al. | Jan 2014 | B2 |
8706457 | Hart et al. | Apr 2014 | B2 |
8727988 | Flaherty et al. | May 2014 | B2 |
8734357 | Taylor | May 2014 | B2 |
8744211 | Owen | Jun 2014 | B2 |
8754865 | Merritt et al. | Jun 2014 | B2 |
8764663 | Smok et al. | Jul 2014 | B2 |
8781194 | Malek et al. | Jul 2014 | B2 |
8781555 | Burnside et al. | Jul 2014 | B2 |
8790263 | Randall et al. | Jul 2014 | B2 |
8849382 | Cox et al. | Sep 2014 | B2 |
8939908 | Suzuki et al. | Jan 2015 | B2 |
8961420 | Zhang | Feb 2015 | B2 |
9022940 | Meier | May 2015 | B2 |
9138290 | Hadjicostis | Sep 2015 | B2 |
9199082 | Yared et al. | Dec 2015 | B1 |
9204858 | Pelissier et al. | Dec 2015 | B2 |
9220477 | Urabe et al. | Dec 2015 | B2 |
9295447 | Shah | Mar 2016 | B2 |
9320493 | Visveshwara | Apr 2016 | B2 |
9357980 | Toji et al. | Jun 2016 | B2 |
9364171 | Harris et al. | Jun 2016 | B2 |
9427207 | Sheldon et al. | Aug 2016 | B2 |
9445780 | Hossack et al. | Sep 2016 | B2 |
9456766 | Cox et al. | Oct 2016 | B2 |
9456804 | Tamada | Oct 2016 | B2 |
9468413 | Hall et al. | Oct 2016 | B2 |
9492097 | Wilkes et al. | Nov 2016 | B2 |
9521961 | Silverstein et al. | Dec 2016 | B2 |
9554716 | Burnside et al. | Jan 2017 | B2 |
9582876 | Specht | Feb 2017 | B2 |
9610061 | Ebbini et al. | Apr 2017 | B2 |
9636031 | Cox | May 2017 | B2 |
9649037 | Lowe et al. | May 2017 | B2 |
9649048 | Cox et al. | May 2017 | B2 |
9702969 | Hope Simpson et al. | Jul 2017 | B2 |
9715757 | Ng et al. | Jul 2017 | B2 |
9717415 | Cohen et al. | Aug 2017 | B2 |
9731066 | Liu et al. | Aug 2017 | B2 |
9814433 | Benishti et al. | Nov 2017 | B2 |
9814531 | Yagi et al. | Nov 2017 | B2 |
9861337 | Patwardhan et al. | Jan 2018 | B2 |
9895138 | Sasaki | Feb 2018 | B2 |
9913605 | Harris et al. | Mar 2018 | B2 |
9949720 | Southard et al. | Apr 2018 | B2 |
10043272 | Forzoni et al. | Aug 2018 | B2 |
10449330 | Newman et al. | Oct 2019 | B2 |
10524691 | Newman et al. | Jan 2020 | B2 |
10751509 | Misener | Aug 2020 | B2 |
11564861 | Gaines | Jan 2023 | B1 |
20020038088 | Imran et al. | Mar 2002 | A1 |
20030047126 | Tomaschko | Mar 2003 | A1 |
20030106825 | Molina et al. | Jun 2003 | A1 |
20030120154 | Sauer et al. | Jun 2003 | A1 |
20030125629 | Ustuner | Jul 2003 | A1 |
20030135115 | Burdette et al. | Jul 2003 | A1 |
20030149366 | Stringer et al. | Aug 2003 | A1 |
20040015080 | Kelly et al. | Jan 2004 | A1 |
20040055925 | Franks-Farah et al. | Mar 2004 | A1 |
20040197267 | Black et al. | Oct 2004 | A1 |
20050000975 | Carco et al. | Jan 2005 | A1 |
20050049504 | Lo et al. | Mar 2005 | A1 |
20050165299 | Kressy et al. | Jul 2005 | A1 |
20050251030 | Azar et al. | Nov 2005 | A1 |
20050267365 | Sokulin et al. | Dec 2005 | A1 |
20060004290 | Smith et al. | Jan 2006 | A1 |
20060013523 | Childlers et al. | Jan 2006 | A1 |
20060015039 | Cassidy et al. | Jan 2006 | A1 |
20060020204 | Serra | Jan 2006 | A1 |
20060047617 | Bacioiu et al. | Mar 2006 | A1 |
20060079781 | Germond-Rouet et al. | Apr 2006 | A1 |
20060111634 | Wu | May 2006 | A1 |
20060184029 | Haim | Aug 2006 | A1 |
20060210130 | Germond-Rouet et al. | Sep 2006 | A1 |
20070043341 | Anderson et al. | Feb 2007 | A1 |
20070049822 | Bunce et al. | Mar 2007 | A1 |
20070073155 | Park et al. | Mar 2007 | A1 |
20070167738 | Timinger et al. | Jul 2007 | A1 |
20070199848 | Ellswood et al. | Aug 2007 | A1 |
20070239120 | Brock et al. | Oct 2007 | A1 |
20070249911 | Simon | Oct 2007 | A1 |
20070287886 | Saadat | Dec 2007 | A1 |
20080021322 | Stone et al. | Jan 2008 | A1 |
20080033293 | Beasley et al. | Feb 2008 | A1 |
20080033759 | Finlay | Feb 2008 | A1 |
20080051657 | Rold | Feb 2008 | A1 |
20080108930 | Weitzel et al. | May 2008 | A1 |
20080125651 | Watanabe et al. | May 2008 | A1 |
20080146915 | McMorrow | Jun 2008 | A1 |
20080177186 | Slater et al. | Jul 2008 | A1 |
20080221425 | Olson et al. | Sep 2008 | A1 |
20080294037 | Richter | Nov 2008 | A1 |
20080300491 | Bonde et al. | Dec 2008 | A1 |
20090012399 | Sunagawa et al. | Jan 2009 | A1 |
20090012401 | Steinbacher | Jan 2009 | A1 |
20090074280 | Lu et al. | Mar 2009 | A1 |
20090124903 | Osaka | May 2009 | A1 |
20090137887 | Shariati et al. | May 2009 | A1 |
20090143672 | Harms et al. | Jun 2009 | A1 |
20090143684 | Cermak et al. | Jun 2009 | A1 |
20090156926 | Messerly et al. | Jun 2009 | A1 |
20090281413 | Boyden et al. | Nov 2009 | A1 |
20090306509 | Pedersen et al. | Dec 2009 | A1 |
20100010348 | Halmann | Jan 2010 | A1 |
20100211026 | Sheetz et al. | Aug 2010 | A2 |
20100249598 | Smith et al. | Sep 2010 | A1 |
20100286515 | Gravenstein et al. | Nov 2010 | A1 |
20100312121 | Guan | Dec 2010 | A1 |
20100324423 | El-Aklouk et al. | Dec 2010 | A1 |
20110002518 | Ziv-Ari et al. | Jan 2011 | A1 |
20110026796 | Hyun | Feb 2011 | A1 |
20110071404 | Schmitt et al. | Mar 2011 | A1 |
20110074244 | Osawa | Mar 2011 | A1 |
20110087107 | Lindekugel et al. | Apr 2011 | A1 |
20110166451 | Blaivas et al. | Jul 2011 | A1 |
20110282188 | Burnside et al. | Nov 2011 | A1 |
20110295108 | Cox et al. | Dec 2011 | A1 |
20110313293 | Lindekugel et al. | Dec 2011 | A1 |
20120165679 | Orome et al. | Jun 2012 | A1 |
20120179038 | Meurer et al. | Jul 2012 | A1 |
20120179042 | Fukumoto et al. | Jul 2012 | A1 |
20120179044 | Chiang et al. | Jul 2012 | A1 |
20120197132 | O'Connor | Aug 2012 | A1 |
20120220865 | Brown et al. | Aug 2012 | A1 |
20120277576 | Lui | Nov 2012 | A1 |
20130041250 | Pelissier et al. | Feb 2013 | A1 |
20130102889 | Southard et al. | Apr 2013 | A1 |
20130131499 | Chan et al. | May 2013 | A1 |
20130131502 | Blaivas et al. | May 2013 | A1 |
20130150724 | Blaivas et al. | Jun 2013 | A1 |
20130188832 | Ma et al. | Jul 2013 | A1 |
20130197367 | Smok et al. | Aug 2013 | A1 |
20130218024 | Boctor et al. | Aug 2013 | A1 |
20130323700 | Samosky et al. | Dec 2013 | A1 |
20130338503 | Cohen et al. | Dec 2013 | A1 |
20130338508 | Nakamura et al. | Dec 2013 | A1 |
20140005530 | Liu et al. | Jan 2014 | A1 |
20140031694 | Solek | Jan 2014 | A1 |
20140066779 | Nakanishi | Mar 2014 | A1 |
20140073976 | Fonte et al. | Mar 2014 | A1 |
20140100440 | Cheline et al. | Apr 2014 | A1 |
20140114194 | Kanayama et al. | Apr 2014 | A1 |
20140180098 | Flaherty et al. | Jun 2014 | A1 |
20140180116 | Lindekugel et al. | Jun 2014 | A1 |
20140188133 | Misener | Jul 2014 | A1 |
20140188440 | Donhowe et al. | Jul 2014 | A1 |
20140276059 | Sheehan | Sep 2014 | A1 |
20140276069 | Amble et al. | Sep 2014 | A1 |
20140276081 | Tegels | Sep 2014 | A1 |
20140276085 | Miller | Sep 2014 | A1 |
20140276690 | Grace | Sep 2014 | A1 |
20140343431 | Vajinepalli et al. | Nov 2014 | A1 |
20140357994 | Jin et al. | Dec 2014 | A1 |
20150005738 | Blacker | Jan 2015 | A1 |
20150011887 | Ahn et al. | Jan 2015 | A1 |
20150065916 | Maguire et al. | Mar 2015 | A1 |
20150073279 | Cai et al. | Mar 2015 | A1 |
20150112200 | Oberg et al. | Apr 2015 | A1 |
20150209113 | Burkholz et al. | Jul 2015 | A1 |
20150209510 | Burkholz et al. | Jul 2015 | A1 |
20150209526 | Matsubara et al. | Jul 2015 | A1 |
20150282890 | Cohen | Oct 2015 | A1 |
20150297097 | Matsubara et al. | Oct 2015 | A1 |
20150359520 | Shan et al. | Dec 2015 | A1 |
20150359991 | Dunbar et al. | Dec 2015 | A1 |
20160000367 | Lyon | Jan 2016 | A1 |
20160026894 | Nagase | Jan 2016 | A1 |
20160029995 | Navratil et al. | Feb 2016 | A1 |
20160113699 | Sverdlik et al. | Apr 2016 | A1 |
20160120607 | Sorotzkin et al. | May 2016 | A1 |
20160157831 | Kang et al. | Jun 2016 | A1 |
20160166232 | Merritt | Jun 2016 | A1 |
20160202053 | Walker et al. | Jul 2016 | A1 |
20160211045 | Jeon | Jul 2016 | A1 |
20160213398 | Liu | Jul 2016 | A1 |
20160259992 | Knodt et al. | Sep 2016 | A1 |
20160278869 | Grunwald | Sep 2016 | A1 |
20160296208 | Sethuraman et al. | Oct 2016 | A1 |
20160317119 | Tahmasebi Maraghoosh | Nov 2016 | A1 |
20160374644 | Mauldin, Jr. et al. | Dec 2016 | A1 |
20170020561 | Cox et al. | Jan 2017 | A1 |
20170079548 | Silverstein et al. | Mar 2017 | A1 |
20170143312 | Hedlund et al. | May 2017 | A1 |
20170164923 | Matsumoto | Jun 2017 | A1 |
20170172666 | Govari et al. | Jun 2017 | A1 |
20170215842 | Ryu et al. | Aug 2017 | A1 |
20170252004 | Broad et al. | Sep 2017 | A1 |
20170328751 | Lemke | Nov 2017 | A1 |
20170367678 | Sirtori et al. | Dec 2017 | A1 |
20180015256 | Southard et al. | Jan 2018 | A1 |
20180116723 | Hettrick et al. | May 2018 | A1 |
20180125450 | Blackbourne et al. | May 2018 | A1 |
20180161502 | Nanan et al. | Jun 2018 | A1 |
20180199914 | Ramachandran et al. | Jul 2018 | A1 |
20180214119 | Mehrmohammadi et al. | Aug 2018 | A1 |
20180228465 | Southard et al. | Aug 2018 | A1 |
20180235649 | Elkadi | Aug 2018 | A1 |
20180235709 | Donhowe et al. | Aug 2018 | A1 |
20180289927 | Messerly | Oct 2018 | A1 |
20180296185 | Cox et al. | Oct 2018 | A1 |
20180310955 | Lindekugel et al. | Nov 2018 | A1 |
20180344293 | Raju et al. | Dec 2018 | A1 |
20190060001 | Kohli et al. | Feb 2019 | A1 |
20190060014 | Hazelton et al. | Feb 2019 | A1 |
20190125210 | Govari et al. | May 2019 | A1 |
20190200951 | Meier | Jul 2019 | A1 |
20190239848 | Bedi et al. | Aug 2019 | A1 |
20190307419 | Durfee | Oct 2019 | A1 |
20190307515 | Naito et al. | Oct 2019 | A1 |
20190365347 | Abe | Dec 2019 | A1 |
20190365348 | Toume et al. | Dec 2019 | A1 |
20200069929 | Mason et al. | Mar 2020 | A1 |
20200113540 | Gijsbers et al. | Apr 2020 | A1 |
20200163654 | Satir et al. | May 2020 | A1 |
20200200900 | Asami et al. | Jun 2020 | A1 |
20200205774 | Duffy | Jul 2020 | A1 |
20200230391 | Burkholz et al. | Jul 2020 | A1 |
20200281563 | Muller et al. | Sep 2020 | A1 |
20200359990 | Poland et al. | Nov 2020 | A1 |
20210059639 | Howell | Mar 2021 | A1 |
20210137492 | Imai | May 2021 | A1 |
20210161510 | Sasaki et al. | Jun 2021 | A1 |
20210186467 | Urabe et al. | Jun 2021 | A1 |
20210212668 | Li et al. | Jul 2021 | A1 |
20210267570 | Ulman et al. | Sep 2021 | A1 |
20210315538 | Brandl et al. | Oct 2021 | A1 |
20220039777 | Durfee | Feb 2022 | A1 |
20220039829 | Zijlstra et al. | Feb 2022 | A1 |
20220071593 | Tran | Mar 2022 | A1 |
20220096797 | Prince | Mar 2022 | A1 |
20220104791 | Matsumoto | Apr 2022 | A1 |
20220104886 | Blanchard et al. | Apr 2022 | A1 |
20220117582 | McLaughlin et al. | Apr 2022 | A1 |
20220160434 | Messerly et al. | May 2022 | A1 |
20220172354 | Misener et al. | Jun 2022 | A1 |
20220330922 | Sowards et al. | Oct 2022 | A1 |
20220334251 | Sowards et al. | Oct 2022 | A1 |
20230107629 | Sowards et al. | Apr 2023 | A1 |
20230132148 | Sowards et al. | Apr 2023 | A1 |
20230135562 | Misener et al. | May 2023 | A1 |
20230138970 | Sowards et al. | May 2023 | A1 |
20230148872 | Sowards et al. | May 2023 | A1 |
20230201539 | Howell | Jun 2023 | A1 |
20230277153 | Sowards et al. | Sep 2023 | A1 |
20230277154 | Sowards et al. | Sep 2023 | A1 |
20230293143 | Sowards et al. | Sep 2023 | A1 |
20230397900 | Prince | Dec 2023 | A1 |
20240065673 | Sowards et al. | Feb 2024 | A1 |
Number | Date | Country |
---|---|---|
102871645 | Jan 2013 | CN |
105107067 | May 2018 | CN |
0933063 | Aug 1999 | EP |
1504713 | Feb 2005 | EP |
1591074 | May 2008 | EP |
2823766 | Jan 2015 | EP |
3181083 | Jun 2017 | EP |
3870059 | Sep 2021 | EP |
2000271136 | Oct 2000 | JP |
2007222291 | Sep 2007 | JP |
2014150928 | Aug 2014 | JP |
2018175547 | Nov 2018 | JP |
20180070878 | Jun 2018 | KR |
102176196 | Nov 2020 | KR |
2010029521 | Mar 2010 | WO |
2010076808 | Jul 2010 | WO |
2013059714 | Apr 2013 | WO |
2014115150 | Jul 2014 | WO |
2015017270 | Feb 2015 | WO |
2016081023 | May 2016 | WO |
2017096487 | Jun 2017 | WO |
2017214428 | Dec 2017 | WO |
2018026878 | Feb 2018 | WO |
2018134726 | Jul 2018 | WO |
2019232451 | Dec 2019 | WO |
2020002620 | Jan 2020 | WO |
2020016018 | Jan 2020 | WO |
2019232454 | Feb 2020 | WO |
2020044769 | Mar 2020 | WO |
2020067897 | Apr 2020 | WO |
2020083660 | Apr 2020 | WO |
2020186198 | Sep 2020 | WO |
2021198226 | Oct 2021 | WO |
2022072727 | Apr 2022 | WO |
2022081904 | Apr 2022 | WO |
2022119853 | Jun 2022 | WO |
2022115479 | Jun 2022 | WO |
2022119856 | Jun 2022 | WO |
2022221703 | Oct 2022 | WO |
2022221714 | Oct 2022 | WO |
2023059512 | Apr 2023 | WO |
2023091424 | May 2023 | WO |
2023076268 | May 2023 | WO |
2023081220 | May 2023 | WO |
2023081223 | May 2023 | WO |
Entry |
---|
Lu Zhenyu et al “Recent advances in 5 robot-assisted echography combining perception control and cognition.” Cognitive Computation and Systems the Institution of Engineering and Technology, Michael Faraday House, Six Hills Way, Stevenage Herts. SG1 2AY UK vol. 2 No. 3 Sep. 2, 2020 (Sep. 2, 2020). |
PCT/US2021/045218 filed Aug. 9, 2021 International Search Report and Written Opinion dated Nov. 23, 2021. |
PCT/US2021/049123 filed Sep. 3, 2021 International Search Report and Written Opinion dated Feb. 4, 2022. |
PCT/US2021/053018 filed Sep. 30, 2021 International Search Report and Written Opinion dated May 3, 2022. |
PCT/US2021/060622 filed Nov. 23, 2021 International Search Report and Written Opinion dated Mar. 3, 2022. |
PCT/US2021/061267 filed Nov. 30, 2021 International Search Report and Written Opinion dated Mar. 9, 2022. |
PCT/US2021/061276 filed Nov. 30, 2021 International Search Report and Written Opinion dated Mar. 9, 2022. |
Sebastian Vogt: “Real-Time Augmented Reality for Image-Guided Interventions”, Oct. 5, 2009, XPO55354720, Retrieved from the Internet: URL: https://opus4.kobv.de/opus4-fau/frontdoor/deliver/index/docId/1235/file/SebastianVogtDissertation.pdf. |
U.S. Appl. No. 15/951,903, filed Apr. 12, 2018 Board Decision dated Apr. 20, 2022. |
U.S. Appl. No. 15/951,903, filed Apr. 12, 2018 Notice of Allowance dated May 2, 2022. |
William Garrett et al: “Real-time incremental visualization of dynamic ultrasound volumes using parallel BSP trees”, Visualization '96. Proceedings, IEEE, NE, Oct. 27, 1996, pp. 235-ff, XPO58399771, ISBN: 978-0-89791-864-0 abstract, figures 1-7, pp. 236-240. |
PCT/US12/61182 International Seach Report and Written Opinion dated Mar. 11, 2013. |
PCT/US2021/049294 filed Sep. 7, 2021 International Search Report and Written Opinion dated Dec. 8, 2021. |
PCT/US2021/049712 filed Sep. 9, 2021 International Search Report and Written Opinion dated Dec. 14, 2021. |
PCT/US2021/052055 filed Sep. 24, 2021 International Search Report and Written Opinion dated Dec. 20, 2021. |
U.S. Appl. No. 13/656,563, filed Oct. 19, 2012 Decision on Appeal dated Nov. 1, 2017. |
U.S. Appl. No. 13/656,563, filed Oct. 19, 2012 Examiner's Answer dated Nov. 16, 2015. |
U.S. Appl. No. 13/656,563, filed Oct. 19, 2012 Final Office Action dated Dec. 5, 2014. |
U.S. Appl. No. 13/656,563, filed Oct. 19, 2012 Non-Final Office Action dated Jul. 18, 2014. |
U.S. Appl. No. 15/650,474, filed Jul. 14, 2017 Final Office Action dated Jun. 2, 2020. |
U.S. Appl. No. 15/650,474, filed Jul. 14, 2017 Non-Final Office Action dated Dec. 16, 2019. |
U.S. Appl. No. 15/650,474, filed Jul. 14, 2017 Notice of Allowance dated Dec. 11, 2020. |
U.S. Appl. No. 15/650,474, filed Jul. 14, 2017 Notice of Allowance dated Mar. 1, 2021. |
U.S. Appl. No. 15/951,903, filed Apr. 12, 2018 Advisory Action dated Dec. 22, 2020. |
U.S. Appl. No. 15/951,903, filed Apr. 12, 2018 Examiner's Answer dated Jun. 3, 2021. |
U.S. Appl. No. 15/951,903, filed Apr. 12, 2018 Final Office Action dated Oct. 13, 2020. |
U.S. Appl. No. 15/951,903, filed Apr. 12, 2018 Non-Final Office Action dated May 22, 2020. |
U.S. Appl. No. 17/020,476, filed Sep. 14, 2020 Non-Final Office Action dated Feb. 9, 2022. |
Pagoulatos, N. et al. “New spatial localizer based on fiber optics with applications in 3D ultrasound imaging” Proceeding of Spie, vol. 3976 (Apr. 18, 2000; Apr. 18, 2000). |
PCT/US2022/025082 filed Apr. 15, 2022 International Search Report and Written Opinion dated Jul. 11, 2022. |
PCT/US2022/025097 filed Apr. 15, 2022 International Search Report and Written Opinion dated Jul. 8, 2022. |
U.S. Appl. No. 17/020,476, filed Sep. 14, 2020 Final Office Action dated Jun. 9, 2022. |
U.S. Appl. No. 17/020,476, filed Sep. 14, 2020 Advisory Action dated Aug. 19, 2022. |
U.S. Appl. No. 17/020,476, filed Sep. 14, 2020 Non-Final Office Action dated Sep. 23, 2022. |
U.S. Appl. No. 17/471,015, filed Sep. 9, 2021 Non-Final Office Action dated Aug. 16, 2022. |
PCT/US2023/014143 filed Feb. 28, 2023 International Search Report and Written Opinion dated Jun. 12, 2023. |
PCT/US2023/015266 filed Mar. 15, 2023 International Search Report and Written Opinion dated May 25, 2023. |
U.S. Appl. No. 17/684,180, filed Mar. 1, 2022 Non-Final Office Action dated Jul. 28, 2023. |
PCT/US2022/048716 filed Nov. 2, 2022 International Search Report and Written Opinion dated Feb. 24, 2023. |
PCT/US2022/048722 filed Nov. 2, 2022 International Search Report and Written Opinion dated Feb. 24, 2023. |
PCT/US2022/049983 filed Nov. 15, 2022 International Search Report and Written Opinion dated Mar. 29, 2023. |
PCT/US2022047727 filed Oct. 25, 2022 International Search Report and Written Opinion dated Jan. 25, 2023. |
Saxena Ashish et al Thermographic venous blood flow characterization with external cooling stimulation Infrared Physics and Technology Elsevier Science GB vol. 90 Feb. 9, 2018 Feb. 9, 2018 pp. 8-19 XP085378852. |
U.S. Appl. No. 17/020,476, filed Sep. 14, 2020 Final Office Action dated Jan. 5, 2023. |
U.S. Appl. No. 17/020,476, filed Sep. 14, 2020 Notice of Allowance dated Apr. 28, 2022. |
U.S. Appl. No. 17/468,318, filed Sep. 7, 2021 Non-Final Office Action dated Apr. 12, 2023. |
U.S. Appl. No. 17/471,015, filed Sep. 9, 2021 Non-Final Office Action dated Mar. 30, 2023. |
U.S. Appl. No. 17/534,099, filed Nov. 23, 2021 Non-Final Office Action dated Mar. 31, 2023. |
U.S. Appl. No. 17/684,180, filed Mar. 1, 2022 Restriction Requirement dated May 19, 2023. |
EP 20866520.8 filed Apr. 5, 2022 Extended European Search Report dated Aug. 22, 2023. |
PCT/US2022/025097 filed Apr. 15, 2021 International Preliminary Report on Patentability dated Oct. 26, 2023. |
PCT/US2023/030970 filed Aug. 23, 2023 International Search Report and Written Opinion dated Oct. 30, 2023. |
U.S. Appl. No. 17/468,318, filed Sep. 7, 2021 Advisory Action dated Nov. 6, 2023. |
U.S. Appl. No. 17/468,318, filed Sep. 7, 2021 Final Office Action dated Sep. 8, 2023. |
U.S. Appl. No. 17/471,015, filed Sep. 9, 2021 Final Office Action dated Oct. 12, 2023. |
U.S. Appl. No. 17/534,099, filed Nov. 23, 2021 Final Office Action dated Sep. 29, 2023. |
U.S. Appl. No. 17/722,151, filed Apr. 15, 2022 Final Office Action dated Nov. 6, 2023. |
U.S. Appl. No. 17/722,151, filed Apr. 15, 2022 Non-Final Office Action dated Sep. 7, 2023. |
U.S. Appl. No. 17/894,460, filed Aug. 24, 2022 Non-Final Office Action dated Nov. 6, 2023. |
U.S. Appl. No. 17/468,318, filed Sep. 7, 2021 Notice of Allowance dated Jan. 18, 2024. |
U.S. Appl. No. 17/471,015, filed Sep. 9, 2021 Advisory Action dated Feb. 2, 2024. |
U.S. Appl. No. 17/534,099, filed Nov. 23, 2021 Advisory Action dated Dec. 8, 2023. |
U.S. Appl. No. 17/538,943, filed Nov. 30, 2021 Non-Final Office Action dated Jan. 30, 2024. |
U.S. Appl. No. 17/684,180, filed Mar. 1, 2022 Final Office Action dated Jan. 18, 2024. |
U.S. Appl. No. 17/722,111, filed Apr. 15, 2022 Non-Final Office Action dated Dec. 22, 2023. |
U.S. Appl. No. 17/722,151, filed Apr. 15, 2022 Advisory Action dated Jan. 2, 2024. |
U.S. Appl. No. 17/894,460, filed Aug. 24, 2022 Final Office Action dated Jan. 31, 2024. |
M. Ikhsan, K. K. Tan, AS. Putra, C. F. Kong, et al., “Automatic identification of blood vessel cross-section for central venous catheter placement using a cascading classifier,” 39th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC). pp. 1489-1492 (Year: 2017). |
U.S. Appl. No. 17/471,015, filed Sep. 9, 2021 Non-Final Office Action dated Mar. 28, 2024. |
U.S. Appl. No. 17/534,099, filed Nov. 23, 2021 Non-Final Office Action dated Mar. 14, 2024. |
US 17/684, 180 filed Mar. 1, 2022 Advisory Action dated Apr. 4, 2024. |
US 17/722, 151 filed Apr. 15, 2022 Non-Final Office Action dated Mar. 25, 2024. |
U.S. Appl. No. 17/894,460, filed Aug. 24, 2022 Advisory Action dated Apr. 4, 2024. |
U.S. Appl. No. 18/238,281, filed Aug. 25, 2023 Non-Final Office Action dated Mar. 22, 2024. |
Number | Date | Country | |
---|---|---|---|
20220168050 A1 | Jun 2022 | US |
Number | Date | Country | |
---|---|---|---|
63119829 | Dec 2020 | US |