A variety of ultrasound systems exist including wired or wireless ultrasound probes for ultrasound imaging. Whether wired or wireless, an ultrasound system such as the foregoing requires a clinician to switch his or her spatial attention between different spatial regions, particularly between 1) a relatively close ultrasound probe being used for ultrasound imaging and 2) a relatively distant display rendering corresponding ultrasound images. Having to switch spatial attention between the ultrasound probe and the display can be difficult when ultrasound imaging and attempting to simultaneously establish an insertion site with a needle, place a vascular access device (“VAD”) such as a catheter in a blood vessel of a patient at the insertion site, or the like. Such difficulties can be pronounced for less experienced clinicians, older clinicians having reduced lens flexibility in their eyes, etc. Ultrasound systems are needed that do not require clinicians to continuously switch their spatial attention between different spatial regions.
Disclosed herein are ultrasound systems and methods for sustained spatial attention in one or more spatial regions.
Disclosed herein is an ultrasound probe including, in some embodiments, a probe body, a probe head extending from a distal end of the probe body, and a camera integrated into a side of the ultrasound probe. The probe head includes a plurality of ultrasonic transducers arranged in an array. The camera is configured for recording one or more still or moving images of a procedural field with a depth of field including a plane of a distal end of the probe head and a field of view including a spatial region about the probe head.
In some embodiments, the ultrasound probe further includes a light-pattern projector integrated into the side of the ultrasound probe including the camera. The light-pattern projector is configured to project a light pattern in the spatial region about the probe head focused in the plane of the distal end of the probe head. The light pattern is configured for guided insertion of a needle into an anatomical target under the probe head in the procedural field.
In some embodiments, the light pattern includes periodic hash marks along one or more rays radiating from a central axis of the ultrasound probe in the plane of the probe head. Each hash mark of the hash marks corresponds to a depth under the probe head accessible by the needle along an associated ray at a needle-insertion angle with respect to the plane of the probe head.
In some embodiments, the light pattern includes periodic concentric circular arcs bound between two or more rays radiating from a central axis of the ultrasound probe in the plane of the probe head. Each circular arc of the circular arcs corresponds to a depth under the probe head accessible by the needle along an associated ray at a needle-insertion angle with respect to the plane of the probe head.
In some embodiments, the ultrasound probe further includes a needle-guide holder extending from a side of the probe head in common with the side of the ultrasound probe including the camera.
In some embodiments, the ultrasound probe further includes a single-use needle guide coupled to the needle-guide holder. The needle-guide holder, the needle guide, or a combination of the needle-guide holder and the needle guide includes at least one degree of freedom enabling the needle guide to swivel between sides of the ultrasound probe.
Also disclosed herein is an ultrasound system including, in some embodiments, a console and an ultrasound probe. The console includes a display configured to render on a display screen thereof ultrasound images and one or more still or moving images of a procedural field. The ultrasound probe includes a probe body, a probe head extending from a distal end of the probe body, and a camera integrated into a side of the ultrasound probe. The probe head includes a plurality of ultrasonic transducers arranged in an array. The camera is configured for recording the one-or-more still or moving images of the procedural field with a depth of field including a plane of a distal end of the probe head and a field of view including a spatial region about the probe head.
In some embodiments, the ultrasound probe further includes a needle-guide holder extending from a side of the probe head in common with the side of the ultrasound probe including the camera.
In some embodiments, the ultrasound probe further includes a single-use needle guide coupled to the needle-guide holder. The needle-guide holder, the needle guide, or a combination of the needle-guide holder and the needle guide includes at least one degree of freedom enabling the needle guide to swivel between sides of the ultrasound probe.
In some embodiments, the ultrasound probe further includes a light-pattern projector integrated into the side of the ultrasound probe including the camera. The light-pattern projector is configured to project a light pattern in the spatial region about the probe head focused in the plane of the distal end of the probe head. The light pattern is configured for guided insertion of a needle into an anatomical target under the probe head in the procedural field.
In some embodiments, the light pattern includes periodic hash marks along one or more rays radiating from a central axis of the ultrasound probe in the plane of the probe head. Each hash mark of the hash marks corresponds to a depth under the probe head accessible by the needle along an associated ray at a needle-insertion angle with respect to the plane of the probe head.
In some embodiments, the light pattern includes periodic concentric circular arcs bound between two or more rays radiating from a central axis of the ultrasound probe in the plane of the probe head. Each circular arc of the circular arcs corresponds to a depth under the probe head accessible by the needle along an associated ray at a needle-insertion angle with respect to the plane of the probe head.
In some embodiments, the one-or-more still or moving images show both the light pattern in the spatial region about the probe head and the needle in relation to the light pattern when both the light pattern and the needle are present in the spatial region about the probe head. The one-or-more still or moving images show both the light pattern and the needle in relation to the light pattern for the guided insertion of the needle into the anatomical target under the probe head optionally on the display.
In some embodiments, the display is further configured to render on the display screen one or more overlying needle trajectories lying over the ultrasound images in accordance with one or more depths accessible by the needle indicated by the light pattern. The one-or-more overlying needle trajectories are configured for the guided insertion of the needle into the anatomical target under the probe head on the display.
In some embodiments, the display is further configured to render on the display screen an overlying pattern lying over the one-or-more still or moving images. The overlying pattern is configured for guided insertion of a needle into an anatomical target under the probe head on the display.
In some embodiments, the overlying pattern includes periodic hash marks along one or more rays radiating from a central axis of the ultrasound probe in the plane of the probe head. Each hash mark of the hash marks corresponds to a depth under the probe head accessible by the needle along an associated ray at a needle-insertion angle with respect to the plane of the probe head.
In some embodiments, the overlying pattern includes periodic concentric circular arcs bound between two or more rays radiating from a central axis of the ultrasound probe in the plane of the probe head. Each circular arc of the circular arcs corresponds to a depth under the probe head accessible by the needle along an associated ray at a needle-insertion angle with respect to the plane of the probe head.
In some embodiments, the one-or-more still or moving images show the needle in relation to the overlying pattern when the needle is present in the spatial region about the probe head. The one-or-more still or moving images show the needle in relation to the overlying pattern for the guided insertion of the needle into the anatomical target under the probe head optionally on the display.
In some embodiments, the display is further configured to render on the display screen one or more overlying needle trajectories lying over the ultrasound images in accordance with one or more depths accessible by the needle indicated by the overlying pattern. The one-or-more overlying needle trajectories are configured for the guided insertion of the needle into an anatomical target under the probe head on the display.
Also disclosed herein is an ultrasound probe including, in some embodiments, a probe body, a probe head extending from a distal end of the probe body, and a display integrated into a side of the ultrasound probe. The probe head includes a plurality of ultrasonic transducers arranged in an array. The display is configured to render on a display screen thereof ultrasound images and one or more overlying needle trajectories lying over the ultrasound images. The one-or-more overlying needle trajectories are configured for guided insertion of a needle into an anatomical target under the probe head on the display.
In some embodiments, the ultrasound probe further includes a light-pattern projector integrated into the side of the ultrasound probe including the display. The light-pattern projector is configured to project a light pattern in a spatial region about the probe head focused in a plane of a distal end of the probe head. The light pattern is configured for the guided insertion of the needle into the anatomical target under the probe head in the procedural field.
In some embodiments, the light pattern includes periodic hash marks along one or more rays radiating from a central axis of the ultrasound probe in the plane of the probe head. Each hash mark of the hash marks corresponds to a depth under the probe head accessible by the needle along an associated ray at a needle-insertion angle with respect to the plane of the probe head.
In some embodiments, the light pattern includes periodic concentric circular arcs bound between two or more rays radiating from a central axis of the ultrasound probe in the plane of the probe head. Each circular arc of the circular arcs corresponds to a depth under the probe head accessible by the needle along an associated ray at a needle-insertion angle with respect to the plane of the probe head.
In some embodiments, the one-or-more overlying needle trajectories lying over the ultrasound images are in accordance with one or more depths accessible by the needle indicated by the light pattern.
In some embodiments, the ultrasound probe further includes a needle-guide holder extending from the side of the ultrasound probe including the display.
In some embodiments, the ultrasound probe further includes a single-use needle guide coupled to the needle-guide holder. The needle-guide holder, the needle guide, or a combination of the needle-guide holder and the needle guide includes at least one degree of freedom enabling the needle guide to swivel between sides of the ultrasound probe.
Also disclosed herein is a method of an ultrasound system including, in some embodiments, an ultrasound probe-obtaining step, an ultrasound probe-moving step, a recording step, an ultrasound image-monitoring step, and a needle-inserting step. The ultrasound probe-obtaining step includes obtaining an ultrasound probe. The ultrasound probe includes a probe body, a probe head extending from a distal end of the probe body, and a camera integrated into a side of the ultrasound probe. The ultrasound probe-moving step includes moving the ultrasound probe over a patient while the ultrasound probe emits generated ultrasound signals into the patient from ultrasonic transducers in the probe head and receives reflected ultrasound signals from the patient by the ultrasonic transducers. The recording step includes recording one or more still or moving images of a procedural field with a depth of field including a plane of a distal end of the probe head and a field of view including a spatial region about the probe head. The ultrasound image-monitoring step includes monitoring ultrasound images rendered on a display screen of a display associated with a console of the ultrasound system to identify an anatomical target of the patient under the probe head. The needle-inserting step includes inserting a needle into the anatomical target. Optionally, the inserting of the needle is guided by the display with reference to the one-or-more still or moving images rendered on the display screen thereof.
In some embodiments, the method further includes a needle guide-attaching step. The needle guide-attaching step includes attaching a needle guide to a needle-guide holder extending from the probe body. The needle guide includes a needle through hole configured to direct the needle into the patient under the probe head at a needle-insertion angle with respect to the plane of the probe head.
In some embodiments, the method further includes a needle guide-swiveling step. The needle guide-swiveling step includes swiveling the needle guide between sides of the ultrasound probe to find a suitable needle trajectory before the needle-inserting step. The needle-guide holder, the needle guide, or a combination of the needle-guide holder and the needle guide includes at least one degree of freedom enabling the swiveling of the needle guide.
In some embodiments, the needle is guided in the procedural field during the needle-inserting step in accordance with a light pattern in the spatial region about the probe head. The light pattern is projected from a light-pattern projector integrated into the side of the ultrasound probe including the camera and focused in the plane of the distal end of the probe head for guiding the needle in the procedural field.
In some embodiments, the light pattern includes periodic hash marks along one or more rays radiating from a central axis of the ultrasound probe in the plane of the probe head. Each hash mark of the hash marks corresponds to a depth under the probe head accessible by the needle along an associated ray at a needle-insertion angle with respect to the plane of the probe head.
In some embodiments, the light pattern includes periodic concentric circular arcs bound between two or more rays radiating from a central axis of the ultrasound probe in the plane of the probe head. Each circular arc of the circular arcs corresponds to a depth under the probe head accessible by the needle along an associated ray at a needle-insertion angle with respect to the plane of the probe head.
In some embodiments, the needle is further guided on the display during the needle-inserting step. The one-or-more still or moving images show both the light pattern in the spatial region about the probe head and the needle in relation to the light pattern for guiding the needle on the display.
In some embodiments, the needle is further guided on the display during the needle-inserting step. The ultrasound images show one or more overlying needle trajectories in accordance with one or more depths accessible by the needle indicated by the light pattern for guiding the needle on the display.
In some embodiments, the needle is guided on the display during the needle-inserting step in accordance with an overlying pattern rendered over the one-or-more still or moving images on the display screen for guiding the needle on the display.
In some embodiments, the overlying pattern includes periodic hash marks along one or more rays radiating from a central axis of the ultrasound probe in the plane of the probe head. Each hash mark of the hash marks corresponds to a depth under the probe head accessible by the needle along an associated ray at a needle-insertion angle with respect to the plane of the probe head.
In some embodiments, the overlying pattern includes periodic concentric circular arcs bound between two or more rays radiating from a central axis of the ultrasound probe in the plane of the probe head. Each circular arc of the circular arcs corresponds to a depth under the probe head accessible by the needle along an associated ray at a needle-insertion angle with respect to the plane of the probe head.
In some embodiments, the needle is further guided on the display during the needle-inserting step. The ultrasound images show one or more overlying needle trajectories in accordance with one or more depths accessible by the needle indicated by the overlying pattern for guiding the needle on the display.
These and other features of the concepts provided herein will become more apparent to those of skill in the art in view of the accompanying drawings and following description, which describe particular embodiments of such concepts in greater detail.
Before some particular embodiments are disclosed in greater detail, it should be understood that the particular embodiments disclosed herein do not limit the scope of the concepts provided herein. It should also be understood that a particular embodiment disclosed herein can have features that can be readily separated from the particular embodiment and optionally combined with or substituted for features of any of a number of other embodiments disclosed herein.
Regarding terms used herein, it should also be understood the terms are for the purpose of describing some particular embodiments, and the terms do not limit the scope of the concepts provided herein. Ordinal numbers (e.g., first, second, third, etc.) are generally used to distinguish or identify different features or steps in a group of features or steps, and do not supply a serial or numerical limitation. For example, “first,” “second,” and “third” features or steps need not necessarily appear in that order, and the particular embodiments including such features or steps need not necessarily be limited to the three features or steps. In addition, any of the foregoing features or steps can, in turn, further include one or more features or steps unless indicated otherwise. Labels such as “left,” “right,” “top,” “bottom,” “front,” “back,” and the like are used for convenience and are not intended to imply, for example, any particular fixed location, orientation, or direction. Instead, such labels are used to reflect, for example, relative location, orientation, or directions. Singular forms of “a,” “an,” and “the” include plural references unless the context clearly dictates otherwise.
With respect to “proximal,” a “proximal portion” or “proximal section” of, for example, a catheter includes a portion or section of the catheter intended to be near a clinician when the catheter is used on a patient. Likewise, a “proximal length” of, for example, the catheter includes a length of the catheter intended to be near the clinician when the catheter is used on the patient. A “proximal end” of, for example, the catheter includes an end of the catheter intended to be near the clinician when the catheter is used on the patient. The proximal portion, the proximal section, or the proximal length of the catheter can include the proximal end of the catheter; however, the proximal portion, the proximal section, or the proximal length of the catheter need not include the proximal end of the catheter. That is, unless context suggests otherwise, the proximal portion, the proximal section, or the proximal length of the catheter is not a terminal portion or terminal length of the catheter.
With respect to “distal,” a “distal portion” or a “distal section” of, for example, a catheter includes a portion or section of the catheter intended to be near or in a patient when the catheter is used on the patient. Likewise, a “distal length” of, for example, the catheter includes a length of the catheter intended to be near or in the patient when the catheter is used on the patient. A “distal end” of, for example, the catheter includes an end of the catheter intended to be near or in the patient when the catheter is used on the patient. The distal portion, the distal section, or the distal length of the catheter can include the distal end of the catheter; however, the distal portion, the distal section, or the distal length of the catheter need not include the distal end of the catheter. That is, unless context suggests otherwise, the distal portion, the distal section, or the distal length of the catheter is not a terminal portion or terminal length of the catheter.
Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by those of ordinary skill in the art.
As set forth above, a variety of ultrasound systems exist including wired or wireless ultrasound probes for ultrasound imaging. Whether wired or wireless, an ultrasound system such as the foregoing requires a clinician to switch his or her spatial attention between different spatial regions, particularly between 1) a relatively close ultrasound probe being used for ultrasound imaging and 2) a relatively distant display rendering corresponding ultrasound images. Having to switch spatial attention between the ultrasound probe and the display can be difficult when ultrasound imaging and attempting to simultaneously establish an insertion site with a needle, place a VAD such as a catheter in a blood vessel of a patient at the insertion site, or the like. Such difficulties can be pronounced for less experienced clinicians, older clinicians having reduced lens flexibility in their eyes, etc. Ultrasound systems are needed that do not require clinicians to continuously switch their spatial attention between different spatial regions.
Disclosed herein are ultrasound systems and methods for sustained spatial attention. For example, an ultrasound system can include a console and an ultrasound probe. A display of the console can be configured to display ultrasound images and one or more still or moving images of a procedural field. The ultrasound probe can include a camera integrated into the ultrasound probe for recording the one-or-more still or moving images of the procedural field with a depth of field including a distal end of a probe head and a field of view including a spatial region about the probe head. With the one-or-more still or moving images displayed along with the ultrasound images, a clinician need not switch his or her spatial attention between spatial regions such as the procedural field and the display quite as frequently as with existing ultrasound systems, thereby sustaining spatial attention in one or more spatial regions. These and other features will become more apparent to those of skill in the art in view of the accompanying drawings and following description, which describe particular embodiments in greater detail.
Ultrasound Systems
As shown, the ultrasound probe 104 includes a probe body 106, a probe head 108 extending from a distal end of the probe body 106, and a plurality of ultrasonic transducers 110 arranged in an array in the probe head 108.
The ultrasound probe 104 can also include a camera 112 integrated into a side of the ultrasound probe 104, a light-pattern projector 114 (e.g., a laser light-pattern projector) integrated into the side of the ultrasound probe 104, or both the camera 112 and the light-pattern projector 114 integrated into the side of the ultrasound probe 104. Notably, the side of the ultrasound probe 104 including the camera 112 or the light-pattern projector 114 is shown in
The camera 112 is configured for recording one or more still or moving images 120 (see
The light-pattern projector 114 is configured to project a light pattern 122 in the spatial region about the probe head 108 focused in the plane of the distal end of the probe head 108, thereby including the foregoing subject portion of the patient in the procedural field. The light pattern 122 is configured for guided insertion of the needle 116 into an anatomical target under the probe head 108 in the procedural field. Similar to the one-or-more still or moving images 120 when rendered on the display screen of the display 158, the light pattern 122 when projected in the spatial region about the probe head 108 allows a clinician to sustain spatial attention in the procedural field when establishing an insertion site with the needle 116 as set forth in the method below, thereby obviating the clinician from frequently switching his or her spatial attention between the procedural field and the display 158 as done with existing ultrasound systems.
As shown, the light pattern 122a of 122b includes periodic hash marks 124 along one or more rays 126 radiating from a central axis of the ultrasound probe 104 in the plane of the probe head 108. Indeed, the light pattern 122a includes the hash marks 124 along one ray 126 radiating from the central axis of the ultrasound probe 104, whereas the light pattern 122b includes the hash marks 124 along three rays 126 radiating from the central axis of the ultrasound probe 104. As shown in
As shown, the light pattern 122c includes periodic concentric circular arcs 128 bound between two or more rays 126 radiating from the central axis of the ultrasound probe 104 in the plane of the probe head 108. Indeed, the light pattern 122c includes the circular arcs 128 bound between three rays 126 radiating from the central axis of the ultrasound probe 104. As shown in
The ultrasound probe 104 can also include a needle-guide holder 130 extending from the side of the probe head 108 in common with the side of the ultrasound probe 104 including the camera 112, whether the foregoing side is the major or minor side of the ultrasound probe 104 including the camera 112 or the light-pattern projector 114.
The ultrasound probe 104 can also include a single-use needle guide 132 configured to couple to the needle-guide holder 130. The needle guide 132, the needle-guide holder 130, or a combination of the needle guide 132 and the needle-guide holder 130 can include at least one degree of freedom enabling the needle guide 132 to swivel between sides of the ultrasound probe 104. Indeed, the needle guide 132 can swivel between minor sides of the ultrasound probe 104 if the needle-guide holder 130 extends from a major side of the ultrasound probe 104. The needle guide 132 can alternatively swivel between major sides of the ultrasound probe 104 if the needle-guide holder 130 extends from a minor side of the ultrasound probe 104. To enable the needle guide 132 to swivel between the foregoing sides of the ultrasound probe 104, the needle guide 132 and the needle-guide holder 130 can include a joint (e.g., ball joint) formed therebetween that provides the degree of freedom needed. If the needle guide 132 is used with the needle 116 to establish an insertion site, the needle guide 132 can be advantageously swiveled along each circular arc of the circular arcs 128 of the light pattern 122c. The needle 116 can be subsequently inserted along any existing or envisioned ray of the light pattern 122c to establish an insertion site.
As shown, the ultrasound probe 204 includes a probe body 206, a probe head 208 extending from a distal end of the probe body 206, and the plurality of ultrasonic transducers 110 arranged in an array in the probe head 208. In addition, the ultrasound probe 204 can include the camera 112 integrated into a side of the ultrasound probe 204, the light-pattern projector 114 integrated into the side of the ultrasound probe 204, or both the camera 112 and the light-pattern projector 114 integrated into the side of the ultrasound probe 204. As such, the ultrasound probe 204 is like the ultrasound probe 104 in certain ways. Therefore, the description set forth above for the ultrasound probe 104 likewise applies to the ultrasound probe 204.
The ultrasound probe 204 also includes a display 134 integrated into the side of the ultrasound probe 204, specifically the top side (or front face) of the ultrasound probe 204, which differentiates the ultrasound probe 204 from the ultrasound probe 104. The display 134 is configured to render ultrasound images 136 on a display screen thereof, which allows a clinician to sustain spatial attention in the procedural field when establishing an insertion site with the needle 116 as set forth in the method below, thereby obviating the clinician from frequently switching his or her spatial attention between the procedural field, which includes the display 134, and another display (e.g., the display 158 of the console 102) as done with existing ultrasound systems. In addition, the display 134 is configured to render one or more overlying needle trajectories 138 over the ultrasound images 136. (See, for example,
Notably, the ultrasound probe 104 or 204 can include magnetic sensors to enhance guided insertion of the needle 116 into an anatomical target as set forth herein with magnetic-based needle guidance. Such magnetic-based needle guidance is disclosed in U.S. Pat. Nos. 8,388,541; 8,781,555; 8,849,382; 9,456,766; 9,492,097; 9,521,961; 9,554,716; 9,636,031; 9,649,048; 10,449,330; 10,524,691; and 10,751,509, each of which is incorporated by reference in its entirety into this application.
As shown, the console 102 includes a variety of components including a processor 140 and memory 142 such as random-access memory (“RAM”) or non-volatile memory (e.g., electrically erasable programmable read-only memory [“EEPROM”]) for controlling various functions of the ultrasound system 100 during operation thereof. Indeed, the console 102 is configured to instantiate by way of executable instructions 144 stored in the memory 142 and executed by the processor 140 various processes for controlling the various functions of the ultrasound system 100.
As to the various processes for controlling the various functions of the ultrasound system 100, the various processes can include beamforming by way of a beamformer configured to drive the ultrasonic transducers 110, wherein driving the ultrasonic transducers 110 includes emitting generated ultrasound signals as well as receiving, amplifying, and digitizing reflected ultrasound signals; signal processing by way of a signal processor configured to detect an amplitude of each of the foregoing reflected ultrasound signals or the digitized signals corresponding thereto; and image processing by way of an image processor configured to manage storage of detected amplitudes and send the ultrasound images 136 corresponding to collections of the detected amplitudes to the display screen of the display 134 or 158 upon completion of the ultrasound images 136.
Further to the various processes for controlling the various functions of the ultrasound system 100, the various processes can include processing electrical signals corresponding to color and brightness data from an image sensor of the camera 112 of the ultrasound probe 104 or 204 into the one-or-more still or moving images 120; determining depths for various anatomical structures in the ultrasound images 136 by way of delays in time between emitting the generated ultrasound signals from the ultrasonic transducers 110 and receiving the reflected ultrasound signals by the ultrasonic transducers 110; adjusting a scale of the light pattern 122 projected from the light-pattern projector 114 in accordance with both the depths for the various anatomical structures in the ultrasound images 136 and a needle-insertion angle, wherein the needle-insertion angle is selected from a single ultrasound system-defined needle-insertion angle, a clinician-selected needle-insertion angle among various ultrasound system-defined needle-insertion angles, and a dynamic needle-insertion angle determined by way of magnetic-based needle guidance; adjusting a scale of the overlying pattern 160 lying over the one-or-more still or moving images 120 in accordance with both the depths for the various anatomical structures in the ultrasound images 136 and the needle-insertion angle; and adjusting a scale of the one-or-more needle trajectories 138 lying over the ultrasound images 136 in accordance with both the depths for various anatomical structures in the ultrasound images 136 and the needle-insertion angle.
The console 102 also includes a digital controller/analog interface 146 in communication with both the processor 140 and other system components to govern interfacing between the ultrasound probe 104 or 204 and the foregoing system components. Ports 148 are also included in the console 102 for connection with additional system components including can be universal serial bus (“USB”) ports, though other types of ports can be used for these connections or any other connections shown or described herein.
A power connection 150 is included with the console 102 to enable an operable connection to an external power supply 152. An internal power supply 154 (e.g., a battery) can also be employed either with or exclusive of the external power supply 152. Power management circuitry 156 is included with the digital controller/analog interface 146 of the console 102 to regulate power use and distribution.
A display 158 integrated into the console 102 is configured to render on a display screen thereof a graphical user interface (“GUI”), the ultrasound images 136 attained by the ultrasound probe 104 or 204, the one-or-more still or moving images 120 of the procedural field attained by the camera 112 of the ultrasound probe 104 or 204, an overlying pattern 160 lying over the one-or-more still or moving images 120, the one-or-more needle trajectories 138 lying over the ultrasound images 136, etc. That said, the display 158 can alternatively be separate from the console 102 and communicatively coupled thereto. Regardless, control buttons (see
When rendered on the display screen, the one-or-more still or moving images 120 show at least the needle 116 when the needle 116 is present in the spatial region about the probe head 108 or 208, which, even alone, allows a clinician to sustain spatial attention on the display 158 when establishing an insertion site with the needle 116. If the ultrasound probe 104 or 204 includes the light-pattern projector 114, however, the one-or-more still or moving images 120 can show both the light pattern 122 in the spatial region about the probe head 108 or 208 and the needle 116 in relation to the light pattern 122 for guided insertion of the needle 116 into an anatomical target under the probe head 108 or 208 on the display 158. Having both the light pattern 122 and the needle 116 shown in the one-or-more still or moving images 120 further allows a clinician to sustain spatial attention on the display 158 when establishing the insertion site with the needle 116, thereby obviating the clinician from frequently switching his or her spatial attention between the display 158 and the procedural field as done with existing ultrasound systems.
Following on the foregoing, if the ultrasound probe 104 or 204 does not include the light-pattern projector 114, or if a clinician prefers not to use the light-pattern projector 114 of the ultrasound probe 104 or 204, the one-or-more still or moving images 120 can show the overlying pattern 160 lying thereover. When the needle 116 is present in the spatial region about the probe head 108 or 208, the one-or-more still or moving images 120 can thusly show both the overlying pattern 160 and the needle 116 in relation to the overlying pattern 160 for guided insertion of the needle 116 into an anatomical target under the probe head 108 or 208 on the display 158. Having both the overlying pattern 160 and the needle 116 shown in the one-or-more still or moving images 120 further allows a clinician to sustain spatial attention on the display 158 when establishing the insertion site with the needle 116, thereby obviating the clinician from frequently switching his or her spatial attention between the display 158 and the procedural field as done with existing ultrasound systems.
Like the light pattern 122a or 122b, the overlying pattern 160a or 160b includes the periodic hash marks 124 along one or more rays 126 radiating from the central axis of the ultrasound probe 104 or 204 in the plane of the probe head 108 or 208; however, unlike the light pattern 122a or 122b, the hash marks 124 and the one-or-more rays 126 are virtual, existing only on the display screen. By analogy to the light pattern 122a, the overlying pattern 160a likewise includes the hash marks 124 along one ray 126 radiating from the central axis of the ultrasound probe 104 or 204, and, by analogy to the light pattern 122b, the overlying pattern 160b likewise includes the hash marks 124 along three rays 126 radiating from the central axis of the ultrasound probe 104 or 204. Each hash mark of the hash marks 124 corresponds to a depth under the probe head 108 or 208 accessible by the needle 116 along an associated ray 126 at a needle-insertion angle with respect to the plane of the probe head 108 or 208.
Like the light pattern 122c, the overlying pattern 160c includes periodic concentric circular arcs 128 bound between two or more rays 126 radiating from a central axis of the ultrasound probe 104 or 204 in the plane of the probe head 108 or 208; however, unlike the light pattern 122c, the circular arcs 128 and the two-or-more rays 126 are virtual, existing only on the display screen. By analogy to the light pattern 122c, the overlying pattern 160c likewise includes the circular arcs 128 bound between three rays 126 radiating from the central axis of the ultrasound probe 104 or 204. Each circular arc of the circular arcs 128 corresponds to a depth under the probe head 108 or 208 accessible by the needle 116 along an associated ray 126 at a needle-insertion angle with respect to the plane of the probe head 108 or 208. Notably, the associated ray 126 can be an intervening ray between the two-or-more rays 126 of the overlying pattern 160c radiating from the central axis of the ultrasound probe 104 or 204. The intervening ray need not be a visible ray of the overlying pattern 160c; the intervening ray can be envisioned between the two-or-more rays 126 of the overlying pattern 160c and followed with the needle 116 when establishing an insertion site therewith as set forth in the method below.
As set forth above, the display 158 is configured to render on the display screen thereof the one-or-more needle trajectories 138 lying over the ultrasound images 136. The one-or-more needle trajectories 138 are configured for guided insertion of the needle 116 into an anatomical target under the probe head 108 or 208 on the display 158. Indeed, as shown in
The needle trajectories 138 labeled ‘1’ in
The needle trajectories 138 labeled ‘2’ and ‘3’ in of
Adverting briefly back to the ultrasound probe 104 or 204, the ultrasound probe 104 or 204 includes the buttons 118 for operating the ultrasound probe 104 or 204 or the ultrasound system 100 of which the ultrasound probe 104 or 204 is part. For example, the buttons 118 can be configured for selecting a desired mode of the ultrasound system 100 as set forth above. The ultrasound probe 104 or 204 includes a button-and-memory controller 164 configured for operable communication with a probe interface 166 of the console 102, which probe interface 166 includes an input/output (“I/O”) component 168 for interfacing with the ultrasonic transducers 110 and a button-and-memory I/O component 170 for interfacing with the button-and-memory controller 164.
Methods
Methods include a method of using the ultrasound system 100 to establish an insertion site for access to an anatomical structure (e.g., blood vessel) of a patient. The method includes one or more steps selected from an ultrasound probe-obtaining step, an ultrasound probe-moving step, a recording step, an ultrasound image-monitoring step, a needle guide-attaching step, a needle guide-swiveling step, and a needle-inserting step.
The ultrasound probe-obtaining step includes obtaining the ultrasound probe 104. As set forth above, the ultrasound probe 104 includes the probe body 106, the probe head 108 extending from the distal end of the probe body 106, and the camera 112 integrated into the side of the ultrasound probe 104.
The needle guide-attaching step includes attaching the needle guide 132 to the needle-guide holder 130 extending from the probe body 106. The needle guide 132 includes a needle through hole configured to direct the needle 116 into the patient under the probe head 108 at the needle-insertion angle defined by the needle guide 132.
The ultrasound probe-moving step includes moving the ultrasound probe 104 over the patient while the ultrasound probe 104 emits generated ultrasound signals into the patient from the ultrasonic transducers 110 in the probe head 108 and receives reflected ultrasound signals from the patient by the ultrasonic transducers 110.
The recording step includes recording the one-or-more still or moving images 120 of the procedural field including a subject portion of the patient therein. As set forth above, the one-or-more still or moving images 120 are recorded with a depth of field including the plane of the distal end of the probe head 108 and the field of view including the spatial region about the probe head 108.
The ultrasound image-monitoring step includes monitoring ultrasound images 136 rendered on the display screen of the display 158 associated with the console 102 of the ultrasound system 100 to identify an anatomical target of the patient under the probe head 108.
The needle guide-swiveling step includes swiveling the needle guide 132 between sides of the ultrasound probe 104 to find a suitable needle trajectory before the needle-inserting step. The needle-guide holder 130, the needle guide 132, or a combination of the needle-guide holder 130 and the needle guide 132 such as the joint formed therebetween includes at least one degree of freedom enabling the swiveling of the needle guide 132.
The needle-inserting step includes inserting the needle 116 into the anatomical target. The inserting of the needle 116 into the anatomical target during the needle-inserting step is guided in the procedural field with reference to the light pattern 122 in the spatial region about the probe head 108, on the display 158 with reference to the one-or-more still or moving images 120 or the one-or-more needle trajectories 138 rendered on the display screen thereof, or a combination thereof.
As to guidance in the procedural field with reference to the light pattern 122, the light pattern 122 is projected into the spatial region about the probe head 108 from the light-pattern projector 114 and focused in the plane of the distal end of the probe head 108 for guiding the needle 116 in the procedural field. As set forth above, the light pattern 122a or 122b includes the periodic hash marks 124 along the one-or-more rays 126 radiating from the central axis of the ultrasound probe 104 in the plane of the probe head 108. Each hash mark of the hash marks 124 corresponds to a depth under the probe head 108 accessible by the needle 116 along an associated ray 126 at a needle-insertion angle with respect to the plane of the probe head 108. As further set forth above, the light pattern 122c includes the periodic concentric circular arcs 128 bound between the two-or-more rays 126 radiating from the central axis of the ultrasound probe 104 in the plane of the probe head 108. Each circular arc of the circular arcs 128 corresponds to a depth under the probe head 108 accessible by the needle 116 along an associated ray 126 at a needle-insertion angle with respect to the plane of the probe head 108.
As to guidance on the display 158 with reference to the one-or-more still or moving images 120, the one-or-more still or moving images 120 can show both the light pattern 122 in the spatial region about the probe head 108 and the needle 116 in relation to the light pattern 122 for guiding the needle 116 on the display 158. However, if the ultrasound probe 104 does not include the light-pattern projector 114, or if a clinician prefers not to use the light-pattern projector 114 of the ultrasound probe 104, the one-or-more still or moving images 120 can show the overlying pattern 160 lying thereover for guiding the needle 116 on the display 158. As set forth above, the overlying pattern 160a or 160b includes the periodic hash marks 124 along the one-or-more rays 126 radiating from the central axis of the ultrasound probe 1014 in the plane of the probe head 108. Each hash mark of the hash marks 124 corresponds to a depth under the probe head 108 accessible by the needle 116 along an associated ray 126 at a needle-insertion angle with respect to the plane of the probe head 108. As further set forth above, the overlying pattern 160c includes the periodic concentric circular arcs 128 bound between the two-or-more rays 126 radiating from the central axis of the ultrasound probe 104 in the plane of the probe head 108. Each circular arc of the circular arcs 128 corresponds to a depth under the probe head 108 accessible by the needle 116 along an associated ray 126 at a needle-insertion angle with respect to the plane of the probe head 108.
Further as to guidance on the display 158 with reference to the one-or-more needle trajectories 138, the ultrasound images 136 can show the one-or-more needle trajectories 138 in accordance with one or more depths accessible by the needle 116 indicated by the light pattern 122 or overlying pattern 160 in the one-or-more still or moving images 120 for guiding the needle 116 on the display 158.
Notably, the foregoing method involves the ultrasound probe 104; however, the method can be modified for the ultrasound probe 204. In such a method, the ultrasound images 136 are displayed on the display 134 of the ultrasound probe 204, optionally, in combination with the ultrasound images 136 and the one-or-more still or moving images 120 on the display 158 of the console 102. As set forth above, displaying the images on the display 134 of the ultrasound probe 204 allows a clinician to sustain spatial attention in the procedural field when establishing the insertion site with the needle 116 in the needle-inserting step, thereby obviating the clinician from frequently switching his or her spatial attention between the procedural field, which includes the display 134, and another display (e.g., the display 158 of the console 102) as done with existing ultrasound systems.
While some particular embodiments have been disclosed herein, and while the particular embodiments have been disclosed in some detail, it is not the intention for the particular embodiments to limit the scope of the concepts provided herein. Additional adaptations or modifications can appear to those of ordinary skill in the art, and, in broader aspects, these adaptations or modifications are encompassed as well. Accordingly, departures may be made from the particular embodiments disclosed herein without departing from the scope of the concepts provided herein.
This application claims the benefit of priority to U.S. Provisional Patent Application No. 63/086,971, filed Oct. 2, 2020, which is incorporated by reference in its entirety into this application.
Number | Name | Date | Kind |
---|---|---|---|
5148809 | Biegeleisen-Knight et al. | Sep 1992 | A |
5181513 | Touboul et al. | Jan 1993 | A |
5325293 | Dorne | Jun 1994 | A |
5441052 | Miyajima | Aug 1995 | A |
5549554 | Miraki | Aug 1996 | A |
5573529 | Haak et al. | Nov 1996 | A |
5775322 | Silverstein et al. | Jul 1998 | A |
5879297 | Haynor et al. | Mar 1999 | A |
5908387 | LeFree et al. | Jun 1999 | A |
5967984 | Chu et al. | Oct 1999 | A |
5970119 | Hofmann | Oct 1999 | A |
6004270 | Urbano et al. | Dec 1999 | A |
6019724 | Gronningsaeter et al. | Feb 2000 | A |
6068599 | Saito et al. | May 2000 | A |
6074367 | Hubbell | Jun 2000 | A |
6129668 | Haynor et al. | Oct 2000 | A |
6132379 | Patacsil et al. | Oct 2000 | A |
6216028 | Haynor et al. | Apr 2001 | B1 |
6233476 | Strommer et al. | May 2001 | B1 |
6245018 | Lee | Jun 2001 | B1 |
6263230 | Haynor et al. | Jul 2001 | B1 |
6375615 | Flaherty et al. | Apr 2002 | B1 |
6436043 | Bonnefous | Aug 2002 | B2 |
6498942 | Esenaliev et al. | Dec 2002 | B1 |
6503205 | Manor et al. | Jan 2003 | B2 |
6508769 | Bonnefous | Jan 2003 | B2 |
6511458 | Milo et al. | Jan 2003 | B2 |
6524249 | Moehring et al. | Feb 2003 | B2 |
6543642 | Milliorn | Apr 2003 | B1 |
6554771 | Buil et al. | Apr 2003 | B1 |
6592520 | Peszynski et al. | Jul 2003 | B1 |
6592565 | Twardowski | Jul 2003 | B2 |
6601705 | Molina et al. | Aug 2003 | B2 |
6612992 | Hossack et al. | Sep 2003 | B1 |
6613002 | Clark et al. | Sep 2003 | B1 |
6623431 | Sakuma et al. | Sep 2003 | B1 |
6641538 | Nakaya et al. | Nov 2003 | B2 |
6647135 | Bonnefous | Nov 2003 | B2 |
6687386 | Ito et al. | Feb 2004 | B1 |
6749569 | Pellegretti | Jun 2004 | B1 |
6754608 | Svanerudh et al. | Jun 2004 | B2 |
6755789 | Stringer et al. | Jun 2004 | B2 |
6840379 | Franks-Farah et al. | Jan 2005 | B2 |
6857196 | Dalrymple | Feb 2005 | B2 |
6979294 | Selzer et al. | Dec 2005 | B1 |
7074187 | Selzer et al. | Jul 2006 | B2 |
7244234 | Ridley et al. | Jul 2007 | B2 |
7359554 | Klingensmith et al. | Apr 2008 | B2 |
7534209 | Abend et al. | May 2009 | B2 |
7599730 | Hunter et al. | Oct 2009 | B2 |
7637870 | Flaherty et al. | Dec 2009 | B2 |
7681579 | Schwartz | Mar 2010 | B2 |
7691061 | Hirota | Apr 2010 | B2 |
7699779 | Sasaki et al. | Apr 2010 | B2 |
7720520 | Willis | May 2010 | B2 |
7727153 | Fritz et al. | Jun 2010 | B2 |
7734326 | Pedain et al. | Jun 2010 | B2 |
7831449 | Ying et al. | Nov 2010 | B2 |
7905837 | Suzuki | Mar 2011 | B2 |
7925327 | Weese | Apr 2011 | B2 |
7927278 | Selzer et al. | Apr 2011 | B2 |
8014848 | Birkenbach et al. | Sep 2011 | B2 |
8050523 | Younge et al. | Nov 2011 | B2 |
8060181 | Rodriguez Ponce et al. | Nov 2011 | B2 |
8068581 | Boese et al. | Nov 2011 | B2 |
8075488 | Burton | Dec 2011 | B2 |
8090427 | Eck et al. | Jan 2012 | B2 |
8105239 | Specht | Jan 2012 | B2 |
8172754 | Watanabe et al. | May 2012 | B2 |
8175368 | Sathyanarayana | May 2012 | B2 |
8200313 | Rambod et al. | Jun 2012 | B1 |
8211023 | Swan et al. | Jul 2012 | B2 |
8228347 | Beasley et al. | Jul 2012 | B2 |
8298147 | Huennekens et al. | Oct 2012 | B2 |
8303505 | Webler et al. | Nov 2012 | B2 |
8323202 | Roschak et al. | Dec 2012 | B2 |
8328727 | Miele et al. | Dec 2012 | B2 |
8388541 | Messerly et al. | Mar 2013 | B2 |
8409103 | Grunwald et al. | Apr 2013 | B2 |
8449465 | Nair et al. | May 2013 | B2 |
8553954 | Saikia | Oct 2013 | B2 |
8556815 | Pelissier et al. | Oct 2013 | B2 |
8585600 | Liu et al. | Nov 2013 | B2 |
8622913 | Dentinger et al. | Jan 2014 | B2 |
8706457 | Hart et al. | Apr 2014 | B2 |
8727988 | Flaherty et al. | May 2014 | B2 |
8734357 | Taylor | May 2014 | B2 |
8744211 | Owen | Jun 2014 | B2 |
8754865 | Merritt et al. | Jun 2014 | B2 |
8764663 | Smok et al. | Jul 2014 | B2 |
8781194 | Malek et al. | Jul 2014 | B2 |
8781555 | Burnside et al. | Jul 2014 | B2 |
8790263 | Randall et al. | Jul 2014 | B2 |
8849382 | Cox et al. | Sep 2014 | B2 |
8939908 | Suzuki et al. | Jan 2015 | B2 |
8961420 | Zhang | Feb 2015 | B2 |
9022940 | Meier | May 2015 | B2 |
9138290 | Hadjicostis | Sep 2015 | B2 |
9155517 | Dunbar et al. | Oct 2015 | B2 |
9204858 | Pelissier et al. | Dec 2015 | B2 |
9220477 | Urabe et al. | Dec 2015 | B2 |
9257220 | Nicholls et al. | Feb 2016 | B2 |
9295447 | Shah | Mar 2016 | B2 |
9320493 | Visveshwara | Apr 2016 | B2 |
9357980 | Toji et al. | Jun 2016 | B2 |
9364171 | Harris et al. | Jun 2016 | B2 |
9427207 | Sheldon et al. | Aug 2016 | B2 |
9445780 | Hossack et al. | Sep 2016 | B2 |
9456766 | Cox et al. | Oct 2016 | B2 |
9456804 | Tamada | Oct 2016 | B2 |
9459087 | Dunbar et al. | Oct 2016 | B2 |
9468413 | Hall et al. | Oct 2016 | B2 |
9492097 | Wilkes et al. | Nov 2016 | B2 |
9521961 | Silverstein et al. | Dec 2016 | B2 |
9554716 | Burnside et al. | Jan 2017 | B2 |
9582876 | Specht | Feb 2017 | B2 |
9597008 | Henkel et al. | Mar 2017 | B2 |
9610061 | Ebbini et al. | Apr 2017 | B2 |
9636031 | Cox | May 2017 | B2 |
9649037 | Lowe et al. | May 2017 | B2 |
9649048 | Cox et al. | May 2017 | B2 |
9702969 | Hope Simpson et al. | Jul 2017 | B2 |
9715757 | Ng et al. | Jul 2017 | B2 |
9717415 | Cohen et al. | Aug 2017 | B2 |
9731066 | Liu et al. | Aug 2017 | B2 |
9814433 | Benishti et al. | Nov 2017 | B2 |
9814531 | Yagi et al. | Nov 2017 | B2 |
9861337 | Patwardhan et al. | Jan 2018 | B2 |
9895138 | Sasaki | Feb 2018 | B2 |
9913605 | Harris et al. | Mar 2018 | B2 |
9949720 | Southard et al. | Apr 2018 | B2 |
10043272 | Forzoni et al. | Aug 2018 | B2 |
10380919 | Savitsky et al. | Aug 2019 | B2 |
10380920 | Savitsky et al. | Aug 2019 | B2 |
10424225 | Nataneli et al. | Sep 2019 | B2 |
10434278 | Dunbar et al. | Oct 2019 | B2 |
10449330 | Newman et al. | Oct 2019 | B2 |
10524691 | Newman et al. | Jan 2020 | B2 |
10674935 | Henkel et al. | Jun 2020 | B2 |
10751509 | Misener | Aug 2020 | B2 |
10758155 | Henkel et al. | Sep 2020 | B2 |
10765343 | Henkel et al. | Sep 2020 | B2 |
10896628 | Savitsky et al. | Jan 2021 | B2 |
11062624 | Savitsky et al. | Jul 2021 | B2 |
11120709 | Savitsky et al. | Sep 2021 | B2 |
11311269 | Dunbar et al. | Apr 2022 | B2 |
11315439 | Savitsky et al. | Apr 2022 | B2 |
11600201 | Savitsky et al. | Mar 2023 | B1 |
20020038088 | Imran et al. | Mar 2002 | A1 |
20020148277 | Umeda | Oct 2002 | A1 |
20030047126 | Tomaschko | Mar 2003 | A1 |
20030060714 | Henderson et al. | Mar 2003 | A1 |
20030073900 | Senarith et al. | Apr 2003 | A1 |
20030093001 | Martikainen | May 2003 | A1 |
20030106825 | Molina et al. | Jun 2003 | A1 |
20030120154 | Sauer et al. | Jun 2003 | A1 |
20040055925 | Franks-Farah et al. | Mar 2004 | A1 |
20050000975 | Carco et al. | Jan 2005 | A1 |
20050049504 | Lo et al. | Mar 2005 | A1 |
20050165299 | Kressy et al. | Jul 2005 | A1 |
20050251030 | Azar et al. | Nov 2005 | A1 |
20050267365 | Sokulin et al. | Dec 2005 | A1 |
20060013523 | Childlers et al. | Jan 2006 | A1 |
20060015039 | Cassidy et al. | Jan 2006 | A1 |
20060020204 | Serra et al. | Jan 2006 | A1 |
20060079781 | Germond-Rouet et al. | Apr 2006 | A1 |
20060184029 | Haim et al. | Aug 2006 | A1 |
20060210130 | Germond-Rouet et al. | Sep 2006 | A1 |
20070043341 | Anderson et al. | Feb 2007 | A1 |
20070049822 | Bunce et al. | Mar 2007 | A1 |
20070073155 | Park et al. | Mar 2007 | A1 |
20070199848 | Ellswood et al. | Aug 2007 | A1 |
20070239120 | Brock et al. | Oct 2007 | A1 |
20070249911 | Simon | Oct 2007 | A1 |
20080021322 | Stone et al. | Jan 2008 | A1 |
20080033293 | Beasley et al. | Feb 2008 | A1 |
20080033759 | Finlay | Feb 2008 | A1 |
20080051657 | Rold | Feb 2008 | A1 |
20080146915 | McMorrow | Jun 2008 | A1 |
20080177186 | Slater et al. | Jul 2008 | A1 |
20080221425 | Olson et al. | Sep 2008 | A1 |
20080294037 | Richter | Nov 2008 | A1 |
20080300491 | Bonde et al. | Dec 2008 | A1 |
20090012399 | Sunagawa et al. | Jan 2009 | A1 |
20090143672 | Harms et al. | Jun 2009 | A1 |
20090143684 | Cermak et al. | Jun 2009 | A1 |
20090156926 | Messerly et al. | Jun 2009 | A1 |
20090306509 | Pedersen et al. | Dec 2009 | A1 |
20100020926 | Boese et al. | Jan 2010 | A1 |
20100106015 | Norris | Apr 2010 | A1 |
20100179428 | Pedersen et al. | Jul 2010 | A1 |
20100211026 | Sheetz et al. | Aug 2010 | A2 |
20100277305 | Garner et al. | Nov 2010 | A1 |
20100286515 | Gravenstein et al. | Nov 2010 | A1 |
20100312121 | Guan | Dec 2010 | A1 |
20110002518 | Ziv-Ari et al. | Jan 2011 | A1 |
20110071404 | Schmitt et al. | Mar 2011 | A1 |
20110295108 | Cox et al. | Dec 2011 | A1 |
20110313293 | Lindekugel et al. | Dec 2011 | A1 |
20120179038 | Meurer et al. | Jul 2012 | A1 |
20120197132 | O'Connor | Aug 2012 | A1 |
20120209121 | Boudier | Aug 2012 | A1 |
20120220865 | Brown et al. | Aug 2012 | A1 |
20120238875 | Savitsky et al. | Sep 2012 | A1 |
20120277576 | Lui | Nov 2012 | A1 |
20130041250 | Pelissier et al. | Feb 2013 | A1 |
20130102889 | Southard et al. | Apr 2013 | A1 |
20130131499 | Chan et al. | May 2013 | A1 |
20130131502 | Blaivas et al. | May 2013 | A1 |
20130150724 | Blaivas et al. | Jun 2013 | A1 |
20130188832 | Ma et al. | Jul 2013 | A1 |
20130218024 | Boctor | Aug 2013 | A1 |
20130324840 | Zhongping et al. | Dec 2013 | A1 |
20140005530 | Liu et al. | Jan 2014 | A1 |
20140031690 | Toji et al. | Jan 2014 | A1 |
20140036091 | Zalev et al. | Feb 2014 | A1 |
20140073976 | Fonte et al. | Mar 2014 | A1 |
20140100440 | Cheline et al. | Apr 2014 | A1 |
20140155737 | Manzke et al. | Jun 2014 | A1 |
20140180098 | Flaherty et al. | Jun 2014 | A1 |
20140188133 | Misener | Jul 2014 | A1 |
20140188440 | Donhowe et al. | Jul 2014 | A1 |
20140257104 | Dunbar et al. | Sep 2014 | A1 |
20140276059 | Sheehan | Sep 2014 | A1 |
20140276081 | Tegels | Sep 2014 | A1 |
20140276085 | Miller | Sep 2014 | A1 |
20140276690 | Grace | Sep 2014 | A1 |
20140343431 | Vajinepalli et al. | Nov 2014 | A1 |
20150005738 | Blacker | Jan 2015 | A1 |
20150011887 | Ahn et al. | Jan 2015 | A1 |
20150065916 | Maguire et al. | Mar 2015 | A1 |
20150073279 | Cai et al. | Mar 2015 | A1 |
20150112200 | Oberg et al. | Apr 2015 | A1 |
20150209113 | Burkholz et al. | Jul 2015 | A1 |
20150209526 | Matsubara et al. | Jul 2015 | A1 |
20150294497 | Ng et al. | Oct 2015 | A1 |
20150297097 | Matsubara et al. | Oct 2015 | A1 |
20150327841 | Banjanin et al. | Nov 2015 | A1 |
20150359991 | Dunbar et al. | Dec 2015 | A1 |
20160029995 | Navratil et al. | Feb 2016 | A1 |
20160029998 | Brister et al. | Feb 2016 | A1 |
20160058420 | Cinthio et al. | Mar 2016 | A1 |
20160100970 | Brister et al. | Apr 2016 | A1 |
20160101263 | Blumenkranz et al. | Apr 2016 | A1 |
20160113699 | Sverdlik et al. | Apr 2016 | A1 |
20160120607 | Sorotzkin et al. | May 2016 | A1 |
20160143622 | Xie et al. | May 2016 | A1 |
20160166232 | Merritt | Jun 2016 | A1 |
20160202053 | Walker et al. | Jul 2016 | A1 |
20160213398 | Liu | Jul 2016 | A1 |
20160278743 | Kawashima | Sep 2016 | A1 |
20160278869 | Grunwald | Sep 2016 | A1 |
20160296208 | Sethuraman et al. | Oct 2016 | A1 |
20160374644 | Mauldin, Jr. et al. | Dec 2016 | A1 |
20170079548 | Silverstein et al. | Mar 2017 | A1 |
20170086785 | Bjaerum | Mar 2017 | A1 |
20170100092 | Kruse et al. | Apr 2017 | A1 |
20170164923 | Matsumoto | Jun 2017 | A1 |
20170172424 | Eggers et al. | Jun 2017 | A1 |
20170188839 | Tashiro | Jul 2017 | A1 |
20170196535 | Arai et al. | Jul 2017 | A1 |
20170215842 | Ryu et al. | Aug 2017 | A1 |
20170259013 | Boyden | Sep 2017 | A1 |
20170265840 | Bharat et al. | Sep 2017 | A1 |
20170303894 | Scully | Oct 2017 | A1 |
20170367678 | Sirtori et al. | Dec 2017 | A1 |
20180015256 | Southard et al. | Jan 2018 | A1 |
20180116723 | Hettrick et al. | May 2018 | A1 |
20180125450 | Blackbourne et al. | May 2018 | A1 |
20180161502 | Nanan et al. | Jun 2018 | A1 |
20180199914 | Ramachandran et al. | Jul 2018 | A1 |
20180214119 | Mehrmohammadi et al. | Aug 2018 | A1 |
20180225993 | Buras et al. | Aug 2018 | A1 |
20180228465 | Southard et al. | Aug 2018 | A1 |
20180235576 | Brannan | Aug 2018 | A1 |
20180250078 | Shochat et al. | Sep 2018 | A1 |
20180272108 | Padilla et al. | Sep 2018 | A1 |
20180279996 | Cox et al. | Oct 2018 | A1 |
20180286287 | Razzaque | Oct 2018 | A1 |
20180310955 | Lindekugel et al. | Nov 2018 | A1 |
20180317881 | Astigarraga et al. | Nov 2018 | A1 |
20180366035 | Dunbar et al. | Dec 2018 | A1 |
20190060014 | Hazelton et al. | Feb 2019 | A1 |
20190069923 | Wang | Mar 2019 | A1 |
20190076121 | Southard et al. | Mar 2019 | A1 |
20190088019 | Prevrhal et al. | Mar 2019 | A1 |
20190105017 | Hastings | Apr 2019 | A1 |
20190117190 | Djajadiningrat et al. | Apr 2019 | A1 |
20190223757 | Durfee | Jul 2019 | A1 |
20190239850 | Dalvin et al. | Aug 2019 | A1 |
20190282324 | Freeman et al. | Sep 2019 | A1 |
20190298457 | Bharat | Oct 2019 | A1 |
20190307516 | Schotzko et al. | Oct 2019 | A1 |
20190339525 | Yanof et al. | Nov 2019 | A1 |
20190355278 | Sainsbury et al. | Nov 2019 | A1 |
20190365348 | Toume et al. | Dec 2019 | A1 |
20200041261 | Bernstein et al. | Feb 2020 | A1 |
20200069285 | Annangi et al. | Mar 2020 | A1 |
20200113540 | Gijsbers et al. | Apr 2020 | A1 |
20200129136 | Harding et al. | Apr 2020 | A1 |
20200188028 | Feiner et al. | Jun 2020 | A1 |
20200230391 | Burkholz et al. | Jul 2020 | A1 |
20210007710 | Douglas | Jan 2021 | A1 |
20210045716 | Shiran et al. | Feb 2021 | A1 |
20210166583 | Buras et al. | Jun 2021 | A1 |
20210307838 | Xia et al. | Oct 2021 | A1 |
20210353255 | Schneider et al. | Nov 2021 | A1 |
20210402144 | Messerly | Dec 2021 | A1 |
20220022969 | Misener | Jan 2022 | A1 |
20220031965 | Durfee | Feb 2022 | A1 |
20220039685 | Misener et al. | Feb 2022 | A1 |
20220039777 | Durfee | Feb 2022 | A1 |
20220096797 | Prince | Mar 2022 | A1 |
20220117582 | McLaughlin et al. | Apr 2022 | A1 |
20220160434 | Messerly et al. | May 2022 | A1 |
20220168050 | Sowards et al. | Jun 2022 | A1 |
20220172354 | Misener et al. | Jun 2022 | A1 |
20220211442 | McLaughlin et al. | Jul 2022 | A1 |
20220381630 | Sowards et al. | Dec 2022 | A1 |
20230113291 | de Wild et al. | Apr 2023 | A1 |
20230240643 | Cermak et al. | Aug 2023 | A1 |
20230389893 | Misener et al. | Dec 2023 | A1 |
20240008929 | Misener et al. | Jan 2024 | A1 |
20240050061 | McLaughlin et al. | Feb 2024 | A1 |
20240058074 | Misener | Feb 2024 | A1 |
20240062678 | Sowards et al. | Feb 2024 | A1 |
Number | Date | Country |
---|---|---|
2006201646 | Nov 2006 | AU |
114129137 | Sep 2022 | CN |
0933063 | Aug 1999 | EP |
1504713 | Feb 2005 | EP |
1591074 | May 2008 | EP |
3181083 | Jun 2017 | EP |
3530221 | Aug 2019 | EP |
2000271136 | Oct 2000 | JP |
2014150928 | Aug 2014 | JP |
2018175547 | Nov 2018 | JP |
20180070878 | Jun 2018 | KR |
20190013133 | Feb 2019 | KR |
2013059714 | Apr 2013 | WO |
2014115150 | Jul 2014 | WO |
2014174305 | Oct 2014 | WO |
2015017270 | Feb 2015 | WO |
2017096487 | Jun 2017 | WO |
2017214428 | Dec 2017 | WO |
2018026878 | Feb 2018 | WO |
2018134726 | Jul 2018 | WO |
2018206473 | Nov 2018 | WO |
2019232451 | Dec 2019 | WO |
2020002620 | Jan 2020 | WO |
2020016018 | Jan 2020 | WO |
2019232454 | Feb 2020 | WO |
2020044769 | Mar 2020 | WO |
2020102665 | May 2020 | WO |
2020186198 | Sep 2020 | WO |
2022031762 | Feb 2022 | WO |
2022072727 | Apr 2022 | WO |
2022081904 | Apr 2022 | WO |
2022-203713 | Sep 2022 | WO |
2022263763 | Dec 2022 | WO |
2023235435 | Dec 2023 | WO |
2024010940 | Jan 2024 | WO |
2024039608 | Feb 2024 | WO |
2024039719 | Feb 2024 | WO |
Entry |
---|
Stolka, P.J., et al., (2014). Needle Guidance Using Handheld Stereo Vision and Projection for Ultrasound-Based Interventions. In: Golland, P., Hata, N., Barillot, C., Hornegger, J., Howe, R. (eds) Medical Image Computing and Computer-Assisted Intervention—MICCAI 2014. MICCAI 2014. (Year: 2014). |
U.S. Appl. No. 17/380,767, filed Jul. 20, 2021 Non-Final Office Action dated Mar. 6, 2023. |
U.S. Appl. No. 17/393,283, filed Aug. 3, 2021 Non-Final Office Action dated Mar. 31, 2023. |
EZono, eZSimulator, https://www.ezono.com/en/ezsimulator/, last accessed Sep. 13, 2022. |
Sonosim, https://sonosim.com/ultrasound-simulation/? last accessed Sep. 13, 2022. |
U.S. Appl. No. 17/397,486, filed Aug. 9, 2021 Restriction Requirement dated Aug. 12, 2022. |
PCT/US2021/050973 filed Sep. 17, 2021 International Search Report and Written Opinion dated Nov. 7, 2022. |
U.S. Appl. No. 17/380,767, filed Jul. 20, 2021 Restriction Requirement dated Dec. 15, 2022. |
U.S. Appl. No. 17/393,283, filed Aug. 3, 2021 Restriction Requirement dated Jan. 12, 2023. |
U.S. Appl. No. 17/397,486, filed Aug. 9, 2021 Non-Final Office Action dated Jan. 23, 2023. |
U.S. Appl. No. 17/501,909, filed Oct. 14, 2021 Restriction Requirement dated Feb. 1, 2023. |
Khsan Mohammad et al: “Assistive technology for ultrasound-guided central venous catheter placement”, Journal of Medical Ultrasonics, Japan Society of Ultrasonics in Medicine, Tokyo, JP, vol. 45, No. 1, Apr. 19, 2017, pp. 41-57, XPO36387340, ISSN: 1346-4523, DOI: 10.1007/S10396-017-0789-2 [retrieved on Apr. 19, 2017]. |
Pagoulatos, N. et al. “New spatial localizer based on fiber optics with applications in 3D ultrasound imaging” Proceeding of Spie, vol. 3976 (Apr. 18, 2000; Apr. 18, 2000). |
PCT/US2021/042369 filed Jul. 20, 2021 International Search Report and Written Opinion dated Oct. 25, 2021. |
PCT/US2021/044419 filed Aug. 3, 2021 International Search Report and Written Opinion dated Nov. 19, 2021. |
PCT/US2021/055076 filed Oct. 14, 2021 International Search Report and Written Opinion dated Mar. 25, 2022. |
PCT/US12/61182 International Seach Report and Written Opinion dated Mar. 11, 2013. |
PCT/US2021/049294 filed Sep. 7, 2021 International Search Report and Written Opinion dated Dec. 8, 2021. |
PCT/US2021/049712 filed Sep. 9, 2021 International Search Report and Written Opinion dated Dec. 14, 2021. |
PCT/US2021/052055 filed Sep. 24, 2021 International Search Report and Written Opinion dated Dec. 20, 2021. |
U.S. Appl. No. 13/656,563, filed Oct. 19, 2012 Decision on Appeal dated Nov. 1, 2017. |
U.S. Appl. No. 13/656,563, filed Oct. 19, 2012 Examiner's Answer dated Nov. 16, 2015. |
U.S. Appl. No. 13/656,563, filed Oct. 19, 2012 Final Office Action dated Dec. 5, 2014. |
U.S. Appl. No. 13/656,563, filed Oct. 19, 2012 Non-Final Office Action dated Jul. 18, 2014. |
U.S. Appl. No. 15/650,474, filed Jul. 14, 2017 Final Office Action dated Jun. 2, 2020. |
U.S. Appl. No. 15/650,474, filed Jul. 14, 2017 Non-Final Office Action dated Dec. 16, 2019. |
U.S. Appl. No. 15/650,474, filed Jul. 14, 2017 Notice of Allowance dated Dec. 11, 2020. |
U.S. Appl. No. 15/650,474, filed Jul. 14, 2017 Notice of Allowance dated Mar. 1, 2021. |
U.S. Appl. No. 15/951,903, filed Apr. 12, 2018 Advisory Action dated Dec. 22, 2020. |
U.S. Appl. No. 15/951,903, filed Apr. 12, 2018 Examiner's Answer dated Jun. 3, 2021. |
U.S. Appl. No. 15/951,903, filed Apr. 12, 2018 Final Office Action dated Oct. 13, 2020. |
U.S. Appl. No. 15/951,903, filed Apr. 12, 2018 Non-Final Office Action dated May 22, 2020. |
U.S. Appl. No. 17/020,476, filed Sep. 14, 2020 Non-Final Office Action dated Feb. 9, 2022. |
Lu Zhenyu et al “Recent advances in 5 robot-assisted echography combining perception control and cognition.” Cognitive Computation and Systems the Institution of Engineering and Technology, Michael Faraday House, Six Hills Way, Stevenage Herts. SG1 2AY UK vol. 2 No. 3 Sep. 2, 2020 (Sep. 2, 2020). |
PCT/US2021/045218 filed Aug. 9, 2021 International Search Report and Written Opinion dated Nov. 23, 2021. |
PCT/US2021/049123 filed Sep. 3, 2021 International Search Report and Written Opinion dated Feb. 4, 2022. |
PCT/US2021/053018 filed Sep. 30, 2021 International Search Report and Written Opinion dated May 3, 2022. |
PCT/US2021/060622 filed Nov. 23, 2021 International Search Report and Written Opinion dated Mar. 3, 2022. |
PCT/US2021/061267 filed Nov. 30, 2021 International Search Report and Written Opinion dated Mar. 9, 2022. |
PCT/US2021/061276 filed Nov. 30, 2021 International Search Report and Written Opinion dated Mar. 9, 2022. |
Sebastian Vogt: “Real-Time Augmented Reality for Image-Guided Interventions”, Oct. 5, 2009, XPO55354720, Retrieved from the Internet: URL: https://opus4.kobv.de/opus4-fau/frontdoor/deliver/index/docld/1235/file/SebastianVogtDissertation.pdf. |
U.S. Appl. No. 15/951,903, filed Apr. 12, 2018 Board Decision dated Apr. 20, 2022. |
U.S. Appl. No. 15/951,903, filed Apr. 12, 2018 Notice of Allowance dated May 2, 2022. |
William F Garrett et al: “Real-time incremental visualization of dynamic ultrasound volumes using parallel BSP trees”, Visualization '96. Proceedings, IEEE, NE, Oct. 27, 1996, pp. 235-ff, XPO58399771, ISBN: 978-0-89791-864-0 abstract, figures 1-7, pp. 236-240. |
State, A., et al. (Aug. 1996). Technologies for augmented reality systems: Realizing ultrasound-guided needle biopsies. In Proceedings of the 23rd annual conference on computer graphics and interactive techniques (pp. 439-446) (Year: 1996). |
U.S. Appl. No. 17/397,486, filed Aug. 9, 2021 Final Office Action dated Aug. 4, 2023. |
U.S. Appl. No. 17/501,909, filed Oct. 14, 2021 Non-Final Office Action dated Jun. 6, 2023. |
U.S. Appl. No. 17/832,389, filed Jun. 3, 2022 Restriction Requirement dated Jul. 13, 2023. |
PCT/US2023/024067 filed May 31, 2023 International Search Report and Written Opinion dated Sep. 15, 2023. |
PCT/US2023/027147 filed Jul. 7, 2023 International Search Report and Written Opinion dated Oct. 2, 2023. |
PCT/US2023/030160 filed Aug. 14, 2023 International Search Report and Written Opinion dated Oct. 23, 2023. |
Practical guide for safe central venous catheterization and management 2017 Journal of Anesthesia vol. 34 published online Nov. 30, 2019 pp. 167-186. |
U.S. Appl. No. 17/380,767, filed Jul. 20, 2021 Notice of Allowance dated Aug. 31, 2023. |
U.S. Appl. No. 17/393,283, filed Aug. 3, 2021 Final Office Action dated Oct. 16, 2023. |
U.S. Appl. No. 17/397,486, filed Aug. 9, 2021 Advisory Action dated Oct. 5, 2023. |
U.S. Appl. No. 17/832,389, filed Jun. 3, 2022 Non-Final Office Action dated Oct. 6, 2023. |
U.S. Appl. No. 17/861,031, filed Jul. 8, 2022 Non-Final Office Action dated Sep. 14, 2023. |
U.S. Appl. No. 17/393,283, filed Aug. 3, 2021 Advisory Action dated Jan. 19, 2024. |
U.S. Appl. No. 17/478,754, filed Sep. 17, 2021 Restriction Requirement dated Jan. 22, 2024. |
U.S. Appl. No. 17/501,909, filed Oct. 14, 2021 Advisory Action dated Jan. 24, 2024. |
U.S. Appl. No. 17/501,909, filed Oct. 14, 2021 Final Office Action dated Nov. 21, 2023. |
U.S. Appl. No. 17/832,389, filed Jun. 3, 2022 Final Office Action dated Jan. 25, 2024. |
PCT/US2023/030347 filed Aug. 16, 2023 International Search Report and Written Opinion dated Nov. 6, 2023. |
U.S. Appl. No. 17/393,283, filed Aug. 3, 2021 Non-Final Office Action dated Feb. 29, 2024. |
U.S. Appl. No. 17/397,486, filed Aug. 9, 2021 Non-Final Office Action dated Mar. 1, 2024. |
U.S. Appl. No. 17/501,909, filed Oct. 14, 2021 Non-Final Office Action dated Mar. 21, 2024. |
U.S. Appl. No. 17/832,389, filed Jun. 3, 2022 Advisory Action dated Apr. 5, 2024. |
U.S. Appl. No. 17/832,389, filed Jun. 3, 2022 Notice of Allowance dated May 15, 2024. |
U.S. Appl. No. 17/861,031, filed Jul. 8, 2022 Final Office Action dated Mar. 15, 2024. |
U.S. Appl. No. 17/397,486, filed Aug. 9, 2021 Notice of Allowance dated Jul. 10, 2024. |
U.S. Appl. No. 17/478,754, filed Sep. 17, 2021 Non-Final Office Action dated Jul. 1, 2024. |
U.S. Appl. No. 17/501,909, filed Oct. 14, 2021 Final Office Action dated Aug. 5, 2024. |
U.S. Appl. No. 17/861,031, filed Jul. 8, 2022 Advisory Action dated Jun. 7, 2024. |
U.S. Appl. No. 17/861,031, filed Jul. 8, 2022 Notice of Allowance dated Jul. 3, 2024. |
U.S. Appl. No. 18/385,101, filed Oct. 30, 2023 Notice of Allowance dated Aug. 20, 2024. |
Number | Date | Country | |
---|---|---|---|
20220104886 A1 | Apr 2022 | US |
Number | Date | Country | |
---|---|---|---|
63086971 | Oct 2020 | US |