Ultrasound imaging is a widely accepted tool for guiding interventional instruments such as needles to targets such as blood vessels or organs in the human body. In order to successfully guide, for example, a needle to a blood vessel using ultrasound imaging, the needle is monitored in real-time both immediately before and after a percutaneous puncture in order to enable a clinician to determine the distance and the orientation of the needle to the blood vessel and ensure successful access thereto. However, through inadvertent movement of an ultrasound probe during the ultrasound imaging, the clinician can lose both the blood vessel and the needle, which can be difficult and time consuming to find again. In addition, it is often easier to monitor the distance and orientation of the needle immediately before the percutaneous puncture with a needle plane including the needle perpendicular to an image plane of the ultrasound probe. And it is often easier to monitor the distance and orientation of the needle immediately after the percutaneous puncture with the needle plane parallel to the image plane. As with inadvertently moving the ultrasound probe, the clinician can lose both the blood vessel and the needle when adjusting the image plane before and after the percutaneous puncture, which can be difficult and time consuming to find again. What is needed are ultrasound-imaging systems and methods thereof that can dynamically adjust the image plane to facilitate guiding interventional instruments to targets in at least the human body.
Disclosed herein are dynamically adjusting ultrasound-imaging systems and methods thereof.
Disclosed herein is an ultrasound-imaging system including, in some embodiments, an ultrasound probe, a console, and a display screen. The ultrasound probe includes an array of ultrasonic transducers. Activated ultrasonic transducers of the array of ultrasonic transducers are configured to emit generated ultrasound signals into a patient, receive reflected ultrasound signals from the patient, and convert the reflected ultrasound signals into corresponding electrical signals of the ultrasound signals for processing into ultrasound images. The console is configured to communicate with the ultrasound probe. The console includes memory with executable instructions and a processor configured to execute the instructions. The instructions are for dynamically adjusting a distance of the activated ultrasonic transducers from a predefined target or area, an orientation of the activated ultrasonic transducers to the predefined target or area, or both the distance and the orientation of the activated ultrasonic transducers with respect to the predefined target or area. The instructions are also for processing the corresponding electrical signals of the ultrasound signals into the ultrasound images. The display screen is configured to communicate with the console. The display screen is configured to display a graphical user interface (“GUI”) including the ultrasound images.
In some embodiments, the ultrasound probe further includes an array of magnetic sensors. The magnetic sensors are configured to convert magnetic signals from a magnetized medical device into corresponding electrical signals of the magnetic signals. The electrical signals are processed by the console into distance and orientation information with respect to the predefined target or area for display of an iconographic representation of the medical device on the display screen.
In some embodiments, the distance and orientation of the activated ultrasonic transducers is adjusted with respect to the predefined target or area when the medical device is brought into proximity of the ultrasound probe. An image plane is established by the activated ultrasonic transducers being perpendicular or parallel to a medical-device plane including the medical device for accessing the predefined target or area with the medical device.
In some embodiments, the distance and orientation of the activated ultrasonic transducers is adjusted with respect to a blood vessel as the predefined target. An image plane is established by the activated ultrasonic transducers being perpendicular or parallel to the blood vessel in accordance with an orientation of the blood vessel.
In some embodiments, the ultrasound-imaging system further includes a stand-alone optical interrogator communicatively coupled to the console or an integrated optical interrogator integrated into the console, as well as an optical-fiber stylet. The optical interrogator is configured to emit input optical signals, receive reflected optical signals, and convert the reflected optical signals into corresponding electrical signals of the optical signals for processing by the console into distance and orientation information with respect to the predefined target or area for display of an iconographic representation of a medical device on the display. The optical-fiber stylet configured to be disposed in a lumen of the medical device. The optical-fiber stylet is configured to convey the input optical signals from the optical interrogator to a number of fiber Bragg grating (“FBG”) sensors along a length of the optical-fiber stylet. The optical-fiber stylet is also configured to convey the reflected optical signals from the number of FBG sensors back to the optical interrogator.
In some embodiments, the distance and orientation of the activated ultrasonic transducers is adjusted with respect to the predefined target or area when the medical device is brought into proximity of the ultrasound probe. An image plane is established by the activated ultrasonic transducers being perpendicular or parallel to a medical-device plane including the medical device for accessing the predefined target or area with the medical device.
In some embodiments, the distance and orientation of the activated ultrasonic transducers is adjusted with respect to a blood vessel as the predefined target. An image plane is established by the activated ultrasonic transducers being perpendicular or parallel to the blood vessel in accordance with an orientation of the blood vessel.
In some embodiments, the image plane includes a blood vessel as the predefined target or area and the medical device includes a needle, the image plane being perpendicular to the medical-device plane upon approach of the needle and parallel to the medical-device plane upon a percutaneous puncture with the needle.
In some embodiments, the array of ultrasonic transducers is a two-dimensional (“2-D”) array of ultrasonic transducers. The activated ultrasonic transducers are an approximately linear subset of ultrasonic transducers of the 2-D array of ultrasonic transducers activated by the console at any given time.
In some embodiments, the array of ultrasonic transducers is a movable linear array of ultrasonic transducers. The activated ultrasonic transducers are a subset of the ultrasonic transducers up to all the ultrasonic transducers in the linear array of ultrasonic transducers activated by the console at any given time.
In some embodiments, the ultrasound probe further includes an accelerometer, a gyroscope, a magnetometer, or a combination thereof configured to provide positional-tracking data to the console. The processor is further configured to execute the instructions for processing the positional-tracking data for the adjusting of the distance of the activated ultrasonic transducers from the predefined target or area, the orientation of the activated ultrasonic transducers to the predefined target or area, or both the distance and the orientation of the activated ultrasonic transducers with respect to the predefined target or area.
In some embodiments, the distance and the orientation of the activated ultrasonic transducers is maintained with respect to the predefined target or area when the ultrasound probe is inadvertently moved with respect to the predefined target or area.
Also disclosed herein is a method of an ultrasound-imaging system including a non-transitory computer-readable medium (“CRM”) having executable instructions that cause the ultrasound-imaging system to perform a set of operations for ultrasound imaging when the instructions are executed by a processor of a console of the ultrasound-imaging system. The method includes, in some embodiments, an activating operation, an adjusting operation, a first processing operation, and a first displaying operation. The activating operation includes activating ultrasonic transducers of an array of ultrasonic transducers of an ultrasound probe communicatively coupled to the console. With the activating operation, the ultrasonic transducers emit generated ultrasound signals into a patient, receive reflected ultrasound signals from the patient, and convert the reflected ultrasound signals into corresponding electrical signals of the ultrasound signals for processing into ultrasound images. The adjusting operation includes dynamically adjusting a distance of activated ultrasonic transducers from a predefined target or area, an orientation of the activated ultrasonic transducers to the predefined target or area, or both the distance and the orientation of the activated ultrasonic transducers with respect to the predefined target or area. The first processing operation includes processing the corresponding electrical signals of the ultrasound signals into the ultrasound images. The first displaying operation includes displaying on a display screen communicatively coupled to the console a GUI including the ultrasound images.
In some embodiments, the method further includes a converting operation, a second processing operation, and a second displaying operation. The converting operation includes converting magnetic signals from a magnetized medical device with an array of magnetic sensors of the ultrasound probe into corresponding electrical signals of the magnetic signals. The second processing operation includes processing the corresponding electrical signals of the magnetic signals with the processor into distance and orientation information with respect to the predefined target or area. The second displaying operation includes displaying an iconographic representation of the medical device on the display screen.
In some embodiments, the method further includes an adjusting operation in response to the magnetic signals. The adjusting operation includes adjusting the distance and orientation of the activated ultrasonic transducers with respect to the predefined target or area when the medical device is brought into proximity of the ultrasound probe. The adjusting operation establishes an image plane by the activated ultrasonic transducers perpendicular or parallel to a medical-device plane including the medical device for accessing the predefined target or area with the medical device.
In some embodiments, the method further includes an adjusting operation in response to an orientation of a blood vessel as the predefined target. The adjusting operation includes adjusting the distance and orientation of the activated ultrasonic transducers with respect to the orientation of the blood vessel. The adjusting operation establishes an image plane by the activated ultrasonic transducers perpendicular or parallel to the blood vessel.
In some embodiments, the method further include optical signal-related operations, as well as a third processing operation and a third displaying operation. The optical signal-related operations include emitting input optical signals, receiving reflected optical signals, and converting the reflected optical signals into corresponding electrical signals of the optical signals by a stand-alone optical interrogator communicatively coupled to the console or an integrated optical interrogator integrated into the console. The optical signal-related operations also include conveying the input optical signals from the optical interrogator to a number of FBG sensors along a length of an optical-fiber stylet, as well as conveying the reflected optical signals from the number of FBG sensors back to the optical interrogator with the optical-fiber stylet disposed in a lumen of the medical device. The third processing operation includes processing the corresponding electrical signals of the optical signals with the processor into distance and orientation information with respect to the predefined target or area. The third displaying operation includes displaying an iconographic representation of a medical device on the display screen.
In some embodiments, the method further includes an adjusting operation in response to the optical signals. The adjusting operation includes adjusting the distance and orientation of the activated ultrasonic transducers with respect to the predefined target or area when the medical device is brought into proximity of the ultrasound probe. The adjusting operation establishes an image plane by the activated ultrasonic transducers perpendicular or parallel to a medical-device plane including the medical device for accessing the predefined target or area with the medical device.
In some embodiments, the method further includes an adjusting operation in response to an orientation of a blood vessel as the predefined target. The adjusting operation includes adjusting the distance and orientation of the activated ultrasonic transducers with respect to the orientation of the blood vessel. The adjusting operation establishes an image plane by the activated ultrasonic transducers perpendicular or parallel to the blood vessel.
In some embodiments, the establishing of the image plane is perpendicular to the medical-device plane upon approach of the medical device and parallel to the medical-device plane upon insertion of the medical device. The image plane includes a blood vessel as the predefined target or area and the medical-device plane includes a needle as the medical device.
In some embodiments, the activating operation includes activating an approximately linear subset of ultrasonic transducers of a 2-D array of ultrasonic transducers.
In some embodiments, the activating operation includes activating a subset of the ultrasonic transducers up to all the ultrasonic transducers in a movable linear array of ultrasonic transducers.
In some embodiments, the method further includes a data-providing operation and a fourth processing operation. The data providing operation includes providing positional-tracking data to the console from an accelerometer, a gyroscope, a magnetometer, or a combination thereof of the ultrasound probe. The fourth processing operation includes processing the positional-tracking data with the processor for the adjusting operation.
In some embodiments, the method further includes a maintaining operation. The maintaining operation includes maintaining the distance and the orientation of the activated ultrasonic transducers with respect to the predefined target or area when the ultrasound probe is inadvertently moved with respect to the predefined target or area.
These and other features of the concepts provided herein will become more apparent to those of skill in the art in view of the accompanying drawings and following description, which describe particular embodiments of such concepts in greater detail.
Before some particular embodiments are disclosed in greater detail, it should be understood that the particular embodiments disclosed herein do not limit the scope of the concepts provided herein. It should also be understood that a particular embodiment disclosed herein can have features that can be readily separated from the particular embodiment and optionally combined with or substituted for features of any of a number of other embodiments disclosed herein.
Regarding terms used herein, it should also be understood the terms are for the purpose of describing some particular embodiments, and the terms do not limit the scope of the concepts provided herein. Ordinal numbers (e.g., first, second, third, etc.) are generally used to distinguish or identify different features or steps in a group of features or steps, and do not supply a serial or numerical limitation. For example, “first,” “second,” and “third” features or steps need not necessarily appear in that order, and the particular embodiments including such features or steps need not necessarily be limited to the three features or steps. Labels such as “left,” “right,” “top,” “bottom,” “front,” “back,” and the like are used for convenience and are not intended to imply, for example, any particular fixed location, orientation, or direction. Instead, such labels are used to reflect, for example, relative location, orientation, or directions. Singular forms of “a,” “an,” and “the” include plural references unless the context clearly dictates otherwise.
With respect to “proximal,” a “proximal portion” or a “proximal-end portion” of, for example, a catheter disclosed herein includes a portion of the catheter intended to be near a clinician when the catheter is used on a patient. Likewise, a “proximal length” of, for example, the catheter includes a length of the catheter intended to be near the clinician when the catheter is used on the patient. A “proximal end” of, for example, the catheter includes an end of the catheter intended to be near the clinician when the catheter is used on the patient. The proximal portion, the proximal-end portion, or the proximal length of the catheter can include the proximal end of the catheter; however, the proximal portion, the proximal-end portion, or the proximal length of the catheter need not include the proximal end of the catheter. That is, unless context suggests otherwise, the proximal portion, the proximal-end portion, or the proximal length of the catheter is not a terminal portion or terminal length of the catheter.
With respect to “distal,” a “distal portion” or a “distal-end portion” of, for example, a catheter disclosed herein includes a portion of the catheter intended to be near or in a patient when the catheter is used on the patient. Likewise, a “distal length” of, for example, the catheter includes a length of the catheter intended to be near or in the patient when the catheter is used on the patient. A “distal end” of, for example, the catheter includes an end of the catheter intended to be near or in the patient when the catheter is used on the patient. The distal portion, the distal-end portion, or the distal length of the catheter can include the distal end of the catheter; however, the distal portion, the distal-end portion, or the distal length of the catheter need not include the distal end of the catheter. That is, unless context suggests otherwise, the distal portion, the distal-end portion, or the distal length of the catheter is not a terminal portion or terminal length of the catheter.
Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by those of ordinary skill in the art.
As set forth above, ultrasound-imaging systems and methods thereof are needed that can dynamically adjust the image plane to facilitate guiding interventional instruments to targets in at least the human body. Disclosed herein are dynamically adjusting ultrasound-imaging systems and methods thereof.
Ultrasound-Imaging Systems
As shown, the ultrasound-imaging system 100 includes a console 102, the display screen 104, and the ultrasound probe 106. The ultrasound-imaging system 100 is useful for imaging a target such as a blood vessel or an organ within a body of the patient P prior to a percutaneous puncture with the needle 112 for inserting the needle 112 or another medical device into the target and accessing the target. Indeed, the ultrasound-imaging system 100 is shown in
The console 102 houses a variety of components of the ultrasound-imaging system 100, and it is appreciated the console 102 can take any of a variety of forms. A processor 116 and memory 118 such as random-access memory (“RAM”) or non-volatile memory (e.g., electrically erasable programmable read-only memory [“EEPROM”]) is included in the console 102 for controlling functions of the ultrasound-imaging system 100, as well as executing various logic operations or algorithms during operation of the ultrasound-imaging system 100 in accordance executable instructions 120 therefor stored in the memory 118 for execution by the processor 116. For example, the console 102 is configured to instantiate by way of the instructions 120 one or more processes for dynamically adjusting a distance of activated ultrasonic transducers 149 from a predefined target (e.g., blood vessel) or area, an orientation of the activated ultrasonic transducers 149 to the predefined target or area, or both the distance and the orientation of the activated ultrasonic transducers 149 with respect to the predefined target or area, as well as process electrical signals from the ultrasound probe 106 into ultrasound images. Dynamically adjusting the activated ultrasonic transducers 149 uses ultrasound-imaging data, magnetic-field data, shape-sensing data, or a combination thereof received by the console 102 for activating certain ultrasonic transducers of a 2-D array of the ultrasonic transducers 148 or moving those already activated in a linear array of the ultrasonic transducers 148. A digital controller/analog interface 122 is also included with the console 102 and is in communication with both the processor 116 and other system components to govern interfacing between the ultrasound probe 106 and other system components set forth herein.
The ultrasound-imaging system 100 further includes ports 124 for connection with additional components such as optional components 126 including a printer, storage media, keyboard, etc. The ports 124 can be universal serial bus (“USB”) ports, though other types of ports can be used for this connection or any other connections shown or described herein. A power connection 128 is included with the console 102 to enable operable connection to an external power supply 130. An internal power supply 132 (e.g., a battery) can also be employed either with or exclusive of the external power supply 130. Power management circuitry 134 is included with the digital controller/analog interface 122 of the console 102 to regulate power use and distribution.
Optionally, a stand-alone optical interrogator 154 can be communicatively coupled to the console 102 by way of one of the ports 124. Alternatively, the console 102 can include an integrated optical interrogator integrated into the console 102. Such an optical interrogator is configured to emit input optical signals into a companion optical-fiber stylet 156 for shape sensing with the ultrasound-imaging system 100, which optical-fiber stylet 156, in turn, is configured to be inserted into a lumen of a medical device such as the needle 112 and convey the input optical signals from the optical interrogator 154 to a number of FBG sensors along a length of the optical-fiber stylet 156. The optical interrogator 154 is also configured to receive reflected optical signals conveyed by the optical-fiber stylet 156 reflected from the number of FBG sensors, the reflected optical signals indicative of a shape of the optical-fiber stylet 156. The optical interrogator 154 is also configured to convert the reflected optical signals into corresponding electrical signals for processing by the console 102 into distance and orientation information with respect to the target for dynamically adjusting a distance of the activated ultrasonic transducers 149, an orientation of the activated ultrasonic transducers 149, or both the distance and the orientation of the activated ultrasonic transducers 149 with respect to the target or the medical device when it is brought into proximity of the target. For example, the distance and orientation of the activated ultrasonic transducers 149 can be adjusted with respect to a blood vessel as the target. Indeed, an image plane can be established by the activated ultrasonic transducers 149 being perpendicular or parallel to the blood vessel in accordance with an orientation of the blood vessel. In another example, when a medical device such as the needle 112 is brought into proximity of the ultrasound probe 106, an image plane can be established by the activated ultrasonic transducers 149 being perpendicular to a medical-device plane including the medical device as shown in
The display screen 104 is integrated into the console 102 to provide a GUI and display information for a clinician during such as one-or-more ultrasound images of the target or the patient P attained by the ultrasound probe 106. In addition, the ultrasound-imaging system 100 enables the distance and orientation of a magnetized medical device such as the needle 112 to be superimposed in real-time atop an ultrasound image of the target, thus enabling a clinician to accurately guide the magnetized medical device to the intended target. Notwithstanding the foregoing, the display screen 104 can alternatively be separate from the console 102 and communicatively coupled thereto. A console button interface 136 and control buttons 110 (see
The ultrasound probe 106 is employed in connection with ultrasound-based visualization of a target such as a blood vessel (see
The ultrasound probe 106 includes a probe head 114 that houses a mounted and moveable (e.g., translatable or rotatable along a central axis) linear array of the ultrasonic transducers 148 or a 2-D array of the ultrasonic transducers 148, wherein the ultrasonic transducers 148 are piezoelectric transducers or capacitive micromachined ultrasonic transducers (“CMUTs”). When the ultrasound probe 106 is configured with the 2-D array of the ultrasonic transducers 148, a subset of the ultrasonic transducers 148 is linearly activated as needed for ultrasound imaging in accordance with ultrasound-imaging data, magnetic-field data, shape-sensing data, or a combination thereof to maintain the target in an image plane or switch to a different image plane (e.g., from perpendicular to a medical-device plane to parallel to the medical-device plane) including the target. (See, for example, the activated ultrasonic transducers 149 of
The probe head 114 is configured for placement against skin of the patient P proximate a prospective needle-insertion site where the activated ultrasonic transducers 149 in the probe head 114 can generate and emit the generated ultrasound signals into the patient P in a number of pulses, receive reflected ultrasound signals or ultrasound echoes from the patient P by way of reflection of the generated ultrasonic pulses by the body of the patient P, and convert the reflected ultrasound signals into corresponding electrical signals for processing into ultrasound images by the console 102 to which the ultrasound probe 106 is communicatively coupled. In this way, a clinician can employ the ultrasound-imaging system 100 to determine a suitable insertion site and establish vascular access with the needle 112 or another medical device.
The ultrasound probe 106 further includes the control buttons 110 for controlling certain aspects of the ultrasound-imaging system 100 during an ultrasound-based medical procedure, thus eliminating the need for the clinician to reach out of a sterile field around the patient P to control the ultrasound-imaging system 100. For example, a control button of the control buttons 110 can be configured to select or lock onto the target (e.g., a blood vessel, an organ, etc.) when pressed for visualization of the target in preparation for inserting the needle 112 or another medical device into the target. Such a control button can also be configured to deselect the target, which is useful whether the target was selected by the control button or another means such as by holding the ultrasound probe 106 stationary over the target to select the target, issuing a voice command to select the target, or the like.
Also as seen in
Though configured here as magnetic sensors, it is appreciated that the magnetic sensors 150 can be sensors of other types and configurations. Also, though they are described herein as included with the ultrasound probe 106, the magnetic sensors 150 of the magnetic-sensor array 146 can be included in a component separate from the ultrasound probe 106 such as a sleeve into which the ultrasound probe 106 is inserted or even a separate handheld device. The magnetic sensors 150 can be disposed in an annular configuration about the probe head 114 of the ultrasound probe 106, though it is appreciated that the magnetic sensors 150 can be arranged in other configurations, such as in an arched, planar, or semi-circular arrangement.
Each magnetic sensor of the magnetic sensors 150 includes three orthogonal sensor coils for enabling detection of a magnetic field in three spatial dimensions. Such 3-dimensional (“3-D”) magnetic sensors can be purchased, for example, from Honeywell Sensing and Control of Morristown, NJ. Further, the magnetic sensors 150 are configured as Hall-effect sensors, though other types of magnetic sensors could be employed. Further, instead of 3-D sensors, a plurality of 1-dimensional (“1-D”) magnetic sensors can be included and arranged as desired to achieve 1-, 2-, or 3-D detection capability.
Five magnetic sensors for the magnetic sensors 150 are included in the magnetic-sensor array 146 so as to enable detection of a magnetized medical device such as the needle 112 in three spatial dimensions (e.g., X, Y, Z coordinate space), as well as the pitch and yaw orientation of the magnetized medical device itself. Detection of the magnetized medical device in accordance with the foregoing when the magnetized medical device is brought into proximity of the ultrasound probe 106 allows for dynamically adjusting a distance of the activated ultrasonic transducers 149, an orientation of the activated ultrasonic transducers 149, or both the distance and the orientation of the activated ultrasonic transducers 149 with respect to the target or the magnetized medical device. For example, the distance and orientation of the activated ultrasonic transducers 149 can be adjusted with respect to a blood vessel as the target. Indeed, an image plane can be established by the activated ultrasonic transducers 149 being perpendicular or parallel to the blood vessel in accordance with an orientation of the blood vessel. In another example, as shown among
As shown in
It is appreciated that a medical device of a magnetizable material enables the medical device (e.g., the needle 112) to be magnetized by a magnetizer, if not already magnetized, and tracked by the ultrasound-imaging system 100 when the magnetized medical device is brought into proximity of the magnetic sensors 150 of the magnetic-sensor array 146 or inserted into the body of the patient P during an ultrasound-based medical procedure. Such magnetic-based tracking of the magnetized medical device assists the clinician in placing a distal tip thereof in a desired location, such as in a lumen of a blood vessel, by superimposing a simulated needle image representing the real-time distance and orientation of the needle 112 over an ultrasound image of the body of the patient P being accessed by the magnetized medical device. Such a medical device can be stainless steel such as SS 304 stainless steel; however, other suitable needle materials that are capable of being magnetized can be employed. So configured, the needle 112 or the like can produce a magnetic field or create a magnetic disturbance in a magnetic field detectable as magnetic signals by the magnetic-sensor array 146 of the ultrasound probe 106 so as to enable the distance and orientation of the magnetized medical device to be tracked by the ultrasound-imaging system 100 for dynamically adjusting the distance of the activated ultrasonic transducers 149, an orientation of the activated ultrasonic transducers 149, or both the distance and the orientation of the activated ultrasonic transducers 149 with respect to the magnetized medical device.
During operation of the ultrasound-imaging system 100, the probe head 114 of the ultrasound probe 106 is placed against skin of the patient P. An ultrasound beam 152 is produced so as to ultrasonically image a portion of a target such as a blood vessel beneath a surface of the skin of the patient P. (See
The ultrasound-imaging system 100 is configured to detect the distance and orientation of a medical device by way of the magnetic sensors 150 or shape-sensing optical-fiber stylet 156. By way of example, the magnetic-sensor array 146 of the ultrasound probe 106 is configured to detect a magnetic field of the magnetized medical device or a disturbance in a magnetic field due to the magnetized magnetic device. Each magnetic sensor of the magnetic sensors 150 in the magnetic-sensor array 146 is configured to spatially detect the needle 112 in 3-dimensional space. (See
The distance or orientation of any point along an entire length of the magnetized medical device in a coordinate space with respect to the magnetic-sensor array 146 can be determined by the ultrasound-imaging system 100 using the magnetic-field strength data sensed by the magnetic sensors 150. Moreover, a pitch and yaw of the needle 112 can also be determined. Suitable circuitry of the ultrasound probe 106, the console 102, or other components of the ultrasound-imaging system 100 can provide the calculations necessary for such distance or orientation. In some embodiments, the needle 112 can be tracked using the teachings of one or more patents of U.S. Pat. Nos. 5,775,322; 5,879,297; 6,129,668; 6,216,028; and 6,263,230, each of which is incorporated by reference in its entirety into this application.
The distance and orientation information determined by the ultrasound-imaging system 100, together with an entire length of the magnetized medical device, as known by or input into the ultrasound-imaging system 100, enables the ultrasound-imaging system 100 to accurately determine the distance and orientation of the entire length of the magnetized medical device, including a distal tip thereof, with respect to the magnetic-sensor array 146. This, in turn, enables the ultrasound-imaging system 100 to superimpose an image of the needle 112 on an ultrasound image produced by the ultrasound beam 152 of the ultrasound probe 106 on the display screen 104, as well as dynamically adjusting the activated ultrasonic transducers 149. For example, the ultrasound image depicted on the display screen 104 can include depiction of the surface of the skin of the patient P and a subcutaneous blood vessel thereunder to be accessed by the needle 112, as well as a depiction of the magnetized medical device as detected by the ultrasound-imaging system 100 and its orientation to the vessel. The ultrasound image corresponds to an image acquired by the ultrasound beam 152 of the ultrasound probe 106. It should be appreciated that only a portion of an entire length of the magnetized medical device is magnetized and, thus, tracked by the ultrasound-imaging system 100.
Note that further details regarding structure and operation of the ultrasound-imaging system 100 can be found in U.S. Pat. No. 9,456,766, titled “Apparatus for Use with Needle Insertion Guidance System,” which is incorporated by reference in its entirety into this application.
Methods
Methods of the foregoing ultrasound-imaging systems include methods implemented in the ultrasound-imaging systems. For example, a method of the ultrasound-imaging system 100 includes a non-transitory CRM (e.g., EEPROM) having the instructions 120 stored thereon that cause the ultrasound-imaging system 100 to perform a set of operations for ultrasound imaging when the instructions 120 are executed by the processor 116 of the console 102. Such a method includes an activating operation, an adjusting operation, a first processing operation, and a first displaying operation.
The activating operation includes activating the ultrasonic transducers of the array of the ultrasonic transducers 148 of the ultrasound probe 106 communicatively coupled to the console 102. With the activating operation, the ultrasonic transducers 148 emit generated ultrasound signals into the patient P, receive reflected ultrasound signals from the patient P, and convert the reflected ultrasound signals into corresponding electrical signals for processing into ultrasound images. The activating operation can include activating an approximately linear subset of the ultrasonic transducers 148 of a 2-D array of the ultrasonic transducers 148. Alternatively, the activating operation can include activating a subset of the ultrasonic transducers 148 up to all the ultrasonic transducers 148 in the movable linear array of the ultrasonic transducers 148.
The adjusting operation includes dynamically adjusting a distance of the activated ultrasonic transducers 149 from a predefined target or area, an orientation of the activated ultrasonic transducers 149 to the predefined target or area, or both the distance and the orientation of the activated ultrasonic transducers 149 with respect to the predefined target or area. For example, the adjusting operation can be in response to an orientation of a blood vessel as the predefined target. The adjusting operation includes adjusting the distance and orientation of the activated ultrasonic transducers 149 with respect to the orientation of the blood vessel so as to establish an image plane by the activated ultrasonic transducers 149 perpendicular or parallel to the blood vessel.
The first processing operation includes processing the corresponding electrical signals of the ultrasound signals into the ultrasound images.
The first displaying operation includes displaying on the display screen 104 communicatively coupled to the console 102 the GUI including the ultrasound images.
As to magnetic signal-related operations, the method can include a converting operation, a second processing operation, and a second displaying operation. The converting operation includes converting magnetic signals from a magnetized medical device (e.g., the needle 112) with the magnetic-sensor array 146 of the ultrasound probe 106 into corresponding electrical signals. The second processing operation includes processing the corresponding electrical signals of the magnetic signals with the processor 116 into distance and orientation information with respect to the predefined target or area. The second displaying operation includes displaying an iconographic representation of the medical device on the display screen 104.
The method further includes an adjusting operation in response to the magnetic signals. The adjusting operation includes adjusting the distance and orientation of the activated ultrasonic transducers 149 with respect to the predefined target or area when the medical device is brought into proximity of the ultrasound probe 106. The adjusting operation establishes an image plane by the activated ultrasonic transducers 149 perpendicular or parallel to the medical-device plane including the medical device for accessing the predefined target or area with the medical device. The establishing of the image plane can be perpendicular to the medical-device plane upon approach of the medical device and parallel to the medical-device plane upon insertion of the medical device. The image plane can include a blood vessel as the predefined target or area and the medical-device plane can include the needle 112 as the medical device.
As to optical signal-related operations, the method can include a number of optical signal-related operations, as well as a third processing operation and a third displaying operation. The optical signal-related operations include emitting input optical signals, receiving reflected optical signals, and converting the reflected optical signals into corresponding electrical signals of the optical signals by the optical interrogator 154. The optical signal-related operations also include conveying the input optical signals from the optical interrogator 154 to the number of FBG sensors along the length of the optical-fiber stylet 156, as well as conveying the reflected optical signals from the number of FBG sensors back to the optical interrogator 154 with the optical-fiber stylet 156 disposed in a lumen of the medical device. The third processing operation includes processing the corresponding electrical signals of the optical signals with the processor 116 into distance and orientation information with respect to the predefined target or area. The third displaying operation includes displaying an iconographic representation of a medical device on the display screen 104.
The method further includes an adjusting operation in response to the optical signals. The adjusting operation includes adjusting the distance and orientation of the activated ultrasonic transducers 149 with respect to the predefined target or area when the medical device is brought into proximity of the ultrasound probe 106. The adjusting operation establishes an image plane by the activated ultrasonic transducers 149 perpendicular or parallel to the medical-device plane including the medical device for accessing the predefined target or area with the medical device. Again, the establishing of the image plane is perpendicular to the medical-device plane upon approach of the medical device and parallel to the medical-device plane upon insertion of the medical device. The image plane includes a blood vessel as the predefined target or area and the medical-device plane includes the needle 112 as the medical device.
The method can further include a data-providing operation and a fourth processing operation. The data-providing operation includes providing positional-tracking data to the console 102 from the accelerometer 160, the gyroscope 162, the magnetometer 164, or a combination thereof of the ultrasound probe 106. The fourth processing operation includes processing the positional-tracking data with the processor 116 for the adjusting operation.
The method can further include a maintaining operation. The maintaining operation includes maintaining the distance and the orientation of the activated ultrasonic transducers 149 with respect to the predefined target or area when the ultrasound probe 106 is inadvertently moved with respect to the predefined target or area.
While some particular embodiments have been disclosed herein, and while the particular embodiments have been disclosed in some detail, it is not the intention for the particular embodiments to limit the scope of the concepts provided herein. Additional adaptations and/or modifications can appear to those of ordinary skill in the art, and, in broader aspects, these adaptations and/or modifications are encompassed as well. Accordingly, departures may be made from the particular embodiments disclosed herein without departing from the scope of the concepts provided herein.
This application claims the benefit of priority to U.S. Provisional Patent Application No. 63/075,707, filed Sep. 8, 2020, which is incorporated by reference in its entirety into this application.
| Number | Name | Date | Kind |
|---|---|---|---|
| 3697917 | Orth et al. | Oct 1972 | A |
| 5148809 | Biegeleisen-Knight et al. | Sep 1992 | A |
| 5181513 | Touboul et al. | Jan 1993 | A |
| 5325293 | Dorne | Jun 1994 | A |
| 5349865 | Kavli et al. | Sep 1994 | A |
| 5441052 | Miyajima | Aug 1995 | A |
| 5549554 | Miraki | Aug 1996 | A |
| 5573529 | Haak et al. | Nov 1996 | A |
| 5775322 | Silverstein et al. | Jul 1998 | A |
| 5879297 | Haynor et al. | Mar 1999 | A |
| 5897503 | Lyon et al. | Apr 1999 | A |
| 5908387 | LeFree et al. | Jun 1999 | A |
| 5967984 | Chu et al. | Oct 1999 | A |
| 5970119 | Hofmann | Oct 1999 | A |
| 6004270 | Urbano et al. | Dec 1999 | A |
| 6019724 | Gronningsaeter et al. | Feb 2000 | A |
| 6068599 | Saito et al. | May 2000 | A |
| 6074367 | Hubbell | Jun 2000 | A |
| 6129668 | Haynor et al. | Oct 2000 | A |
| 6132379 | Patacsil et al. | Oct 2000 | A |
| 6216028 | Haynor et al. | Apr 2001 | B1 |
| 6233476 | Strommer et al. | May 2001 | B1 |
| 6245018 | Lee | Jun 2001 | B1 |
| 6263230 | Haynor et al. | Jul 2001 | B1 |
| 6375615 | Flaherty et al. | Apr 2002 | B1 |
| 6436043 | Bonnefous | Aug 2002 | B2 |
| 6498942 | Esenaliev et al. | Dec 2002 | B1 |
| 6503205 | Manor et al. | Jan 2003 | B2 |
| 6508769 | Bonnefous | Jan 2003 | B2 |
| 6511458 | Milo et al. | Jan 2003 | B2 |
| 6524249 | Moehring et al. | Feb 2003 | B2 |
| 6543642 | Milliorn | Apr 2003 | B1 |
| 6554771 | Buil et al. | Apr 2003 | B1 |
| 6592520 | Peszynski et al. | Jul 2003 | B1 |
| 6592565 | Twardowski | Jul 2003 | B2 |
| 6601705 | Molina et al. | Aug 2003 | B2 |
| 6612992 | Hossack et al. | Sep 2003 | B1 |
| 6613002 | Clark et al. | Sep 2003 | B1 |
| 6623431 | Sakuma et al. | Sep 2003 | B1 |
| 6641538 | Nakaya et al. | Nov 2003 | B2 |
| 6647135 | Bonnefous | Nov 2003 | B2 |
| 6687386 | Ito et al. | Feb 2004 | B1 |
| 6733458 | Steins | May 2004 | B1 |
| 6749569 | Pellegretti | Jun 2004 | B1 |
| 6754608 | Svanerudh et al. | Jun 2004 | B2 |
| 6755789 | Stringer et al. | Jun 2004 | B2 |
| 6840379 | Franks-Farah et al. | Jan 2005 | B2 |
| 6857196 | Dalrymple | Feb 2005 | B2 |
| 6979294 | Selzer et al. | Dec 2005 | B1 |
| 7074187 | Selzer et al. | Jul 2006 | B2 |
| 7244234 | Ridley et al. | Jul 2007 | B2 |
| 7359554 | Klingensmith et al. | Apr 2008 | B2 |
| 7534209 | Abend et al. | May 2009 | B2 |
| 7599730 | Hunter et al. | Oct 2009 | B2 |
| 7637870 | Flaherty et al. | Dec 2009 | B2 |
| 7681579 | Schwartz | Mar 2010 | B2 |
| 7691061 | Hirota | Apr 2010 | B2 |
| 7699779 | Sasaki et al. | Apr 2010 | B2 |
| 7720520 | Willis | May 2010 | B2 |
| 7727153 | Fritz et al. | Jun 2010 | B2 |
| 7734326 | Pedain et al. | Jun 2010 | B2 |
| 7831449 | Ying et al. | Nov 2010 | B2 |
| 7905837 | Suzuki | Mar 2011 | B2 |
| 7925327 | Weese | Apr 2011 | B2 |
| 7927278 | Selzer et al. | Apr 2011 | B2 |
| 8014848 | Birkenbach et al. | Sep 2011 | B2 |
| 8038619 | Steinbacher | Oct 2011 | B2 |
| 8060181 | Rodriguez Ponce et al. | Nov 2011 | B2 |
| 8075488 | Burton | Dec 2011 | B2 |
| 8090427 | Eck et al. | Jan 2012 | B2 |
| 8105239 | Specht | Jan 2012 | B2 |
| 8172754 | Watanabe et al. | May 2012 | B2 |
| 8175368 | Sathyanarayana | May 2012 | B2 |
| 8200313 | Rambod et al. | Jun 2012 | B1 |
| 8211023 | Swan et al. | Jul 2012 | B2 |
| 8228347 | Beasley et al. | Jul 2012 | B2 |
| 8298147 | Huennekens et al. | Oct 2012 | B2 |
| 8303505 | Webler et al. | Nov 2012 | B2 |
| 8323202 | Roschak et al. | Dec 2012 | B2 |
| 8328727 | Miele et al. | Dec 2012 | B2 |
| 8388541 | Messerly et al. | Mar 2013 | B2 |
| 8409103 | Grunwald et al. | Apr 2013 | B2 |
| 8449465 | Nair et al. | May 2013 | B2 |
| 8553954 | Saikia | Oct 2013 | B2 |
| 8556815 | Pelissier et al. | Oct 2013 | B2 |
| 8585600 | Liu et al. | Nov 2013 | B2 |
| 8622913 | Dentinger et al. | Jan 2014 | B2 |
| 8706457 | Hart et al. | Apr 2014 | B2 |
| 8727988 | Flaherty et al. | May 2014 | B2 |
| 8734357 | Taylor | May 2014 | B2 |
| 8744211 | Owen | Jun 2014 | B2 |
| 8754865 | Merritt et al. | Jun 2014 | B2 |
| 8764663 | Smok et al. | Jul 2014 | B2 |
| 8781194 | Malek et al. | Jul 2014 | B2 |
| 8781555 | Burnside et al. | Jul 2014 | B2 |
| 8790263 | Randall et al. | Jul 2014 | B2 |
| 8849382 | Cox et al. | Sep 2014 | B2 |
| 8939908 | Suzuki et al. | Jan 2015 | B2 |
| 8961420 | Zhang | Feb 2015 | B2 |
| 9022940 | Meier | May 2015 | B2 |
| 9138290 | Hadjicostis | Sep 2015 | B2 |
| 9199082 | Yared et al. | Dec 2015 | B1 |
| 9204858 | Pelissier et al. | Dec 2015 | B2 |
| 9220477 | Urabe et al. | Dec 2015 | B2 |
| 9295447 | Shah | Mar 2016 | B2 |
| 9320493 | Visveshwara | Apr 2016 | B2 |
| 9357980 | Toji et al. | Jun 2016 | B2 |
| 9364171 | Harris et al. | Jun 2016 | B2 |
| 9427207 | Sheldon et al. | Aug 2016 | B2 |
| 9445780 | Hossack et al. | Sep 2016 | B2 |
| 9456766 | Cox et al. | Oct 2016 | B2 |
| 9456804 | Tamada | Oct 2016 | B2 |
| 9468413 | Hall et al. | Oct 2016 | B2 |
| 9492097 | Wilkes et al. | Nov 2016 | B2 |
| 9521961 | Silverstein et al. | Dec 2016 | B2 |
| 9554716 | Burnside et al. | Jan 2017 | B2 |
| 9582876 | Specht | Feb 2017 | B2 |
| 9610061 | Ebbini et al. | Apr 2017 | B2 |
| 9636031 | Cox | May 2017 | B2 |
| 9649037 | Lowe et al. | May 2017 | B2 |
| 9649048 | Cox et al. | May 2017 | B2 |
| 9702969 | Hope Simpson et al. | Jul 2017 | B2 |
| 9715757 | Ng et al. | Jul 2017 | B2 |
| 9717415 | Cohen et al. | Aug 2017 | B2 |
| 9731066 | Liu et al. | Aug 2017 | B2 |
| 9814433 | Benishti et al. | Nov 2017 | B2 |
| 9814531 | Yagi et al. | Nov 2017 | B2 |
| 9861337 | Patwardhan et al. | Jan 2018 | B2 |
| 9895138 | Sasaki | Feb 2018 | B2 |
| 9913605 | Harris et al. | Mar 2018 | B2 |
| 9949720 | Southard et al. | Apr 2018 | B2 |
| 10043272 | Forzoni et al. | Aug 2018 | B2 |
| 10449330 | Newman et al. | Oct 2019 | B2 |
| 10524691 | Newman et al. | Jan 2020 | B2 |
| 10751509 | Misener | Aug 2020 | B2 |
| 11564861 | Gaines | Jan 2023 | B1 |
| 20020038088 | Imran et al. | Mar 2002 | A1 |
| 20030047126 | Tomaschko | Mar 2003 | A1 |
| 20030106825 | Molina et al. | Jun 2003 | A1 |
| 20030120154 | Sauer et al. | Jun 2003 | A1 |
| 20030135115 | Burdette et al. | Jul 2003 | A1 |
| 20030149366 | Stringer et al. | Aug 2003 | A1 |
| 20040015080 | Kelly et al. | Jan 2004 | A1 |
| 20040055925 | Franks-Farah et al. | Mar 2004 | A1 |
| 20040197267 | Black et al. | Oct 2004 | A1 |
| 20050000975 | Carco et al. | Jan 2005 | A1 |
| 20050049504 | Lo et al. | Mar 2005 | A1 |
| 20050165299 | Kressy et al. | Jul 2005 | A1 |
| 20050251030 | Azar et al. | Nov 2005 | A1 |
| 20050267365 | Sokulin et al. | Dec 2005 | A1 |
| 20060004290 | Smith et al. | Jan 2006 | A1 |
| 20060013523 | Childlers et al. | Jan 2006 | A1 |
| 20060015039 | Cassidy et al. | Jan 2006 | A1 |
| 20060020204 | Serra et al. | Jan 2006 | A1 |
| 20060047617 | Bacioiu et al. | Mar 2006 | A1 |
| 20060079781 | Germond-Rouet et al. | Apr 2006 | A1 |
| 20060184029 | Haim et al. | Aug 2006 | A1 |
| 20060210130 | Germond-Rouet et al. | Sep 2006 | A1 |
| 20070043341 | Anderson et al. | Feb 2007 | A1 |
| 20070049822 | Bunce et al. | Mar 2007 | A1 |
| 20070073155 | Park et al. | Mar 2007 | A1 |
| 20070167738 | Timinger et al. | Jul 2007 | A1 |
| 20070199848 | Ellswood et al. | Aug 2007 | A1 |
| 20070239120 | Brock et al. | Oct 2007 | A1 |
| 20070249911 | Simon | Oct 2007 | A1 |
| 20070287886 | Saadat | Dec 2007 | A1 |
| 20080021322 | Stone et al. | Jan 2008 | A1 |
| 20080033293 | Beasley et al. | Feb 2008 | A1 |
| 20080033759 | Finlay | Feb 2008 | A1 |
| 20080051657 | Rold | Feb 2008 | A1 |
| 20080108930 | Weitzel et al. | May 2008 | A1 |
| 20080125651 | Watanabe et al. | May 2008 | A1 |
| 20080146915 | McMorrow | Jun 2008 | A1 |
| 20080177186 | Slater et al. | Jul 2008 | A1 |
| 20080221425 | Olson et al. | Sep 2008 | A1 |
| 20080294037 | Richter | Nov 2008 | A1 |
| 20080300491 | Bonde et al. | Dec 2008 | A1 |
| 20090012399 | Sunagawa et al. | Jan 2009 | A1 |
| 20090012401 | Steinbacher | Jan 2009 | A1 |
| 20090074280 | Lu et al. | Mar 2009 | A1 |
| 20090124903 | Osaka | May 2009 | A1 |
| 20090137887 | Shariati et al. | May 2009 | A1 |
| 20090143672 | Harms et al. | Jun 2009 | A1 |
| 20090143684 | Cermak et al. | Jun 2009 | A1 |
| 20090156926 | Messerly et al. | Jun 2009 | A1 |
| 20090281413 | Boyden et al. | Nov 2009 | A1 |
| 20090306509 | Pedersen et al. | Dec 2009 | A1 |
| 20100010348 | Halmann | Jan 2010 | A1 |
| 20100211026 | Sheetz et al. | Aug 2010 | A2 |
| 20100249598 | Smith et al. | Sep 2010 | A1 |
| 20100286515 | Gravenstein et al. | Nov 2010 | A1 |
| 20100312121 | Guan | Dec 2010 | A1 |
| 20100324423 | El-Aklouk et al. | Dec 2010 | A1 |
| 20110002518 | Ziv-Ari et al. | Jan 2011 | A1 |
| 20110026796 | Hyun et al. | Feb 2011 | A1 |
| 20110071404 | Schmitt et al. | Mar 2011 | A1 |
| 20110074244 | Osawa | Mar 2011 | A1 |
| 20110087107 | Lindekugel et al. | Apr 2011 | A1 |
| 20110166451 | Blaivas et al. | Jul 2011 | A1 |
| 20110282188 | Burnside et al. | Nov 2011 | A1 |
| 20110295108 | Cox et al. | Dec 2011 | A1 |
| 20110313293 | Lindekugel et al. | Dec 2011 | A1 |
| 20120165679 | Orome et al. | Jun 2012 | A1 |
| 20120179038 | Meurer et al. | Jul 2012 | A1 |
| 20120179042 | Fukumoto et al. | Jul 2012 | A1 |
| 20120179044 | Chiang et al. | Jul 2012 | A1 |
| 20120197132 | O'Connor | Aug 2012 | A1 |
| 20120220865 | Brown et al. | Aug 2012 | A1 |
| 20120277576 | Lui | Nov 2012 | A1 |
| 20130041250 | Pelissier et al. | Feb 2013 | A1 |
| 20130102889 | Southard et al. | Apr 2013 | A1 |
| 20130131499 | Chan et al. | May 2013 | A1 |
| 20130131502 | Blaivas et al. | May 2013 | A1 |
| 20130150724 | Blaivas et al. | Jun 2013 | A1 |
| 20130188832 | Ma et al. | Jul 2013 | A1 |
| 20130197367 | Smok et al. | Aug 2013 | A1 |
| 20130218024 | Boctor et al. | Aug 2013 | A1 |
| 20130323700 | Samosky et al. | Dec 2013 | A1 |
| 20130338503 | Cohen et al. | Dec 2013 | A1 |
| 20130338508 | Nakamura et al. | Dec 2013 | A1 |
| 20140005530 | Liu et al. | Jan 2014 | A1 |
| 20140031694 | Solek | Jan 2014 | A1 |
| 20140066779 | Nakanishi | Mar 2014 | A1 |
| 20140073976 | Fonte et al. | Mar 2014 | A1 |
| 20140100440 | Cheline et al. | Apr 2014 | A1 |
| 20140114194 | Kanayama et al. | Apr 2014 | A1 |
| 20140180098 | Flaherty et al. | Jun 2014 | A1 |
| 20140180116 | Lindekugel et al. | Jun 2014 | A1 |
| 20140188133 | Misener | Jul 2014 | A1 |
| 20140188440 | Donhowe et al. | Jul 2014 | A1 |
| 20140276059 | Sheehan | Sep 2014 | A1 |
| 20140276069 | Amble et al. | Sep 2014 | A1 |
| 20140276081 | Tegels | Sep 2014 | A1 |
| 20140276085 | Miller | Sep 2014 | A1 |
| 20140276690 | Grace | Sep 2014 | A1 |
| 20140343431 | Vajinepalli et al. | Nov 2014 | A1 |
| 20140357994 | Jin et al. | Dec 2014 | A1 |
| 20150005738 | Blacker | Jan 2015 | A1 |
| 20150011887 | Ahn et al. | Jan 2015 | A1 |
| 20150065916 | Maguire et al. | Mar 2015 | A1 |
| 20150073279 | Cai et al. | Mar 2015 | A1 |
| 20150112200 | Oberg et al. | Apr 2015 | A1 |
| 20150209113 | Burkholz et al. | Jul 2015 | A1 |
| 20150209510 | Burkholz et al. | Jul 2015 | A1 |
| 20150209526 | Matsubara et al. | Jul 2015 | A1 |
| 20150297097 | Matsubara et al. | Oct 2015 | A1 |
| 20150359520 | Shan et al. | Dec 2015 | A1 |
| 20150359991 | Dunbar et al. | Dec 2015 | A1 |
| 20160000367 | Lyon | Jan 2016 | A1 |
| 20160029995 | Navratil et al. | Feb 2016 | A1 |
| 20160113699 | Sverdlik et al. | Apr 2016 | A1 |
| 20160120607 | Sorotzkin et al. | May 2016 | A1 |
| 20160157831 | Kang et al. | Jun 2016 | A1 |
| 20160166232 | Merritt | Jun 2016 | A1 |
| 20160202053 | Walker et al. | Jul 2016 | A1 |
| 20160213398 | Liu | Jul 2016 | A1 |
| 20160259992 | Knodt et al. | Sep 2016 | A1 |
| 20160278869 | Grunwald | Sep 2016 | A1 |
| 20160296208 | Sethuraman et al. | Oct 2016 | A1 |
| 20160374644 | Mauldin, Jr. et al. | Dec 2016 | A1 |
| 20170020561 | Cox | Jan 2017 | A1 |
| 20170079548 | Silverstein et al. | Mar 2017 | A1 |
| 20170143312 | Hedlund et al. | May 2017 | A1 |
| 20170164923 | Matsumoto | Jun 2017 | A1 |
| 20170172666 | Govari et al. | Jun 2017 | A1 |
| 20170215842 | Ryu et al. | Aug 2017 | A1 |
| 20170252004 | Broad et al. | Sep 2017 | A1 |
| 20170328751 | Lemke | Nov 2017 | A1 |
| 20170367678 | Sirtori et al. | Dec 2017 | A1 |
| 20180015256 | Southard et al. | Jan 2018 | A1 |
| 20180116723 | Hettrick et al. | May 2018 | A1 |
| 20180125450 | Blackbourne et al. | May 2018 | A1 |
| 20180161502 | Nanan et al. | Jun 2018 | A1 |
| 20180199914 | Ramachandran et al. | Jul 2018 | A1 |
| 20180214119 | Mehrmohammadi et al. | Aug 2018 | A1 |
| 20180228465 | Southard et al. | Aug 2018 | A1 |
| 20180235709 | Donhowe | Aug 2018 | A1 |
| 20180289927 | Messerly | Oct 2018 | A1 |
| 20180296185 | Cox | Oct 2018 | A1 |
| 20180310955 | Lindekugel et al. | Nov 2018 | A1 |
| 20180344293 | Raju et al. | Dec 2018 | A1 |
| 20190060001 | Kohli et al. | Feb 2019 | A1 |
| 20190060014 | Hazelton et al. | Feb 2019 | A1 |
| 20190125210 | Govari et al. | May 2019 | A1 |
| 20190200951 | Meier | Jul 2019 | A1 |
| 20190239848 | Bedi et al. | Aug 2019 | A1 |
| 20190307419 | Durfee | Oct 2019 | A1 |
| 20190307515 | Naito | Oct 2019 | A1 |
| 20190365347 | Abe | Dec 2019 | A1 |
| 20190365348 | Toume | Dec 2019 | A1 |
| 20200069929 | Mason et al. | Mar 2020 | A1 |
| 20200113540 | Gijsbers et al. | Apr 2020 | A1 |
| 20200163654 | Satir et al. | May 2020 | A1 |
| 20200200900 | Asami et al. | Jun 2020 | A1 |
| 20200230391 | Burkholz et al. | Jul 2020 | A1 |
| 20200281563 | Muller et al. | Sep 2020 | A1 |
| 20200359990 | Poland et al. | Nov 2020 | A1 |
| 20210059639 | Howell | Mar 2021 | A1 |
| 20210137492 | Imai | May 2021 | A1 |
| 20210161510 | Sasaki et al. | Jun 2021 | A1 |
| 20210186467 | Urabe et al. | Jun 2021 | A1 |
| 20210267570 | Ulman et al. | Sep 2021 | A1 |
| 20210315538 | Brandl et al. | Oct 2021 | A1 |
| 20220039777 | Durfee | Feb 2022 | A1 |
| 20220039829 | Zijlstra et al. | Feb 2022 | A1 |
| 20220096797 | Prince | Mar 2022 | A1 |
| 20220104791 | Matsumoto | Apr 2022 | A1 |
| 20220104886 | Blanchard et al. | Apr 2022 | A1 |
| 20220117582 | McLaughlin et al. | Apr 2022 | A1 |
| 20220160434 | Messerly et al. | May 2022 | A1 |
| 20220168050 | Sowards et al. | Jun 2022 | A1 |
| 20220172354 | Misener et al. | Jun 2022 | A1 |
| 20220330922 | Sowards et al. | Oct 2022 | A1 |
| 20220334251 | Sowards et al. | Oct 2022 | A1 |
| 20230107629 | Sowards et al. | Apr 2023 | A1 |
| 20230132148 | Sowards et al. | Apr 2023 | A1 |
| 20230135562 | Misener et al. | May 2023 | A1 |
| 20230138970 | Sowards et al. | May 2023 | A1 |
| 20230148872 | Sowards et al. | May 2023 | A1 |
| 20230201539 | Howell | Jun 2023 | A1 |
| 20230277153 | Sowards et al. | Sep 2023 | A1 |
| 20230277154 | Sowards et al. | Sep 2023 | A1 |
| 20230293143 | Sowards et al. | Sep 2023 | A1 |
| 20230397900 | Prince | Dec 2023 | A1 |
| Number | Date | Country |
|---|---|---|
| 102871645 | Jan 2013 | CN |
| 105107067 | May 2018 | CN |
| 0933063 | Aug 1999 | EP |
| 1504713 | Feb 2005 | EP |
| 1591074 | May 2008 | EP |
| 2823766 | Jan 2015 | EP |
| 3181083 | Jun 2017 | EP |
| 3870059 | Sep 2021 | EP |
| 2000271136 | Oct 2000 | JP |
| 2007222291 | Sep 2007 | JP |
| 2014150928 | Aug 2014 | JP |
| 2018175547 | Nov 2018 | JP |
| 20180070878 | Jun 2018 | KR |
| 102176196 | Nov 2020 | KR |
| 2010029521 | Mar 2010 | WO |
| 2010076808 | Jul 2010 | WO |
| 2013059714 | Apr 2013 | WO |
| 2014115150 | Jul 2014 | WO |
| 2015017270 | Feb 2015 | WO |
| 2016081023 | May 2016 | WO |
| 2017096487 | Jun 2017 | WO |
| 2017214428 | Dec 2017 | WO |
| 2018026878 | Feb 2018 | WO |
| 2018134726 | Jul 2018 | WO |
| 2019232451 | Dec 2019 | WO |
| 2020002620 | Jan 2020 | WO |
| 2020016018 | Jan 2020 | WO |
| 2019232454 | Feb 2020 | WO |
| 2020044769 | Mar 2020 | WO |
| 2020067897 | Apr 2020 | WO |
| 2020083660 | Apr 2020 | WO |
| 2020186198 | Sep 2020 | WO |
| WO-2021198226 | Oct 2021 | WO |
| 2022072727 | Apr 2022 | WO |
| 2022081904 | Apr 2022 | WO |
| 2022119853 | Jun 2022 | WO |
| 2022115479 | Jun 2022 | WO |
| 2022119856 | Jun 2022 | WO |
| 2022221703 | Oct 2022 | WO |
| 2022221714 | Oct 2022 | WO |
| 2023059512 | Apr 2023 | WO |
| 2023076268 | May 2023 | WO |
| 2023081220 | May 2023 | WO |
| 2023081223 | May 2023 | WO |
| 2023091424 | May 2023 | WO |
| Entry |
|---|
| Pagoulatos, N. et al. “New spatial localizer based on fiber optics with applications in 3D ultrasound imaging” Proceeding of Spie, vol. 3976 (Apr. 18, 2000; Apr. 18, 2000). |
| U.S. Appl. No. 17/020,476, filed Sep. 14, 2020 Final Office Action dated Jun. 9, 2022. |
| PCT/US2022/025082 filed Apr. 15, 2022 International Search Report and Written Opinion dated Jul. 11, 2022. |
| PCT/US2022/025097 filed Apr. 15, 2022 International Search Report and Written Opinion dated Jul. 8, 2022. |
| U.S. Appl. No. 17/020,476, filed Sep. 14, 2020 Advisory Action dated Aug. 19, 2022. |
| U.S. Appl. No. 17/020,476, filed Sep. 14, 2020 Non-Final Office Action dated Sep. 23, 2022. |
| U.S. Appl. No. 17/471,015, filed Sep. 9, 2021 Non-Final Office Action dated Aug. 16, 2022. |
| Lu Zhenyu et al “Recent advances in 5 robot-assisted echography combining perception control and cognition.” Cognitive Computation and Systems the Institution of Engineering and Technology, Michael Faraday House, Six Hills Way, Stevenage Herts. SG1 2AY UK vol. 2 No. 3 Sep. 2, 2020 (Sep. 2, 2020). |
| PCT/US2021/045218 filed Aug. 9, 2021 International Search Report and Written Opinion dated Nov. 23, 2021. |
| PCT/US2021/049123 filed Sep. 3, 2021 International Search Report and Written Opinion dated Feb. 4, 2022. |
| PCT/US2021/060622 filed Nov. 23, 2021 International Search Report and Written Opinion dated Mar. 3, 2022. |
| PCT/US2021/061267 filed Nov. 30, 2021 International Search Report and Written Opinion dated Mar. 9, 2022. |
| PCT/US2021/061276 filed Nov. 30, 2021 International Search Report and Written Opinion dated Mar. 9, 2022. |
| Sebastian Vogt: “Real-Time Augmented Reality for Image-Guided Interventions”, Oct. 5, 2009, XPO55354720, Retrieved from the Internet: URL: https://opus4.kobv.de/opus4-fau/frontdoor/deliver/index/docId/1235/file/SebastianVogtDissertation.pdf. |
| U.S. Appl. No. 15/951,903, filed Apr. 12, 2018 Board Decision dated Apr. 20, 2022. |
| U.S. Appl. No. 15/951,903, filed Apr. 12, 2018 Notice of Allowance dated May 2, 2022. |
| William F Garrett et al: “Real-time incremental visualization of dynamic ultrasound volumes using parallel BSP trees”, Visualization '96. Proceedings, IEEE, NE, Oct. 27, 1996, pp. 235-ff, XPO58399771, ISBN: 978-0-89791-864-0 abstract, figures 1-7, pp. 236-240. |
| PCT/US2022/048716 filed Nov. 2, 2022 International Search Report and Written Opinion dated Feb. 24, 2023. |
| PCT/US2022/048722 filed Nov. 2, 2022 International Search Report and Written Opinion dated Feb. 24, 2023. |
| PCT/US2022047727 filed Oct. 25, 2022 International Search Report and Written Opinion dated Jan. 25, 2023. |
| U.S. Appl. No. 17/020,476, filed Sep. 14, 2020 Final Office Action dated Jan. 5, 2023. |
| U.S. Appl. No. 17/471,015, filed Sep. 9, 2021 Non-Final Office Action dated Mar. 30, 2023. |
| U.S. Appl. No. 17/538,911, filed Nov. 30, 2021 Non-Final Office Action dated Mar. 2, 2023. |
| PCT/US12/61182 International Seach Report and Written Opinion dated Mar. 11, 2013. |
| PCT/US2021/049294 filed Sep. 7, 2021 International Search Report and Written Opinion dated Dec. 8, 2021. |
| PCT/US2021/049712 filed Sep. 9, 2021 International Search Report and Written Opinion dated Dec. 14, 2021. |
| PCT/US2021/052055 filed Sep. 24, 2021 International Search Report and Written Opinion dated Dec. 20, 2021. |
| U.S. Appl. No. 13/656,563, filed Oct. 19, 2012 Decision on Appeal dated Nov. 1, 2017. |
| U.S. Appl. No. 13/656,563, filed Oct. 19, 2012 Examiner's Answer dated Nov. 16, 2015. |
| U.S. Appl. No. 13/656,563, filed Oct. 19, 2012 Final Office Action dated Dec. 5, 2014. |
| U.S. Appl. No. 13/656,563, filed Oct. 19, 2012 Non-Final Office Action dated Jul. 18, 2014. |
| U.S. Appl. No. 15/650,474, filed Jul. 14, 2017 Final Office Action dated Jun. 2, 2020. |
| U.S. Appl. No. 15/650,474, filed Jul. 14, 2017 Non-Final Office Action dated Dec. 16, 2019. |
| U.S. Appl. No. 15/650,474, filed Jul. 14, 2017 Notice of Allowance dated Dec. 11, 2020. |
| U.S. Appl. No. 15/650,474, filed Jul. 14, 2017 Notice of Allowance dated Mar. 1, 2021. |
| U.S. Appl. No. 15/951,903, filed Apr. 12, 2018 Advisory Action dated Dec. 22, 2020. |
| U.S. Appl. No. 15/951,903, filed Apr. 12, 2018 Examiner's Answer dated Jun. 3, 2021. |
| U.S. Appl. No. 15/951,903, filed Apr. 12, 2018 Final Office Action dated Oct. 13, 2020. |
| U.S. Appl. No. 15/951,903, filed Apr. 12, 2018 Non-Final Office Action dated May 22, 2020. |
| U.S. Appl. No. 17/020,476, filed Sep. 14, 2020 Non-Final Office Action dated Feb. 9, 2022. |
| PCT/US2022/049983 filed Nov. 15, 2022 International Search Report and Written Opinion dated Mar. 29, 2023. |
| PCT/US2023/014143 filed Feb. 28, 2023 International Search Report and Written Opinion dated Jun. 12, 2023. |
| PCT/US2023/015266 filed Mar. 15, 2023 International Search Report and Written Opinion dated May 25, 2023. |
| Saxena Ashish et al Thermographic venous blood flow characterization with external cooling stimulation Infrared Physics and Technology Elsevier Science GB vol. 90 Feb. 9, 2018 Feb. 9, 2018 pp. 8-19 XP085378852. |
| U.S. Appl. No. 17/020,476, filed Sep. 14, 2020 Notice of Allowance dated Apr. 28, 2022. |
| U.S. Appl. No. 17/534,099, filed Nov. 23, 2021 Non-Final Office Action dated Mar. 31, 2023. |
| U.S. Appl. No. 17/684,180, filed Mar. 1, 2022 Restriction Requirement dated May 19, 2023. |
| EP 20866520.8 filed Apr. 5, 2022 Extended European Search Report dated Aug. 22, 2023. |
| U.S. Appl. No. 17/471,015, filed Sep. 9, 2021 Final Office Action dated Oct. 12, 2023. |
| U.S. Appl. No. 17/534,099, filed Nov. 23, 2021 Final Office Action dated Sep. 29, 2023. |
| U.S. Appl. No. 17/538,911, filed Nov. 30, 2021 Final Office Action dated Sep. 13, 2023. |
| U.S. Appl. No. 17/684,180, filed Mar. 1, 2022 Non-Final Office Action dated Jul. 28, 2023. |
| U.S. Appl. No. 17/722,151, filed Apr. 15, 2022 Non-Final Office Action dated Sep. 7, 2023. |
| PCT/US2022/025097 filed Apr. 15, 2021 International Preliminary Report on Patentability dated Oct. 26, 2023. |
| PCT/US2023/030970 filed Aug. 23, 2023 International Search Report and Written Opinion dated Oct. 30, 2023. |
| U.S. Appl. No. 17/538,911, filed Nov. 30, 2021 Advisory Action dated Nov. 22, 2023. |
| U.S. Appl. No. 17/722,151, filed Apr. 15, 2022 Final Office Action dated Nov. 6, 2023. |
| U.S. Appl. No. 17/894,460, filed Aug. 24, 2022 Non-Final Office Action dated Nov. 6, 2023. |
| U.S. Appl. No. 17/471,015, filed Sep. 9, 2021 Advisory Action dated Feb. 2, 2024. |
| U.S. Appl. No. 17/534,099, filed Nov. 23, 2021 Advisory Action dated Dec. 8, 2023. |
| U.S. Appl. No. 17/538,943, filed Nov. 30, 2021 Non-Final Office Action dated Jan. 30, 2024. |
| U.S. Appl. No. 17/684,180, filed Mar. 1, 2022 Final Office Action dated Jan. 18, 2024. |
| U.S. Appl. No. 17/722,111, filed Apr. 15, 2022 Non-Final Office Action dated Dec. 22, 2023. |
| U.S. Appl. No. 17/722,151, filed Apr. 15, 2022 Advisory Action dated Jan. 2, 2024. |
| U.S. Appl. No. 17/894,460, filed Aug. 24, 2022 Final Office Action dated Jan. 31, 2024. |
| Number | Date | Country | |
|---|---|---|---|
| 20220071589 A1 | Mar 2022 | US |
| Number | Date | Country | |
|---|---|---|---|
| 63075707 | Sep 2020 | US |