Ultrasound imaging has a wide range of applications in the medical and scientific fields for diagnosis, treatment, and studying of internal objects, within a body, such as internal organs or developing fetuses. An ultrasound probe typically includes an array of transducers that transmit and receive ultrasound signals that are used for these imaging techniques.
This summary is provided to introduce a selection of concepts that are further described below in the detailed description. This summary is not intended to identify key or essential features of the claimed subject matter, nor is it intended to be used as an aid in limiting the scope of the claimed subject matter.
In general, one or more embodiments of the disclosure relate to a handheld ultrasound system, the handheld ultrasound system comprising: an ultrasound device with a two-dimensional array of ultrasound transducers; and a smartphone or tablet comprising a processor that: configures the ultrasound device to obtain at least one first ultrasound image frame, using a scan pattern defining an acoustic beam, automatically updates the scan pattern to optimize the view of the desired anatomy, wherein: when the handheld ultrasound system is operating in a cardiac imaging mode, the updating of the scan pattern comprises an adjustment of a combination of an azimuthal tilt and an elevational tilt of the acoustic beam, when the handheld ultrasound system is operating in a lung imaging mode, the updating of the scan pattern comprises an adjustment of a combination of an elevational tilt and a translation of an aperture of the two-dimensional array of ultrasound transducers emitting the acoustic beam, and configures the ultrasound device to obtain a second ultrasound image frame using the updated scan pattern.
In general, one or more embodiments of the disclosure relate to a method for operating a handheld ultrasound system, the method, performed by a processor of a smartphone or tablet of the handheld ultrasound system, comprising: configuring, a two-dimensional array of ultrasound transducers disposed in an ultrasound probe of the handheld ultrasound system to obtain at least one first ultrasound image frame comprising a view of a desired anatomy, using a scan pattern defining an acoustic beam, automatically updating the scan pattern to optimize the view of the desired anatomy, wherein: when the handheld ultrasound system is operating in a cardiac imaging mode, the updating of the scan pattern comprises an adjustment of a combination of an azimuthal tilt and an elevational tilt of the acoustic beam, when the handheld ultrasound system is operating in a lung imaging mode, the updating of the scan pattern comprises an adjustment of a combination of an elevational tilt and a translation of an aperture of the two-dimensional array of ultrasound transducers emitting the acoustic beam, and configuring the two-dimensional array of ultrasound transducers to obtain a second ultrasound image frame using the updated scan pattern.
In general, one or more embodiments of the disclosure relate to a wearable ultrasound patch, comprising: A two-dimensional array of ultrasound transducers; and a processor that: configures the wearable ultrasound patch to obtain at least one first ultrasound image frame comprising a view of a desired anatomy, using a scan pattern defining an acoustic beam, automatically updating the scan pattern to optimize the view of the desired anatomy, wherein: when the wearable ultrasound patch is operating in a cardiac imaging mode, the updating of the scan pattern comprises an adjustment of a combination of an azimuthal tilt and an elevational tilt of the acoustic beam, when the wearable ultrasound patch is operating in a lung imaging mode, the updating of the scan pattern comprises an adjustment of a combination of an elevational tilt and a translation of an aperture of the two-dimensional array of ultrasound transducers emitting the acoustic beam, and configures the wearable ultrasound patch to obtain a second ultrasound image frame using the updated scan pattern.
Other aspects of the disclosure will be apparent from the following description and the appended claims.
Conventional ultrasound systems are large, complex, and expensive systems that are typically only purchased by large medical facilities with significant financial resources. Recently, less expensive and less complex ultrasound imaging devices have been introduced. Such devices may include ultrasonic transducers monolithically integrated onto a single semiconductor die to form a monolithic ultrasound device. Aspects of such ultrasound-on-a chip devices are described in U.S. Pat. Application No. 15/415,434 titled “UNIVERSAL ULTRASOUND DEVICE AND RELATED APPARATUS AND METHODS,” filed on Jan. 25, 2017 (and assigned to the assignee of the instant application), published as U.S. Pat. Pub. No. 2017/0360397 A1and issued as U.S. Pat. No. 10,856,840 (the ‘840 patent), which is incorporated by reference herein in its entirety. The reduced cost and increased portability of these new ultrasound devices may make them significantly more accessible to the general public than conventional ultrasound devices. The inventors have recognized and appreciated that although the reduced cost and increased portability of some ultrasound imaging devices, such as those described in the ‘840 patent, makes them more accessible to the general populace, people who could make use of such devices have little to no training for how to use them. Ultrasound examinations often include the acquisition of ultrasound images that contain a view of a particular anatomical structure (e.g., an organ) of a subject. Acquisition of these ultrasound images typically requires considerable skill.
In general, embodiments of the disclosure provide a method, non-transitory computer readable medium (CRM), and system assisting in acquiring clinically usable ultrasound images. Embodiments of the disclosure provide technology for automatically adjusting a scan pattern to obtain a desired or target view of a specified anatomy without user intervention. In this manner, the ultrasound imaging devices may be productively employed even when the user is unable to reliably position or adjust the device to get the target view on their own.
This technique may be particularly helpful, for example, for cardiac imaging because the “cardiac windows” through the ribs and lungs can be hard to find, and may vary from patient to patient. Fine probe movements are typically required to find the correct view, and novices often struggle to obtain acceptable cardiac images or lung images. However, auto-steering could also be helpful for many other clinical applications. For example, auto-steering may be used to image organs such as the lungs, carotid, kidney, aorta, thyroid, bladder, etc., thus finding the best view (in the best preset) of the anatomy. Some embodiments of the present application may utilize or benefit from use of an ultrasound probe of the types described in the ‘840 patent, referenced above. An ultrasound probe having a 2D aperture with electronically controllable scan pattern (e.g., controllable in the azimuth, elevation, both, other in other manners) may facilitate automatic adjustment of the scanning and beneficial capture of target images. A detailed description is subsequently provided.
The ultrasound device (102) may be configured to generate ultrasound data. The ultrasound device (102) may be configured to generate ultrasound data by, for example, emitting acoustic waves into the subject (101) and detecting the reflected acoustic waves. The detected reflected acoustic wave may be analyzed to identify various properties of the tissues through which the acoustic wave traveled, such as a density of the tissue. The ultrasound device (102) may be implemented in any of a variety of ways. For example, the ultrasound device (102) may be implemented as a handheld device (as shown in
The ultrasound device (102) may transmit ultrasound data to the processing device 104 using the communication link (112). The communication link (112) may be a wired or wireless communication link. In some embodiments, the communication link (112) may be implemented as a cable such as a Universal Serial Bus (USB) cable or a Lightning cable. In these embodiments, the cable may also be used to transfer power from the processing device 104 to the ultrasound device (102). In other embodiments, the communication link (112) may be a wireless communication link such as a BLUETOOTH, WiFi, or ZIGBEE wireless communication link.
The processing device (104) may comprise one or more processing elements (such as a processor) to, for example, process ultrasound data received from the ultrasound device (102). Additionally, the processing device (104) may comprise one or more storage elements (such as a non-transitory computer readable medium) to, for example, store instructions that may be executed by the processing element(s) and/or store all or any portion of the ultrasound data received from the ultrasound device (102). It should be appreciated that the processing device (104) may be implemented in any of a variety of ways. For example, the processing device (104) may be implemented as a mobile device (e.g., a mobile smartphone, a tablet, or a laptop) with an integrated display (106) as shown in
The one or more ultrasonic transducer arrays (152) may take on any of numerous forms, and aspects of the present technology do not necessarily require the use of any particular type or arrangement of ultrasonic transducer cells or ultrasonic transducer elements. For example, multiple ultrasonic transducer elements in the ultrasonic transducer array (152) may be arranged in one-dimension, or two-dimensions. Although the term “array” is used in this description, it should be appreciated that in some embodiments the ultrasonic transducer elements may be organized in a non-array fashion. In various embodiments, each of the ultrasonic transducer elements in the array (152) may, for example, include one or more capacitive micromachined ultrasonic transducers (CMUTs), or one or more piezoelectric micromachined ultrasonic transducers (PMUTs).
In a non-limiting example, the ultrasonic transducer array 152 may include between approximately 6,000-10,000 (e.g., 8,960) active CMUTs on the chip, forming an array of hundreds of CMUTs by tens of CMUTs (e.g., 140 × 64). The CMUT element pitch may be between 150-250 um, such as 208 um, and thus, result in the total dimension of between 10-50 mm by 10-50 mm (e.g., 29.12 mm × 13.312 mm).
In some embodiments, the TX circuitry (154) may, for example, generate pulses that drive the individual elements of, or one or more groups of elements within, the ultrasonic transducer array(s) (152) so as to generate acoustic signals to be used for imaging. The RX circuitry (156), on the other hand, may receive and process electronic signals generated by the individual elements of the ultrasonic transducer array(s) (152) when acoustic signals impinge upon such elements.
With further reference to
In some embodiments, the output range of a same (or single) transducer unit in an ultrasound device may be anywhere in a range of 1-12 MHz (including the entire frequency range from 1-12 MHz), making it a universal solution, in which there is no need to change the ultrasound heads or units for different operating ranges or to image at different depths within a patient. That is, the transmit and/or receive frequency of the transducers of the ultrasonic transducer array may be selected to be any frequency or range of frequencies within the range of 1 MHz-12 MHz. The universal ultrasound device (150) described herein may thus be used for a broad range of medical imaging tasks including, but not limited to, imaging a patient’s liver, kidney, heart, bladder, thyroid, carotid artery, lower venous extremity, and performing central line placement. Multiple conventional ultrasound probes would have to be used to perform all these imaging tasks. By contrast, a single universal ultrasound device (150) may be used to perform all these tasks by operating, for each task, at a frequency range appropriate for the task, as shown in the examples of Table 1 together with corresponding depths at which the subject may be imaged.
The power management circuit (168) may be, for example, responsible for converting one or more input voltages VIN from an off-chip source into voltages needed to carry out operation of the chip, and for otherwise managing power consumption within the ultrasound device (150). In some embodiments, for example, a single voltage (e.g., 12 V, 80 V, 100 V, 120 V, etc.) may be supplied to the chip and the power management circuit (168) may step that voltage up or down, as necessary, using a charge pump circuit or via some other DC-to-DC voltage conversion mechanism. In other embodiments, multiple different voltages may be supplied separately to the power management circuit (168) for processing and/or distribution to the other on-chip components.
In the embodiment shown above, all of the illustrated elements are formed on a single semiconductor die (162). It should be appreciated, however, that in alternative embodiments one or more of the illustrated elements may be instead located off-chip, in a separate semiconductor die (162), or in a separate device. Alternatively, one or more of these components may be implemented in a DSP chip, a field programmable gate array (FPGA) in a separate chip, or a separate application specific integrated circuity (ASIC) chip. Additionally, and/or alternatively, one or more of the components in the beamformer may be implemented in the semiconductor die (162), whereas other components in the beamformer may be implemented in an external processing device in hardware or software, where the external processing device is capable of communicating with the ultrasound device (150).
In addition, although the illustrated example shows both TX circuitry (154) and RX circuitry (156), in alternative embodiments only TX circuitry or only RX circuitry may be employed. For example, such embodiments may be employed in a circumstance where one or more transmission-only devices are used to transmit acoustic signals and one or more reception-only devices are used to receive acoustic signals that have been transmitted through or reflected off of a subject being ultrasonically imaged.
It should be appreciated that communication between one or more of the illustrated components may be performed in any of numerous ways. In some embodiments, for example, one or more high-speed busses (not shown), such as that employed by a unified Northbridge, may be used to allow high-speed intra-chip communication or communication with one or more off-chip components.
In some embodiments, the ultrasonic transducer elements of the ultrasonic transducer array (152) may be formed on the same chip as the electronics of the TX circuitry (154) and/or RX circuitry (156). The ultrasonic transducer arrays (152), TX circuitry (154), and RX circuitry (156) may be, in some embodiments, integrated in a single ultrasound probe. In some embodiments, the single ultrasound probe may be a hand-held probe including, but not limited to, the hand-held probes described below with reference to
A CMUT may include, for example, a cavity formed in a CMOS wafer, with a membrane overlying the cavity, and in some embodiments sealing the cavity. Electrodes may be provided to create an ultrasonic transducer cell from the covered cavity structure. The CMOS wafer may include integrated circuitry to which the ultrasonic transducer cell may be connected. The ultrasonic transducer cell and CMOS wafer may be monolithically integrated, thus forming an integrated ultrasonic transducer cell and integrated circuit on a single substrate (the CMOS wafer).
In the example shown, one or more output ports (164) may output a high-speed serial data stream generated by one or more components of the signal conditioning/processing circuit (160). Such data streams may be, for example, generated by one or more USB 3.0 modules, and/or one or more 10GB, 40GB, or 100GB Ethernet modules, integrated on the die (162). It is appreciated that other communication protocols may be used for the output ports (164).
In some embodiments, the signal stream produced on output port (164) can be provided to a computer, tablet, or smartphone for the generation and/or display of two-dimensional, three-dimensional, and/or tomographic images. In some embodiments, the signal provided at the output port (164) may be ultrasound data provided by the one or more beamformer components or auto-correlation approximation circuitry, where the ultrasound data may be used by the computer (external to the ultrasound device) for displaying the ultrasound images. In embodiments in which image formation capabilities are incorporated in the signal conditioning/processing circuit (160), even relatively low-power devices, such as smartphones or tablets which have only a limited amount of processing power and memory available for application execution, can display images using only a serial data stream from the output port (164). As noted above, the use of on-chip analog-to-digital conversion and a high-speed serial data link to offload a digital data stream is one of the features that helps facilitate an “ultrasound on a chip” solution according to some embodiments of the technology described herein.
Devices (150) such as that shown in
Reference is now made to the processing device (204). In some embodiments, the processing device (204) may be communicatively coupled to the ultrasound device (202) (e.g., (102) in
In some embodiments, the processing device (204) may be configured to process the ultrasound data received from the ultrasound device (202) to generate ultrasound images for display on the display screen (208). The processing may be performed by, for example, the processor(s) (210). The processor(s) (210) may also be adapted to control the acquisition of ultrasound data with the ultrasound device (202). The ultrasound data may be processed in real--time during a scanning session as the echo signals are received. In some embodiments, the displayed ultrasound image may be updated a rate of at least 5 Hz, at least 10 Hz, at least 20 Hz, at a rate between 5 and 60 Hz, at a rate of more than 20 Hz. For example, ultrasound data may be acquired even as images are being generated based on previously acquired data and while a live ultrasound image is being displayed. As additional ultrasound data is acquired, additional frames or images generated from more-recently acquired ultrasound data are sequentially displayed. Additionally, or alternatively, the ultrasound data may be stored temporarily in a buffer during a scanning session and processed in less than real-time.
In some embodiments, the processing device (204) may be configured to perform various ultrasound operations using the processor(s) (210) (e.g., one or more computer hardware processors) and one or more articles of manufacture that include non-transitory computer-readable storage media such as the memory (212). The processor(s) (210) may control writing data to and reading data from the memory (212) in any suitable manner. To perform certain of the processes described herein, the processor(s) (210) may execute one or more processor-executable instructions stored in one or more non-transitory computer-readable storage media (e.g., the memory (212)), which may serve as non-transitory computer-readable storage media storing processor-executable instructions for execution by the processor(s) (210).
The camera (220) may be configured to detect light (e.g., visible light) to form an image. The camera (220) may be on the same face of the processing device (204) as the display screen (208). The display screen (208) may be configured to display images and/or videos, and may be, for example, a liquid crystal display (LCD), a plasma display, and/or an organic light emitting diode (OLED) display on the processing device (204). The input device (218) may include one or more devices capable of receiving input from a user and transmitting the input to the processor(s) (210). For example, the input device (218) may include a keyboard, a mouse, a microphone, touch-enabled sensors on the display screen (208), and/or a microphone. The display screen (208), the input device (218), the camera (220), and/or other input/output interfaces (e.g., speaker) may be communicatively coupled to the processor(s) (210) and/or under the control of the processor (210).
It should be appreciated that the processing device (204) may be implemented in any of a variety of ways. For example, the processing device (204) may be implemented as a handheld device such as a mobile smartphone or a tablet. Thereby, a user of the ultrasound device (202) may be able to operate the ultrasound device (202) with one hand and hold the processing device (204) with another hand. In other examples, the processing device (204) may be implemented as a portable device that is not a handheld device, such as a laptop. In yet other examples, the processing device (204) may be implemented as a stationary device such as a desktop computer. The processing device (204) may be connected to the network (216) over a wired connection (e.g., via an Ethernet cable) and/or a wireless connection (e.g., over a WiFi network). The processing device (204) may thereby communicate with (e.g., transmit data to or receive data from) the one or more servers (234) over the network (216). For example, a party may provide from the server (234) to the processing device (204) processor-executable instructions for storing in one or more non-transitory computer-readable storage media (e.g., the memory (212)) which, when executed, may cause the processing device (204) to perform ultrasound processes.
Further description of ultrasound devices and systems may be found in U.S. Pat. No. 9,521,991, the content of which is incorporated by reference herein in its entirety; and U.S. Pat. No. 11,311,274, the content of which is incorporated by reference herein in its entirety.
In some embodiments, the phases Φ1, Φ2, Φ3, Φ4 ... ΦN and/or time delays τ1, τ2, τ3, τ4 ... τN may be controlled to cause the ultrasound waves to interfere with one another so that the resulting waves add together to increase the acoustic beam in a desired direction. The phases Φ1, Φ2, Φ3, Φ4 ... ΦN and/or time delays τ1, τ2, τ3, τ4, . . τN may be controlled with respective signal drivers, which may be implemented for example using transistors and/or diodes arranged in a suitable configuration. In at least some of the embodiments in which the ultrasound elements are disposed on a semiconductor substrate, the signal drivers may be disposed on the same semiconductor substrate.
In some embodiments, the phases may be adjusted to produce steering within a 3D field-of-view. This may be accomplished for example by adjusting azimuth and elevation of the emissions. Accordingly, the acoustic beam may be steered through an entire volume at a desired angle, with respect to the direction that is normal to the aperture of the transducer array. While the above discussion is for the transmit phase, similar methods may be applied during the receive phase. Accordingly, beamforming in accordance with embodiments of the invention may involve beamforming when transmitting, when receiving, or both.
While not shown in
Turning to
One or more steps shown in
Turning to
In Step 504, a test is performed to determine whether a request is received to initiate an auto-steer operation. The request may be based on a user request, or it may be an ultrasound device-internal request. The user request may be the pressing of a button, e.g., on the ultrasound probe, by the user. The ultrasound device-internal request, in one or more embodiments is based on a suitability of an ultrasound image frame for performing a diagnostic action. Quality-based metrics or other metrics may be used. The metrics may be continuously or periodically monitored in order to make the suitability determination. Specifically, for example, if the quality of the ultrasound image frame falls below a threshold, then this may trigger the auto-steer operation. In one or more embodiments, quality is at least partially governed by the view of the desired anatomy. For example, a lower quality may result from significant occlusions of parts of the desired anatomy. Higher quality may result from fewer or no occlusions of the desired anatomy. A discussion of quality is provided below, following the description of the methods. Step 504 is optional. In other words, in some embodiments, the subsequently described steps may be executed without being explicitly triggered by Step 504.
If no request was received, the method may proceed with another execution of Step 502. Accordingly, another ultrasound image frame may be obtained using the initial scan pattern.
If a request was received, a search mode or auto-steer operation is activated. Briefly summarized, the auto-steer operation, performed by executing subsequently described Steps 506-510, adjusts the scan pattern to be used to capture the ultrasound images, with the goal to identify a scan pattern that provides better quality ultrasound image frames (or more broadly, ultrasound images with a higher suitability for the purpose of performing a diagnostic action). For example, in one embodiment in which a scan pattern sequence defines a number of scan patterns, e.g., elevational and azimuthal steering angles for the acoustic beam, execution of Steps 506-510 may result in variation of the elevational and azimuthal steering angles. An updated scan pattern based on a combination of elevational and azimuthal steering angles may then be used to capture ultrasound image frames after completion of the auto-steer operations. Examples and a description of sequences of scan patterns are provided below, following the description of the methods. The user may be instructed to hold the ultrasound probe still while different steering angles are explored while the auto-steer operations are ongoing.
In Step 506, triggered by the request, a search scan pattern is determined based on one or more of the subsequently discussed sequences of scan patterns. Specifically, for example, the search scan pattern may be set to include exploratory steering angles and/or positions for the acoustic beam, specified in a scan pattern sequence. A scan pattern sequence may include many definitions of scan patterns in the form of combinations of steering angles and/or positions of the aperture, and by repeatedly executing Steps 506-510, the method may step through some or all of these combinations.
In Step 508. a search image frame is obtained. The search scan pattern is used to capture the search image frame. The ultrasound probe may have remained stationary or near-stationary between the execution of Steps 502 and 508.
In Step 510, a decision is made whether the auto-steer operation should be terminated. Different criteria may be used to make the decision. For example, the auto-steer operation may be terminated once a predetermined sequence of exploratory steering angles, specified in a scan pattern sequence, has been used to obtain search image frames with the corresponding search scan pattern. Additionally or alternatively, the auto-steer operation may be terminated once a search image frame obtained using a particular search scan pattern provides sufficient image quality, e.g., meeting specified quality standards or being higher quality than the ultrasound image frame obtained in Step 502. More generally, the auto-steer operation may be determined when a search image frame is deemed sufficiently suitable for performing a diagnostic action.
If the updating of the search scan pattern is to be continued, the execution of the method may proceed with Step 506 to pick another scan pattern from the scan pattern sequence. If the updating of the search scan pattern is to be terminated, the method may proceed with Step 512.
In Step 512, an ultrasound image frame (or any number of ultrasound image frames) is obtained. An updated scan pattern is used for capturing the ultrasound image frame. The updated scan pattern may be set based on the search scan pattern that produced a highest suitability for the purpose of performing a diagnostic action, a highest quality, a quality above a threshold, etc., during the execution of Steps 506-510. The execution of Step 512 may continue for a prolonged time, for example, until the user decides that sufficient number of ultrasound image frames has been collected. Eventually, the execution may stop, or alternatively, the execution of the method may repeat, for example, after the ultrasound probe has been moved to a different location.
Turning to
The following steps for an auto-steer operation may be performed based on a request, or without being explicitly triggered by a request. The request may be based on a user request, or it may be an ultrasound device-internal request. The user request may be the pressing of a button, e.g., on the ultrasound probe, by the user. The ultrasound device-internal request, in one or more embodiments is suitability-based. In other words, if the suitability of the ultrasound image frame for the purpose of performing a diagnostic action is too low, an auto-steer operation may be triggered. Quality-based metrics or other metrics may be used. Specifically, for example, if the quality of the ultrasound image frame falls below a threshold, then this may trigger the auto-steer operation. In one or more embodiments, quality is at least partially governed by the view of the desired anatomy. For example, a lower quality may result from significant occlusions of parts of the desired anatomy. Higher quality may result from fewer or no occlusions of the desired anatomy. A discussion of quality is provided below, following the description of the methods.
Briefly summarized, the auto-steer operation, performed by executing subsequently described Steps 554-558 adjusts the scan pattern to be used to capture the ultrasound images, with the goal to identify a scan pattern that provides better quality ultrasound image frames (or, more generally, a higher suitability of the ultrasound image frames for the purpose of performing a diagnostic action). In the embodiment as shown, execution of Steps 554-558 may result in an incremental updating of the scan pattern by interleaving search image frames between the ultrasound image frames. The user may be instructed to hold the ultrasound probe still while different steering angles are explored during the auto-steer operations.
For example, the execution of the method may begin with an imaging at elevational angle=0 and an azimuthal angle=0, but with search image frames added between the ultrasound image frames, the search image frames with a different angle every N (e.g., 3 or 4) frames, so as not to degrade the frame rate too much. The search image frames may be hidden from the user (i.e., they may not be displayed), but their quality may be computed. If a search image frame produces a better quality than the previously obtained ultrasound imaging frame, one of two things may occur. Either the user may be notified (visually, audibly, or otherwise) that a more suitable scan pattern was identified and asked whether they would like to switch to that scan pattern, or the ultrasound device may automatically update the scan pattern to produce better quality ultrasound image frames. Searching may continue, interleaving search frames at other angles. These operations are performed by repetitive execution of Steps 552-558.
In Step 554, triggered by the request, a search image frame is obtained using a search scan pattern. The search scan pattern may be determined by stepping through the scan pattern sequence, that define exploratory steering angles and/or positions for the acoustic beam, as previously discussed. Based on the selected search scan pattern, a search image frame is obtained. By repeated execution of Step 556 in a loop, many combinations of steering angles and/or positions of the aperture may be used to obtain search image frames.
In Step 556, a test is performed to determine whether the quality of the search image frame is better than the quality of the previously obtained ultrasound image frame. A comparison may be performed based on quality metrics, as further discussed below. If the quality of the search image is not better, then the execution of the method may proceed with Step 552 to repeat obtaining an ultrasound image frame, without updating the scan pattern, followed by producing a different search image frame using a different search scan pattern. If the quality of the search image frame is better, then the execution of the method may proceed with Step 558.
In Step 558, the scan pattern is updated based on the search scan pattern that resulted in the search image frame with the quality better than the quality of the ultrasound image frame. For example, the configuration for the initial scan pattern is replaced with the configuration of the search scan pattern. Subsequently, when returning to Step 552, a new ultrasound image may be obtained using the updated search scan pattern.
Scan patterns establish possible combinations of scan parameters to be used for beam steering when executing the methods (500, 550). The following description provides definitions of scan patterns.
Any combination of the above definitions (and/or any definitions of translations, tilts, rotations, etc. of the acoustic beam) may be arranged in a sequence to form a scan pattern sequence, as further discussed below.
Non-limiting examples of such image frame sequences are shown below in Tables 1 and 2. These tables present mini-sequences of imaging frames with interleaved search frames, e.g., as used by the method (550).
The scan pattern sequence in Table 2 differs from that in Table 3 in that the sequence of Table 2 adjust the azimuthal angle only, while the sequence in Table 3 adjust both azimuth and elevation angles. The results of the scan sequences illustrated in Tables 2 and 3 may be used to build a multi-dimensional numerical optimizer. Such an optimizer may be used to better find an optimal imaging angle.
The stepping through a scan pattern sequence when performing an auto-steer operation as described in reference to
While the described stepping through scan patterns in a scan pattern sequence is based on examples involving azimuth and elevation, this may be generalized to any type of scan patterns that may include elevation and/or azimuth angle changes, translations in elevation and/or azimuth directions, rotations about the ultrasound probe axis, etc.
In one or more embodiments, one or more ultrasound image frame/search image frame is scored to decide on suitability. Suitability may be governed by the diagnostic action being performed. In other words, a suitable ultrasound image frame/search image frame is likely or sufficiently likely to be useful for the diagnostic action, whereas a non-suitable ultrasound image frame/search image frame is not sufficiently likely to be useful for the diagnostic action. In one or more embodiments, suitability is at least partially governed by the view of the desired anatomy in the ultrasound image frame / search image frame. For example, a lower suitability may result from significant occlusions of parts of the desired anatomy. Higher suitability may result from fewer or no occlusions of the desired anatomy. A threshold may be applied to distinguish good/acceptable suitability and bad suitability. Examples for specific applications are provided below in reference to
In one embodiment, a quality of the ultrasound image frame / search image frame is evaluated to assess suitability. Various methods may be used to assess quality:
The above description assumes that the ultrasound probe is kept stationary or near-stationary during the execution of the method steps. However, even with the described capability to adjust the scan pattern to obtain an acceptable ultrasound image frame with the desired anatomy without user intervention, there may be cases where the ultrasound probe must be physically moved on the body to acquire the desired view. In a “cardiac” mode, for instance, the user might position the probe near the heart, then attempt to scan for the desired view. If the desired view is not accessible from the current probe position, the system, in one embodiment, alerts the user to “move the probe towards the patient’s head” via text instructions and/or graphical depictions. The system may then continue automatic scanning for the desired view, and after several probe movements, the desired view may be accessible with an appropriate scan pattern.
Further, embodiments of the disclosure address concerns regarding lung interference. For example, when operating in a cardiac scanning mode, the system may detect lung interference and alert the user to instruct the patient to breathe out and hold, which may move the lung out of the beam path.
Also, embodiments of the disclosure provide guidance to the user to perform more complex imaging operations. For example, in FAST (focused assessment with sonography in trauma) imaging, an ultrasound study that includes views of the heart, lungs, Morrison’s pouch, etc. is conducted to searching for free fluid indicating internal bleeding. In one or more embodiments, the ultrasound device instructs the user to move the probe to the patient’s right upper quadrant, followed by an automatic adjustment of the scan pattern to search for free fluid. Then, instructions may be provided to move the probe to near the patient’s heart, followed by an automatic adjustment of the scan pattern to search for free fluid, etc. In this scenario, the scan pattern may be adjusted to search for free fluid in 3-dimensional space via elevational sweeping, instead of merely trying to identify a single view of the desired anatomy.
Embodiments of the disclosure may be implemented in different configurations. As previously described, a host device (e.g., a smartphone or a tablet computing device) may be communicating with an AI model in the cloud. Alternatively, the AI model would be run locally on the ultrasound probe in an Edge AI configuration. In such a configuration, the algorithms may utilize the data generated by the devices themselves, and the devices may make independent decisions in a matter of milliseconds without having to connect to the Internet/cloud.
One or more embodiments of the disclosure support fast sequence changes.
Adjusting the scan pattern may require full re-computation of sensor delays and gains, as well as beamforming parameters used to reconstruct data along the new plane. This operation may require a non-negligible amount of time, e.g. 100-500 ms. Such a delay may introduce a lag while scanning. This lag may be undesirable or even unacceptable, especially if the search frames are interleaved with the imaging frames. To avoid this lag, ultrasound devices in accordance with embodiments of the disclosure may compute the new imaging parameters with a different processor than the one currently used for imaging. Alternatively, ultrasound devices in accordance with embodiments of the disclosure may pre-generate a “complete” sequence that contains many different scan patterns, allowing the host to quickly switch between them by pointing the system to different points in memory.
The following paragraphs illustrate non-limiting examples of applications of embodiments of the disclosure. Lung imaging and cardiac imaging are discussed. As the examples illustrate, in lung imaging, elevational steering may ensure that the imaging plane is perpendicular to the lung pleura, whereas in cardiac imaging, it may be used to get a better window through the ribs. Accordingly, while both cardiac and lung imaging use elevational autosteering, it is for different purposes.
As explained above, aspects of the present application provide for automatically adjusting the scan pattern of an ultrasound imaging device, such as an ultrasound probe or patch, to obtain a desired or target view of the specified anatomy without user intervention, with such operation being beneficial in the context of cardiac imaging. While the technology described herein may be applied to imaging of various anatomical features, cardiac imaging presents particular challenges because the “cardiac windows” through the ribs and lungs can be hard to find, and may vary from patient to patient. Fine probe movements are typically required to find the correct view with conventional ultrasound probes, and thus novices often struggle to obtain acceptable cardiac images. That is, successful cardiac imaging often involves making fine adjustments in the position and angle of the ultrasound probe, and a suitable position and angle often depends on the anatomy, position, and breath state of the particular subject being imaged. Thus, application of the present technology is particularly beneficial for at least some cardiac imaging scenarios.
The probe is then tilted relative to the subject, such that the probe face is directed more downwardly toward the subject’s feet. The system detects that, absent adjusting the elevational steering angle, the resulting image quality drops below a threshold and therefore performs a steering angle sweep to identify the steering angle producing the best image quality for the given probe tilt.
Although
More generally, the described operations, including the operations of the methods described in
Although the disclosure has been described with respect to a limited number of embodiments, those skilled in the art, having benefit of this disclosure, will appreciate that various other embodiments may be devised without departing from the scope of the present disclosure. Accordingly, the scope of the disclosure should be limited only by the attached claims.
This application claims the benefit of priority under 35 U S.C. § 119(e) to U.S. Provisional Pat. Application Serial No. 63/278,981, filed on Nov. 12, 2021 and U.S. Provisional Pat. Application Serial No. 63/290,243, filed on Dec. 16, 2021, which are hereby incorporated by reference herein in their entirety.
Number | Date | Country | |
---|---|---|---|
63290243 | Dec 2021 | US | |
63278981 | Nov 2021 | US |