The disclosure relates generally to the field of medical diagnostic ultrasound systems and methods and more particularly to apparatus and methods that provide ultrafast imaging.
Ultrasound imaging systems/methods are well known. See for example U.S. Pat. No. 6,705,995 (Poland) and U.S. Pat. No. 5,370,120 (Oppelt). All of the above-identified references are incorporated herein by reference in their entirety.
Conventional ultrasound imaging apparatus can have one or more transducers, transmit and receive beamformers, and various processing and display components used for generating and presenting the acquired images. A transmit beamformer supplies electrical waveform signals to the transducer arrays on the hand-held probe, which, in turn, generates the associated ultrasonic signals. Objects in the path of the transducer signals scatter ultrasound energy back to the transducers, which then generate receive electrical signals. The receive electrical signals are delayed for selected times specific to each transducer, so that ultrasonic energy scattered from selected regions adds coherently, while ultrasonic energy from other regions has no perceptible impact. Array processing techniques used for generating and processing received signals in this way are termed “beamforming” and are well known to those in the ultrasound imaging field.
Today's more advanced ultrasound systems can offer the sonographer a number of imaging options for obtaining image content that is suitable for patient assessment and diagnostics. Among options selectable by the operator or pre-programmed by the system designer are different types of beamforming algorithms. Selection of the beamforming sequence that best meets the requirements for a particular exam can be based on a number of factors, including imaging frame rate, relative noise levels, and various imaging characteristics.
Ultrasound imaging systems have typically used a conventional method of beamforming referred to as serial line-by-line imaging in the literature. In such a conventional system, the imaging is usually performed by sequential insonification of the medium using focused beams. Each focused beam allows the reconstruction of one image line or a few lines (for example, up to 16 in multi-line imaging). A typical 2D image is made of a few tens of lines (64 to 512). The frame rate of the imaging mode is set by the time required to transmit a beam, receive and process the backscattered echoes from the medium, and repeat this for all the lines of the image. A number of clinical imaging modes have been developed on this architecture, and it remains the mainstay of imaging today. In these systems, the image formation approach implemented is commonly referred to as Delay and Sum (DAS) or delay/sum beamforming.
With advances in computing power, another ultrasound beamforming architecture has emerged. This is often referred to as Ultrafast imaging. An introduction and background of this new ultrasound imaging technology is provided, for example, in the reference: “Ultrafast Imaging in Biomedical Ultrasound” in IEEE Transactions on Ultrasonics, Ferroelectrics, and Frequency Control, 61(1):102-119, January 2014, by M. Tanter and M. Fink, edited by Prof. Oleg Minin, incorporated herein in its entirety.
In Ultrafast imaging, instead of forming an image line-by-line as in conventional imaging, the entire image can be formed from a limited number of transmitted beams that use plane or divergent waves. In such a system, the image frame rate is no longer limited by the number of lines reconstructed using focused waves, but by the time of flight taken for a single plane wave pulse to propagate through the medium and return to the transducer.
With an ultrafast imaging architecture, a variety of new transmit excitation schemes are used. These include the use of plane and diverging beams that insonify a large region of the tissue. On the receive side, to achieve ultrafast imaging, systems benefit from the ability to capture and store the raw RF channel data (i.e. data from each transducer element) to reconstruct the image pixel by pixel over the entire field of view. The channel data is processed by high performance hardware such as GPUs (Graphic Processing Units) and FPGAs (Field Programmable Gate Arrays) or DSPs (Digital Signal Processors), or a combination of these, to implement novel image reconstruction algorithms, typically referred to as pixel-based beamforming. Since no transmit beamforming is applied with the plane or diverging beam transmit sequences, image and contrast resolution suffer if the image is formed with a single transmit firing. To overcome this aspect, overlapping transmit beams are used that insonify a given position from multiple directions. As a result, dynamic transmit focusing can be achieved, wherein every location in the image is in focus, compared to one or a few locations in a conventional static transmit focus imaging paradigm.
There are benefits and challenges with implementing ultrafast imaging systems on ultrasound systems, particularly if implemented in a commercial environment. Some view at least one benefit of ultrafast imaging to include substantially high frame rates without compromise on imaging resolution, and ability to implement quantitative and parametric imaging modes that are not possible without access to the raw channel data. On the other hand, transmit paradigms typically used in ultrafast imaging may not be well suited for some imaging modes. One example is harmonic imaging. First, the intensity of harmonic signals, particularly tissue harmonic signals, is very low relative to fundamental frequency signals. This is because harmonic generation is proportional to the square of the acoustic pressure at the target. Since the transmit beams (such as plane wave and diverging beams) used to best exploit the ultrafast imaging architecture are not as narrowly focused as conventional transmitted signals, the pressure of the acoustic energy field developed is far below that which is present at the focus of a standard focused transmit beam, and may even be insufficient to produce diagnostically useful harmonic images.
In clinical ultrasound systems, Harmonic imaging is typically used as part of multi-mode or duplex imaging sequences where it is combined with other mode sequences (e.g. B-mode fundamental, Color, M-mode). With known methods, the entire duplex sequence including the harmonic mode would be performed using a single beamforming approach, whereas it might be beneficial to perform duplex scanning using a combination of approaches (e.g. the non-harmonic mode using ultrafast imaging approaches while relying on DAS for harmonic imaging).
It can be appreciated that there can be significant value in providing ultrasound solutions that take advantage of multiple beamforming modes, while allowing flexibility and control for mode selection to the operator.
An object of the present disclosure is to advance the art of ultrasound imaging and overall system operation.
These objects are given only by way of illustrative example, and such objects may be exemplary of one or more embodiments of the invention. Other desirable objectives and advantages inherently achieved may occur or become apparent to those skilled in the art. The invention is defined by the appended claims.
According to one aspect of the disclosure, there is provided a method for ultrasound imaging comprising: generating an interleaved ultrasound beam pattern that alternates transmission between a focused ultrasound signal and a plane wave ultrasound signal during a scan; directing reflected signal data from the focused ultrasound signal to a first signal processing path that executes delay/sum processing and generates successive lines of image data; directing reflected signal data from the plane wave ultrasound signal to a second signal processing path that generates a full plane of image data; synchronizing pixel location and timing for data from the first signal processing path to the second signal processing path; and displaying, storing, or transmitting the combined, synchronized image data from the first and second signal processing paths.
The foregoing and other objects, features, and advantages of the invention will be apparent from the following more particular description of the embodiments of the invention, as illustrated in the accompanying drawings. The elements of the drawings are not necessarily to scale relative to each other.
The following is a detailed description of the preferred embodiments, reference being made to the drawings in which the same reference numerals identify the same elements of structure in each of the several figures.
Medical ultrasound (also known as diagnostic sonography or ultrasonography) is a diagnostic imaging technique based on the application of ultrasound, used to display internal body structures such as tendons, muscles, joints, vessels and internal organs.
Reference is hereby made to US Patent Application Publication No. 2015/0141821 by Yoshikawa et al. entitled “Ultrasonic Diagnostic Apparatus and Elastic Evaluation Method” incorporated herein by reference.
Ultrasound is sound wave energy with frequencies higher than those audible to the human ear. Ultrasonic images, also known as sonograms, are made by directing pulses of ultrasound into tissue using a probe. The sound echoes off the tissue; with different tissues reflecting varying degrees of sound. These echoes are recorded and displayed as an image to the operator.
Different types of images can be formed using sonographic instruments. The most well-known type is a B-mode image, which displays the acoustic impedance of a two-dimensional cross-section of tissue. Other types of images can display blood flow, motion of tissue over time, the location of blood, the presence of specific materials, the stiffness of tissue, or the anatomy of a three-dimensional region.
Accordingly, the system of
Clinical modes of ultrasound used in medical imaging include the following:
A-mode: A-mode (amplitude mode) is the simplest type of ultrasound. A single transducer scans a line through the body with the echoes plotted on screen as a function of depth. Therapeutic ultrasound aimed at a specific tumor or calculus also uses A-mode emission to allow for pinpoint accurate focus of the destructive wave energy.
B-mode or 2D mode: In B-mode (brightness mode) ultrasound, a linear array of transducers simultaneously scans a plane through the body that can be viewed as a two-dimensional image on screen. Sometimes referred to as 2D mode, this mode is effective for showing positional and dimensional characteristics of internal structures and is generally the starting point for exam types that use other modes.
C-mode: A C-mode image is formed in a plane normal to a B-mode image. A gate that selects data from a specific depth from an A-mode line is used; the transducer is moved in the 2D plane to sample the entire region at this fixed depth. When the transducer traverses the area in a spiral, an area of 100 cm2 can be scanned in around 10 seconds.
M-mode: In M-mode (motion mode) ultrasound, pulses are emitted in quick succession. With each pulse, either an A-mode or B-mode image is acquired. Over time, M-mode imaging is analogous to recording a video in ultrasound. As the organ boundaries that produce reflections move relative to the probe, this mode can be used to determine the velocity of specific organ structures.
Doppler mode: This mode makes use of the Doppler effect in measuring and visualizing blood flow.
Color Doppler: Velocity information is presented as a color-coded overlay on top of a B-mode image. This mode is sometimes referred to as Color Flow or color mode.
Continuous Doppler: Doppler information is sampled along a line through the body, and all velocities detected at each point in time are presented (on a time line).
Pulsed wave (PW) Doppler: Doppler information is sampled from only a small sample volume (defined in 2D image), and presented on a timeline.
Duplex: a common name for the simultaneous presentation of 2D and (usually) PW Doppler information. (Using modern ultrasound machines, color Doppler is almost always also used; hence the alternative name Triplex.).
Pulse inversion mode: In this mode, two successive pulses with opposite sign are emitted and then subtracted from each other. This implies that any linearly responding constituent will disappear while gases with non-linear compressibility stand out. Pulse inversion may also be used in a similar manner as in Harmonic mode.
Harmonic mode: In this mode a deep penetrating fundamental frequency is emitted into the body and a harmonic overtone is detected. With this method, noise and artifacts due to reverberation and aberration are greatly reduced. Some also believe that penetration depth can be gained with improved lateral resolution; however, this is not well documented.
A Sonographer, ultrasonographer, clinician, practitioner, or other clinical user, is a healthcare professional (often a radiographer but may be any healthcare professional with the appropriate training) who specializes in the use of ultrasonic imaging devices to produce diagnostic images, scans, videos, or 3D volumes of anatomy and diagnostic data.
Harmonic imaging relies on imaging the harmonic signal components that are generated as the incident acoustic wave propagates through the tissue due to the non-linear properties of tissue. The higher the acoustic pressure in the tissue, the more pronounced the harmonic signal content. Since conventional beamforming types typically rely on narrowly focused transmit signals, they are inherently better suited for harmonic imaging, when compared to plane wave and diverging beam approaches. However, there can be a number of instances in which plane wave and diverging beam beamforming signal types can be advantaged. To help improve system operability, embodiments of the present disclosure address the problem of beamforming selection by beginning a scan using a first beamforming type, analyzing the data generated following the first beamforming paradigm as this data is generated, and dynamically determining whether or not to switch to another beamforming type. A novel aspect of the present disclosure is a proposed method wherein the optimal beamforming type for the specific clinical environment (considering factors such as anatomy, body habitus, and ultrasound system capability) is automatically selected, or is suggested to the operator, in order to provide a useful clinical outcome.
Diverging beam and plane wave beamforming allow high temporal resolution color flow data to be acquired. The same data, saved over a short time period, can be used to create a Spectral Doppler waveform. Advantageously, the acquired data is available over the entire image rather than at only a single location.
It is understood that the different ultrasound beamforming types and associated imaging algorithms have advantages as well as limitations. For example, diverging beams and plane wave beamforming signals used for ultrafast ultrasound processing provide high frame rates, but do not, in conventional implementations, provide focus for the transmitted signal, which can result in poorer spatial and contrast resolution. High frame rates, such as equivalent to several hundred frames per second (FPS) are achievable with diverging and plane wave beamforming signal sequences, because these sequences allow reconstruction over larger regions with relatively fewer number of signal emissions, such as a single transmission, or a set of plane wave transmissions, each at a different angle with relation to the imaged tissue. In the context of the present disclosure, the phrase “diverging beam/plane wave” is used to represent emission using either divergent or plane beamforming types, since there is often no formal distinction needed between the types. This beamforming type is distinguished from conventional, serial delay/sum beamforming and processing.
Recent developments in beam transmission and signal processing have provided methods to overcome some of the shortcomings of plane wave and divergent beam imaging. Coherent plane wave compounding has been introduced, for example, wherein plane waves are transmitted into tissue from multiple overlapping directions and are coherently summed to improve image resolution by reducing side lobes. Conversely, serial delay-sum (or delay/sum) beamforming or more conventional beamforming schemes allow tightly focused beams, and therefore provide better spatial resolution for a targeted region of interest (ROI), but at lower frame rates. Some modes such as harmonic imaging, while feasible, are more challenging with diverging and plane wave beams.
Some conventional ultrasound hardware has been adapted for use in ultrafast imaging applications. However, the conventional architecture of earlier systems makes it difficult to take advantage of the speed and accuracy that is available with ultrafast imaging techniques. Moreover, conventional systems adapted in this way do not offer the capability to interleave ultrafast processing with conventional serial delay/sum ultrasound processing, constraining the operator to choose either one or the other. In contrast, embodiments of the present disclosure allow the operator to combine conventional and ultrafast imaging without compromising either imaging capability.
The present disclosure describes an architecture and framework whereby interleaved signal modes, such as those noted above, can be provided by the beamforming paradigm, providing performance (such as using serial imaging or ultra-fast imaging) best suited for each case. By interleaving different types of ultrasound beams, an embodiment of the present disclosure enables the ultrasound system control logic to acquire image content in multiple modes and to synchronize the storage, transmission, and display of image data obtained using different types of beamforming signals.
A particular aspect of the disclosure relates to how the data for the individual sub-modes are gathered, processed, stored and displayed or rendered on the clinical display. Different beamforming schemes work best for the various clinical image modes described previously. To address the need for multiple mode imaging with suitable system performance and simplified workflow, embodiments of the present disclosure provide a hybrid architecture for interleaved signal imaging, as shown in the flow diagram of
As illustrated in
Applicants describe an example for Duplex imaging that provides Harmonic B-mode imaging along with Color mode imaging, wherein harmonic B-mode and Color mode data are generated and rendered on a frame-interleaved or line-interleaved basis. When Duplex imaging with Harmonic B-mode is initiated, the transmit sequence is set up with the appropriate pulses required for Harmonic B-mode and Color mode. In an exemplary sequence, Harmonic B-mode is effected using focused beams with the transmit focus placed at a narrow region within the tissue. Color mode imaging is typically best performed using ultrafast imaging techniques, with plane signal transmission. The harmonic B-mode and Color mode transmit signals are interleaved in a pre-determined manner, with the signal scheduler firing the appropriate signal pulses accordingly. For receive signal processing shown in
Referring to the individual detailed steps in the signal delivery, acquisition, and processing sequence of
An imaging mode and beamforming type selection step S520 selects the imaging mode type and beamforming type for the analog front end and specifies the corresponding encoding according to operator selection, or predetermined setup, of conventional or ultrafast processing. A decoding logic step S530 determines the appropriate processing path for the acquired interleaved signals obtained from the transponder, directing conventional delay/sum ultrasound signal content to Path A, and ultrafast ultrasound processing for the plane or divergent wave signal content to Path B, as described herein.
Path A processing has a delay/sum processing step S540 that performs conventional delay and sum beamforming, such as using FPGA or conventional software configuration for conventional ultrasound image extraction. A subsequent post-beamformation processing step S550 then executes the appropriate post-processing for the conventional delay/sum signal.
Path B in the
Following Path A and Path B processing, a frame combiner step S580 assembles image content from both the conventional signal and ultrafast ultrasound signal processing. A display step S590 then displays results of image processing for either or both processing paths.
For the processing sequence shown in
It is possible in certain embodiments that the decode logic and the delay/sum beamforming can be implemented on the same physical piece of hardware. This can depend on the complexity and capacity of the apparatus hardware and programmed logic.
Following beamforming, the data is processed using the post-beamforming signal and image processing functions described in
It is noted that although the acquisition is interleaved, the architecture presented does not require synchronization between the two paths A and B on a per data sample basis (typically 25 ns at 40 MHz). Instead, computing logic must be incorporated in the frame combiner circuitry, so that frame combiner step S580 waits until all of the data corresponding to a frame has arrived (from each of the processing paths) before presenting it to the display. Frame combiner step S580 provides both spatial and timing synchronization, as described subsequently.
Pixel Synchronization (Spatial Registration of Information from Paths A and B)
Pixel synchronization allows the two types of image content from delay/sum imaging using focused ultrasound beams and ultrafast imaging using plane wave or divergent wave signals to be readily processed, synchronized, and displayed. In the final display, the images formed by the two signal processing paths A and B in
(a) The spatial coordinates of the image pixels that contain the fused information from the 2 paths, typically as a 800 by 600 or equivalent 2D grid, are first defined as a look up table. This can be referred to as the global image coordinate system.
(b) For Path A of
(c) Similarly for Path B, the acoustic signal processing steps can produce spatial data in the global coordinate system defined in Step (a). With the ultra-fast imaging paradigm, the scan conversion step is not explicitly needed but rather the output of the acoustic signal processing steps would directly be a 2D spatial matrix.
(d) Once the data streams from (b) and (c) have been mapped to (a), decision logic can be applied, similar to the “priority” control in conventional ultrasound systems, that determines whether the data from Path A or B is presented on a given pixel. The decision can be made, for example, based on the energy content of the specific pixel, statistical variance, or other statistical parameters.
Just as the information from the two signal paths are synchronized spatially, it is also desirable to synchronize the posting of the information from each signal path. It is possible that, in some cases, the timing of the transmission is impacted due to the inability to completely flush the data from the signal processing path. An example transmit sequence is presented below. The example is based on a focused transmit-based B-mode imaging via Path A and color-flow imaging via Path B:
(a) The focused transmit beams for B-mode imaging via Path A are fired sequentially one after another. In response to each transmit signal, the receive data is routed through Path A, and the scan-converted data is stored in an image buffer.
(b) Interleaved with the focused transmission and acquisition, the transmit sequence to create an ultrafast imaging frame for Path B is fired. The emitted signal can be a plane or divergent wave signal. For a Color flow sequence, this could also entail a rapid series of transmissions at the same spatial location that forms the ensemble. The data for this excitation is routed via Path B, and processed to create a Color flow frame and stored in the image buffer as in (a).
(c) Once the corresponding image frame outputs from (a) and (b) are obtained, the data can be displayed on the screen. In case latency causes delay in the output from (b) reaching the image buffer in time, a decision can be made by a logic component to either discard that frame output and begin the next transmit sequence, or else wait for output from (b) to arrive before beginning the next transmit. Consequently, the image would be updated only after the corresponding frames from both paths are available to maintain timing synchronization.
The graph of
According to an embodiment of the present disclosure, the controller, control logic processor, or computer that controls ultrasound imaging operation provides a predefined, default set of combined modes and also allows the operator to override default settings and specify beamforming types to be used for any particular combination of modes. A table similar to that shown in
In addition to allowing operator selection, the control logic processor may also measure image quality and, based on data analysis, determine whether or not imaging with plane wave or divergent wave beamforming achieves acceptable results.
Data analysis can be performed to extract an image metric with information such as contrast resolution, spatial resolution, penetration, frame rate, and the like. A determination can be made as to whether or not the extracted metric is suitable, such as within an acceptable range of values, for the given anatomy and body part. If the metric is acceptable, scanning can continue using the same diverging beam/plane wave beamforming signal type. If one or more metrics are not suitable, it can be beneficial to switch to the delay-sum (focused beam) beamforming signal type or to another beamforming variant suitable for harmonic imaging or other imaging mode. Suitability can be determined by comparing the values of the metric(s) against one or more stored threshold values, for example.
Optionally, instead of switching the beamforming type, the system could automatically adjust filter settings for the current beamforming type, based on image analysis. It should be noted that automated switching of beamforming type change can be optional; as an alternative, a message or prompt is provided to the operator allowing entry, confirmation, or override of the proposed change in beamforming type.
At least one mode for implementing the beam interleaving signal delivery and processing of the present disclosure includes a high performance medical ultrasound system where the architecture is designed to support both focused beam and plane wave/divergent wave signal paths concurrently.
In the
Several techniques can be used to make the decision about which imaging path to utilize. In one technique, the optimal choices are preprogrammed within the ultrasound system. As an example, a table may be created within the stored programming of the ultrasound system, containing the rules defining the paths to utilize. Thus continuing the example described previously, a rule may state that B-mode fundamental imaging may always use divergent or plane waves. Another rule may state that harmonic imaging may always use focused modes. An operator would not necessarily need to know what transmit paradigms or imaging path is being used.
On the other hand, the decision on which path to use may be selected by the user through a user interface. Thus, for example, the user may choose between divergent waves, plane waves or conventional focused modes for fundamental B-mode images. Alternatively, a preference from the user or the clinical application being imaged might, in turn, determine which imaging mode is to be invoked. For example, a certain range of frame rates might use serial imaging via
This disclosure describes a methodology/architecture to interleave multiple modes of imaging in a frame.
This disclosure proposes a framework wherein Color and Harmonic imaging are both implemented using the optimal ultrasound imaging path.
The schematic diagrams of
According to an embodiment of the present disclosure, an operator command on the user interface allows the viewer to select the image from either mode individually, such as for full screen display. Tabs or other on-screen utilities then enable the viewer to select which image displays at a particular time.
Embodiments of the present disclosure can be applied in hand-carried systems where traditionally it has been considered expensive to incorporate high performance hardware, or alternatively has limited power consumption and computing capabilities to do extensive ultrafast imaging. In such architectures, it may be desirable to have embedded hardware/firmware that supports serial imaging to do part of the processing using serial imaging which is potentially less expensive and to execute other functions using ultrafast imaging.
The present invention can be a software program. Those skilled in the art will recognize that the equivalent of such software may also be constructed in hardware. Because image manipulation algorithms and systems are well known, the present description will be directed in particular to algorithms and systems forming part of, or cooperating more directly with, the method in accordance with the present invention. Other aspects of such algorithms and systems, and hardware and/or software for producing and otherwise processing the image signals involved therewith, not specifically shown or described herein may be selected from such systems, algorithms, components and elements known in the art.
A computer program product may include one or more storage medium, for example; magnetic storage media such as magnetic disk (such as a floppy disk) or magnetic tape; optical storage media such as optical disk, optical tape, or machine readable bar code; solid-state electronic storage devices such as random access memory (RAM), or read-only memory (ROM); or any other physical device or media employed to store a computer program having instructions for controlling one or more computers to practice the method according to the present invention.
A computer program product may include one or more storage medium, for example; magnetic storage media such as magnetic disk (such as a floppy disk) or magnetic tape; optical storage media such as optical disk, optical tape, or machine readable bar code; solid-state electronic storage devices such as random access memory (RAM), or read-only memory (ROM); or any other physical device or media employed to store a computer program having instructions for controlling one or more computers to practice the method according to the present invention.
The invention has been described in detail, and may have been described with particular reference to a suitable or presently preferred embodiment, but it will be understood that variations and modifications can be effected within the spirit and scope of the invention. The presently disclosed embodiments are therefore considered in all respects to be illustrative and not restrictive. The scope of the invention is indicated by the appended claims, and all changes that come within the meaning and range of equivalents thereof are intended to be embraced therein.
This application claims the benefit of U.S. Provisional application U.S. Ser. No. 62/450,696, provisionally filed on Jan. 26, 2017, entitled “ULTRASOUND APPARATUS AND METHOD”, in the name of Ajay Anand et al. and incorporated herein in its entirety.
Number | Date | Country | |
---|---|---|---|
62450696 | Jan 2017 | US |