This invention relates generally to the medical ultrasound processing field, and more specifically to a new and useful system and method of dynamic processing in the medical ultrasound field.
The following description of the preferred embodiments of the invention is not intended to limit the invention to these preferred embodiments, but rather to enable any person skilled in the art to make and use this invention.
As shown in
Step S110 includes acquiring data and, more specifically, acquiring ultrasound data. Step S110 preferably includes the sub-steps of collecting data and preparing data. The step of collecting data functions to collect raw ultrasound data such as from an ultrasound transducer or device storing raw ultrasound data. The raw ultrasound data may be represented by real or complex, demodulated or frequency shifted (e.g., baseband data), or any suitable representation of raw ultrasound data. Preparing data functions to perform preliminary processing to convert the raw data into a suitable form, such as brightness mode (B-mode), motion mode (M-mode), Doppler, or any other suitable form of ultrasound data. The acquired data may alternatively be left as raw ultrasound data, or the acquired data may alternatively be collected in a prepared data format from an outside device. In addition, pre- or post-beamformed data may be acquired. The acquired data may describe any suitable area (either 1D, 2D, 3D), or any suitable geometric description of the inspected material. The acquired data is preferably from an ultrasound device, but may alternatively be any suitable data acquisition system sensitive to motion. The acquired data may alternatively be provided by an intermediary device such as a data storage unit (e.g. hard drive), data buffer, or any suitable device. The acquired data is preferably output as processing data and control data. The processing data is preferably the data that will be processed in Step S140. The control data is preferably used in motion calculation and for processing parameter control. The processing data and control data are preferably in the same format, but may alternatively be in varying forms described above.
Step S120, which includes calculating object motion, functions to analyze the acquired data to detect tissue movement, probe movement, and/or any other motion that affects the acquired data. Object motion preferably includes any motion that affects the acquired data such as tissue motion, tissue deformation, probe movement, and/or any suitable motion. The measured motion may be a measurement of tissue velocity, displacement, acceleration, strain, strain rate, or any suitable characteristic of probe, tissue motion, or tissue deformation. Object motion is preferably calculated using the raw ultrasound data, but may alternatively use any suitable form of ultrasound data. At least two data sets (e.g., data images) acquired at different times are preferably used to calculate 1D, 2D or 3D motion. Speckle tracking is preferably used, but alternatively, Doppler processing, block matching, cross-correlation processing, lateral beam modulation, and/or any suitable method may be used. The motion measurements may additionally be improved and refined using models of tissue motion. The object motion (or motion data) is preferably used as parameter inputs in the modification of processing parameters in Step S130, but may alternatively or additionally be used directly in the processing Step S140.
As mentioned above, speckle tracking is a motion tracking method implemented by tracking the position of a kernel (section) of ultrasound speckles that are a result of ultrasound interference and reflections from scanned objects. The pattern of ultrasound speckles is fairly similar over small motions, which allows for tracking the motion of the speckle kernel within a search window (or region) over time. The search window is preferably a window within which the kernel is expected to be found, assuming normal tissue motion. Preferably, the search window is additionally dependent on the frame rate of the ultrasound data. A smaller search window can be used with a faster frame rate, assuming the same tissue velocity. The size of the kernel affects the resolution of the motion measurements. For example, a smaller kernel will result in higher resolution. Motion from speckle tracking can be calculated with various algorithms such as sum of absolute difference (SAD) or normalized cross correlation.
Step S130, which includes modifying processing parameter(s), functions to utilize object motion calculations to enhance or improve the data processing. The coefficients or control parameters of filters or signal processing operations are preferably adjusted according to parameter inputs that are related to the object motion calculated in Step S120. More preferably, the calculated object motion is used as the parameter inputs to modify the processing parameters. The parameter inputs may additionally or alternatively include other information such as data quality metrics discussed in further detail below. Step S130 may include variations depending on the data processing application. For example, data processing may include tissue motion calculation using speckle tracking. In this case, windows are preferably increased in size and search regions are decreased for the case of speckle tracking in a region of static tissue. Inversely, data windows are preferably decreased in size and search regions are increased for speckle tracking in regions of moving or deforming tissue. Another example of motion controlled data processing is image frame registration. In this case, motion estimates can be used to resample and align B-mode or raw data samples for improved filtering, averaging, or any suitable signal processing. Image resampling coefficients are preferably adjusted to provide frame registration. As another example, the parameter inputs may determine the coefficients, or alternatively, a new coordinate system, used for processing ultrasound data such as when resampling an ultrasound image. The modified processing parameters may additionally be used in the following applications: spatial and temporal sampling of various algorithms, including color-flow (2D Doppler), B-mode, M-mode and image scan conversion; wall filtering for color-flow and Doppler processing; temporal and spatial filters programming (e.g., filter response cut-offs); speckle tracking window size, search size, temporal and spatial sampling; setting parameters of speckle reduction algorithms; and/or any suitable application.
Step S140, which includes processing ultrasound data, functions to transform the acquired data for ultrasound imaging, analysis, or any other suitable goal. The step of processing preferably aids in the detection, measurement, and/or visualizing of image features. After the processing of the ultrasound data is complete, the method preferably proceeds in outputting the processed data (i.e., transformed data) S148. The outputted data may be used for any suitable operation such as being stored, displayed, passed to another device, or any suitable use. The step of processing may be any suitable processing task such as spatial or temporal filtering (e.g., wall filtering for Doppler and color flow imaging), summing, weighting, ordering, sorting, resampling, or other processes and may be designed for any suitable application. Preferably, Step S140 uses the data that was acquired in Step S110 and the parameters that were modified in Step S130. As an example, object motion data (calculated in Step S120) may be used to automatically identify or differentiate between object features such as between blood and tissue in Step S130. Depending on the situation, velocity, strain, or strain-rate calculations or any suitable calculation may be optimized to target only the object features of interest. For example, strain calculations may ignore ultrasound data associated with blood as a way to improve accuracy of tissue deformation measurements. The ultrasound data may be raw ultrasound data (e.g., RF data) or other suitable forms of data such as raw data converted into a suitable form (i.e., pre-processed). Step S140 is preferably performed in real-time on the ultrasound data while the data is being acquired, but may alternatively be performed offline or remotely on saved or buffered data. As shown in
Step S142, which includes forming an ultrasound image, functions to output an ultrasound image from the ultrasound data acquired in Step S110. Ultrasound data from step S110 is preferably converted into a format for processing operations. This step is optional, and is not necessary, such as in the case when the processing step is based upon raw ultrasound data. An ultrasound image is preferably any spatial representation of ultrasound data or data derived from ultrasound signals including raw ultrasound data (i.e., radio-frequency (RF) data images), B-mode images (magnitude or envelope detected images from raw ultrasound data), color Doppler images, power Doppler images, tissue motion images (e.g., velocity and displacement), tissue deformation images (e.g., strain and strain rate) or any suitable images.
Step S144, which includes resampling of an ultrasound image, functions to apply the processing parameters based on the motion data to the processing of the ultrasound data. The resampling is preferably spatially focused, with temporal processing occurring in Step S146, but Step S144 and Step S146 may alternatively be implemented in substantially the same step. Ultrasound image refinements may be made using the motion data as a filter for image processing operations. For example, motion data may be used to identify areas of high tissue velocity and apply image correction (sharpening or focusing) to account for distortion in the image resulting from the motion. Additionally or alternatively, resampling of an ultrasound image may include spatially mapping data, using measurements of the spatial transformation between frames to map data to a common grid. Spatially mapping data preferably includes shifting and additionally warping images by adaptively transforming image frames to a common spatial reference frame. This is preferably used cooperatively with temporal processing of Step S146 to achieve motion compensated frame averaging.
Step S146, which includes performing temporal processing, functions to apply time based processing of successive ultrasound data images. Temporal processing preferably describes the frame-to-frame (i.e., time series) processing. Additionally, the step of performing temporal processing may be performed according to a parameter controlled by the object motion calculation. Temporal processing may include temporal integration, weighted summation (finite impulse response (FIR) filtering), and weighted summation of frame group members with previous temporal processing outputs (infinite impulse response (IIR) filtering). The simple method of frame averaging is described by a FIR filter with constant weighting for each frame. Frame averaging or persistence may be used to reduce noise. Frame averaging is typically performed assuming no motion. Temporal processing can additionally take advantage of spatial mapping of data performed in Step S144 to enhance frame averaging. For example, with a system that acquires data at 20 frames per second (i.e., 50 ms intra-frame time) and an object with an object stability time (i.e., time the underlying object can be considered constant) of 100 ms, only two frames may be averaged or processed without image quality degradation. Using measurements of the spatial transformation between frames, the data can be mapped to a common grid prior to temporal processing to compensate for object motion, providing larger temporal processing windows and ultimately improved image quality from signal to noise increase. In this example, assume the object stability time increases by a factor of 10 (to 1 second) when the probe and object motion is removed. Now, 20 frames can be averaged without degradation, improving the signal to noise ratio by a factor greater than 3 (assuming white noise).
As shown in
Step S220, which includes calculating object motion, functions to analyze the acquired data to detect tissue movement, probe movement, and/or any other motion that affects the acquired data. Step S220 is preferably substantially similar to Step S120 described above, but Step S220 may additionally contribute to calculating data quality metrics in Step S125. As explained below, speckle tracking performed with normalized cross correlation produces a quantity referred to as data quality index (DQI) that can be used as a DQM. Normalized cross correlation is preferably performed by acquiring ultrasound radio frequency (RF) images or signals before and after deformation of an object. Image regions, or windows, of the images are then tracked between the two acquisitions using the cross-correlation function. The cross-correlation function measures the similarity between two regions as a function of a displacement between the regions. The peak magnitude of the correlation function corresponds to the displacement that maximizes signal matching. This peak value is preferably referred to as the DQI.
Step S225, which includes calculating a data quality metric, functions to aid in the optimization of data processing by determining a value reflecting the quality of the data. The DQM preferably relates to the level of assurance that the data is valid. Data quality metrics are preferably calculated for each sample, sub-set of samples of an image region, and/or for each pixel forming a DQM map. The DQM is preferably obtained from calculations related to tissue velocity, displacement, strain, and/or strain rate, or more specifically, peak correlation, temporal and spatial variation (e.g., derivatives and variance) of tissue displacement, and spatial and temporal variation of correlation magnitude. The data quality metric (DQM) is preferably calculated from a parameter(s) of the speckle tracking method and is more preferably the DQI described above. The DQI is preferably represented on a 0.0 to 1.0 scale where 0.0 represents low quality data and 1.0 represents high quality data. However, any suitable scale may be used. The DQI of data associated with tissue tend to have higher values, than data in areas that contain blood or noise. As is described below, this information can be used in the processing of ultrasound data for segmentation and signal identification. The DQM is preferably used in Step S230 as a parameter input to modify processing parameters.
The DQM may be used individually to modify the processing parameters (
Step S230, which includes modifying processing parameter(s), functions to utilize object motion calculations and/or DQM to enhance or improve the data processing. The coefficients or control parameters of filters or signal processing operations are preferably adjusted according to the parameter inputs related to object motion measured in Step S220 and/or the DQM of Step S225. The modification of processing parameters may be based directly on DQM (
The use of DQM preferably enables a variety of ways to control the processing of data. For example, measurements such as B-mode, velocity, strain, and strain rate may be weighted or sorted (filtered) based on the DQM. The DQM can preferably be used for multiple interpretations. The DQM may be interpreted as a quantized assessment of the quality of the data. Data that is not of high enough quality can be filtered from the ultrasound data. As an example, ultrasound derived velocity measurements for a section of tissue may suffer from noise (shown in
Step S240, which includes processing ultrasound data, functions to transform the acquired data for ultrasound imaging, analysis, or any suitable goal. The processing of ultrasound data preferably uses the modified processing parameters provided in Step S230. Preferably, Step S240 uses the data that was acquired in Step S210 and the parameters that were modified in Step S230. After the processing of the ultrasound data is complete, method preferably proceeds in outputting the processed data (i.e., transformed data) S248. The outputted data may be used for any suitable operation such as being stored, displayed, passed to another device, or any suitable use. The processing of ultrasound data may include multiple sub-steps as described for Step S140, and modified processing parameters based on motion information and/or DQM may be used for any of these sub-steps. As shown in
As shown in
As shown in
In an additional alternative shown in
As shown in
As a person skilled in the art will recognize from the previous detailed description and from the figures and claims, modifications and changes can be made to the preferred embodiments of the invention without departing from the scope of this invention defined in the following claims.
This application claims the benefit of U.S. Provisional Application No. 61/145,710, filed 19 Jan. 2009, which is incorporated in its entirety by this reference.
Number | Date | Country | |
---|---|---|---|
61145710 | Jan 2009 | US |