The present application claims priority from Japanese Patent Application No. 2023-160130, filed Sep. 25, 2023, the content of which is hereby incorporated by reference into this application.
The present disclosure relates to a three-dimensional ultrasound image processing apparatus, and particularly to an apparatus that performs filter processing on two-dimensional ultrasound image data constituting three-dimensional ultrasound image data.
There is an ultrasound diagnostic apparatus that acquires B-mode image data of a plurality of frames at different positions in a subject to acquire three-dimensional ultrasound image data. Such an ultrasound diagnostic apparatus has a two-dimensional array probe as an ultrasound probe, and repeatedly acquires the B-mode image data while performing electronic scanning not only in a major axis direction but also in a minor axis direction (depth direction). Each B-mode image data is acquired, for example, by scanning with a transmission beam on a plane. The B-mode image data of one frame is generated by receiving the reflected ultrasound arriving from each direction in which the transmission beam scanned on one plane is directed. The B-mode images of the plurality of frames connected in a minor axis scanning direction constitute a three-dimensional ultrasound image.
In the ultrasound diagnostic apparatus, the ultrasound that is transmitted earlier is reciprocated, and then the next ultrasound is transmitted. Therefore, there is an upper limit on the number of times of transmission and reception per unit time, and there is also an upper limit on a frame rate (also referred to as a volume rate in three-dimensional imaging). In order to increase the frame rate and reduce a time for acquiring the three-dimensional ultrasound image data, an interval in a case of scanning with the transmission beam is increased as compared with a case of acquiring the B-mode image data on one plane. The number of transmission beams per frame in a case of acquiring the three-dimensional ultrasound image data is decreased as compared with a case of generating only the B-mode image data on one plane. Further, a spatial resolution of the B-mode image is improved by forming a plurality of reception beams for one transmission beam and generating the B-mode image data based on a plurality of reception signals obtained from the plurality of reception beams, that is, signals obtained by parallel reception phasing.
It should be noted that JP2016-87302A discloses parallel reception phasing. JP2020-69304A discloses that, as a technology related to the present disclosure, filter processing is performed on a spatial frequency distribution for two axes of a depth direction and an azimuthal angle direction of an ultrasound image.
In the ultrasound image such as the B-mode image and the Doppler image generated by the parallel reception phasing, due to the pixel value undulating in the depth direction in which the ultrasound is transmitted and received, as well as in a scanning direction of the transmission beam (azimuthal angle direction in sector scanning), spot-like speckles extending in the scanning direction may be generated. As a result, a stripe pattern extending in the scanning direction may appear as an artifact. The stripe pattern tends to be more remarkable as the scanning interval of the transmission beam is larger.
An object of the present disclosure is to suppress an artifact caused by a pixel value undulating in a depth direction of the ultrasound image for a plurality of ultrasound images constituting a three-dimensional ultrasound image.
An aspect of the present disclosure relates to a three-dimensional ultrasound image processing apparatus comprising: an information processing unit that executes processing of acquiring spatial frequency distribution data in a depth direction for one frame in two-dimensional ultrasound image data of a plurality of frames constituting three-dimensional ultrasound image data, and processing of performing filter processing, which is based on filter characteristics determined in accordance with the spatial frequency distribution data and characteristics of transmitted ultrasound in a case in which each two-dimensional ultrasound image data is acquired, on each two-dimensional ultrasound image data.
In one embodiment, that the information processing unit searches for a spatial frequency corresponding to a maximal value of a distribution indicated by the spatial frequency distribution data in a search range determined in accordance with a pulse width of the transmitted ultrasound, and obtains, as the filter characteristics, characteristics for suppressing a level of the distribution indicated by the spatial frequency distribution data in a spatial frequency band including the spatial frequency that is searched for.
In one embodiment, that the information processing unit changes the filter characteristics in a case in which the maximal value of the distribution indicated by the spatial frequency distribution data does not satisfy a predetermined condition for the spatial frequency distribution data acquired from the two-dimensional ultrasound image data on which the filter processing has been performed.
In one embodiment, the information processing unit generates the two-dimensional ultrasound image data in sequence with elapse of time, and performs the filter processing, which is the same as the filter processing on the two-dimensional ultrasound image data of one frame generated earlier, on the two-dimensional ultrasound image data of one frame generated later.
In one embodiment, the information processing unit performs the filter processing in which the filter characteristics are changed on the two-dimensional ultrasound image data of one frame generated much later in a case in which a condition related to an artifact is not satisfied for the two-dimensional ultrasound image data of the one frame generated later.
In one embodiment, the two-dimensional ultrasound image data is obtained by transmitting and receiving the ultrasound while performing scanning with a transmission beam of the ultrasound at a predetermined scanning interval, and the information processing unit performs the filter processing on the two-dimensional ultrasound image data in a case in which the scanning interval exceeds a predetermined scanning interval threshold value.
In one embodiment, that the scanning interval threshold value is determined based on the characteristics of the transmitted ultrasound.
In one embodiment, the filter characteristics are characteristics for making an attenuation amount in a spatial frequency band corresponding to a predetermined structure be equal to or less than an attenuation amount limit value.
In one embodiment, the two-dimensional ultrasound image data is data indicating a B-mode image on which a blood flow image is superimposed.
According to the aspect of the present disclosure, it is possible to suppress the artifact caused by the pixel value undulating in the depth direction of the ultrasound image.
Embodiments of the present disclosure will be described with reference to the respective drawings. The same components shown in a plurality of drawings will be denoted by the same reference numerals, and the description thereof will be omitted. It should be noted that, in the present specification, for a term “P image data” for specifying certain image data, an image indicated by the P image data will be referred to as a P image.
The information processing unit 10 may be configured by one or a plurality of processors that execute a program to realize functions of the controller 40, the B-mode image generation unit 46, the image processing unit 50, and the filter processing unit 52. The program may be stored in the memory 44.
The controller 40 performs overall control on the ultrasound diagnostic apparatus 100. The operation unit 20 includes a keyboard, a mouse, a lever, a button, and the like, and outputs information related to an operation of a user to the controller 40. In a case in which the display unit 18 includes a touch panel on a display screen, the operation unit 20 may include the touch panel. The controller 40 may control the ultrasound diagnostic apparatus 100 in accordance with an operation on the operation unit 20.
The operation of the ultrasound diagnostic apparatus 100 will be described. The ultrasound probe 14 is in a state of being in contact with a surface of the subject 54. The ultrasound probe 14 comprises a plurality of oscillating elements 22. The transmission unit 12 outputs a transmission signal to each oscillating element 22 of the ultrasound probe 14 based on control via a beam controller 42 provided in the controller 40. As a result, the ultrasound is transmitted from the ultrasound probe 14. The beam controller 42 forms a transmission beam in the ultrasound probe 14 to scan the subject 54 with the transmission beam by controlling the transmission unit 12. That is, the transmission unit 12 adjusts a delay time or a level of each transmission signal in accordance with the control of the beam controller 42, forms the transmission beam in the ultrasound probe 14, and scans the subject 54 with the transmission beam.
In a case in which the ultrasound reflected in the subject 54 is received by each oscillating element 22 of the ultrasound probe 14, each oscillating element 22 outputs an electric signal corresponding to the received ultrasound to the reception unit 16. The reception unit 16 performs processing, such as amplification, detection, and frequency band limitation, on the reception signal output from each oscillating element 22 in accordance with the control of the beam controller 42. The reception unit 16 further performs phasing addition on the reception signals output from the respective oscillating elements 22 to generate a post-phasing reception signal. As a result, the post-phasing reception signals in which the phases are adjusted and added such that the reception signals based on the ultrasound received from a specific direction reinforce each other are generated, and a reception beam is formed in the specific direction. The reception unit 16 outputs the post-phasing reception signal to the B-mode image generation unit 46.
The B-mode image generation unit 46 generates the B-mode image data based on the post-phasing reception signal obtained in each reception beam direction, and outputs the B-mode image data to the image processing unit 50. The B-mode image data based on one scanning with the transmission beam and the reception beam is image data for one frame, and corresponds to one B-mode image.
The ultrasound probe 14 has a two-dimensional array structure, and swings a scanning surface scanned with the transmission beam and the reception beam in a direction intersecting the scanning surface, for example, the azimuthal angle direction of the depth direction. The ultrasound probe 14 swings, for example, the scanning surface in the azimuthal angle direction of the depth direction by a predetermined step width each time the B-mode image data of one scanning surface is generated. The B-mode image generation unit 46 generates the B-mode image data as two-dimensional ultrasound image data of one scanning surface each time the ultrasound probe 14 swings the scanning surface by the step width, and outputs the B-mode image data to the image processing unit 50. The image processing unit 50 generates three-dimensional ultrasound image data composed of the B-mode image data (two-dimensional ultrasound image data) of a plurality of frames acquired at different positions in the subject 54. The three-dimensional ultrasound image data is data representing each pixel value of a plurality of voxels (pixels) arranged in three axial directions.
The image processing unit 50 selects the B-mode image data of one frame in the B-mode image data of the plurality of frames based on the operation of the operation unit 20, generates a video signal for displaying an image indicated by the selected B-mode image data, and outputs the video signal to the display unit 18. The display unit 18 displays the B-mode image based on the video signal. In addition, the image processing unit 50 may generate volume rendering image data in which the three-dimensional ultrasound image is represented as a two-dimensional image in a stereoscopic manner. The image processing unit 50 generates the video signal for displaying the volume rendering image, to output the video signal to the display unit 18. The display unit 18 displays the volume rendering image based on the video signal.
It should be noted that, in the ultrasound diagnostic apparatus 100 according to the present embodiment, the three-dimensional ultrasound image data representing the plurality of voxels arranged in the three axial directions is acquired by swinging the scanning surface via the ultrasound probe 14. In addition to such processing, processing of transporting the ultrasound probe 14 in a linear or curved manner to acquire the three-dimensional ultrasound image data may be executed.
For example, it is assumed that the plurality of oscillating elements 22 are two-dimensionally arranged along a major axis direction and a minor axis direction of a substantially rectangular surface of the ultrasound probe 14 in contact with the subject 54. In this case, a surface including an imaginary major axis extending in the major axis direction may be defined as the scanning surface, and the ultrasound probe 14 may be transported in the minor axis direction in a linear manner. The ultrasound probe 14 is transported in the minor axis direction by a predetermined step width, for example, each time the B-mode image data of one frame is generated. The B-mode image generation unit 46 generates the B-mode image data as the two-dimensional ultrasound image data of one frame each time the ultrasound probe 14 is transported by the step width, and outputs the B-mode image data to the image processing unit 50. The image processing unit 50 generates three-dimensional ultrasound image data composed of the B-mode image data (two-dimensional ultrasound image data) of a plurality of frames acquired at different positions in the subject 54.
In order to maintain a frame rate of the B-mode image data generated by the ultrasound diagnostic apparatus 100 at a certain magnitude, it is necessary to reduce a time for scanning the transmission beam on one scanning surface. Therefore, the number of transmission beams per frame in a case of acquiring the three-dimensional ultrasound image data is smaller than the number of transmission beams in a case of generating the B-mode image data only on one scanning surface. Further, a plurality of reception beams are formed for one transmission beam, the B-mode image data is generated based on a plurality of reception signals obtained from the plurality of reception beams, that is, signals obtained by parallel reception phasing.
The parallel reception phasing will be described. The reception unit 16 forms a plurality of (M) reception beams for one transmission of the ultrasound pulse. That is, the reception unit 16 performs parallel reception phasing on the plurality of reception signals output from the plurality of oscillating elements 22 to form the M reception beams. In a case in which linear scanning with the transmission beam is performed, the reception unit 16 forms the M reception beams having the same direction as the transmission beam direction and arranged parallel to each other. In a case in which sector scanning with the transmission beam is performed, the reception unit 16 forms the M reception beams arranged at equal azimuthal angle intervals. Here, the azimuthal angle refers to an angle that defines a direction as seen from the center of the sector scanning.
The reception unit 16 generates the post-phasing reception signal for each of the M reception beams, to output the post-phasing reception signal generated for each of the M reception beams to the B-mode image generation unit 46. The B-mode image generation unit 46 generates the B-mode image data of a region in which M reception beams are disposed in each direction of the scanned transmission beam.
In the B-mode image indicated by the B-mode image data generated by the parallel reception phasing, spot-like speckles extending in the scanning direction may be caused by the pixel value undulating in the depth direction as well as the scanning direction of the transmission beam. Here, the scanning direction is a direction perpendicular to the transmission beam in the linear scanning and is an azimuthal angle direction in the sector scanning. Such speckles may cause a stripe pattern extending in the scanning direction to appear as an artifact. This stripe-like artifact tends to be more remarkable as a scanning interval is larger.
Therefore, in the ultrasound diagnostic apparatus 100 according to the present embodiment, the filter processing unit 52 determines filter characteristics for suppressing the stripe-like artifact based on the B-mode image data. The filter processing unit 52 performs the filter processing based on the filter characteristics, on the B-mode image data to generate filter-processed B-mode image data. The image processing unit 50 displays, on the display unit 18, a filter-processed B-mode image or the volume rendering image based on the filter-processed B-mode images of a plurality of frames. Hereinafter, specific processing is shown.
The image processing unit 50 outputs the B-mode image data to the filter processing unit 52. The filter processing unit 52 extracts depth direction line data representing each pixel value of a plurality of pixels arranged in the depth direction for each of a plurality of azimuthal angle directions from the B-mode image data. In
The filter processing unit 52 selects one set of a plurality of sets of depth direction line data 32 generated for the plurality of azimuthal angle directions as sample line data. One set of sample line data corresponds to one set of depth direction line data 32 corresponding to one azimuthal angle direction. The filter processing unit 52 performs spatial Fourier transformation on the sample line data to obtain the spatial frequency distribution data.
The filter processing unit 52 searches for the speckle spatial frequency fs corresponding to the maximal value of the spatial frequency distribution of the sample line data, to define the speckle spatial frequency band Bs including the speckle spatial frequency fs. The filter processing unit 52 further performs the filter processing of suppressing the spatial frequency component of the speckle spatial frequency band Bs on the B-mode image data. Specifically, the filter processing unit 52 performs the filter processing of suppressing the spatial frequency component of the speckle spatial frequency band Bs on the plurality of sets of depth direction line data 32 corresponding to the plurality of azimuthal angle directions extending over the entire region of the B-mode image. As a result, the filter processing unit 52 generates the filter-processed B-mode image data to output the filter-processed B-mode image data to the image processing unit 50. The filter-processed B-mode image data is composed of a plurality of sets of filter-processed depth direction line data obtained by performing the filter processing on the plurality of sets of depth direction line data 32 corresponding to the plurality of azimuthal angle directions extending over the entire region of the B-mode image.
The processing of searching for the speckle spatial frequency fs, defining the speckle spatial frequency band Bs, and obtaining the filter characteristics will be described.
It is known that the speckle spatial frequency fs is close to a reference spatial frequency fs0 obtained by multiplying the sound velocity c by a reciprocal of Sr, as shown in (Expression 2).
The filter processing unit 52 acquires the spatial frequency component level corresponding to the spatial frequency while increasing the spatial frequency in sequence from the reference spatial frequency fs0 by a step width δ within the search range including the reference spatial frequency fs0 (within the spatial frequency band determined in accordance with the pulse width of the transmitted ultrasound). In addition, the filter processing unit 52 acquires the spatial frequency component level corresponding to the spatial frequency while decreasing the spatial frequency in sequence from the reference spatial frequency fs0 by the step width δ. The filter processing unit 52 determines the spatial frequency at which the spatial frequency component level is maximal, as the speckle spatial frequency fs.
The filter processing unit 52 determines a spatial frequency higher than the speckle spatial frequency fs by a predetermined upper limit spatial frequency width Δb as an upper limit spatial frequency fb of the speckle spatial frequency band Bs, as shown in
The filter processing unit 52 forms a band suppression filter in which the speckle spatial frequency band Bs equal to or higher than the lower limit spatial frequency fa and equal to or lower than the upper limit spatial frequency fb is a suppression band width. The band suppression filter has, for example, unnecessary component suppression filter characteristics for suppressing the level by 6 dB or higher in the frequency band equal to or higher than the lower limit spatial frequency fa and equal to or lower than the upper limit spatial frequency fb. The unnecessary component suppression filter characteristics may be characteristics for attenuating the level by 6 dB or higher, preferably 20 dB or higher, at the speckle spatial frequency fs.
The filter processing unit 52 performs the filter processing based on the unnecessary component suppression filter characteristics for each of the plurality of sets of depth direction line data 32 corresponding to the plurality of azimuthal angle directions extending over the entire region of the B-mode image, and generates the plurality of sets of filter-processed depth direction line data corresponding to the plurality of azimuthal angle directions extending over the entire region of the B-mode image. The filter processing unit 52 forms the filter-processed B-mode image data based on the plurality of sets of filter-processed depth direction line data, and outputs the filter-processed B-mode image data to the image processing unit 50. The image processing unit 50 generates the video signal for displaying the filter-processed B-mode image, to output the video signal to the display unit 18. The display unit 18 displays the filter-processed B-mode image based on the video signal. In addition, the image processing unit 50 may generate a video signal for displaying the volume rendering image based on the filter-processed B-mode image of the plurality of frames, and may output the video signal to the display unit 18. In this case, the display unit 18 displays the volume rendering image based on the video signal.
With such processing, as the characteristics of the transmitted ultrasound, the reference spatial frequency fs0 is obtained in accordance with (Expression 2) based on the pulse width o of the transmission pulse. Then, the speckle spatial frequency fs at which the level is maximal in the spatial frequency distribution is searched for on the high-band side and the low-band side of the reference spatial frequency fs0. Further, the spatial frequency higher than the speckle spatial frequency fs by the predetermined upper limit spatial frequency width Δb is determined as the upper limit spatial frequency fb, the spatial frequency lower than the speckle spatial frequency fs by the predetermined lower limit frequency width Δa is determined as the lower limit spatial frequency fa, and the unnecessary component suppression filter characteristics in the filter processing unit 52 are determined. As a result, the processing of determining the unnecessary component suppression filter characteristics for suppressing the stripe-like artifact is quickly performed.
In the processing, the filter processing unit 52 determines the spatial frequency higher than the speckle spatial frequency fs by the predetermined upper limit spatial frequency width Δb as the upper limit spatial frequency fb, and determines the spatial frequency lower than the speckle spatial frequency fs by the predetermined lower limit frequency width Δa as the lower limit spatial frequency fa. As described above, instead of the processing of determining the spatial frequencies separated from the speckle spatial frequency fs by a certain value as the upper limit spatial frequency fb and the lower limit spatial frequency fa, the following processing may be executed.
That is, the filter processing unit 52 may determine the spatial frequency on the low-band side at which the level of the spatial frequency distribution is decreased by a predetermined ratio with respect to the maximal value of the speckle spatial frequency fs of the spatial frequency distribution, as the lower limit spatial frequency fa. In addition, the filter processing unit 52 may determine the spatial frequency on the high-band side at which the level of the spatial frequency distribution is decreased by a predetermined ratio, as the upper limit spatial frequency fb. As will be described below, the filter processing unit 52 may execute processing of adaptively determining the lower limit spatial frequency fa and the upper limit spatial frequency fb.
The filter processing unit 52 selects a plurality of (K) sets among a plurality of sets of depth direction line data 32 generated for the plurality of azimuthal angle directions as the K sets of sample line data. The filter processing unit 52 performs the spatial Fourier transformation on each of the K sets of sample line data to obtain K types of spatial frequency distributions. The filter processing unit 52 searches for the speckle spatial frequency fs for each of the K types of spatial frequency distributions via the same processing as the processing based on the one set of sample line data. The filter processing unit 52 obtains the minimum speckle spatial frequency fs among the K speckle spatial frequencies fs obtained from the K types of spatial frequency distributions as the lower limit spatial frequency fa, and obtains the maximum speckle spatial frequency fs as the upper limit spatial frequency fb.
The filter processing unit 52 may obtain one spatial frequency distribution by obtaining a statistical value such as an average value, a median value, a maximum value, and a minimum value of K spatial frequency component levels at each spatial frequency, from the K types of spatial frequency distributions obtained from the K sets of sample line data. The filter processing unit 52 may execute the same processing as the processing based on the one set of sample line data for one spatial frequency distribution obtained from the K types of spatial frequency distributions. That is, the filter processing unit 52 may search for the speckle spatial frequency fs for one spatial frequency distribution obtained from the K types of spatial frequency distributions, to obtain the lower limit spatial frequency fa and the upper limit spatial frequency fb.
An image of a structure of biological tissue, such as a blood vessel, a heart wall, or a heart valve, may appear in the B-mode image. In this case, in the filter-processed B-mode image in which the stripe-like artifact is suppressed, the structure may be difficult to see. The spatial frequency component included in the sample line data by the structure may be lower than the speckle spatial frequency. Therefore, the filter processing unit 52 may execute processing of limiting the attenuation amount in the spatial frequency band corresponding to the structure.
An upper part of
A lower part of
The B-mode image generation unit 46 outputs the B-mode image data to the image processing unit 50 in sequence with the elapse of time. The image processing unit 50 generates the B-mode image data in sequence with the elapse of time to output the B-mode image data to the filter processing unit 52. The filter processing unit 52 performs the filter processing on each of the B-mode image data output from the image processing unit 50 in sequence with the elapse of time, and outputs the filter-processed B-mode image data to the image processing unit 50 in sequence with the elapse of time.
The filter processing unit 52 may perform the filter processing (hereinafter, also referred to as adaptive filter processing) in which filter characteristics are adaptively changed in accordance with the filter-processed B-mode image data generated in the past, on the B-mode image data output in sequence from the image processing unit 50. Further, the filter processing unit 52 may output the filter-processed B-mode image data to the image processing unit 50 in sequence with the elapse of time.
Here, the adaptive filter processing will be described. In the adaptive filter processing, processing of changing the filter characteristics is repeated in a case in which the maximal value of the distribution indicated by the spatial frequency distribution data does not satisfy a predetermined condition, for the spatial frequency distribution data acquired from the filter-processed B-mode image data. Here, the predetermined condition is a condition for suppressing the stripe-like artifact.
The filter processing unit 52 performs the filter processing based on the unnecessary component suppression filter characteristics changed based on the earlier B-mode image data, on the B-mode image data of one frame output later from the image processing unit 50, and outputs the filter-processed B-mode image data to the image processing unit 50. In a case in which R<A is satisfied again in the spatial frequency distribution obtained for one sample line data in the filter-processed B-mode image data of one frame output to the image processing unit 50, the filter processing unit 52 changes the unnecessary component suppression filter characteristics such that the attenuation amount at the speckle spatial frequency fs is 20 log(A/R) or more.
In this way, the filter processing unit 52 sets the unnecessary component suppression filter characteristics based on the earlier data in the filter-processed B-mode image data generated in sequence with the elapse of time, and performs the filter processing based on the determined filter characteristics, on the B-mode image data of the next frame. The filter processing unit 52 executes such processing in sequence on the B-mode image data generated in sequence with the elapse of time, so that the unnecessary component suppression filter characteristics converge to ideal characteristics.
From another viewpoint, the filter processing unit 52 executes the following processing. That is, the filter processing unit 52 performs the same filter processing as the filter processing on the B-mode image data of one frame generated earlier, on the B-mode image data of one frame generated later. In a case in which the condition related to the artifact is not satisfied for the B-mode image data of one frame generated later, the filter processing unit 52 performs the filter processing in which the filter characteristics are changed, on the B-mode image data of the frame generated much later. The condition related to the artifact is, for example, a condition in which the maximal value A and the target value R at the speckle spatial frequency fs satisfy A≤R for the spatial frequency distribution obtained for one sample line data.
The filter processing unit 52 searches for the speckle spatial frequency fs by using the reference spatial frequency fs0, to determine the unnecessary component suppression filter characteristics (S105). The filter processing unit 52 performs the filter processing on the B-mode image data based on the determined unnecessary component suppression filter characteristics, to generate the filter-processed B-mode image data (S106).
The filter processing unit 52 extracts the sample line data from the filter-processed B-mode image data, and determines whether or not the obtained spatial frequency distribution of the sample line data satisfies the condition related to the artifact (S301). In a case in which the condition related to the artifact is satisfied, the filter processing unit 52 terminates the processing while maintaining the unnecessary component suppression filter characteristics.
On the other hand, in a case in which the condition related to the artifact is not satisfied, the filter processing unit 52 changes the unnecessary component suppression filter characteristics (S302), and returns the processing to the processing of step S107. In step S107 from the second time onwards, the filter processing unit 52 performs the filter processing based on the changed unnecessary component suppression filter characteristics, on the filter-processed B-mode image data generated earlier (S106).
With such processing, in steps S301 and S302, the unnecessary component suppression filter characteristics are repeatedly changed for the filter-processed B-mode image data generated in sequence with the elapse of time until the condition related to the artifact is satisfied. As a result, the unnecessary component suppression filter characteristics converge to characteristics in which the condition related to the artifact is satisfied for the filter-processed B-mode image data.
Whether or not the filter processing is performed on the B-mode image data may be determined in accordance with the scanning interval. For example, the filter processing may be performed in a case in which the scanning interval in the azimuthal angle direction exceeds a predetermined scanning interval threshold value. In a case in which the scanning interval exceeds the predetermined scanning interval threshold value, the controller 40 causes the image processing unit 50 and the filter processing unit 52 to execute the filter processing. On the other hand, in a case in which the scanning interval is equal to or less than the scanning interval threshold value, the controller 40 does not cause the image processing unit 50 and the filter processing unit 52 to execute the filter processing on the B-mode image data, and displays the B-mode image based on the B-mode image data on which the filter processing has not been performed or the volume rendering image, on the display unit 18.
The scanning interval threshold value may be determined based on a theoretical value of the magnitude of the speckles 56 in the scanning direction. An average value Sθ [mm] of the widths of the speckles 56 in the scanning direction is approximately represented as (Expression 3). Here, λ is a wavelength [mm] of the ultrasound, z is a measurement depth [mm], and D is an opening width [mm]. The opening width D is a length in which the plurality of oscillating elements 22 provided in the ultrasound probe 14 are arranged on a surface scanned with the ultrasound beam.
In a case in which the width of the speckle 56 in the scanning direction is expressed by the azimuthal angle, a scanning interval threshold value θt [rad] may be obtained in accordance with (Expression 4).
In the ultrasound diagnostic apparatus 100, a man-machine interface that changes the unnecessary component suppression filter characteristics in accordance with the operation of the user may be configured. The man-machine interface may be configured by the image processing unit 50, the display unit 18, and the operation unit 20. That is, the unnecessary component suppression filter characteristics may be set based on the operation of the operation unit 20 performed by the user who refers to the display unit 18. For example, the controller 40 may read the attenuation amounts at the speckle spatial frequency fs, the lower limit spatial frequency fa, the upper limit spatial frequency fb, and the speckle spatial frequency band, the attenuation amounts at the structure spatial frequency fsT and the spatial frequency band STR including the structure spatial frequency fsT, and the like, in accordance with the operation of the operation unit 20. In this case, the image processing unit 50 may display at least one of the spatial frequency distribution obtained for the sample line data or the unnecessary component suppression filter characteristics on the display unit 18, and may present the at least one of the spatial frequency distribution or the unnecessary component suppression filter characteristics to the user. The unnecessary component suppression filter characteristics may be represented by a graph in which a horizontal axis is the spatial frequency and a vertical axis is the attenuation amount, the phase rotation amount, and the like.
A graphical user interface displayed on the display unit 18 in a case in which the unnecessary component suppression filter characteristics are set may include, for example, a button, an icon, or the like in which “none”, “weak”, “medium”, “strong”, or the like is displayed. In addition, the image processing unit 50 may display a graphical user interface for setting the unnecessary component suppression filter characteristics on the display unit 18 via the operation unit 20.
The operation in the Doppler mode will be described. In the Doppler mode, the transmission unit 12 outputs the transmission signal to each of the oscillating elements 22 of the ultrasound probe 14 a plurality of (N) times such that ultrasound pulses are transmitted N times at repetitive time intervals T in one transmission beam direction. The reception unit 16 generates N types of post-phasing reception signals for the N ultrasound pulses reflected in the subject 54 and received by the ultrasound probe 14.
The reception unit 16 forms a plurality of (M) reception beams for one transmission of the ultrasound pulse. That is, the reception unit 16 performs parallel reception phasing on the plurality of reception signals output from the plurality of oscillating elements 22 to form the M reception beams. In a case in which linear scanning with the transmission beam is performed, the reception unit 16 forms the M reception beams having the same direction as the transmission beam direction and arranged parallel to each other. In a case in which sector scanning with the transmission beam is performed, the reception unit 16 forms the M reception beams arranged at equal azimuthal angle intervals.
The reception unit 16 generates N types of post-phasing reception signals corresponding to the ultrasound pulses transmitted N times for one transmission beam for each of the M reception beams, and outputs the N types of post-phasing reception signals for each of the M reception beams to the blood flow image generation unit 48.
The blood flow image generation unit 48 obtains a blood flow velocity v(r) in the depth direction at each depth r of each of the M reception beams based on the N types of post-phasing reception signals generated for each of the M reception beams, for example, according to the self-correlation processing described in JP2016-87302A.
The blood flow image generation unit 48 generates blood flow image data based on the blood flow velocity v(r) in the depth direction at each depth r of each of the M reception beams. A blood flow image indicated by the blood flow image data is an image indicating a color in accordance with the blood flow velocity v(r) at a position of each depth r on each reception beam. That is, the blood flow image indicates a distribution of the blood flow velocity v(r) in a region in which the M reception beams are disposed by color. In the blood flow image, for example, a blue color is added to a region in which the blood flows in a direction away from the ultrasound probe 14, a red color is added to a region in which the blood flows in a direction close to the ultrasound probe 14, and the brightness is higher in a region in which the blood flow velocity is higher. The blood flow image generation unit 48 outputs the blood flow image data to the image processing unit 50.
The color Doppler ultrasound diagnostic apparatus 102 executes the measurement in each of the B-mode and the Doppler mode in time division within a time corresponding to one frame, and generates color Doppler image data as data indicating a color Doppler image in which the B-mode image of one frame and the blood flow image of one frame are superimposed on each other.
The B-mode image generation unit 46 generates, for example, the B-mode image data of one scanning surface each time the ultrasound probe 14 swings the scanning surface by the step width, and outputs the B-mode image data to the image processing unit 50. The blood flow image generation unit 48 generates the blood flow image data of one scanning surface each time the ultrasound probe 14 swings the scanning surface by the step width, and outputs the blood flow image data to the image processing unit 50.
The image processing unit 50 generates the Doppler image data acquired at the different positions at intervals of the step widths based on the B-mode image data and the blood flow image data generated each time the ultrasound probe 14 swings the scanning surface by the step width. The image processing unit 50 generates three-dimensional ultrasound image data composed of the Doppler image data of a plurality of frames acquired at different positions in the subject 54.
The image processing unit 50 selects the Doppler image data of one frame in the Doppler image data of the plurality of frames based on the operation of the operation unit 20, generates a video signal for displaying an image indicated by the selected Doppler image data, and outputs the video signal to the display unit 18. The display unit 18 displays the Doppler image based on the video signal. In addition, the image processing unit 50 may generate volume rendering image data in which the three-dimensional ultrasound image is represented as a two-dimensional image in a stereoscopic manner. The image processing unit 50 generates the video signal for displaying the volume rendering image, to output the video signal to the display unit 18. The display unit 18 displays the volume rendering image based on the video signal.
In the B-mode image represented by the B-mode image data generated by the parallel reception phasing, a stripe pattern extending in the scanning direction may appear as the artifact. This stripe-like artifact tends to be more remarkable as a scanning interval is larger
In addition, in order to maintain a frame rate of the B-mode image data generated by the color Doppler ultrasound diagnostic apparatus 102 at a certain magnitude, it is necessary to reduce a time for scanning one scanning surface. Therefore, in the color Doppler ultrasound diagnostic apparatus 102, the scanning interval of the transmission beam in a case in which the measurement in each of the B-mode and the Doppler mode is executed in time division is made larger than the scanning interval in a case in which the measurement is executed only in the B-mode. As a result, the stripe-like artifact appearing in the image displayed on the display unit 18 is remarkable.
Therefore, in the color Doppler ultrasound diagnostic apparatus 102, the image processing unit 50 outputs the B-mode image data to the filter processing unit 52. The filter processing unit 52 performs the filter processing based on the unnecessary component suppression filter characteristics, on the B-mode image data to generate filter-processed B-mode image data, and outputs the filter-processed B-mode image data to the image processing unit 50. The filter processing executed by the filter processing unit 52 is the same as the filter processing executed by the filter processing unit 52 of the ultrasound diagnostic apparatus 100.
The image processing unit 50 generates the Doppler image data acquired at the different positions with the intervals of the step widths based on the filter-processed B-mode image data and the blood flow image data. The image processing unit 50 generates three-dimensional ultrasound image data composed of the Doppler image data of a plurality of frames acquired at different positions in the subject 54.
In this way, the filter processing based on the unnecessary component suppression filter characteristics is performed on the B-mode image data that is a source for generating the Doppler image data, whereby the stripe-like artifact appearing in the image displayed on the display unit 18 based on the three-dimensional ultrasound image data is suppressed.
A three-dimensional ultrasound image processing apparatus comprising: an information processing unit that executes processing of acquiring spatial frequency distribution data in a depth direction for one frame in two-dimensional ultrasound image data of a plurality of frames constituting three-dimensional ultrasound image data, and processing of performing filter processing, which is based on filter characteristics determined in accordance with the spatial frequency distribution data and characteristics of transmitted ultrasound in a case in which each two-dimensional ultrasound image data is acquired, on each two-dimensional ultrasound image data.
The three-dimensional ultrasound image processing apparatus according to configuration 1, in which the information processing unit searches for a spatial frequency corresponding to a maximal value of a distribution indicated by the spatial frequency distribution data in a search range determined in accordance with a pulse width of the transmitted ultrasound, and obtains, as the filter characteristics, characteristics for suppressing a level of the distribution indicated by the spatial frequency distribution data in a spatial frequency band including the spatial frequency that is searched for.
The three-dimensional ultrasound image processing apparatus according to configuration 2, in which the information processing unit changes the filter characteristics in a case in which the maximal value of the distribution indicated by the spatial frequency distribution data does not satisfy a predetermined condition for the spatial frequency distribution data acquired from the two-dimensional ultrasound image data on which the filter processing has been performed.
The three-dimensional ultrasound image processing apparatus according to any one of configurations 1 to 3, in which the information processing unit generates the two-dimensional ultrasound image data in sequence with elapse of time, and performs the filter processing, which is the same as the filter processing on the two-dimensional ultrasound image data of one frame generated earlier, on the two-dimensional ultrasound image data of one frame generated later.
The three-dimensional ultrasound image processing apparatus according to configuration 4, in which the information processing unit performs the filter processing in which the filter characteristics are changed on the two-dimensional ultrasound image data of one frame generated much later in a case in which a condition related to an artifact is not satisfied for the two-dimensional ultrasound image data of the one frame generated later.
The three-dimensional ultrasound image processing apparatus according to any one of configurations 1 to 5, in which the two-dimensional ultrasound image data is obtained by transmitting and receiving the ultrasound while performing scanning with a transmission beam of the ultrasound at a predetermined scanning interval, and the information processing unit performs the filter processing on the two-dimensional ultrasound image data in a case in which the scanning interval exceeds a predetermined scanning interval threshold value.
The three-dimensional ultrasound image processing apparatus according to configuration 6, in which the scanning interval threshold value is determined based on the characteristics of the transmitted ultrasound.
The three-dimensional ultrasound image processing apparatus according to any one of configurations 1 to 7, in which the filter characteristics are characteristics for making an attenuation amount in a spatial frequency band corresponding to a predetermined structure be equal to or less than an attenuation amount limit value.
The three-dimensional ultrasound image processing apparatus according to any one of configurations 1 to 8, in which the two-dimensional ultrasound image data is data indicating a B-mode image on which a blood flow image is superimposed.
Number | Date | Country | Kind |
---|---|---|---|
2023-160130 | Sep 2023 | JP | national |