The present invention relates to an ultrasound diagnostic device, and more particularly to image processing of an ultrasound image.
Techniques for enhancing a boundary of a tissue, for example, in an ultrasound image obtained by transmitting and receiving ultrasound waves are known (see Patent Documents 1 and 2).
Tone curve modification and unsharp masking are typical examples of specific boundary enhancement techniques that are conventionally known. With these techniques, however, not only the boundaries for which enhancement is desired but also other parts for which enhancement is not necessary, such as noise, may be enhanced, and also parts having a sufficient contrast may be enhanced to thereby have excessively increased contrast.
Patent Document 3 describes a method for improving the image quality of an ultrasound image by multiresolution decomposition with respect to the image.
Patent Document 1: JP 3816151 B
Patent Document 2: JP 2012-95806 A
Patent Document 3: JP 4789854 B
In view of the background art described above, the inventors of the present application have repeatedly conducted research and development of a technique of enhancing boundaries within an ultrasound image, and have paid particular attention to image processing to which multiple resolution decomposition is applied.
The present invention was made in the process of the research and development, and is aimed at providing a technique of enhancing a boundary within an ultrasound image using multiresolution decomposition.
To achieve the above-described aim, in accordance with one preferred aspect, an ultrasound diagnostic device comprises a probe configured to transmit and receive ultrasound; a transmitter/receiver unit configured to control the probe to obtain a received signal of ultrasound; a resolution processing unit configured to perform resolution conversion processing with respect to an ultrasound image obtained based on the received signal, to thereby generate a plurality of resolution images having different resolutions; and a boundary component generation unit configured to generate a boundary component related to a boundary included in an image by non-linear processing applied to a differential image obtained by comparing the plurality of resolution images, wherein a boundary-enhanced image is generated by applying enhancement processing to the ultrasound image based on the boundary component which is obtained.
In a preferable specific example, the boundary component generation unit performs non-linear processing with different properties for a positive pixel value of the differential image and for a negative pixel value of the differential image.
In a preferable specific example, the boundary component generation unit performs non-linear processing such that a pixel value of the differential image having a greater absolute value is suppressed by a greater amount before being output.
In a preferable specific example, the boundary component generation unit applies, to the differential image having been subjected to the non-linear processing, weighting processing in accordance with a pixel value of the resolution image which has been used for comparison for obtaining the differential image, thereby generating the boundary component image.
In a preferable specific example, the resolution processing unit forms a plurality of resolution images having a plurality of resolutions which differ from each other stepwise, and the boundary component generation unit obtains one boundary component based on two resolution images having resolutions which differ from each other by only one step, thereby generating a plurality of boundary components corresponding to a plurality of steps, and the ultrasound diagnostic device further comprises a summed component generation unit configured to generate a summed component of an image based on a plurality of boundary components corresponding to a plurality of steps; and a summation processing unit configured to add the summed component which is generated to the ultrasound image, to thereby generate the boundary-enhanced image.
In a preferable specific example, the boundary component generation unit generates one differential image based on two resolution images having resolutions which differ from each other by only one step, and applies non-linear processing to a plurality of differential images corresponding to a plurality of steps to generate a plurality of boundary components.
The present invention provides a technique for enhancing a boundary within an ultrasound image using multiresolution decomposition. In accordance with a preferred aspect of the invention, the visibility of boundaries of a tissue can be increased without impairing information inherent in an ultrasound image.
When the ultrasound beam is scanned within an area including the subject for diagnosis and the echo data along the ultrasound beam, that is, line data, is collected by the transmitter/receiver unit 12, an image processing unit 20 forms ultrasound image data based on the collected line data. The image processing unit 20 forms image data of a B mode image, for example.
When forming an ultrasound image (image data), the image processing unit 20 enhances the boundaries of a tissue of the heart or the like within the ultrasound image. In order to enhance the boundaries, the image processing unit 20 has functions of multiresolution decomposition, boundary component generation, non-linear processing, weighting processing, and boundary enhancement processing. The image processing unit 20 applies resolution conversion processing to an ultrasound image obtained by the received signal to thereby generate a plurality of resolution images having different resolutions. The image processing unit 20 further applies non-linear processing to a differential image obtained by comparison among the plurality of resolution images to thereby generate a boundary component related to a boundary included in the image. Enhancement processing is then applied to the ultrasound image based on the boundary component which is generated, so that a boundary-enhanced image is generated. The image processing unit 20 then generates a plurality of image data items representing the heart, which is a subject for diagnosis, for a plurality of frames, and outputs the image data items to a display processing unit 30.
The image processing in the image processing unit 20 may be executed after processing including wave detection, logarithmic transformation, and the like, is applied to a signal obtained from the transmitter/receiver unit 12, and may be further followed by coordinate transformation processing executed by a digital scan converter. Of course, the boundary enhancement processing in the image processing unit 20 applied to a signal obtained by the transmitter/receiver unit 12 may be followed by processing including wave detection, logarithmic transformation, and the like, or the coordinate transformation processing executed in the digital scan converter may be followed by the image processing in the image processing unit 20.
The display processing unit 30 applies coordinate transformation processing for transforming the scanning coordinate system of ultrasound to the display coordinate system of an image to the image data obtained by the image processing unit 20, for example, and further adds a graphic image and the like, as necessary, to form a display image including an ultrasound image. The display image formed in the display processing unit 30 is displayed on a display unit 40.
Among the structures (function blocks) shown in
The structures shown in
The overall structure of the ultrasound diagnostic device shown in
The image processing unit 20 compares a plurality of resolution images corresponding to different resolutions, e.g. the images G0 to G3 shown in
In an ultrasound image, a cardiac muscle portion of the heart reflects properties of a cardiac muscle tissue (structure), e.g. fine recesses and projections on a tissue surface or within a tissue. Therefore, when a pixel on a cardiac muscle surface or within a cardiac muscle is defined as a pixel of interest, a relatively large difference in luminance appears between the pixel of interest and surrounding pixels in the resolution image Gn having a relatively high resolution. A change in the luminance is particularly noticeable at the boundary of the cardiac muscle.
In the resolution image Ex (Gn+1), which is a dull (blurred) image compared to the ultrasound image Gn due to low-resolution processing (downsampling processing), the difference in luminance between the pixel of interest and the surrounding pixels is smaller than that in the ultrasound image Gn.
Accordingly, as the difference in luminance between the pixel of interest and the surrounding pixels is greater in the ultrasound image Gn, the pixel of interest in the resolution image Ex (Gn+1) is changed by a greater amount from that in the ultrasound image Gn, particularly at the boundary of the cardiac muscle, resulting in a greater pixel value (greater difference in luminance) in a differential image.
For generating a summed component, the image processing unit 20 applies non-linear processing to pixels forming each differential image Ln. The image processing unit 20 further applies weighting processing with reference to the pixels of the resolution images Gn to the pixels forming each differential image Ln which have been subjected to the non-linear processing. The non-linear processing and the weighting processing to be applied to the differential image Ln will be described in detail below.
The image processing unit 20 then consecutively sums the plurality of differential images Ln having been subjected to the non-linear processing and the weighting processing while applying upsampling (US) processing in a stepwise manner. For the summation, weighting for summation (×Wn) may be executed. Thus, the image processing unit 20 generates a summed component based on the plurality of differential images Ln.
The processing which is executed in the present ultrasound diagnostic device (particularly, the image processing unit 20) is summarized as described above. A specific example structure of the image processing unit 20 for implementing the processing described above will now be described.
The summed component generation unit 31 calculates a summed component Edge through the processing which will be described below. The summed component Edge which is calculated is input to the weighted summation unit 12-1 along with the diagnosis image Input.
The weighted summation unit 12-1 executes weighted summation with respect to the diagnosis image Input and the summed component Edge, to form the boundary-enhanced image Enh. The weighted summation is preferably performed using a parameter Worg according to the following equation, but is not limited to this example. The boundary-enhanced image Enh which is calculated is input, along with the diagnosis image Input, to the selector unit 13-1.
Enh=Worg·Input+Edge [Mathematical Formula 1]
The selector unit 13-1 receives the diagnosis image Input and the boundary-enhanced image Enh which are input, and performs selection such that the image selected by the user on the device is output as an output image Output. The selected image Output is output to the display processing unit 30.
The noise reduction filter unit 51 applies an edge-preserving filter which is called a Guided Filter, for example, to remove noise while preserving boundary information. This structure can reduce noise information to be incorporated in the summed component Edge which is to be calculated through the processing described below. An edge-preserving filter is not limited to the specific example described above, and a non-edge-preserving filter represented by a Gaussian filter or the like may also be used.
The data calculated by the noise reduction filter unit 51 are input, along with the data calculated by the sample direction DS unit 41, to the selector unit 13-2, which outputs data selected by the user on the device to a summed component calculation unit 101.
The summed component calculation unit 101 calculates a boundary image through the processing which will be described below, and inputs the boundary image to a sample direction US (upsampling) unit 61. The sample direction US (upsampling) unit 61 applies upsampling processing to the boundary image in the sample direction according to the method described below to calculate a summed component Edge having the same size as that of the diagnosis image Input which is input to the summed component generation unit 31. The summed component Edge thus calculated is input to the weighted summation unit 12-1 (
The Gn components calculated in the multiresolution decomposition unit 111 are input, along with Gn+1 components, to corresponding boundary component calculation units 112-1, 112-2, and 112-3, which calculate Ln′ components having been subjected to non-linear processing, through the processing which will be described below. The calculated Ln′ components are input to a boundary component add-up unit 113, which generates a boundary image Ln″ component through the processing which will be described below.
While in the specific example described above multiresolution decomposition is performed three times, to generate a Gaussian pyramid formed of the Gn components (0≦n≦3) and calculate the Ln′ components (0≦n≦2), the present invention need not be limited to this example.
While in the above specific example 3 is set to the highest hierarchical level, the present invention is not limited to this example, and multiresolution decomposition may be performed within a scope from level 0 to level n (n≧1). Further, while in the above specific example an example multiresolution decomposition unit is configured to perform Gaussian pyramid processing, the configuration of the multiresolution decomposition unit may be modified to perform multiresolution decomposition using discrete wavelet transform, Gabor transform, bandpass filter in the frequency area, and the like.
The Gn component obtained in the multiresolution decomposition unit 111 is further input, along with a Gn+1 component, to the boundary component calculation unit 112 (
In the case of normal Gaussian and Laplacian pyramids, an Ln component is output as a high frequency component, and calculation of a summed component using this Ln component as an output would result in a summed component Edge including excessive addition and subtraction. Accordingly, in the present embodiment, the Ln component is further subjected to non-linear processing in a non-linear transformation unit 121, to calculate an Ln′ component.
In the present embodiment, the Ln component may have either a positive value or a negative value. A negative value as used herein functions to impair information originally contained in the diagnosis image. Accordingly, in order to provide a desirable diagnosis image based on the information inherent in the original diagnosis image, it is desirable that, as illustrated in
Further, it is preferable to vary the parameters for each level n of the Ln component which is a high frequency component in the non-linear processing in the non-linear transformation unit 121 (
While in the specific example described above it is described that it is preferable to apply non-linear processing in the non-linear transformation unit 121, the present invention is not limited to this example, and a structure may be adopted in which several threshold values are provided and linear transformation is performed for each pair of the threshold values.
As described above, with the non-linear processing applied to the Ln component, it is possible to suppress the excessive addition and subtraction, with the boundary component near the zero-crossing being sufficiently maintained. In the present embodiment, in order to reduce the excessive addition and subtraction which causes glare in a posterior wall, for example, which is generated by applying significant addition and subtraction to a portion having a sufficient contrast, such as a high luminance portion, it is further desirable to multiply a component having been subjected to the above-described non-linear processing by a weight determined with reference to the Gn component, thereby adjusting the component.
While in the specific example described above a weight to the Ln component is determined with reference to the luminance value of the Gn component, the present invention is not limited to this example. For example, a weight may be determined with reference to a feature other than the luminance value, such as by setting a weight for a portion with a high edge intensity to 1 and setting a weight for a portion with a low edge intensity to 0, with reference to the boundary intensity.
The L2′ component which is input is subjected to upsampling in an US (upsampling) unit 6101-2-1, and is then input, as an Ex (L2′) component, to a weighted summation unit 12-2 and an US (upsampling) unit 6101-2-2.
The weighted summation unit 12-2 applies weighted summation to the L1′ component and the Ex (L2′) component to generate an L1″ component. The weighted summation in the weighted summation unit 12-2 is preferably performed by a calculation using a parameter W2, according to the following formula, which is not limiting:
The component calculated in the weighted summation unit 12-2 is further upsampled in an US (upsampling) unit 6101-1, and is input, as an Ex (L1″) component, to a weighted summation unit 12-3.
The Ex (L2′) component input to the US unit 6101-2-2 is subjected to further upsampling processing to form an Ex (Ex (L2′)) component having the same image size as that of the L0′ component, which is then input to a high frequency control unit 131.
The high frequency control unit 131 removes a noise component from the L0′ component including a relatively large amount of noise, while leaving the boundary component remaining therein. More specifically, the high frequency control unit 131 calculates weighting such that, when the value of the Ex (Ex (L2′)) component is large, it is assumed that the component is a component close to the boundary and the weight is set to be close to 1, whereas when the value of the Ex (Ex (L2′)) component is small, it is assumed that the component is information of a position distant from the boundary of a large structure, and the weight is set toward 0. Further, the weighted value which is calculated is multiplied by the L0′ component, thereby reducing the noise component included in the L0′ component. The L0′ component with the noise component being reduced is input to the weighted summation unit 12-3.
While in the specific example described above the processing for reducing the noise in the L0′ component with reference to the Ex (Ex (L2′)) component has been described, the present invention is not limited to this example, and noise reduction processing may be performed with reference to a component having a lower resolution than the Ln′ component which is noted, for example.
The weighted summation unit 12-3 performs weighted summation with respect to the L0′ component having been subjected to noise reduction processing in the high frequency control unit 131 and the Ex (L1″) component obtained from the US unit 6101-1, to thereby generate the boundary image L0″. The weighted summation in the weighted summation unit 12-3 is preferably performed by calculation using parameters W0 and W1, according to the following formula, which is not limiting:
L″
0
=W
0
·L′
0
+W
1·Ex(L″1) [Mathematical Formula 3]
The component calculated in the weighted summation unit 12-3 is upsampled in the sample direction US (upsampling) unit 61 (
As described above with reference to
In the field of circulatory organs, particularly in the ultrasonography of a heart, for example, evaluation of properties and forms of a tissue is regarded to be significant, and therefore an increase in visibility of the tissue boundaries in the endocardia surface has been desired. Conventional techniques, however, have raised a problem that the boundary enhancement would not only enhance the endocardial surface but also increase the noise in the heart cavity and the glare in the posterior wall, thus producing an image which is not suitable for diagnosis.
The ultrasound diagnostic device according to the present embodiment described above, on the other hand, adds a boundary image which is calculated from an ultrasound image of the examinee and controlled so as not to generate incongruity to the ultrasound image, for example, so that a diagnosis image with the visibility in the tissue boundary increased without incongruity can be generated.
While a preferred embodiment of the present invention has been described, the embodiment described above is only an example and does not limit the scope of the invention. The invention includes various modifications which do not depart from the nature of the invention.
10 probe, 12 transmitter/receiver unit, 20 image processing unit, 30 display processing unit, 40 display unit.
Number | Date | Country | Kind |
---|---|---|---|
2013-243475 | Nov 2013 | JP | national |
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/JP2014/080702 | 11/13/2014 | WO | 00 |